Spatial Data Platform from SpaceCurve for Real-Time Operational Intelligence
SpaceCurve Spatial Data Platform will deliver strong spatial and temporal analytics capabilities. SpaceCurve CEO Dane Coyer and CTO Andrew Rogers tell us more.
Bringing the benefits of Big Data technology to the Operations department needs robust handling of two particular dimensions: space (or location) and time. A key requirement for optimizing logistics is to have a clear understanding of “when” and “where” information for all inventory. While most of today’s data architectures do include both these dimensions, there are significant limitations to spatial and temporal analysis that could be performed on these architectures simply because of their traditional design. SpaceCurve has developed a promising solution to help overcome these limitations and obtain deep insights through space-time analysis.
SpaceCurve has developed the first Spatial Data Platform specifically engineered to organize and enable the analysis of large-scale spatial data. The platform
delivers unprecedented time-to-value using space and time as the key indices for all associated data. SpaceCurve continuously fuses geospatial, sensor, IoT, social media, location, and other streaming and historical data while making the data immediately available for analytics and operational intelligence.
The Internet of Things (IoT) requires immediate analysis of sensor representations of the real world for a wide range of purposes, from
understanding consumer behavior to optimizing oil and gas production. For example, telecommunications companies can now monitor network sensor data in real-time, detecting patterns of human migration to understand the density of people in a given location, physical or virtual, at a particular point in time to improve services. In transportation and logistics, the movement of people and products from one location to another can be optimized by fusing real-time environmental conditions, traffic flows, and any other operational factors to constantly determine the most economical routes possible.
SpaceCurve’s ability to continuously index and store this data concurrent with queries enables fast, interactive spatial analytics that cannot be achieved using technologies currently on the market. This is combined with a computational geometry engine that is uniquely capable of performant and ultra-precise spatial analysis operations to ensure maximum fidelity for applications with global scope.
The product includes a database engine designed to support the continuous ingestion and high-dimensionality indexing of spatial data at the extremely high data rates typical of machine-generated data sources. It also enables ad hoc queries and complex operations that run concurrent with data ingestion, immediately reflecting all new data. SpaceCurve has implemented a new approach to computational parallelism that enables highly scalable ingest, storage, and analysis of complex space and time relationships across diverse, very large data sources. This is all accomplished on commodity Linux clusters, either on customer premise or in the cloud.
Here is my short interview with SpaceCurve CEO Dane Coyer and CTO Andrew Rogers:
What inspired you to launch SpaceCurve? What are your thoughts on company's evolution since its launch in 2009?
Big data platforms were originally invented to allow people to analyze the Internet. A decade ago, I was involved in some of the earliest attempts to bring live sensor data into these analyses. There was a big open question at the time: how could we analyze the real world in the same way we were analyzing relationships in the virtual world? We actually proved that this was not possible using big data technologies like Hadoop. SpaceCurve was created to be the first platform capable of analyzing the real world in real-time.
When the company was first launched in 2009, we really had no idea what the shape of the market would be for our platform. The recent emergence of the Internet of Things, ubiquitous sensors, and an increasingly mobile-centric world is creating an enormous market for applications that require a platform with SpaceCurve’s unique capabilities.
What are the major limitations of traditional data architectures in handling space and time dimensions? How does the SpaceCurve spatial data platform enable us to overcome those limitations?
SpaceCurve overcomes two major limitations of traditional data architectures.
First, traditional data architectures are designed for the simple entity relationships common in enterprise, web, and social media data models where relationships are represented by shared keys. Spatiotemporal data models are unique in that all of your keys are multidimensional intervals and relationships are traversed by evaluating key intersection rather than equality. Existing scalable architectures lack the ability to efficiently express this type of relationship. SpaceCurve is the first architecture based on algorithms that can efficiently represent and analyze spatial relationships at extreme scales.

Second, the workload profiles typical of modern spatial data applications are not supported by traditional data architectures. Machine-generated spatial data sources tend to be extremely high velocity and high volume, far beyond the design assumptions of most big data platforms, requiring data architectures that can continuously ingest millions of complex records per second and store them to disk. Furthermore, these data models tend to be operational, requiring that these real-time data sources be immediately analyzable concurrent with that ingestion. Stream processing systems do not work because the analytics are rarely summarizations. SpaceCurve developed a completely new massively parallel database design that was purpose-built for these workloads.
What are your favorite applications (or client use cases) built on the SpaceCurve technology? What kind of innovative applications do you anticipate in future?
That is like asking us to pick our favorite child. They really run the spectrum from exploiting data in existing enterprises, that up to this point was not possible, or the creation of entirely new business that are only now
possible. I tend to be drawn to the complexity of fusing numerous existing data sources, both internally generated and externally available, together to unlock the value of that data for our clients. They may have many silos of spatially attributed data that have previously been ignored, or have they sub-optimized their business processes around the limitations of their existing technology. I have seen cases where there is tremendous value to be provided by the real time fusing of these data sets and providing insights when they are most valuable, but their current business process takes 24 hours or longer to create a very limited version of what the end customers really desire. The value of these insights decays very rapidly.
As far as the future, I am surprised daily by the level of innovation emerging in this new sensor based world. Right now the industry is very focused on human decisions and control of what I will call “single actor” scenarios. A human is in the decision making loop and the sensor technology is generally single purpose or single platform. We are able to spatially fuse this data together at the speed it is generated, but the main consumer is still human. Very soon much of the consumption of this data will be by other machines (things in IoT parlance), and by a wide variety of machines. Think of cooperating distribution networks, with autonomous vehicles being completely aware of their physical surroundings and being able to negotiate the most advantaged outcomes.
How do you assess the maturity level of Location Analytics? What have been the most significant achievements in this area in the last few years? What major developments do you expect in 2015?
The past few years has seen great growth in the Location Analytics arena, moving it from primarily points of interest on map based media to a much deeper, richer and more current collection of intelligence about predefined locations. I see these defined locations as containers you fill with details. But it is still primarily
“placial” in nature, not spatial. It is still relatively static information placed on a cartesian system that is very limited in what can be done beyond the “container”. There is also a currency issue where even the most advanced offerings are updated in batch mode. We will soon see a very different approach to the problem, utilizing real time streams of information that are fused with historical data bases. This will enable not only immediate situational awareness but will also be able to look at patterns and abnormalities by instantaneously comparing what is happening in real time to history. Future capabilities will also not be limited to what is in the container, but be truly spatial, even global in their capacity.
Related:
SpaceCurve has developed the first Spatial Data Platform specifically engineered to organize and enable the analysis of large-scale spatial data. The platform
The Internet of Things (IoT) requires immediate analysis of sensor representations of the real world for a wide range of purposes, from
SpaceCurve’s ability to continuously index and store this data concurrent with queries enables fast, interactive spatial analytics that cannot be achieved using technologies currently on the market. This is combined with a computational geometry engine that is uniquely capable of performant and ultra-precise spatial analysis operations to ensure maximum fidelity for applications with global scope.
The product includes a database engine designed to support the continuous ingestion and high-dimensionality indexing of spatial data at the extremely high data rates typical of machine-generated data sources. It also enables ad hoc queries and complex operations that run concurrent with data ingestion, immediately reflecting all new data. SpaceCurve has implemented a new approach to computational parallelism that enables highly scalable ingest, storage, and analysis of complex space and time relationships across diverse, very large data sources. This is all accomplished on commodity Linux clusters, either on customer premise or in the cloud.
Here is my short interview with SpaceCurve CEO Dane Coyer and CTO Andrew Rogers:
What inspired you to launch SpaceCurve? What are your thoughts on company's evolution since its launch in 2009?
Big data platforms were originally invented to allow people to analyze the Internet. A decade ago, I was involved in some of the earliest attempts to bring live sensor data into these analyses. There was a big open question at the time: how could we analyze the real world in the same way we were analyzing relationships in the virtual world? We actually proved that this was not possible using big data technologies like Hadoop. SpaceCurve was created to be the first platform capable of analyzing the real world in real-time.
When the company was first launched in 2009, we really had no idea what the shape of the market would be for our platform. The recent emergence of the Internet of Things, ubiquitous sensors, and an increasingly mobile-centric world is creating an enormous market for applications that require a platform with SpaceCurve’s unique capabilities.
What are the major limitations of traditional data architectures in handling space and time dimensions? How does the SpaceCurve spatial data platform enable us to overcome those limitations?
SpaceCurve overcomes two major limitations of traditional data architectures.
First, traditional data architectures are designed for the simple entity relationships common in enterprise, web, and social media data models where relationships are represented by shared keys. Spatiotemporal data models are unique in that all of your keys are multidimensional intervals and relationships are traversed by evaluating key intersection rather than equality. Existing scalable architectures lack the ability to efficiently express this type of relationship. SpaceCurve is the first architecture based on algorithms that can efficiently represent and analyze spatial relationships at extreme scales.
Second, the workload profiles typical of modern spatial data applications are not supported by traditional data architectures. Machine-generated spatial data sources tend to be extremely high velocity and high volume, far beyond the design assumptions of most big data platforms, requiring data architectures that can continuously ingest millions of complex records per second and store them to disk. Furthermore, these data models tend to be operational, requiring that these real-time data sources be immediately analyzable concurrent with that ingestion. Stream processing systems do not work because the analytics are rarely summarizations. SpaceCurve developed a completely new massively parallel database design that was purpose-built for these workloads.
What are your favorite applications (or client use cases) built on the SpaceCurve technology? What kind of innovative applications do you anticipate in future?
That is like asking us to pick our favorite child. They really run the spectrum from exploiting data in existing enterprises, that up to this point was not possible, or the creation of entirely new business that are only now
As far as the future, I am surprised daily by the level of innovation emerging in this new sensor based world. Right now the industry is very focused on human decisions and control of what I will call “single actor” scenarios. A human is in the decision making loop and the sensor technology is generally single purpose or single platform. We are able to spatially fuse this data together at the speed it is generated, but the main consumer is still human. Very soon much of the consumption of this data will be by other machines (things in IoT parlance), and by a wide variety of machines. Think of cooperating distribution networks, with autonomous vehicles being completely aware of their physical surroundings and being able to negotiate the most advantaged outcomes.
How do you assess the maturity level of Location Analytics? What have been the most significant achievements in this area in the last few years? What major developments do you expect in 2015?
The past few years has seen great growth in the Location Analytics arena, moving it from primarily points of interest on map based media to a much deeper, richer and more current collection of intelligence about predefined locations. I see these defined locations as containers you fill with details. But it is still primarily
Related: