The Descartes Labs Platform is designed to answer some of the world’s most complex and pressing geospatial analytics questions. Our customers use the platform to build algorithms and models that transform their businesses quickly, efficiently, and cost-effectively.
By giving data scientists and their line-of-business colleagues the best geospatial data and modeling tools in one package, we help turn AI into a core competency.
Data science teams can use our scaling infrastructure to design models faster than ever, using our massive data archive or their own.
The Descartes Labs Platform
Customers rely on our cloud-based platform to quickly and securely scale computer vision, statistical, and machine learning models to inform business decisions with powerful raster-based analytics.
Key features and sample applications
Our extensive API documentation, tutorials, guides and demos provide a deep knowledge base for users allowing them to quickly deploy high-value applications across diverse industries including Agriculture, Oil & Gas, Mining & Metals, Power & Renewables, Shipping & Logistics, Financial Services & Insurance, and more.
Geolocation and AIS
Copernicus Sentinel data 2020
Geospatial data can add invaluable information to modern data science workflows, but it can be challenging to incorporate. Complications often arise for data scientists when they’re confronted with legacy coordinate systems, band combinations, differing resolutions, cloud cover, or seemingly endless post-processing steps.
Beyond imagery issues, further challenges stem from the massive, billion-record vector datasets generated by AIS shipping transponders, mobile devices, ground cameras, and sensors. When fusing such datasets together into a time series analysis, these difficulties can become insurmountable.
In short, we’re living through a sensor revolution and the world needs better tools to prepare and manage all of the data being produced.
The Descartes Labs Platform is such a tool, equipping customers with 15+ petabytes of analysis-ready data from some of the world’s leading satellite constellations and ground-based sensors. Customers use this data to power continental-scale models without the hassle of downloading, data preparation, storage, or compute.
Our data ingestion pipelines perform continuous loading and automatic pre-processing at speeds up to 20 gigabytes per second. The processing stages we go through for a typical image may include:
- Retrieving it from cloud storage
- Uncompressing it
- Parsing the metadata
- Identifying the bounding rectangle that contains valid data
- Cleaning the edges of the image
- Converting the raw pixel information into meaningful units
- Calibrating top of atmosphere reflectance using the appropriate constants for each satellite
- Accounting for solar distance and zenith angle
- Tiling to standard sizes
- Performing any necessary coordinate transformations
- Compressing the data back into a standard format
- Storing the result back into the cloud.
- Landsat 4, 5, 7, 8: A joint NASA/USGS program providing the longest continuous space-based record of Earth’s land in existence
- Sentinel-2, Sentinel-3: An ESA Copernicus program providing multi-spectral imaging instruments for land, ocean and atmospheric monitoring
- MODIS: A sensor aboard NASA’s Terra and Aqua satellites that views the entire Earth’s surface every 1 to 2 days and acquires data in 36 spectral bands
- ASTER: A sensor aboard the NASA Terra satellite providing high-resolution images of Earth in 14 different bands of the electromagnetic spectrum, ranging from visible to thermal infrared light
- Airbus SPOT 6 and SPOT 7: A set of wide-swath high-resolution (1.5 m) optical imagery satellites owned and operated by Airbus Defense and Space
- Airbus Pleiades A and Pleiades B: A set of very high-resolution (0.5 m) optical imagery satellites owned and operated by Airbus Defense and Space
- NAIP: National Agricultural Imagery Program’s orthoimagery collected during the peak growing season
- Texas Orthoimagery Program: 50cm orthoimagery collected during leaf-off conditions
- Any taskable satellite or aerial imagery from third party providers
- Sentinel-5P: An ESA Copernicus program providing radar and multi-spectral imaging instruments for land, ocean and atmospheric monitoring
- More planned for 2020 and beyond
- NOAA GOES-16, GOES-17: A pair of geostationary satellites positioned over the Western Hemisphere with a five minute refresh. Useful for detecting and monitoring storm systems, fog, wildfires, and other weather phenomena
- DMSP night lights: The Defense Meteorological Satellite Program’s cloud-free composites of average light percent from nighttime-visible bands
- NCEP CFSR: The National Centers for Environmental Precition’s Climate Forecast System Reanalysis: The most extensive gridded weather product with daily global coverage
- NOAA GSOD: The National Oceanic and Atmospheric Administration’s Global Surface Summary of the Day: A higher resolution daily weather dataset interpolated into a grid
- NOAA GFS: The National Oceanic and Atmospheric Administration’s Global Forecast system: A numerical weather prediction system published multiple times a day
- Sentinel-1: An ESA Copernicus program providing radar imaging instruments for land, ocean and atmospheric monitoring
- NASA SRTM: NASA’s Shuttle Radar Topography Mission: A near-global, high-resolution digital topographic database of the Earth with data measuring slope, altitude, and aspect of topographical features
- 3DEP: The USGS’s 3D Elevation Program. The highest resolution, seamless-elevation dataset for the United States, derived from LiDAR
- CHIRPS: Climate Hazards Group InfraRed Precipitation data, a 30-year record of rainfall estimates
- AIS vessel tracking data: Automatic Identification System: vessel positioning data from maritime navigation safety communication systems
- Ground sensors: Custom sensors for continuous monitoring, e.g. visual and thermal cameras
- NLCD: National Land Cover Dataset: Impervious surface, tree canopy, and change indices for a wide variety of environmental, land management, and modeling applications.
- SMAP: NASA’s Soil Moisture Active Passive Data: Low resolution soil moisture maps
- GFSAD: Global Food Security-support Analysis Data. A 30m resolution dataset of global croplands
- Proprietary customer data sourced from internal applications or third-party vendors
- InSAR coherence: A radar technique that uses two or more synthetic aperture radar (SAR) images to generate maps of surface changes using differences in the phase of the waves returning to the satellite
- Fields: Field boundaries rasterized to 15m resolution
- MAX NDVI: Post-processed MODIS data for 16-day rolling average of NDVI
- NO2 Composites: 2-month composite of NO2 concentrations derived from Sentinel-5P
- Surface Reflectance: Atmospheric correction algorithm available for Landsat 8 and Sentinel 2
- Buildings: Building locations derived from high-resolution imagery
- Solar: Commercial solar panel locations derived from high-resolution imagery
- Surface Water: Probability of water on a pixel-by-pixel basis
- Trees: Tree locations identified over high-resolution imagery
- Wildfire Alerts: Updated fire locations derived from GOES-16 and GOES-17
Petabytes of geospatial data—ready for modeling
Geospatial data is large, diverse, and can be difficult to manage. That’s why many organizations neglect it. But in doing so, they surrender an important signal that complements their existing data ecosystem and can provide competitive advantage over other market participants.
Most companies are simply not equipped to tap into the value of geospatial data–but the Descartes Labs data refinery closes that gap. A core design principle of our data refinery is to enable AI development at scale. This principle permeates our solution from the bottom of the stack, up through the platform’s application layers. Built on Kubernetes in the Google Cloud, the data refinery fuels the rapid development of geospatial analytics and forecasts. Benefits include:
Petabytes of clean, analysis-ready geospatial data from leading public and private sources
A cloud infrastructure that unlocks the potential of data and scales models to continental levels
The ability to add new data, whether from proprietary sources or from the output of analysis
A Python client library and catalog interface to access and manipulate data sources at scale
The data refinery automatically ingests imagery from sensors including NASA and European Space Agency satellites, as well as ground-based detectors and weather data. Powerful data ingest pipelines perform continuous loading and pre-processing at speeds up to 20 gigabytes per second. Customers can leverage our multi-petabyte catalog of analytics-ready data or quickly ingest, clean and calibrate their own data to make it AI-ready.
A collaborative modeling environment for your data science team
Workbench is a cloud-hosted development environment that allows you to build massively scalable workflows and applications. It offers integrated visualization tools, scalable compute, and a growing collection of functions that empower cross-functional teams to innovate in a highly collaborative manner.
With Workbench, business leaders more easily hypothesize about, explore, and understand data, while data scientists model faster, display results more dynamically, and shorten the timetable between new ideas and vetted projects.
Organizations use Workbench to model business-critical use cases with a deep set of functions and interfaces. By exposing an enterprise-grade API, the platform supercharges your return on investment.
Workbench API clients
Collaboration among data scientists, software engineers and business users is enhanced through Workbench’s notebook tools and visualization widgets. Our powerful Viewer application, in tandem with open-source interfaces and common desktop GIS software, provides unlimited flexibility. The addition of integrated training and labeling tools allows organizations to build robust models and operationalize at scale.
Integrated modeling tools
Configured in the cloud with Jupyter notebooks
The Workbench allows users to visually interact with their model results within a hosted JupyterLab notebook infrastructure, providing data scientists with a configurable library of shareable code blocks and map widgets that enable rapid model development without the need to constantly switch between coding and results.
Prior to Workbench, data scientists had to configure their own development environment and build tedious command-line or visual interfaces for common geospatial workflows. This created a significant amount of overhead and frustration and made it more difficult to share repeatable results with other users. By deploying a cloud notebook solution as part of our platform offering, we remove the barriers to geospatial data science.
USGS/NASA ASTER 2020
Rapid development with the Workflows API
The Descartes Labs Workflows API is a scalable algorithm development-and-visualization environment built within the Workbench component. Workflows enables the rapid development and reuse of algorithms and fast interactive visualization of model results on a map.
Customers can use the Descartes Labs Workflows API to quickly discover, combine, and analyze geospatial data in order to extract information to fuel predictive analytics. Customers can then store and share those models and workflows for future execution by themselves and their collaborators.
Workflows offers customers access to all the tools commonly employed by our applied scientists in a single, configurable interface, onto which last-mile integrations can be added. It enables data scientists and analysts to leverage petascale geospatial data without deep prior knowledge of geospatial concepts and techniques. With Workflows, model development time is reduced and users benefit from a fast iteration loop between having an idea, testing it, and visualizing the results.
Descartes Labs enables rapid experimentation—so your team can hypothesize, test and discover powerful signals across your business. By accelerating the deployment of new models and applications, we make it easy to operationalize new signals and deliver them to your business team through the appropriate interface. That means you achieve real business impact as quickly as possible.
With the Descartes Labs Platform, customers build applications as diverse as forecasting the timing and impact of snowmelt on hydropower generation, to the detection of deforestation and land conversion across the agricultural supply chain.
At any given time, we may have tens of thousands of CPUs extracting information from complex datasets, with models generating insights within minutes of satellites passing overhead (including this one for wildfire detection).
CxOs, scientists, farmers, first responders, and policymakers utilize our platform to build applications and assess real-world impact and decision-making across multiple lines of business. Our collection of sample models is a powerful resource, expediting the development of tailored applications that leverage your internal data or target your exact use case.
Contains modified Copernicus Sentinel data 2020