Get Started

Overview

Reaching AI at scale

The Descartes Labs Platform is designed to solve some of the world’s hardest and most important geospatial AI problems—at scale. Our platform and tools allow customers to build models that can transform businesses more quickly, efficiently, and cost-effectively.

By giving data scientists and their line-of-business colleagues the best geospatial data and modeling tools in one package, we turn AI into a core competency. Data science teams can use our scaling infrastructure to design models faster than ever, using our massive data archive or their own.

The Descartes Labs Platform is made up of three critical components that work together to accelerate productivity across IT, engineering, data science, and leadership.

The Descartes Labs Platform

Customers rely on our platform to securely scale computer vision, statistical, and machine learning models in a rapid manner, transforming business decisions and handling nearly all raster modeling functions with one solution in the cloud.

Key features and sample applications

Our extensive API documentation, tutorials, guides and demos provide a deep knowledge base for customers and users to quickly deploy high-value applications across various industries, including Agriculture, Oil and Gas, Metals and Mining, Power and Renewables, Shipping and Logistics, Financial Services and Insurance, and more.

Data Sources

Overview

Geospatial data can be challenging to handle inside modern data science workflows but it can also add valuable information not available from other sources. Complications often arise when dealing with coordinate systems, band combinations, different resolutions, cloud cover, and seemingly endless post-processing steps.

Looking beyond imagery, whole new sets of challenges are presented from the massive billion-record vector datasets that are generated by AIS shipping transponders, mobile devices, ground cameras, and sensors. And when fusing these datasets together into a time series analysis the difficulties can often become insurmountable.

In short, we’re living through a sensor revolution and the world needs better tools to prepare and manage all of this data.

The Descartes Labs Platform makes this easy by providing 15+ petabytes of analysis-ready data from some of the world’s leading satellite constellations and ground-based sensors. Customers use this data to power continental-scale models without having to worry about the hassle of downloading, data preparation, storage, or compute.

Our data ingestion pipelines perform continuous loading and automatic pre-processing at speeds up to 20 gigabytes per second. The processing stages we go through for a typical image may include retrieving it from cloud storage, uncompressing it, parsing the metadata, identifying the bounding rectangle that contains valid data, cleaning the edges of the image, converting the raw pixel information into meaningful units, calibrating top of atmosphere reflectance using the appropriate constants for each satellite, accounting for solar distance and zenith angle, tiling to standard sizes, performing any necessary coordinate transformations, compressing the data back into a standard format, and storing the result back into cloud storage.

Datasets

Multiband Optical

  • Landsat 4, 5, 7, 8: A joint NASA/USGS program providing the longest continuous space-based record of Earth’s land in existence
  • Sentinel-2, Sentinel-3: An ESA Copernicus program providing multi-spectral imaging instruments for land, ocean and atmospheric monitoring
  • MODIS: A sensor aboard NASA’s Terra and Aqua satellites that views the entire Earth’s surface every 1 to 2 days and acquires data in 36 spectral bands
  • ASTER: A sensor aboard the NASA Terra satellite that provides high-resolution images of the planet Earth in 14 different bands of the electromagnetic spectrum, ranging from visible to thermal infrared light

High-Res Optical

  • Airbus SPOT 6 and SPOT 7: A set of wide-swath high-resolution (1.5 m) optical imagery satellites owned and operated by Airbus Defense and Space
  • Airbus Pleiades A and Pleiades B: A set of very high-resolution (0.5 m) optical imagery satellites owned and operated by Airbus Defense and Space
  • NAIP: National Agricultural Imagery Program’s orthoimagery collected during the peak growing season
  • Texas Orthoimagery Program: 50cm orthoimagery collected during leaf-off conditions
  • Any taskable satellite or aerial imagery from third party providers

Atmospheric

  • Sentinel-5P: An ESA Copernicus program providing radar and multi-spectral imaging instruments for land, ocean and atmospheric monitoring
  • More planned for 2020 and beyond

Geostationary

  • NOAA GOES-16, GOES-17: A pair of geostationary satellites positioned over the Western Hemisphere with a five minute refresh. Useful for detecting and monitoring storm systems, fog, wildfires, and other weather phenomena
  • DMSP night lights: The Defense Meteorological Satellite Program’s cloud-free composites of average light percent from night-time visible bands

Meteorological

  • NCEP CFSR: The National Centers for Environmental Precition’s Climate Forecast System Reanalysis: The most extensive gridded weather product with daily global coverage
  • NOAA GSOD: The National Oceanic and Atmospheric Administration’s Global Surface Summary of the Day: A higher resolution daily weather dataset interpolated onto a grid
  • NOAA GFS: The National Oceanic and Atmospheric Administration’s Global Forecast system: A numerical weather prediction system published multiple times a day

SAR

  • Sentinel-1: An ESA Copernicus program providing radar imaging instruments for land, ocean and atmospheric monitoring

Elevation

  • NASA SRTM: NASA’s Shuttle Radar Topography Mission: A near-global high-resolution digital topographic database of the Earth with data measuring slope, altitude, and aspect of topographical features
  • 3DEP: The USGS’s 3D Elevation Program. The highest resolution seamless elevation dataset for the United States, derived from LiDAR

Hydrological

  • CHIRPS: Climate Hazards Group InfraRed Precipitation data, a 30-year record of rainfall estimates

Geolocation/AIS

  • AIS vessel tracking data: Automatic Identification System: vessel positioning data from maritime navigation safety communication systems
  • Ground sensors: Custom sensors for continuous monitoring, e.g. visual and thermal cameras

Land use

  • NLCD: National Land Cover Dataset: Impervious surface, tree canopy, and change indices for a wide variety of environmental, land management, and modeling applications.
  • SMAP: NASA’s Soil Moisture Active Passive Data: Low resolution soil moisture maps
  • GFSAD: Global Food Security-support Analysis Data. A 30m resolution dataset of global croplands

Internal Data

  • Proprietary customer data sourced from internal applications or third party vendors

Derivative Data

  • InSAR coherence: A radar technique that uses two or more synthetic aperture radar (SAR) images to generate maps of surface changes using differences in the phase of the waves returning to the satellite
  • Fields: Field boundaries rasterized to 15m resolution
  • MAX NDVI: Post-processed MODIS data for 16-day rolling average of NDVI
  • NO2 Composites: 2-month composite of NO2 concentrations derived from Sentinel-5P
  • Surface Reflectance: Atmospheric correction algorithm available for Landsat 8 and Sentinel 2
  • Buildings: Building locations derived from high-resolution imagery
  • Solar: Commercial solar panel locations derived from high resolution imagery
  • Surface Water: Probability of water on a pixel by pixel basis
  • Trees: Tree locations identified over high-resolution imagery
  • Wildfire Alerts: Updated fire locations derived from GOES-16 and GOES-17

Data Refinery

Petabytes of geospatial data—ready for modeling

Geospatial data is large and can be difficult to manage. That’s why many organizations neglect it. But in doing so, they give up an important signal that complements their existing data ecosystem and provides competitive advantage over other market participants. Most companies simply are not equipped to tap into its value. The data refinery closes that gap.

A core design principle of the data refinery is to enable AI development at scale. This principle permeates our solution from the bottom of the stack up through the platform’s application layers. Built on Kubernetes in the Google Cloud, the data refinery provides the fuel for the rapid development of geospatial analytics and forecasts. Benefits include:

Petabytes of clean, analysis-ready geospatial data from leading public and private sources

A cloud infrastructure that unlocks the potential of data and scales models to continental levels

The ability to add new data, whether from proprietary sources or from the output of analysis

A Python client library and catalog interface to access and manipulate data sources at scale

The data refinery automatically ingests imagery from satellites, such as those from NASA or the European Space Agency, as well as other sensors, including ground-based detectors and weather data. Powerful data ingest pipelines perform continuous loading and pre-processing at speeds up to 20 gigabytes per second. Customers can leverage our multi-petabyte catalog of analytics-ready data or quickly ingest, clean and calibrate their own data to make it AI-ready.

Workbench

A collaborative modeling environment for your data science team

Workbench is a cloud-hosted development environment that allows you to build massively scalable workflows and applications. It offers integrated visualization tools, scalable compute, and a growing collection of functions that empower cross-functional teams to innovate in a highly collaborative manner.

With Workbench, business leaders more easily hypothesize, explore data and gain understanding. At the same time, data scientists model more quickly, display results more dynamically and shorten the timetable between new ideas and vetted projects.

Organizations use Workbench to model business-critical use cases with a deep set of components. By exposing an enterprise-grade API, the platform helps supercharge geospatial data ROI.

Workbench API components

Collaboration between data scientists, software engineers and business users is enhanced using Workbench’s notebook tools and visualization widgets. Our powerful Viewer application and open-source interfaces to common desktop GIS software provide unlimited flexibility. The addition of integrated training and labeling tools allows organizations to build robust models and operationalize at scale.

Integrated modeling tools

Configured in the cloud with Jupyter notebooks

The Workbench allows users to visually interact with their model results within a hosted JupyterLab notebook infrastructure, providing data scientists with a configurable library of shareable code blocks and map widgets that enable rapid model development without the need to constantly switch between coding and results.

Prior to Workbench, users had to configure their own development environment and build tedious command-line or visual interfaces for common geospatial workflows. This created a significant amount of overhead and frustration and made it more difficult to share repeatable results with other users. By deploying a cloud notebook solution as part of our native offering, we remove the barriers to geospatial data science.

Rapid development with the Workflows API

The Descartes Labs Workflows API is a scalable algorithm development and visualization environment built within the Workbench component. Workflows enables the rapid development of algorithms and models, algorithm reuse and composition, and fast interactive visualization of algorithms on a map.

Customers can use the Descartes Labs Workflows API to quickly discover, combine, and analyze geospatial data in order to extract information to fuel predictive analytics. Customers can then store and share those models and workflows for future execution by themselves and their collaborators.

Workflows offers customers access to all the tools commonly employed by our applied scientists in a single, configurable interface, onto which last-mile integrations can be added. It enables data scientists and analysts to leverage petascale geospatial data without having deep prior knowledge of geospatial concepts and techniques. With Workflows, model development time is reduced and users benefit from a fast iteration loop between having an idea, testing it, and visualizing the results.

Applications

Faster ROI

Descartes Labs enables rapid experimentation—so your team can hypothesize, test and discover powerful signals across your business. By accelerating the deployment of new models and applications, we make it easy to operationalize new signals and deliver them to your business team through the appropriate interface. So you achieve real business impact as quickly as possible.

Using the Descartes Labs Platform, customers build applications as diverse as forecasting the timing and impact of snowmelt on hydropower generation to the detection of deforestation and land conversion across the agricultural supply chain.

At any given time we may have tens of thousands of CPUs extracting information from complex datasets, with models generating insights within minutes of satellites passing overhead (such as this one for wildfire detection).

CxOs, scientists, farmers, first responders, and policymakers utilize our platform to build applications and assess real-world impact and decision-making across multiple lines of business. Our collection of sample models is a powerful resource, accelerating the development of tailored applications that leverage your internal data or target your exact use case.

Sample applications