Descartes Labs Joins DARPA Geospatial Cloud Analytics Program

Descartes Labs has joined DARPA’s Geospatial Cloud Analytics (GCA) program. The contract award is worth $2.9 million with a phase 2 option of $4.2 million, for a possible total of $7.2 million, the company said in a blog post.

DARPA describes the program’s purpose as providing “instant access to the most up-to-date images anywhere in the world, as well as the cutting-edge tools to analyze them.”

“Under the GCA program, teams selected by DARPA will use the Descartes Labs Platform to build global-scale applications and offer them in the marketplace as a commercial service for data scientists,” Descartes Labs said in the blog post.

“The Descartes Labs Platform features a cloud-native infrastructure designed to provide the storage, computing, access, and tools needed to analyze massive, complex geospatial datasets, making it an ideal foundation for this DARPA program,” the company added.

“The GCA marketplace will address several specific analysis objectives, including: food security (strategic analytics), fracking (operational analytics), and maritime change detection/illegal fishing (tactical analytics),” Descartes said.

“To support these objectives, and pave the way for the development of additional applications, Descartes Labs will integrate up to 75 new datasets sourced from members of a diverse data partner network,” the company added.

Descartes Labs Raises $30 Million Series B Round

Descartes Labs has raised a Series B round of funding to develop its “data refinery” that pulls together satellite data into usable products.

Today, we’re excited to announce Descartes Labs $30M Series B round of financing led by March Capital, an LA-based venture capital fund. Our seed round was led by Crosslink Capital and our Series A was led by Cultivian Sandbox. Both Crosslink and Cultivian participated in the Series B along with a few other investors, including one of our customers, Cargill.

Less than three years ago, Descartes Labs was born as a spin-out from Los Alamos National Laboratory with the goal of using satellite imagery to model complex, global systems. In the course of tackling this audacious goal, we built what we believe to be the critical and missing component that unlocks the value of satellite imagery: a data refinery….

A data refinery is a system that pulls in raw data, cleans it up, fuses data from disparate sources, and adds tools on top of it for easier analysis.

One of the hallmarks of a data refinery is data cleanup. The team at Descartes Labs spends an inordinate amount of time on the remote sensing science that enables complex models to be accurate. We’ve been working on coregistration of pixels (making sure every pixel is in the proper place), global surface reflectance (correcting for the effects of the atmosphere) and advanced cloud detection (ensure models only incorporate the best pixels).

But, even more important than data cleanup is data fusion. Every data set has unique qualities: finer spatial resolution, pictures with greater frequency, seeing through clouds, and even listening to radio frequency signals or measuring heat. In working on customer problems, one of the things we’ve uncovered is that it’s rarely a single satellite or dataset alone that will solve the problem. Only by turning different datasets into a fused sensor, a super-sensor, are we able to solve the problem.

Save