But it’s Planet’s ability to make the data accessible to every
government and every business in the world that makes our TAM truly
massive. Today, our business starts with our core data, a daily
3- to 4-meter per pixel scan of the entire
Earth, thousands of daily high-resolution tasked images and of
course, our archive of over 1 billion images going back five
years. That’s over 1,500 images for every single point on the face
of the Earth. But the even bigger opportunity, and we’re still in
the early days here but we are clearly on this path, is in moving
up the stack from imagery to data and APIs, to machine learning,
and time series.
We’re building a platform on top of our proprietary data, services
that could be mixed and matched together like Lego blocks, things
that provide new capabilities, extract new insights, and generally
make our data more valuable and easier to use. I’ve seen this
before in my career. When I was at Twitter, we built a product
called the Firehose, which is a way for developers to subscribe to
the Firehose of all public tweets every day. It still exists, so
every time you tweet, or I tweet or President Obama or Biden
tweets, or Nicki Minaj tweets, 50 milliseconds later, any partner
subscribing to the Firehose has it pushed to them.
When we launched this, we were beyond excited at Twitter. The pulse
of the planet every single day in developers’ hands. Just think of
the cool things they’re going to build with it. We were partially
right because when you—if you look at sophisticated partners like
Google, they did some amazing things. They built tweets into real
time search so that two minutes after something happened out in the
real world, they were answering search queries they had never seen
before with tweets. But most developers blanched at the idea of
managing 5,000 tweets per second coming at them in 100 different
languages.
It turned out, the way to really scale Twitter’s data business was
to make it easier to consume, to provide services that say,
provided a count of tweets talking about your brand, gave each one
a sentiment score, linked them back to your advertising campaigns,
and sent that to you daily in a CSV. Some companies needed the full
Firehose, but every company needed a time series of buzz about
their brand. There is a real analog with our path here at Planet,
though on a much bigger scale for us. I’m going to take you through
this in more detail.
In fact, I’m going to show you how we’re building the strategy from
the bottom up, with each layer stacking on top of and reinforcing
the ones below it. Each of the icons in these slides represents an
example of data or a single service at each layer. Here you see our
PlanetScope monitoring with our doves, high-res tasking, our future
hyperspectral imagery. I’ll warn you; this slide is going to get
busy, but you’ll see it’s exactly because the market opportunity is
so large and so varied for Planet and for our partners.
Just this first layer. Planet began by providing a cloud interface
to our raw imagery, right. Naturally, our first customers were
highly sophisticated at processing large volumes of geospatial
imagery. As an example, we help Corteva monitor 800,000 fields
daily across all the farmers that they serve. But we haven’t
stopped there. We have over 25 terabytes of data coming down from
our satellites every day. We have to move beyond humans using their
eyeballs to look at imagery. The future is about computers running
algorithms and analysis to automatically identify objects,
patterns, time series.
All of our data is analytics ready and that is a new Lego block
that our partners and us can use to train machine learning, remote
sensing algorithms on our proprietary data. We’re already doing
this. For example, we use our data as a base to fuse with data that
comes from other sensors, other optical sensors that have different
characteristics, microwave sensors, even radar in the form of SAR
and that allows us to mix in new capabilities to get at things like
soil temperature, or the ability to see through clouds.
With these blocks in place, we’re starting to extract even more
sophisticated products. As we do, notice that these new products
don’t look anything like geospatial data. They start to look more
like a matrix, spreadsheet, a time series, something that anybody
who knows how to use Excel can use. This is why with each step
along this path, we’re not just reducing time to value for our
customers, we’re also meaningfully increasing our customer base and
our TAM. Examples at this level are things like road building or
ship detection, something that you can imagine any government being
interested in. Or crop classification, so how many acres of soybean
are growing in China this year? Or things like monitoring the
growth of coral or other natural assets, looking at water reservoir
levels, and so on.