Planet Labs and Google Cloud join forces in data analysis agreement

Satellite operator Planet Labs is beefing up its existing partnership with Google Cloud. Under a new agreement, Planet customers can use Google Cloud to store and process data, and access Google’s other products such as its data analytics warehouse BigQuery.

The two companies’ collaboration stretches back to 2017, when Google sold its satellite imaging business, Terra Bella, to Planet. As part of the sale agreement, Google also signed a multi-year contract with Planet to license Earth-imaging for its use. Planet also uses Google Cloud service for its own internal data processing and hosting.

This latest agreement will let Planet customers use products like BigQuery to analyze large volumes of satellite imaging data, reflecting “a growing demand for planetary-scale satellite data analysis, powered by the cloud,” Planet said in a news release.

“Planet customers want scalable compute and storage,” Kevin Weil, Planet’s president of product and business said. “Google Cloud customers want broader access to satellite data and analytics. This partnership is a win-win for both, as it helps customers transform their operations and compete in a digital-first world, powered by Planet’s unique data set.”

Planet operates a network of around 200 satellites – more than any government – and provides analytics services on the data it gathers. Last month, the company joined a slew of other space companies by announcing it was going public via a $2.8 billion merger with blank-check firm dMY Technology Group IV. The deal is anticipated to inject Planet with $545 million in cash, including a $200 million private-investment-in-public-equity from BlackRock-managed funds, Koch Strategic Platforms, Marc Benioff’s TIME Ventures and Google.

#aerospace, #data-analytics, #google, #google-cloud, #planet-labs, #satellite-constellation, #satellite-imagery, #space

Satellite operator Planet to go public in $2.8B SPAC merger

Planet, which operates a network of around 200 satellites that provides Earth imaging, as well as analytics of the data derived from that observation, is going public in a merger with special purpose acquisition company (SPAC) dMY Technology Group IV. The deal has a post-transaction equity value of $2.8 billion, and will provide Planet with $545 million in cash balance at close, including $345 million from dMY IV’s contribution, and a $200 million PIPE provided by BlackRock-managed funds, Koch Strategic Platforms, Marc Benioff’s TIME Ventures and Google.

After a bit of a lull, Planet is now the second significant private space company this week to take the SPAC route to public markets. Both are in the business of Earth observation, though Satellogic, which announced its own SPAC merger on Tuesday, operates on a much smaller scale at the moment. Planet, founded in 2010, has raised around $374 million to date, and operates the largest Earth imaging satellite constellation in operation.

The company’s mission has been to transform the way Earth imaging data is collected and provided to commercial interests here on Earth. Planet’s network can provide a complete scan of all of the Earth’s landmass on a daily basis, and it offers that to customers “via a Bloomer-like terminal for Earth data,” as Planet founder and CEO Wiill Marshall puts it. Access is provided on a subscription basis, and Planet says it generated over $100 million in revenue during its most recent fiscal year, which ended in January.

Planet intends to use the funds resulting from the merger in part to pay down its existing debt, and also to fund its existing operations and “support new and existing growth initiatives.” The aim to to complete the merger sometime later this year, at which point the combined entity will trade under the ticker “PL” on the NYSE.

#blackrock, #corporate-finance, #google, #marc-benioff, #private-equity, #satellite-constellation, #satellite-imagery, #satellogic, #spac, #space, #special-purpose-acquisition-company, #tc, #time-ventures

Single.Earth to link carbon credits to crypto token market, raises $7.9M from EQT Ventures

Here’s the theory: Instead of linking carbon and biodiversity credits to the sale of raw materials such as forests, which cause CO2, what if you linked them to crypto tokens, and thus kept these CO2-producing materials in the ground?

That’s the theory behind Single.Earth, which has now raised a $7.9 million seed funding round led by Swedish VC EQT Ventures to, in its own words, ‘tokenize nature’. Also participating in the round was existing investor Icebreaker, and Ragnar Sass and Martin Henk, founders of Pipedrive. The funding will be used to launch its marketplace for nature-backed MERIT tokens.

Single.Earth says its ‘nature-backed’ financial system will use using MERIT tokens. And given the market for carbon credits is estimated to be worth more than $50 billion by 2030 and crypto surpassed a $2 trillion market cap in 2021, their plan might just work.

It plans to build a ‘digital twin’ of nature that reveals how much any area of ecological significance in the world absorbs CO2 and retains biodiversity. Using environmental data such as satellite imagery, it aims to build global carbon models on which to base its token marketplace, generating profits through carbon compensations, ‘mining’ a new MERIT token for every 100 kg of CO2 sequestered in a specific forest or biodiverse area.

The MERIT tokens are then used to trade, compensate for a CO2 footprint, or contribute to climate goals (as the token is ‘used up’ and cannot be traded anymore). Companies, organisations, and eventually individuals will be able purchase these tokens and own fractional amounts of natural resources, rewarded with carbon and biodiversity offsets. The company says the market for carbon credits is estimated to be worth more than $50 billion by 2030.

Because of the traceability of blockchain and its link to a tradable token, payment to landowners would be immediate.

Single.Earth was co-founded in 2019 by CEO Merit Valdsalu and CTO Andrus Aaslaid. Valdsalu said: “Nature conservation is scalable, accessible, and makes sense financially; what’s more, it’s vital to engineer a systematic change.”

Sandra Malmberg, Venture Lead at EQT Ventures, added: “Oil was the new gold, data the new oil; now, nature is now the most precious and valuable resource of all. A company having a hectare of forest saved as a key metric to scale is a company we are thrilled to back. Disrupting the economy and financial markets with a new tradable and liquid asset class that has a positive impact on the environment is an irresistible investment.”

#articles, #ceo, #cryptocurrencies, #cto, #eqt-ventures, #europe, #martin-henk, #mining, #oil, #pipedrive, #satellite-imagery, #tc

Hydrosat raises $5M seed round to deliver ground temperature data to customers

A lot of information can be gleaned about a surface area just by taking the ground temperature data. If a crop field is under stress, for example, the ground temperature will be elevated long before there’s any actual indication of the stress on the plant itself, Hydrosat CEO and co-founder Pieter Fossel explained to TechCrunch. Now, with a new $5 million injection in seed funding, he hopes to launch Hydrosat’s first surface temperature analytics product for customers.

The seed round was led by Cultivation Capital’s newly launched Geospatial Technologies Fund with participation by Freeflow Ventures, the Yield Lab, Expon Capital, Techstars, Industrious Ventures, Synovia Capital, and the University of Michigan.

The geospatial data analytics startup, which started at the end of 2017, plans to gather surface temperature data using satellites equipped with thermal infrared sensors. Beyond agricultural data, surface temperature can also provide information about wildfire risk, water stress and drought – all important variables if you believe, as Fossel does, that climate change is already starting to exert forces on the planet.

While ground temperature data is collected by legacy institutions like NASA and the European Space Agency (ESA), it’s not gathered at a very high frequency – sometimes a specific location’s ground temperature is only read every 16 days or so – or at a high resolution. Hydrosat hopes to fill in those existing data gaps. The company also collects data on other bands, using a multispectral infrared camera, but its primary value proposition is in its thermal data.

The first satellite will head to low-Earth orbit with Loft Orbital on a SpaceX Falcon 9 rocket in the second half of 2022. That mission is named after Hydrosat’s former CEO, Jakob van Zyl, who passed away from heart attack about a year ago. Although the launches add a certain flair, Fossel stressed that the company is “a content company and a data company first.”

“We’re also developing some applications that sit on top of that [surface temperature] product that are geared towards crop yield forecasting, drought detection, and irrigation management,” he said. “Because these are all fundamentally driven by water stress and all of those applications are fundamentally enabled by our core product, which is land surface temperature data.”

Hydrosat’s first customers have been governments, in the form of a contract with the ESA and three SBIR contracts with the U.S. Air Force and Department of Defense. But through the raise, the company can start to deliver its product to commercial customers, who may include agribusinesses, insurance companies, and even other companies that want to do analytics on top of its collection of ground surface data.

“[Hydrosat] will probably start in agriculture, which is our core focus, but it could branch out across industries, because temperature is a signal of a whole host of activities beyond our focus, which is environment, water, stress, food,” he explained. “Temperature is also a signal of economic activity. There’s a lot of cool use cases for temperature, from kind of a defense and security standpoint, as well.”

Looking to the future, Hydrosat has plans to launch a 16-satellite constellation to enable global monitoring. But that’s only the medium-term focus, Fossel said. The company’s long-term plans could include launching additional satellites, adding additional bands to deepen its data offerings, or building out its analytics layer. “Beyond that, it’s really about providing the underlying data that enables some of these applications in drought, food security, water stress, wildfire, and defense and security,” he added.

#funding, #geospatial-analytics, #hydrosat, #private-equity, #satellite-imagery, #satellites, #seed-round, #space, #startups

OroraTech’s space-based early wildfire warnings spark $7M investment

With wildfires becoming an ever more devastating annual phenomenon, it is in the whole planet’s interest to spot them and respond as early as possible — and the best vantage point for that is space. OroraTech is a German startup building a constellation of small satellites to power a global wildfire warning system, and will be using a freshly raised €5.8M (~$7M) A round to kick things off.

Wildfires destroy tens of millions of acres of forest every year, causing immense harm to people and the planet in countless ways. Once they’ve grown to a certain size, they’re near impossible to stop, so the earlier they can be located and worked against, the better.

But these fires can start just about anywhere in a dried out forest hundreds of miles wide, and literally every minute and hour counts — watch towers, helicopter flights, and other frequently used methods may not be fast or exact enough to effectively counteract this increasingly serious threat. Not to mention they’re expensive and often dangerous jobs for those who perform them.

OroraTech’s plan is to use a constellation of about 100 satellites equipped with custom infrared cameras to watch the entire globe (or at least the parts most likely to burst into flame) at once, reporting any fire bigger than ten meters across within half an hour.

Screenshot of OroraTech wildfire monitoring software showing heat detection in a forest.

Image Credits: OroraTech

To start out with, the Bavarian company has used data from over a dozen satellites already in space, in order to prove out the service on the ground. But with this funding round they are set to put their own bird in the air, a shoebox-sized satellite with a custom infrared sensor that will be launched by Spire later this year. Onboard machine learning processing of this imagery simplifies the downstream process.

14 more satellites are planned for launch by 2023, presumably once they’ve kicked the proverbial tires on the first one and come up with the inevitable improvements.

“In order to cover even more regions in the future and to be able to give warning earlier, we aim to launch our own specialized satellite constellation into orbit,” said CEO and co-founder Thomas Grübler in a press release. “We are therefore delighted to have renowned investors on board to support us with capital and technological know-how in implementing our plans.”

Mockup of an OroraTech Earth imaging satellite in space.

Those renowned investors consist of Findus Venture and Ananda Impact Ventures, which led the round, followed by APEX Ventures, BayernKapital, Clemens Kaiser, SpaceTec Capital and Ingo Baumann. The company was spun out of research done by the founders at TUM, which maintains an interest.

“It is absolutely remarkable what they have built up and achieved so far despite limited financial resources and we feel very proud that we are allowed to be part of this inspiring and ambitious NewSpace project,” APEX’s Wolfgang Neubert said, and indeed it’s impressive to have a leading space-based data service with little cash (it raised an undisclosed seed about a year ago) and no satellites.

It’s not the only company doing infrared imagery of the Earth’s surface; SatelliteVu recently raised money to launch its own, much smaller constellation, though it’s focused on monitoring cities and other high-interest areas, not the vast expanse of forests. And ConstellR is aimed (literally) at the farming world, monitoring fields for precision crop management.

With money in its pocket Orora can expand and start providing its improved detection services, though sadly, it likely won’t be upgrading before wildfire season hits the northern hemisphere this year.

#aerospace, #ananda-impact-ventures, #artificial-intelligence, #earth-imaging, #findus-venture, #funding, #fundings-exits, #greentech, #ororatech, #recent-funding, #satellite-imagery, #satellites, #science, #space, #startups, #wildfire-detection, #wildfires

As concerns rise over forest carbon offsets, Pachama’s verified offset marketplace gets $15 million

Restoring and preserving the world’s forests has long been considered one of the easiest, lowest cost, and simplest ways to reduce the amount of greenhouse gases in the atmosphere.

It’s by far the most popular method for corporations looking to take an easy first step on the long road to decarbonizing or offsetting their industrial operations. But in recent months the efficacy, validity, and reliability of a number of forest offsets have been called into question thanks to some blockbuster reporting from Bloomberg.

It’s against this uncertain backdrop that investors are coming in to shore up financing for Pachama, a company building a marketplace for forest carbon credits that it says is more transparent and verifiable thanks to its use of satellite imagery and machine learning technologies.

That pitch has brought in $15 million in new financing for the company, which co-founder and chief executive Diego Saez Gil said would be used for product development and the continued expansion of the company’s marketplace.

Launched only one year ago, Pachama has managed to land some impressive customers and backers. No less an authority on things environmental than Jeff Bezos (given how much of a negative impact Amazon operations have on the planet), gave the company a shoutout in his last letter to shareholders as Amazon’s outgoing chief executive. And the largest ecommerce company in Latin America, Mercado Libre, tapped the company to manage an $8 million offset project that’s part of a broader commitment to sustainability by the retailing giant.

Amazon’s Climate Pledge Fund is an investor in the latest round, which was led by Bill Gates’ investment firm Breakthrough Energy Ventures. Other investors included Lowercarbon Capital (the climate-focused fund from über-successful angel investor, Chris Sacca), former Über executive Ryan Graves’ Saltwater, the MCJ Collective, and new backers like Tim O’Reilly’s OATV, Ram Fhiram, Joe gebbia, Marcos Galperin, NBA All-star Manu Ginobilli, James Beshara, Fabrice Grinda, Sahil Lavignia, and Tomi Pierucci.

That’s not even the full list of the company’s backers. What’s made Pachama so successful, and given the company the ability to attract top talent from companies like Google, Facebook, SapceX, Tesla, OpenAI, Microsoft, Impossible Foods and Orbital Insights, is the combination of its climate mission applied to the well-understood forest offset market, said Saez Gil.

“Restoring nature is one of the most important solutions to climate change. Forests, oceans and other ecosystems not only sequester enormous amounts of CO2from the atmosphere, but they also provide critical habitat for biodiversity and are sources of livelihood for communities worldwide. We are building the technology stack required to be able to drive funding to the restoration and conservation of these ecosystems with integrity, transparency and efficiency” said Diego Saez Gil, Co-founder and CEO at Pachama. “We feel honored and excited to have the support of such an incredible group of investors who believe in our mission and are demonstrating their willingness to support our growth for the long term”. 

Customers outside of Latin America are also clamoring for access to Pachama’s offset marketplace. Microsoft, Shopify, and Softbank are also among the company’s paying buyers.

It’s another reason that investors like Y Combinator, Social Capital, Tobi Lutke, Serena Williams, Aglaé Ventures (LVMH’s tech investment arm), Paul Graham, AirAngels, Global Founders, ThirdKind Ventures, Sweet Capital, Xplorer Capital, Scott Belsky, Tim Schumacher, Gustaf Alstromer, Facundo Garreton, and Terrence Rohan, were able to commit to backing the company’s nearly $24 million haul since its 2020 launch. 

“Pachama is working on unlocking the full potential of nature to remove CO2 from the atmosphere,” said Carmichael Roberts from BEV, in a statement. “Their technology-based approach will have an enormous multiplier effect by using machine learning models for forest analysis to validate, monitor and measure impactful carbon neutrality initiatives. We are impressed by the progress that the team has made in a short period of time and look forward to working with them to scale their unique solution globally.” 

 

#aglae-ventures, #amazon, #bill-gates, #breakthrough-energy-ventures, #carbon-offset, #chris-sacca, #climate-pledge-fund, #ecommerce, #fabrice-grinda, #google, #greenhouse-gas-emissions, #impossible-foods, #james-beshara, #jeff-bezos, #joe-gebbia, #latin-america, #lowercarbon-capital, #lvmh, #machine-learning, #microsoft, #nba, #openai, #pachama, #paul-graham, #ryan-graves, #satellite-imagery, #scott-belsky, #serena-williams, #shopify, #softbank, #sweet-capital, #tc, #tesla, #tim-oreilly, #xplorer-capital, #y-combinator

Deepfake tech takes on satellite maps

While the concept of “deepfakes,” or AI-generated synthetic imagery, has been decried primarily in connection with involuntary depictions of people, the technology is dangerous (and interesting) in other ways as well. For instance, researchers have shown that it can be used to manipulate satellite imagery to produce real-looking — but totally fake — overhead maps of cities.

The study, led by Bo Zhao from the University of Washington, is not intended to alarm anyone but rather to show the risks and opportunities involved in applying this rather infamous technology to cartography. In fact their approach has as much in common with “style transfer” techniques — redrawing images in an impressionistic, crayon and arbitrary other fashions — than with deepfakes as they are commonly understood.

The team trained a machine learning system on satellite images of three different cities: Seattle, nearby Tacoma and Beijing. Each has its own distinctive look, just as a painter or medium does. For instance, Seattle tends to have larger overhanging greenery and narrower streets, while Beijing is more monochrome and — in the images used for the study — the taller buildings cast long, dark shadows. The system learned to associate details of a street map (like Google or Apple’s) with those of the satellite view.

The resulting machine learning agent, when given a street map, returns a realistic-looking faux satellite image of what that area would look like if it were in any of those cities. In the following image, the map corresponds to the top right satellite image of Tacoma, while the lower versions show how it might look in Seattle and Beijing.

Four images show a street map and a real satellite image of Tacoma, and two simulated satellite images of the same streets in Seattle and Beijing.

Image Credits: Zhao et al.

A close inspection will show that the fake maps aren’t as sharp as the real one, and there are probably some logical inconsistencies like streets that go nowhere and the like. But at a glance the Seattle and Beijing images are perfectly plausible.

One only has to think for a few minutes to conceive of uses for fake maps like this, both legitimate and otherwise. The researchers suggest that the technique could be used to simulate imagery of places for which no satellite imagery is available — like one of these cities in the days before such things were possible, or for a planned expansion or zoning change. The system doesn’t have to imitate another place altogether — it could be trained on a more densely populated part of the same city, or one with wider streets.

It could conceivably even be used, as this rather more whimsical project was, to make realistic-looking modern maps from ancient hand-drawn ones.

Should technology like this be bent to less constructive purposes, the paper also looks at ways to detect such simulated imagery using careful examination of colors and features.

The work challenges the general assumption of the “absolute reliability of satellite images or other geospatial data,” said Zhao in a UW news article, and certainly as with other media that kind thinking has to go by the wayside as new threats appear. You can read the full paper at the journal Cartography and Geographic Information Science.

#aerospace, #artificial-intelligence, #deepfakes, #mapping, #maps, #satellite-imagery, #science, #space, #tc, #university-of-washington

Mercado Libre taps Pachama to monitor and manage its $8 million investment in Latin American rainforest restoration

Mercado Libre, one of the largest e-commerce and financial services company from Latin America by market cap, has selected the startup and Y Combinator alumni Pachama as its strategic partner in developing projects to restore ecosystems in Latin America.

The selection of Pachama is part of a program initiated by Mercado Libre, Latin America’s answer to Amazon, which is called Regenera America. The $8 million that Mercado Libre is investing will be in two reforestation projects: the “Mantiqueira Conservation Project”, organized under the auspices of The Nature Conservancy and the “Corridors of Live Project”, designed and implemented by the Instituto de Pesquisas Ecologicas.

Both projects will focus on the reforestation of over three thosuand hectares, through natural regeneration and planting over 1 million trees, restoring biodiversity corridors and protecting hydrological basins in the Atlantic Forest region of Brazil, the two companies said in a statement.

Pachama will provide satellite and machine learning technologies to verify and monitor the carbon sequestration produced by the sweeping reforestation efforts in a deal which leapfrogs Mercado Libre ahead of Microsoft as the young startup’s largest customer.

Software tools provided by Pachama will also increase the efficiency and transparency of the actual reforestation efforts on the ground, the companies said in a joint statement.

The deal between the two companies, and Mercado Libre’s big buy was announced earlier today at a press conference in Argentina and the agreement marks the first time Mercado Libre has tapped money from a recently issued $400 million Sustainability Bond that was designed to finance projects of what the e-commerce giant called “triple impact” in the Latin American region. The bond was issued by JP Morgan and BNP Paribas.

“We’re taking our first steps. We have always tried to do things the hard way and go to the core of problems. We have had a very interesting debate internally about when is the right time to start buying carbon offsets and carbon credits but we also realize that the … getting up and running of projects that generate carbon credits in Latin America was potentially even more of a challenging situation and more of a longterm solution,” said Mercado Libre chief financial officer Pedro Arnt.

“This is a building block of a longer term strategy thinking through not just what we can do for the next two or three years,” Arnt said. 

The Regenera America project has four pieces, Arnt said: measuring and reporting emissions internally for the company; buying clean energy for the company’s operations; providing electric vehicles for its own fleet and assisting its last mile and logistics partners in electrifying their own transportation; and the development of reforestation efforts across Latin America.

“This is setting up an example for more traditional industries across Latin America,” said Diego Saez-Gil, the co-founder and chief executive of Pachama. MercadoLibre is the largest company by market cap in Latin America and serves as a standard bearer for the forward thinking businesses in the region, he said. “Latin America is one of the biggest holders of biodiversity and carbon stocks in the world, and should be playing a more active role in climate mitigation.”

It’s a big step for Pachama as well. The deal marks the first time the young company has involved itself in project origination and provide a new revenue stream to compliment its existing lines of business.

“We are incredibly excited to start helping new reforestation projects get off the ground that have the capabilities to plant millions of trees and remove millions tons of CO2 from the atmosphere. If we are to solve climate change we need more projects like these to start as soon as possible,” said Saez-Gil in a statement. “We are confident that technologies such as AI and satellite imagery are key to scaling these efforts with high integrity, efficiency and transparency. Partnering with world-class organizations such as Mercado Libre, The Nature Conservancy and IPE for our first projects represents an incredible opportunity for us.” 

#argentina, #artificial-intelligence, #biology, #bnp-paribas, #brazil, #chief-financial-officer, #clean-energy, #diego-saez-gil, #e-commerce, #ebay, #electric-vehicles, #jp-morgan, #latin-america, #mercado-libre, #mercadolibre, #microsoft, #nature-conservancy, #pachama, #partner, #paypal, #satellite-imagery, #tc, #y-combinator

SilviaTerra wants to bring the benefits of carbon offsets to every landowner everywhere

Zack Parisa and Max Nova, the co-founders of the carbon offset company SilviaTerra, have spent the last decade working on a way to democratize access to revenue generating carbon offsets.

As forestry credits become a big, booming business on the back of multi-billion dollar commitments from some of the world’s biggest companies to decarbonize their businesses, the kinds of technologies that the two founders have dedicated ten years of their lives to building are only going to become more valuable.

That’s why their company, already a profitable business, has raised $4.4 million in outside funding led by Union Square Ventures and Version One Ventures, along with Salesforce founder and the driving force between the 1 trillion trees initiative, Marc Benioff .

“Key to addressing the climate crisis is changing the balance in the so-called carbon cycle. At present, every year we are adding roughly 5 gigatons of carbon to the atmosphere. Since atmospheric carbon acts as a greenhouse gas this increases the energy that’s retained rather than radiated back into space which causes the earth to heat up,” writes Union Square Ventures managing partner Albert Wenger in a blog post. “There will be many ways such drawdown occurs and we will write about different approaches in the coming weeks (such as direct air capture and growing kelp in the oceans). One way that we understand well today and can act upon immediately are forests. The world’s forests today absorb a bit more than one gigatons of CO2 per year out of the atmosphere and turn it into biomass. We need to stop cutting and burning down existing forests (including preventing large scale forest fires) and we have to start planting more new trees. If we do that, the total potential for forests is around 4 to 5 gigatons per year (with some estimates as high as 9 gigatons).”

For the two founders, the new funding is the latest step in a long journey that began in the woods of Northern Alabama, where Parisa grew up.

After attending Mississippi State for forestry, Parisa went to graduate school at Yale, where he met Louisville, Kentucky native Max Nova, a computer science student who joined with Parisa to set up the company that would become SiliviaTerra.

SilviaTerra co-founders Max Nova and Zack Parisa. Image Credit: SilviaTerra

The two men developed a way to combine satellite imagery with field measurements to determine the size and species of trees in every acre of forest.

While the first step was to create a map of every forest in the U.S. the ultimate goal for both men was to find a way to put a carbon market on equal footing with the timber industry. Instead of cutting trees for cash, potentially landowners could find out how much it would be worth to maintain their forestland. As the company notes, forest management had previously been driven by the economics of timber harvesting, with over $10 billion spent in the US each year.

The founders at SilviaTerra thought that the carbon market could be equally as large, but it’s hard for moset landowners to access. Carbon offset projects can cost as much as $200,000 to put together, which is more than the value of the smaller offset projects for landowners like Parisa’s own family and the 40 acres they own in the Alabama forests.

There had to be a better way for smaller landowners to benefit from carbon markets too, Parisa and Nova thought.

To create this carbon economy, there needed to be a single source of record for every tree in the U.S. and while SilviaTerra had the technology to make that map, they lacked the compute power, machine learning capabilities and resources to build the map.

That’s where Microsoft’s AI for Earth program came in.

Working with AI for Earth, SilviaTierra created their first product, Basemap, to process terabytes ofsatellite imagery to determine the sizes and species of trees on every acre of America’s forestland. The company also worked with the US Forestry Service to access their data, which was used in creating this holistic view of the forest assets in the U.S.

With the data from Basemap in hand, the company has created what it calls the Natural Capital Exchange. This program uses SilviaTerra’s unparalleled access to information about local forests, and the knowledge of how those forests are currently used to supply projects that actually represent land that would have been forested were it not for the offset money coming in.

Currently, many forestry projects are being passed off to offset buyers as legitimate offsets on land that would never have been forested in the first place — rendering the project meaningless and useless in any real way as an offset for carbon dioxide emissions. 

“It’s a bloodbath out there,” said Nova of the scale of the problem with fraudulent offsets in the industry. “We’re not repackaging existing forest carbon projects and try to connect the demand side with projects that already exist. Use technology to unlock a new supply of forest carbon offset.”

The first Natural Capital Exchange project was actually launched and funded by Microsoft back in 2019. In it, 20 Western Pennsylvania land owners originated forest carbon credits through the program, showing that the offsets could work for landowners with 40 acres, or, as the company said, 40,000.

Landowners involved in SilviaTerra’s pilot carbon offset program paid for by Microsoft. Image Credit: SilviaTerra

“We’re just trying to get inside every landowners annual economic planning cycle,” said Nova. “There’s a whole field of timber economics… and we’re helping answer the question of given the price of timber, given the price of carbon does it make sense to reduce your planned timber harvests?”

Ultimately, the two founders believe that they’ve found a way to pay for the total land value through the creation of data around the potential carbon offset value of these forests.

It’s more than just carbon markets, as well. The tools that SilviaTerra have created can be used for wildfire mitigation as well. “We’re at the right place at the right time with the right data and the right tools,” said Nova. “It’s about connecting that data to the decision and the economics of all this.”

The launch of the SilviaTerra exchange gives large buyers a vetted source to offset carbon. In some ways its an enterprise corollary to the work being done by startups like Wren, another Union Square Ventures investment, that focuses on offsetting the carbon footprint of everyday consumers. It’s also a competitor to companies like Pachama, which are trying to provide similar forest offsets at scale, or 3Degrees Inc. or South Pole.

Under a Biden administration there’s even more of an opportunity for these offset companies, the founders said, given discussions underway to establish a Carbon Bank. Established through the existing Commodity Credit Corp. run by the Department of Agriculture, the Carbon Bank would pay farmers and landowners across the U.S. for forestry and agricultural carbon offset projects.

“Everybody knows that there’s more value in these systems than just the product that we harvest off of it,” said Parisa. “Until we put those benefits in the same footing as the things we cut off and send to market…. As the value of these things goes up… absolutely it is going to influence these decisions and it is a cash crop… It’s a money pump from coastal America into middle America to create these things that they need.” 

#air-pollution, #alabama, #albert-wenger, #america, #articles, #artificial-intelligence, #biden-administration, #carbon-footprint, #energy, #greenhouse-gas, #greenhouse-gas-emissions, #kentucky, #louisville, #machine-learning, #managing-partner, #marc-benioff, #microsoft, #pennsylvania, #salesforce, #satellite-imagery, #tc, #union-square-ventures, #united-states, #version-one-ventures, #yale

As the Western US burns, a forest carbon capture monitoring service nabs cash from Amazon & Bill Gates backed fund

Pachama, the forest carbon sequestration monitoring service that tracks how much carbon dioxide is actually captured in forestry offset projects, has raised $5 million in fresh funding from a clutch of high profile investors including Amazon, Breakthrough Energy Ventures.

The investment is one of several deals that Amazon has announced today through its Climate Pledge Fund. Breakthrough Energy Ventures, the firm backed by Bill Gates and other billionaires, led the round, which brings Pachama’s total haul to $9 million so it can scale its forest restoration and conservation emissions reduction monitoring service, the company said.

With the Western United States continuing to burn from several fires that cover acres of drought-impacted forests and deforestation continuing to be a problem around the world, Pachama’s solution couldn’t be more timely. The company’s remote verification and monitoring service using satellite imagery and artificial intelligence measures carbon captured by forests.

It also couldn’t be more personal. Pachama’s founder, Diego Saez-Gil, lost his own home in the wildfires that tore through California earlier this year.

“We will need to restore hundreds of thousands of acres of forests and carbon credits can be the funding mechanism,” Saez-Gil wrote in a direct message.

Pachama joins two other companies that are jointly financed by Breakthrough Energy Ventures and Amazon’s Climate Pledge Fund.

Other big corporate investors also backed Pachama. Groupe Arnault’s investment arm, Aglaé Ventures, and Airbnb’s alumni fund, AirAngels invested as did a number of prominent family offices and early stage funds. Sweet Capital, the fund investing the personal wealth of gaming company King.com’s management team; Serena Ventures (the investment vehicle for tennis superstar Serena Williams) and Chris Sacca’s Lowercarbon Capital fund also invested in the round along with Third Kind Ventures and Xplorer Ventures.

“There is growing demand from businesses with ESG commitments looking for ways to become carbon neutral, and afforestation is one of the most attractive carbon removal options ready today at scale,” said Carmichael Roberts, of Breakthrough Energy Ventures, in a statement. “By leveraging technology to create new levels of measurement, monitoring, and verification of carbon removal—while also onboarding new carbon removal projects seamlessly—Pachama makes it easier for any company to become carbon neutral. With its advanced enterprise tools and resources, the company has enormous potential to accelerate carbon neutrality initiatives for businesses through afforestation.”

#airbnb, #amazon, #articles, #artificial-intelligence, #bill-gates, #breakthrough-energy-ventures, #climate-pledge-fund, #deforestation, #greenhouse-gas-emissions, #king-com, #nature, #renewable-energy, #satellite-imagery, #serena-ventures, #tc, #united-states

Private space industrialization is here

The universal glee that surrounded the launch of the crewed Dragon spacecraft made it easy to overlook that the Falcon rocket’s red glare marked the advent of a new era — that of private space industrialization. For the first time in human history, we are not merely exploring a new landmass. We, as a biological species, are advancing to a new element — the cosmos.

The whole history of humanity is the story of our struggle with space and time. Mastering new horizons, moving ever farther; driven by the desire for a better life or for profit, out of fear or out of sheer curiosity, people found ever faster, easier, cheaper and safer ways to conquer the space between here and there. When, at the beginning of the 19th century, Thomas Jefferson bought Louisiana from Napoleon, actually having doubled the territory of the United States at that time, he believed it would take thousands of years for settlers to populate these spaces in the center of the continent.

But after just a few decades, the discovery of gold in California mobilized huge masses of industrious people, created incentives for capital and demanded new technologies. As countless wagons of newcomers moved through the land, threads of railways were stretched coast to coast, cities and settlements arose, and what Jefferson envisioned more than 200 years ago was actualized — and in the span of just one human life.

Growing up in a small Mongolian village near where Genghis Khan began the 13th-century journey that resulted in the largest contiguous land empire in history, I acquired an early interest in the history of explorers. Spending many long Siberian winter twilights reading books about great geographical discoveries, I bemoaned fate for placing me in a dull era in which all new lands had been discovered and all frontiers had been mapped.

Little did I know that only a few decades later, I would be living through the most exciting time for human exploration the world had ever seen.

The next space race

In recent years, the entire space industry has been waiting and looking for what will serve as the gold rush of space. One could talk endlessly about the importance of space for humanity and how technologies developed by and for space activity help to solve problems on Earth: satellite imagery, weather, television, communications. But without a real “space fever” — without the short-term insanity that will pour enormous financial resources, entrepreneurial energy and engineering talent into the space industry, it will not be possible to spark a new “space race.”

Presently, the entire space economy — including rockets, communications, imagery, satellites and crewed flights — does not exceed $100 billion, which is less than 0.1% of the global economy. For comparison: during the dot-com bubble in the late 1990s, the total capitalization of companies in this sector amounted to more than 5% of global GDP. The influence of the California Gold Rush in the 1850s was so significant that it changed the entire U.S. economy, essentially creating a new economic center on the West Coast.

The current size of the space economy is not enough to cause truly tectonic shifts in the global economy. What candidates do we have for this place in the 21st century? We are all witnesses to the deployment of space internet megaconstellations, such as Starlink from SpaceX, Kuiper from Amazon and a few other smaller players. But is this market enough to create a real gold rush? The size of the global telecommunications market is an impressive $1.5 trillion (or almost 1.5% of the global economy).

If a number of factors coincide — a sharp increase in the consumption of multimedia content by unmanned car passengers, rapid growth in the Internet of Things segment — satellite telecommunications services can grow in the medium term to 1 trillion or more. Then, there is reason to believe that this segment may be the driver of the growth when it comes to the space economy. This, of course, is not 5% (as was the case during the dot-com era), but it is already an impressive 1% of the world economy.

But despite all the importance of telecommunications, satellite imagery and navigation, these are the traditional space applications that have been used for many decades since the beginning of the space era. What they have in common is that these are high value-added applications, often with no substitutes on the ground. Earth surveillance and global communications are difficult to do from anywhere but space.

Therefore, the high cost of space assets, caused primarily by the high cost of launch and historically amounting to tens of thousands of dollars per kilogram, was the main obstacle to space applications of the past. For the true industrialization of space and for the emergence of new space services and products (many of which will replace ones that are currently produced on Earth), a revolution is needed in the cost of launching and transporting cargo in space.

Space transports

The mastering of new territories is impossible to imagine without transport. The invention and proliferation of new means of moving people and goods — such as railways, aviation, containers — has created the modern economy that we know. Space exploration is not an exception. But the physical nature of this territory creates enormous challenges. Here on Earth, we are at the bottom of a huge gravity well.

To deliver the cargo into orbit and defeat gravity, you need to accelerate things to the prodigious velocity of 8 km/s — 10-20 times faster than a bullet. Less than 5% of a rocket’s starting mass reaches orbit. The answer, then, lies in reusability and in mass production. The tyranny of rocket science’s Tsiolkovsky equation also contributes to the large rocket sizes that are necessary. It drives the strategies for companies like SpaceX and Blue Origin, who are developing large, even gigantic, reusable rockets such as Starship or New Glenn. We’ll soon see that the cost of launching into space will be even less than a few hundred U.S. dollars per kg.

But rockets are effective only for launching huge masses into low-Earth orbits. If you need to distribute cargo into different orbits or deliver it to the very top of the gravity well — high orbits, such as GEO, HEO, Lagrange points or moon orbit — you need to add even more delta velocity. It is another 3-6 km/sec or more. If you use conventional rockets for this, the proportion of the mass removed is reduced from 5% to less than 1%. In many cases, if the delivered mass is much less than the capabilities of huge low-cost rockets, you need to use much more expensive (per kg of transported cargo) small and medium launchers.

This requires multimodal transportation, with huge cheap rockets delivering cargo to low-Earth orbits and then last-mile space tugs distributing cargo between target orbits, to higher orbits, to the moon and to other planets in our solar system. This is why Momentus, the company I founded in 2017 developing space tugs for “hub-and-spoke” multimodal transportation to space, is flying its first commercial mission in December 2020 on a Falcon 9 ride-share flight.

Initially, space tugs can use propellant delivered from Earth. But an increase in the scale of transportation in space, as well as demand to move cargo far from low-Earth orbit, creates the need to use a propellant that we can get not from the Earth’s surface but from the moon, from Mars or from asteroids — including near-Earth ones. Fortunately, we have a gift given to us by the solar system’s process of evolution — water. Among probable rocket fuel candidates, water is the most widely spread in the solar system.

Water has been found on the moon; in craters in the vicinity of the poles, there are huge reserves of ice. On Mars, under the ground, there is a huge ocean of frozen water. We have a vast asteroid belt between the orbits of Mars and Jupiter. At the dawn of the formation of the solar system, the gravitational might of Jupiter prevented one planet from forming, scattering fragments in the form of billions of asteroids, most of which contain water. The same gravity power of Jupiter periodically “throws out” asteroids into the inner part of the solar system, forming a group of near-Earth asteroids. Tens of thousands of near-Earth asteroids are known, of which almost a thousand are more than 1 km in diameter.

From the point of view of celestial mechanics, it is much easier to deliver water from asteroids or from the moon than from Earth. Since Earth has a powerful gravitational field, the payload-to-initial-mass ratio delivered to the very top of the gravitational well (geostationary orbit, Lagrange points or the lunar orbit) is less than 1%; whereas from the surface of the moon you can deliver 70% of the original mass, and from an asteroid 99%.

This is one of the reasons why at Momentus we’re using water as a propellant for our space tugs. We developed a novel plasma microwave propulsion system that can use solar power as an energy source and water as a propellant (simply as a reaction mass) to propel our vehicle in space. The choice of water also makes our space vehicles extremely cost-effective and simple.

The proliferation of large, reusable, low-cost rockets and in-space last-mile delivery opens up opportunities that were not possible within the old transportation price range. We assume that the price to deliver cargo to almost any point in cislunar space, from low-Earth orbit to low-lunar orbit will be well below $1,000/kg within 5-10 years. What is most exciting is that it opens up an opportunity to introduce an entirely new class of space applications, beyond traditional communication, observation and navigation; applications that will start the true industrialization of space and catalyze the process of Earth industry migration into space.

Now, let’s become space futurists, and try to predict future candidates for a space gold rush in the next 5-10 years. What will be the next frontier’s applications, enabled by low-cost space transportation? There are several candidates for trillion-dollar businesses in space.

Energy generation

Energy generation is the first and largest candidate for the gold rush, as the energy share of the global economy is about 8.2%. Power generation in space has several fantastic advantages. First, it is a continuity of power generation. In space, our sun is a large thermonuclear reactor that runs 24/7. There’s no need to store electricity at night and in bad weather. As a result, the same surface collects 10 times more energy per 24 hours than on Earth.

This is not intuitively obvious, but the absence of twilights or nighttime, and the lack of clouds, atmosphere or accumulating dust create unique conditions for the production of electricity. Due to microgravity, space power plants with much lighter structures can eventually be much less costly than terrestrial plants. The energy can be beamed to the ground via microwaves or lasers. There are, however, at least two major challenges to building space power stations that still need to be resolved. The first is the cost of launching into space, and then the cost of transportation within space.

The combination of huge rockets and reusable space tugs will reduce the cost of transporting goods from Earth to optimal orbits up to several hundred dollars per kilogram, which will make the share of transportation less than one cent per kilowatt-hour. The second problem is the amount of propellant you’ll need to stabilize vast panels that will be pushed away by solar radiation pressure. For every 1 gigawatt of power generation capacity, you’ll need 500-1,000 tons of propellant per year. So to have the same generation capacity as the U.S. (1,200 GW), you’ll need up to 1 million tons of propellant per year (eight launches of Falcon 9 per hour or one launch of Starship per hour).

Power generation will be the largest consumer of the propellant in cislunar space, but the delivery of propellant from Earth will be too economically inefficient. The answer lies on the moon, where 40 permanently darkened craters near the north pole contain an estimated 600 million metric tonnes of ice. That alone will be enough for many hundreds of years of space power operations.

Data processing

Centers for data computation and processing are one of the largest and fastest-growing consumers of energy on Earth. Efficiency improvements implemented over the last decade have only increased the demand for large cloud-based server farms. The United States’ data centers alone consume about 70 billion kilowatt-hours of electricity annually. Aside from the power required to operate the systems that process and store data, there is an enormous cost in energy and environmental impact to cool those systems, which translates directly to dollars spent both by governments and private industry.

Regardless of how efficiently they are operated, the expansion of data centers alongside demands for increased power consumption is not sustainable, economically or environmentally. Instead of beaming energy to the ground via microwaves or lasers, energy can be used for data processing in space. It is much easier to stream terabytes and petabytes from space than gigawatts. Power-hungry applications like AI can be easily moved to space because most of them are tolerant of latency.

Space mining

Eventually, asteroids and the moon will be the main mining provinces for humanity as a space species. Rare and precious metals, construction materials, and even regolith will be used in the building of the new space economy, space industrialization and space habitats. But the first resource that will be mined from the moon or asteroids will be water — it will be the “oil” of the future space economy.

In addition to the fact that water can be found on asteroids and other celestial bodies, it is quite easy to extract. You simply need heat to melt ice or extract water from hydrates. Water can be easily stored without cryogenic systems (like liquid oxygen or hydrogen), and it doesn’t need high-pressure tanks (like noble gases — propellant for ion engines).

At the same time, water is a unique propellant for different propulsion technologies. It can be used as water in electrothermal rocket engines (like Momentus’ microwave electrothermal engines) or can be separated into hydrogen and oxygen for chemical rocket engines.

Manufacturing

The disruption of in-space transportation costs can make space a new industrial belt for humanity. Microgravity can support creating new materials for terrestrial applications like optical fiber, without the tiny flaws that inevitably emerge during production in a strong gravity field. These flaws increase signal loss and cause large attenuation of the transmitted light. Also, microgravity can be used in the future space economy to build megastructures for power generation, space hotels for tourists and eventually human habitats. In space, you can easily have a vacuum that would be impossible to achieve on Earth. This vacuum will be extremely valuable for the production of ultrapure materials like crystals, wafers and entirely new materials. The reign of in-space manufacturing will have begun when the main source of raw materials is not Earth, but asteroids or the moon, and the main consumers are in-space industry.

The future market opportunities enabled by the disruption in space transportation are enormous. Even without space tourism, space habitats will be almost a two trillion dollar market in 10-15 years. Undoubtedly, it will lead to a space gold rush that will drive human civilization’s development for generations to come.

The final frontier

I studied in high school during the last years of the Soviet Union. The Soviet economy was collapsing, we had no sanitation in the house, and quite often we had no electricity. During those dark evenings, I studied physics and mathematics books by the light of a kerosene lamp. We had a good community library, and I could order books and magazines from larger libraries in the big cities, like Novosibirsk or Moscow. It was my window into the world. It was awesome.

I was reading about the flights of the Voyager spacecraft, and about the exploration of the solar system, and I was thinking about my future. That was the time when I realized that I both love and excel in science and math, and I decided then to become a space engineer. In an interview with a local newspaper back in 1993, I told the reporter, “I want to study advanced propulsion technologies. I dream about the future, where I can be part of space exploration and may even fly to Mars … .”

And now that future is coming.

#aerospace, #blue-origin, #column, #data-processing, #electricity, #elon-musk, #energy, #falcon, #flight, #international-space-station, #lasers, #microwave, #momentus, #opinion, #outer-space, #satellite, #satellite-imagery, #science, #space, #space-exploration, #spacecraft, #spacex, #startups, #tc, #telecommunications

Autonomous driving startup turns its AI expertise to space for automated satellite operation

Hungarian autonomous driving startup AImotive is leveraging its technology to address a different industry and growing need: autonomous satellite operation. AImotive is teaming up with C3S, a supplier of satellite and space-based technologies, to develop a hardware platform for performing AI operations onboard satellites. AImotive’s aiWare neural network accelerator will be optimized by C3S for use on satellites, which have a set of operating conditions that in many ways resembles those onboard cars on the road – but with more stringent requirements in terms of power management, and environmental operating hazards.

The goal of the team-up is to have AImotive’s technology working on satellites that are actually operational on orbit by the second half of next year. The projected applications of onboard neural network acceleration extend to a number of different functions according to the companies, including telecommunications, Earth imaging and observation, autonomously docking satellites with other spacecraft, deep space mining and more.

While it’s true that most satellites operate essentially in an automated fashion already – mean gin they’re not generally manually flown at every given moment – true neural network-based onboard AI smarts would provide them with much more autonomy when it comes to performing tasks, like imaging a specific area or looking for specific markers in ground or space-based targets. Also, AImotive and C3S believe that local processing of data has the potential to be a significant game-changer when it comes to the satellite business.

Currently, most of the processing of data collected by satellites is done after the raw information is transmitted to ground stations. That can actually result in a lot of lag time between data collection, and delivery of processed data to customers, particularly when the satellite operator or another go-between is acting as the processor on behalf of the client rather than just delivering raw info (and doing this analysis is also a more lucrative proposition for the data provider, or course).

AImotive’s tech could mean that processing happens locally, on the satellite where the information is captured. There’s been a big shift towards this kind of ‘computing at the edge’ in the ground-based IoT world, and it only makes sense to replicate that in space, for many of the same reasons – including that it reduces time to delivery, meaning more responsive service for paying customers.

#aerospace, #aimotive, #artificial-intelligence, #computing, #imaging, #neural-network, #satellite, #satellite-imagery, #satellites, #science, #space, #spacecraft, #spaceflight, #tc, #telecommunications

High Earth Orbit Robotics uses imaging satellites to provide on-demand check-ups for other satellites

Maintaining satellites on orbit and ensuring they make full use of their operational lifespan has never been more important, given concerns around sustainable operations in an increasingly crowded orbital environment. As companies tighten their belts financially to deal with the ongoing economic impact of COVID-19, too, it’s more important than ever for in-space assets to live up to their max potential. A startup called High Earth Orbit (HEO) Robotics has a very clever solution that makes use of existing satellites to provide monitoring services for others, generating revenue from unused Earth imaging satellite time and providing a valuable maintenance service all at the same time.

HEO’s model employs cameras already on orbit mounted on Earth observation satellites operated by partner companies, and tasks them with collecting images of the satellites of its customers, who are looking to ensure their spacecraft are in good working order, oriented in the correct way, and with all their payloads properly deployed. Onboard instrumentation can provide satellite operators with a lot of diagnostic information, but sometimes there are problems only external photography can properly identify, or that require confirmation or further detail to resolve.

The beauty of HEO’s model is that it’s truly a win for all involved; Earth observation satellites generally aren’t in use at all times – they have considerable down time in particular when they’re over open water, for instance, HEO’s founder and CEO William Crowe tells me.

“We try to use the satellites at otherwise low-value times, like when they are over the ocean (which of course is most of the time),” Crowe said via email. “We also task our partners just like we would as a regular Earth-imaging business, specifying an area on Earth’s surface to image, the exception being that there is always a spacecraft in the field-of-view.”

The company is early on in its trajectory, but it has just released a proof-of-concept capture of the International Space Station, as seen in the slides provided by HEO below. The image was captured by a satellite owned by the Korean Aerospace Research Institute, which is operated by commercial satellite operator SI Imaging Services. HEO’s software compensated for the relative velocity of the satellite to the ISS, which was a very fast 10 km/s (around 6.2 miles per second). The company says it’s working towards getting even higher-resolution images.

The beauty of HEO’s model is that it actually requires no capital expenditure to work, in terms of the satellites used: Crowe explained that they currently pay-per-use, which means they only spend when they have a client request, so that the revenue covers the cost of tasking the partner satellite. HEO does plan to launch its own satellites in the “medium-term,” however, Crowe said, in order to cover the gaps that currently exist in coverage and in anticipation of an explosion in the low Earth orbit satellite population, which is expected to expand from the existing 2,000 or so spacecraft to as many as 100,000 or more over roughly the next decade.

HEO could ultimately provide imaging of not only other satellites, but also space debris to help with removal efforts, and even asteroids that could prove potential targets for mining and resource gathering. It’s a remarkably well-considered idea that stands to benefit from the explosion of growth in the orbital satellite industry, and also stands out among space startups because it has a near-term path to revenue that doesn’t require a massive outlay of capital up front.

#aerospace, #ceo, #imaging, #international-space-station, #robotics, #satellite, #satellite-imagery, #satellites, #space, #spacecraft, #spaceflight, #starlink, #tc

As wildfire season approaches, AI could pinpoint risky regions using satellite imagery

The U.S. has suffered from devastating wildfires over the last few years as global temperatures rise and weather patterns change, making the otherwise natural phenomenon especially unpredictable and severe. To help out, Stanford researchers have found a way to track and predict dry, at-risk areas using machine learning and satellite imagery.

Currently the way forests and scrublands are tested for susceptibility to wildfires is by manually collecting branches and foliage and testing their water content. It’s accurate and reliable, but obviously also quite labor intensive and difficult to scale.

Fortunately, other sources of data have recently become available. The European Space Agency’s Sentinel and Landsat satellites have amassed a trove of imagery of the Earth’s surface that, when carefully analyzed, could provide a secondary source for assessing wildfire risk — and one no one has to risk getting splinters for.

This isn’t the first attempt to make this kind of observation from orbital imagery, but previous efforts relied heavily on visual measurements that are “extremely site-specific,” meaning the analysis method differs greatly depending on the location. No splinters, but still hard to scale. The advance leveraged by the Stanford team is the Sentinel satellites’ “synthetic aperture radar,” which can pierce the forest canopy and image the surface below.

“One of our big breakthroughs was to look at a newer set of satellites that are using much longer wavelengths, which allows the observations to be sensitive to water much deeper into the forest canopy and be directly representative of the fuel moisture content,” said senior author of the paper, Stanford ecoydrologist Alexandra Konings, in a news release.

The team fed this new imagery, collected regularly since 2016, to a machine learning model along with the manual measurements made by the U.S. Forest Service. This lets the model “learn” what particular features of the imagery correlate with the ground-truth measurements.

They then tested the resulting AI agent (the term is employed loosely) by having it make predictions based on old data for which they already knew the answers. It was accurate, but most so in scrublands, one of the most common biomes of the American west and also one of the most susceptible to wildfires.

You can see the results of the project in this interactive map showing the model’s prediction of dryness at different periods all over the western part of the country. That’s not so much for firefighters as a validation of the approach — but the same model, given up to date data, can make predictions about the upcoming wildfire season that could help the authorities make more informed decisions about controlled burns, danger areas and safety warnings.

The researchers’ work was published in the journal Remote Sensing of Environment.

#artificial-intelligence, #greentech, #satellite-imagery, #science, #stanford-university, #tc, #wildfires

R&D Roundup: Sweat power, Earth imaging, testing ‘ghostdrivers’

I see far more research articles than I could possibly write up. This column collects the most interesting of those papers and advances, along with notes on why they may prove important in the world of tech and startups.

This week: one step closer to self-powered on-skin electronics; people dressed as car seats; how to make a search engine for 3D data; and a trio of Earth imaging projects that take on three different types of disasters.

Sweat as biofuel

Monitoring vital signs is a crucial part of healthcare and is a big business across fitness, remote medicine and other industries. Unfortunately, powering devices that are low-profile and last a long time requires a bulky battery or frequent charging is a fundamental challenge. Wearables powered by body movement or other bio-derived sources are an area of much research, and this sweat-powered wireless patch is a major advance.

A figure from the paper showing the device and interactions happening inside it.

The device, described in Science Robotics, uses perspiration as both fuel and sampling material; sweat contains chemical signals that can indicate stress, medication uptake, and so on, as well as lactic acid, which can be used in power-generating reactions.

The patch performs this work on a flexible substrate and uses the generated power to transmit its data wirelessly. It’s reliable enough that it was used to control a prosthesis, albeit in limited fashion. The market for devices like this will be enormous and this platform demonstrates a new and interesting direction for researchers to take.

#artificial-intelligence, #autonomous-systems, #coronavirus, #covid-19, #cybernetics, #esa, #extra-crunch, #gadgets, #health, #imaging, #lidar, #machine-learning, #mit, #national-science-foundation, #plastics, #satellite-imagery, #science, #self-driving-car, #space, #tc, #technology, #telemedicine

NASA and Planet expand imagery partnership to all NASA-funded Earth science research

NASA and Planet have deemed their pilot partnership a success, and the result is that NASA will extend its contract with Planet to provide the company’s satellite imagery of Earth to all research programs funded by the agency. NASA had signed an initial contract last April with Planet, to provide Planet imagery to a team of 35 researchers working on tracking what are known as ‘Essential Climate Variables’ or ECVs.

The ECV trial showed that Planet’s imagery was useful in tracking and providing insight into a number of different Earth-based environmental events, including landslides in the Himalayan mountain range. During the study, one of the key ingredients in helping researchers detect early warning signs was the Planet satellite constellation’s high revisit rate, which means the frequency with which it photographs a specific area over time.

Planet’s data covers the entire Earth at least once per day, and includes even areas of the planet not typically included in Earth observation passes by other satellites and providers, like the Arctic. Its frequency, along with its coverage and degree of detail, all combine to make it a valuable resources to anyone conducting Earth science work, which means it’s very good news that it’s now available to hundred more scientists working on dozens more projects.

#aerospace, #google-earth, #greentech, #nasa, #outer-space, #planet, #satellite, #satellite-constellation, #satellite-imagery, #science, #space, #tc

Valispace raises $2.4M lead by JOIN Capital to become the ‘Github for hardware’

Hardware engineering is mostly document-based. A typical satellite might be described in several hundred thousand PDF documents, spreadsheets, simulation files and more; all potentially inconsistent between each other. This can lead to costly mistakes. NASA lost a $125 million Mars orbiter because one engineering team used metric units while another used English units, for instance.

Germany-HQ’d Valispace, which also has offices in Portugal, dubs itself as “Github for hardware”. In other words, it’s a collaboration platform for engineers, allowing them to develop better satellites, planes, rockets, nuclear fusion reactors, cars and medical devices, you name it. It’s a browser-based application, which stores engineering data and lets the users interconnect them through formulas. This means that when one value is changed, all other values are updated, simulations re-run and documentation rewritten automatically.

That last point is important in this pandemic era, where making and improving medical ventilators has become a huge global issue.

Valispace has now raised a Seed Extension funding round of €2.2M / $2.4M lead by JOIN Capital in Berlin and was joined by HCVC (Hardware Club), based in Paris.

The funding will be used to expand into new industries (e.g. medical devices, robotics) and expansion of the existing ones (aeronautics, space, automotive, energy). The company is addressing the Systems Engineering Tool market in Europe which is worth €7Billion, while the US market is at least as big. It’s competitors include RHEA CDP4, Innoslate, JAMA and the largest player Status Quo.

Marco Witzmann, CEO of Valispace said: “Valispace has proven to help engineers across industries to develop better hardware. From drones to satellites, from small electronic boxes to entire nuclear fusion reactors. When modern companies like our customers have the choice, they chose an agile engineering approach with Valispace.”

Tobias Schirmer from JOIN Capital commented: “Browser-based collaboration has become a must for any modern hardware company, as the importance of communication across teams and offices increases.”

The company now counts BMW, Momentus, Commonwealth Fusion Systems and Airbus as customers.
Witzmann previously worked on Europe’s biggest Satellite Program (Meteosat Third Generation) as a Systems Engineer, while his Portugal-based co-founder Louise Lindblad (COO) worked at the European Space Agency, developing satellites and drones.

As satellite engineers, both were surprised that while the products they were working on were cutting edge, the tools to develop them seemed to be from the 80s. In 2016 they launched Valispace as a company, convincing Airbus to become one of their first customers.

#airbus, #articles, #berlin, #bmw, #ceo, #co-founder, #coo, #electrical-engineering, #energy, #engineering, #europe, #european-space-agency, #github, #hardware-club, #join-capital, #momentus, #paris, #player, #portugal, #satellite, #satellite-imagery, #simulation, #tc, #united-states