Remotely operated vehicles have changed how we explore and exploit the ocean. They can operate for far longer than human-occupied vehicles, go into areas where risk would dictate people avoid, and reach depths where very few craft can take a human. But even so, a lot of the hardware gets taken up by an enclosure that’s capable of protecting things like batteries and electronics from the pressures of the deep.
But that may not be entirely necessary, based on a report in today’s Nature. In it, a team of Chinese researchers describe adapting hardware so that it could operate a soft-bodied robot in the deep ocean. The researchers then gave the robot a ride 10 kilometers down in the Mariana Trench and showed that it worked.
Mention robots, and for many people, the first thing that comes to mind are the collections of metal and cabling that make up things like the dancing Atlases from Boston Robotics. But over the last decade, plenty of researchers have demonstrated that all that rigid hardware isn’t strictly necessary. Soft-bodied robots work, too, and can do interesting things like squeeze through tight spaces or incorporate living cells into their structure.
Toro this week announced its intentions to acquire Left Hand Robotics. The Colorado-based startup (not to be confused with Righthand Robotics) is a natural fit for the lawn mowing giant, as the producer of the RT-1000, an autonomous system capable of mowing large lawns and clearing snow from sidewalks.
Toro is best known for its personal and professional mowers and various other landscaping machines. The company also has some experience in the robotic category, beating various competitors to the punch by a decade or two with its iMow, but those efforts don’t appear to have made a major dent in its overall offerings.
Left Hand’s primary offering targets professionals. It’s already a category for its new parent company, though it’s easy to see how it might scale down as personal mowers gain in popularity. The RT-1000 is a fully autonomous system designed for a variety of weather conditions that could make a more compact version an appealing option for users — assuming the companies are able to shrink the cost accordingly.
The acquisition comes as companies like John Deere are strategizing their own approach toward robotics, while devoted companies like iRobot have begun to explore the space. Although iRobot’s own mower was delayed due to COVID-19, the Roomba maker believes it has cracked what’s proven to be a difficult market for many.
Left Hand has raised $8.9 million to date. Details of the transaction have not been disclosed.
The sub-category of soft robotics has transformed the way many think about the field. Oft-influenced by natural phenomenon, the technology offers a dramatically different approach than the sort of rigid structures we traditionally think of when we discuss robots.
Soft designs offer a number of benefits, including compliance, which has already seen a number of real-world applications in manufacturing and fulfillment. But like their more rigid cousins, soft robots have their limitations. As such, designers generally choose between one or the other for a given job — or, best-case scenario, design swappable parts.
A team at MIT’s CSAIL lab is exploring a technology that could make choosing less of a trade-off. The project has been in the works since 2017, though it’s still in the somewhat early stages — still largely the realm of computer simulation, though the details have been outlined in a new paper.
“This is the first step in trying to see if we can get the best of both worlds,” CSAIL post-doc James Bern said in a release.
In the project (or the simulated version, at least), the robot is controlled by a series of cables. Pulling on them in the right combination turns the soft structure into a hard one. The team uses the analogy of a series of muscles controlling the human arm — if the right ones are flexed, you can effectively lock a position in place.
The team will present their findings at a conference next month. For the time being, they’re currently working on a prototype to showcase how it operates in a real-world setting. Combining the two fields could go a ways toward building safer collaborative robots for interacting with human workers.
Insects are a lot of things – but fragile they’re not. Sure, most can’t withstand the full force of a human foot, but for their size, they’re evolve to be extremely rugged and resilient. Insect-sized technology, on the other hand, is general another story.
That’s certainly been the historic case with scaled-down drones. The components, in particular, tend to become more fragile the more you shrunk them. In particular, motors both lose efficiency and weaken the smaller they get.
Earlier models from the MIT lab have relied on rigid ceramic-based materials. They did the job in terms of getting the robot airborne, but as the lab notes, “foraging bumblebees endure a collision about once every second.” In other words, if you’re going to build something this small, you need to ensure that it doesn’t break down the first time it comes into contact with something.
“The challenge of building small aerial robots is immense,” says MIT Assistant Professor Kevin Yufeng Chen.
New drone models, which the lab describes as resembling, “a cassette tape with wings,” are built with soft actuators, made from carbon nanotube-coated rubber cylinders. The actuators elongate when electricity is applied at a rate up to 500 times a second. Doing this causes the wings to beat and the drones to take flight.
The drones are extremely light weight, as well, coming in at around 0.6 grams – basically as much as a big bumble bee. There are still limitations to these early models. Namely, the system currently requires them to be hardwired to deliver the necessary charge – as seen in the below gif. It can be a bit of a mess. Other modifications are being made, as well, including a more nature-inspired dragonfly shape being used for newer prototypes.
Image Credits: MIT
Should such the lab be able to to produce such a robot untethered with imaging capabilities and a decent sized battery, the potential applications are immense for the tiny drones. You’ve got everything from simple inspections currently being handled by larger models to pollination and search and rescue.
Skydio has raised $170 million in a Series D funding round led by Andreessen Horowitz’s Growth Fund. That pushes it into unicorn territory, with $340 million in total funding and a post-money valuation north of $1 billion. Skydio’s fresh capital comes on the heels of its expansion last year into the enterprise market, and it intends to use the considerable pile of cash to help it expand globally and accelerate product development.
In July of last year, Skydio announced its $100 million Series C financing, and also debuted the X2, its first dedicated enterprise drone. The company also launched a suite of software for commercial and enterprise customers, its first departure from the consumer drone market where it had been focused prior to that raise since its founding in 2014.
Skydio’s debut drone, the R1, received a lot of accolades and praise for its autonomous capabilities. Unlike other consumer drones at the time, including from recreational drone maker DJI, the R1 could track a target and film them while avoiding obstacles without any human intervention required. Skydio then released the Skydio 2 in 2019, its second drone, cutting off more than half the price while improving on it its autonomous tracking and video capabilities.
Late last year, Skydio brought on additional senior talent to help it address enterprise and government customers, including a software development lead who had experience at Tesla and 3D printing company Carbon. Skydio also hired two Samsara executives at the same time to work on product and engineering. Samsara provides a platform for managing cloud-based fleet operations for large enterprises.
The applications of Skydio’s technology for commercial, public sector and enterprise organizations are many and varied. Already, the company works with public utilities, fire departments, construction firms and more to do work including remote inspection, emergency response, urban planning and more. Skydio’s U.S. pedigree also puts it in prime position to capitalize on the growing interest in applications from the defense sector.
a16z previously led Skydio’s Series A round. Other investors who participated in this Series D include Lines Capital, Next47, IVP and UP.Partners.
Japanese space startup Gitai has raised a $17.1 million funding round, a Series B financing for the robotics startup. This new funding will be used for hiring, as well as funding the development and execution of an on-orbit demonstration mission for the company’s robotic technology, which will show its efficacy in performing in-space satellite servicing work. That mission is currently set to take place in 2023.
Gitai will also be staffing up in the U.S., specifically, as it seeks to expand its stateside presence in a bid to attract more business from that market.
“We are proceeding well in the Japanese market, and we’ve already contracted missions from Japanese companies, but we haven’t expanded to the U.S. market yet,” explained Gitai founder and CEO Sho Nakanose in an interview. So we would like to get missions from U.S. commercial space companies, as a subcontractor first. We’re especially interested in on-orbit servicing, and we would like to provide general-purpose robotic solutions for an orbital service provider in the U.S.”
Nakanose told me that Gitai has plenty of experience under its belt developing robots which are specifically able to install hardware on satellites on-orbit, which could potentially be useful for upgrading existing satellites and constellations with new capabilities, for changing out batteries to keep satellites operational beyond their service life, or for repairing satellites if they should malfunction.
Gitai’s focus isn’t exclusively on extra-vehicular activity in the vacuum of space, however. It’s also performing a demonstration mission of its technical capabilities in partnership with Nanoracks using the Bishop Airlock, which is the first permanent commercial addition to the International Space Station. Gitai’s robot, codenamed S1, is an arm–style robot not unlike industrial robots here on Earth, and it’ll be showing off a number of its capabilities, including operating a control panel and changing out cables.
Long-term, Gitai’s goal is to create a robotic workforce that can assist with establishing bases and colonies on the Moon and Mars, as well as in orbit. With NASA’s plans to build a more permanent research presence on orbit at the Moon, as well as on the surface, with the eventual goal of reaching Mars, and private companies like SpaceX and Blue Origin looking ahead to more permanent colonies on Mars, as well as large in-space habitats hosting humans as well as commercial activity, Nakanose suggests that there’s going to be ample need for low-cost, efficient robotic labor – particularly in environments that are inhospitable to human life.
Nakanose told me that he actually got started with Gitai after the loss of his mother – an unfortunate passing he said he firmly believes could have been avoided with the aid of robotic intervention. He began developing robots that could expand and augment human capability, and then researched what was likely the most useful and needed application of this technology from a commercial perspective. That research led Nakanose to conclude that space was the best long-term opportunity for a new robotics startup, and Gitai was born.
This funding was led by SPARX Innovation for the Future Co. Ltd, and includes funding form DcI Venture Growth Fund, the Dai-ichi Life Insurance Company, and EP-GB (Epson’s venture investment arm).
Robotics took a small step into the wild world of SPACs this week, as Berkshire Grey announced its plan to go public by Q2. Setting aside some of the bigger issues with using the reverse merger route we’ve discussed plenty, BG is an ideal candidate for this next major step for a number of reasons.
First, the company’s got a track record and a ton of interest. I visited their HQ early last year, before the country shut down. Their plans were already fairly aggressive, with the wind of a recently raised $263 million Series B at their back. Retailers everywhere are already looking to automation as a way of staying competitive with the ominous monolith that is Amazon.
The mega-retailer has already acquired and deployed a ton of robots in fulfillment centers across the world. The latest number I’ve seen is 200,000. That comes from early 2020, so the number has no doubt increased since then. As Locus Robotics CEO Rick Faulk told me the other week, “There are investors that want to invest in helping everyone that’s not named ‘Amazon’ compete.” As with so many things these days, it’s Amazon versus the world.
Image Credits: Berkshire Grey
Beyond its knack for raising money by the boatload, Berkshire Grey is the company you go to when you’re looking to automate a factory from the ground, up. The company says current warehouse automation is somewhere in the neighborhood of 5%. It’s a figure I’ve seen tossed around before, and certainly points to a ton of opportunity. BG’s offering isn’t lights-out automation, but it’s a pretty full-feature solution.
Locus, which just raised a healthy $150M Series E, represents a different end of the spectrum. Similar to offerings from companies like Fetch, it offers a more plug-and-play approach to automation. The lowered barrier of entry means a far less costly on-ramp. It also means you don’t have to shut down your warehouses for an extended period to implement the tech. It’s a more workable solution for situations with contract-based clients or temporary seasonal needs.
The company uses a RaaS (robot-as-a-service) model to deploy its technology. That’s something you’re going to be hearing more and more of around the industry. Like the HaaS (the “h” being hardware) model, the company essentially rents out these super-pricey machines, rather than selling them outright. It’s another way to lower the barrier of entry, and it gives the robotics companies the opportunity to offer continuous service upgrades.
Image Credits: Future Acres
It’s a model Future Acres, a Southern Californian agtech startup, is exploring as it comes out of stealth. Things are still early days for the company, which spun out of Wavemaker Partners (which also developed food service robotics company Miso). Among other things, the company is looking toward a crowdfunded raise by way of SeedInvest. I’ve not seen a lot of robotics companies take that route, so it will be interesting to see how that plays out.
Like logistics, agtech is shaping up to be a pretty massive category for robotics investments. FarmWise was ahead of that curve, announcing a $14.5 million round back in 2019 (bringing its total to north of $20 million). This week the Bay Area startup added crop dusting functionality to its weed-pulling robot.
Image Credits: NASA/JPL-Caltech
NASA’s Perseveranceunderstandably grabbed the biggest robotics headlines of the week. Landing with a parachute sporting the JPL motto, “Dare mighty things,” the rover sent back some of the best and most stunning images of Mars to date.
MSCHF’s livestream, on the other hand, was a bit more spotty. But aside from a fair number of interruptions with the feed, I suspect the company’s 40th drop went about as well as it could have hoped. Prior to announcing that it would mount a remote-control paintball gun to the back of Spot, Boston Dynamics issued a statement condemning the move:
Our mission is to create and deliver surprisingly capable robots that inspire, delight & positively impact society. We take great care to make sure our customers intend to use our robots for legal uses. We cross-check every purchase request against the U.S. Government’s denied persons and entities lists, prior to authorizing a sale.
Image Credits: MSCHF
MSCHF seemed to bask in the attention, even before its name was revealed to the public. At the very least, the stunt was a success from the standpoint of having ignited a conversation about the future of robotics. Boston Dynamics intrinsically understands that its robots sometimes freak people out — it’s a big part of the reason we get viral videos from the company, like the recent one featuring various robots dancing to The Contours.
The ACLU notably raised concern last year after footage from one of our events featuring Spot being used in the field by the Massachusetts police made the rounds. This week, the NYPD deployed a Spot robot yet again — this time at the scene of a home invasion in the Bronx (not to mention a new paint job and the name “Digidog” for some reason). Your own interpretation of those particular optics will likely depend on, among other things, your feelings about cops.
Certainly police departments have utilized robotics for decades for bomb disposal. It’s true that Boston Dynamics (along with much of the robotics industry) got early funding from DARPA. Spot in its current form isn’t much as far as war machines go, but I think these are important conversations to have at this stage in robotic evolution. Certainly there are military drones in the world, and have been for more than a decade.
That’s an important ethical conversation. As is the responsibility of robotics manufacturers once their machines are out in the world. Boston Dynamics does due diligence when selling its robots, but does it continue to be responsible for them once it no longer owns them? That’s certainly not a question we’re going to answer this week.
As far as fundraising goes, Berkshire Grey is in pretty good shape. When I visited its Massachusetts headquarters last year, following a massive $263 million Series B, the company discussed some pretty aggressive growth plans. Mind you, that was before the pandemic has really touched down in the U.S. in a meaningful way.
If anything, Covid-19 has accelerated interest in automation, as companies look to safeguard themselves from the inevitable effects of future pandemics. Today, Berkshire Gray announced its intention to become the latest tech company to go public by way of SPAC. The deal, which finds its merging with Revolution Acceleration Acquisition, could value the company at up to $2.7 billion.
In a release tied to the news, BG cites a 5% current warehouse automation figure – a number I’ve heard tossed around a lot in relation to these deals. It certainly points big potential for growth among retailers looking to streamline fulfillment, logistics and the like. For many, it’s as simple as finding a way to stay competitive with the likes of Amazon, which has massively bolstered its own robotics efforts through acquisitions like Kiva Systems.
BG offers a kind of ground-up solution for close to full automation. The technology separates it from more plug and play automation solutions like Locus and Fetch Robotics. Their offerings are more focused on automating companies faster and more cheaply. BG’s ecosystem includes a variety of different robotics, including picking, gripping and image sensing, with north of 300 patents in the space.
“Consumer expectations have changed, putting more pressure on supply chain operations to get the right goods to the right places at the right times, as efficiently as possible,” CEO Tom Wagner said in a release tied to the news. “Over the last 12 months the pandemic amplified the already high pressure to transform, so today it is no longer a question of if companies might transform but how quickly. We are incredibly excited about this transaction, which will enable Berkshire Grey to accelerate growth and provide new and existing customers with our leading robotics solutions.”
The deal would bring up up to $413 million in cash for the company. It says it plans to use the funding to address a backlog of customers and build out an international presence. It’s expected to close in Q2.
Bay Area-based AI startup Symbio today announced its “official launch.” Backed by a total of $30 million in funding, the company has struck deals with both Nissan and Toyota to implement its software in U.S.-based factories.
The company says its SymbioDCS technology is capable of dramatically increasing automation with factory robots on the assembly line.
“To the end customer, the proposition is pretty straightforward,” CEO and co-founder Max Reynolds tells TechCrunch. “We’re improving the efficiency of their automation. The high-level goal is to increase the capacity of the factory and enable them to build more product, more quickly, more flexibly. “
The company closed a $15 million Series B in December of last year. That adds to a $12 million Series A in 2018, $2.5 million seed two years prior and a $500,000 pre-seed. This latest round was led by ACME Capital, joining existing investors Andreessen Horowitz, Eclipse Ventures and The House Fund.
Image Credits: Symbio
“Instead of exclusively providing automation solutions, Symbio is also designing the tools that enable the developers and domain experts working in manufacturing to create their own automation solutions and easily adapt them to new tasks,” UC Berkeley professor Anca Dragan said in a statement tied to the news. “To do this, they are building products that leverage AI strengths and human insight in a symbiotic way.”
Founded in 2014, the company employs around 40, mostly engineers, largely based in California. Reynolds explains that the current level of automated manufacturing in automotive is actually far lower than one might expect. “Assembly is less than 5% automated, across the board,” he says. “Even in this core vertical, there’s a ton of headroom and opportunity for growth.”
When people ask me which robotics categories are poised for the biggest growth, I often point to agriculture. The technology already has a strong foothold in places like warehouse and logistics, but it’s impossible to look at the American – and global – farming community and not see a lot of potential for human-assisted automation.
The category still seems fairly wide open — but not for lack of interest. There are a number of companies both large and small carving out niches in the category. For now, at least, it seems there’s room for a number of different players. After all, needs vary greatly from farm to farm and crop to crop.
Santa Monica-based Future Acres is launching today, with plans to tackle grape picking. An outgrowth of Wavemaker Partners — the same firm that gave the world burger-flipping Miso Robotics — the startup is also announce its first robot, Carry.
Image Credits: Future Acres
“We see Carry as a kind of harvesting sidekick for workers. It’s an autonomous harvesting companion,” CEO Suma Reddy tells TechCrunch. “What it can do in the real world is transport up to 500 lbs. of crops in all terrain and all weather. It can increase production efficiency by up to 80%, which means it pays for itself in only 80 days.”
Carry relies on AI to transport hand-picked crops, working alongside humans rather than attempting to replace the delicate picking process outright. The company is expecting that farms will purchase multiple machines that can work in tandem to speed up their process and help reduce the human strain of moving the crops around manually.
Image Credits: Future Acres
The company is still in early stages, having developed a prototype of Carry. It’s also exploring some partnerships for development. The systems would run $10,000-$15,000 up front, though the company says it’s looking at a RaaS (robotics as a service) model, as a way to defer that cost.
Interest in agricultural robotics has only increased during the pandemic, amid health concerns and labor issues. The company is building on that interest by launching a campaign on SeedInvest, in hopes of raising $3 million, in addition to funding already provided by Wavemaker.
A full description of the rover’s descent and mission can be found here, but briefly stated here’s what happened:
After decelerating in the atmosphere interplanetary velocity, the heat shield is jettisoned and the parachute deployed. Beneath the heat shield are a number of cameras and instruments, which scanned the landscape to find a good landing spot. At a certain altitude and speed the parachute is detached and the “jetpack” lower stage takes over, using rockets to maneuver towards the landing area. At about 70 feet above the surface the “skycrane” dangles the rover itself out of the lander and softly plops it down on the ground before the jetpack flies off to crash at a safe distance.
Image Credits: NASA/JPL-Caltech
The whole process takes about seven minutes, the last few seconds of which which are an especially white-knuckle ride.
While previous rovers sent back lots of telemetry and some imagery, this level of visual documentation is a first. Even Insight, launched in 2018, wasn’t able to send back this kind of footage.
“This is the first time we’ve actually been able to capture an event like the landing of a spacecraft on Mars,” said Mike Watkins, head of JPL, at a press conference. “These are really amazing videos, we all binge watched them over the weekend if you can call a one minute video binge watching. We will learn something by looking at the performance of the vehicle in these videos but a lot of it is also to bring you along on our journey.”
The team discussed the entry, descent, and landing camera system or EDL cams, which were made both to monitor how the process went and to provide the visceral experience that the whole team craved.
“I don’t know about you, but it is unlikely at this point in my career that I will pilot a spacecraft down to the surface of Mars,” said Matt Wallace, deputy project manager of Perseverance at JPL. “But when you see this imagery I think you will feel like you are getting a glimpse into what it would be like to land successfully in Jezero crater with perseverance.”
There were upward-facing cameras on the capsule, jetpack, and rover, and downward-facing cameras on the latter two as well, providing shots in both directions for practically the whole process. This image of the heat shield falling away feels iconic already – revealing the desert landscape of Mars much like film we’ve seen of Apollo landings on the Moon:
Image Credits: NASA/JPL-Caltech
You can see the whole thing below:
Over 30 gigabytes of imagery were captured of the descent even though one of the cameras failed when the parachute deployed.
Practically every frame of the video offers new information about the process of landing on Mars — for instance, one of the springs used to eject the heat shield can be seen to have disconnected, though it didn’t affect the process. All the footage has been and no doubt will continue to be scrutinized for other insights.
In addition to these amazing landing videos, Perseverance has sent back a number of full-color images taken by its navigation cameras, though not all of its systems are up and running yet. The team stitched together the first images of Perseverance inspecting itself and its surroundings to form this panorama:
Image Credits: NASA/JPL-Caltech
We’ll have many, many more images soon as the team processes and uploads them.
As a parting “gift,” the team provided the remarkable first sound recording from the surface of Mars; they hoped that this would both provide new insights and also let anyone who can’t see the images experience the landing in a different way.
The EDL system included a microphone to capture the sound of the landing, but sadly didn’t work during the descent. It is, however, working perfectly well on the surface and has now captured the ambience of the Red Planet — and while the sound of a gust of wind may not be particularly alien, it’s incredible to think that this truly is wind blowing across another world.
I’ve piloted Spot a number of ways in a number of different settings. I had the chance to control the robot for the first time at one of our Robotics events a number of years back, and drove one around an obstacle course at Boston Dynamics’ headquarters. More recently, I navigated it via web browser as a test of the robot’s new remote interface.
But a recent test drive was different. For one thing, it wasn’t officially sanctioned by Boston Dynamics. Of course, the highly sophisticated quadrupedal robot has been out in the world for a while, and a few enterprising souls have begun to offer a remote Spot walking experience through the streets of San Francisco.
The latest project form MSCHF isn’t that. That should come as no surprise, of course. The Brooklyn-based company is never that straightforward. It’s the same organization that gave us the “pirate radio” streaming service All The Streams.FM and that wild Amazon Echo ultrasonic jammer. More than anything, their events are comments — on privacy, on consumerism or this case, a kind of dystopian foreshadowing of what robotics might become.
Like the rest of the world, the company was fascinated when Boston Dynamics put Spot up for sale — but unlike most of us, MSCHF actually managed to cobble together $75,000 to buy one.
And then it mounted a paintball gun to its back.
Image Credits: MSCHF
Starting Wednesday, users will be able to pilot a Spot unit through MSCHF’s site, and fire off a paintball gun in a closed setting. The company calls it “Spot’s Rampage.”
“The stream will start Wednesday at 1 PM EST,” MSCHF’s Daniel Greenberg told TechCrunch. “We will have a four-camera livestream going and as long as you’re on the site on your phone, you will have an equal chance of being able to control Spot, and every two minutes the driver will change. It should go for a few hours.”
Ahead of the launch of Spot’s web portal, the company built an API to remotely control both Spot’s SDK and the paintball gun mounted to the robot’s back. It’s a setup Boston Dynamics isn’t particularly thrilled with. Understandably so. For a company that has long been dealing with the blowback of cautionary science fiction like Black Mirror, the optics of a third-party mounting a gun — even one that shoots paint — are less than ideal.
Boston Dynamics tells TechCrunch that it was interested in working with the company early on.
“They came to us with the idea that they were going to do a creative project with Spot,” a rep told TechCrunch. “They’re a creative group of guys, who have done a bunch of creative things. In our conversations, we said that if you want to cooperate with us, we want to make it clear that the robots will not be used in any way that hurts people.”
Boston Dynamics balked when paintball gun entered the conversation. On Friday, it issued the following statement through Twitter:
Today we learned that an art group is planning a spectacle to draw attention to a provocative use of our industrial robot, Spot. To be clear, we condemn the portrayal of our technology in any way that promotes violence, harm, or intimidation. Our mission is to create and deliver surprisingly capable robots that inspire, delight & positively impact society. We take great care to make sure our customers intend to use our robots for legal uses. We cross-check every purchase request against the U.S. Government’s denied persons and entities lists, prior to authorizing a sale.
In addition, all buyers must agree to our Terms and Conditions of Sale, which state that our products must be used in compliance with the law, and cannot be used to harm or intimidate people or animals. Any violation of our Terms of Sale will automatically void the product’s warranty and prevent the robot from being updated, serviced, repaired or replaced. Provocative art can help push useful dialogue about the role of technology in our daily lives. This art, however, fundamentally misrepresents Spot and how it is being used to benefit our daily lives.
The statement is in line with the language in Spot’s contract, which prohibits using the robot to do anything illegal, or to intimidate or harm people. The company says it does additional “due diligence” with potential customers, including background checks.
Image Credits: MSCHF
The application is something of a gray area where Boston Dynamics is concerned. MSCHF approached the robotics company with its idea and Boston Dynamics balked, believing it wasn’t in-line with the stated mission for the quadrupedal robots. The official Spot’s Rampage site notes:
We talked with Boston Dynamics and they HATED [emphasis theirs] this idea. They said they would give us another TWO Spots for FREE if we took the gun off. That just made us want to do this even more and if our Spot stops working just know they have a backdoor override built into each and every one of these little robots.
Boston Dynamics says the company’s “understanding of the interaction” is “inaccurate.”
“We get approached by marketing opportunities all the time to create a really fantastic and compelling experience,” the company adds. “Selling one robot is not that interesting. Creating an amazing interactive experience is really compelling for us. One of the things they pitched to us was an interactive idea. It’s an expensive robot and they wanted to create an interactive experience where anybody can control the robot. We thought that was super cool and compelling.”
Boston Dynamics says it pitched the idea of using Spot’s robot arm to paint the physical space with a brush, rather than using the paintball gun. The company also offered to send technicians to the site to help maintain the robot during the stream, along with a few models as back up.
MSCHF’s inclusion of the paintball gun is, ultimately, about more than simply painting the canvas. The image of the robot with a gun — even one that only shoots paint — is menacing. And that’s kind of the point.
“It’s easy to look at these robots dance and cavort and see them as cute semi-sentient little friends,” says Greenberg. “They’re endearing when they mess up and fall over. We’ve adopted the trappings of that scenario by creating a ‘bull-in-a-china-shop’ scenario. Still, it’s worth remembering the big versions of Spot [Big Dog] were explicitly military mules, and that their public deployments tend to be by city agencies and law enforcement. At the end of the day, Spot is a terrestrial UAV – when you get to drive this robot and experience the thrill of pulling the trigger your adrenaline spikes — but, we hope, a few minutes later you feel a distinct chill. Anyone in their right mind knows these little cuties will kill people sooner or later.”
While early Boston Dynamics robots were, indeed, funded by DARPA for use as transport vehicles, the company is quick to distance itself from even the remotest hint of ominous imagery. Boston Dynamics came under fire from the ACLU after showcasing footage of a Spot being used in Massachusetts State police drills onstage at a TechCrunch robotics event.
Image Credits: MSCHF
The company told TechCrunch at the time:
Right now we’re at a scale where we can pick and choose the partners we engage with and make sure that they have a similar deployment and a vision for how robots are used. For example, not using robots in a way that would physically harm or intimidate people. But also have a realistic expectation for what a robot can and cannot do.
As MSCHF prepares to launch its event, the company is echoing those sentiments.
Image Credits: MSCHF
But the question of whether the company can put the toothpaste back in the tube remains. In cases of violations of the Terms of Service, the company can opt not to renew the license, which effectively deactivates it the next time a firmware update is due. Other cases could essentially void the warranty, meaning the company won’t service it.
A paintball gun being fired in a closed space likely doesn’t fall under harm, intimidation or illegal activity, however. So it’s not entirely clear whether Boston Dynamics has a direct course of action in this case.
“This is something we’re evaluating now, around this particular use case,” Boston Dynamics says. “We do have other terms of service in there, regarding modification of the robot in a way that makes it unsafe. We’re trying to understand what the implications are.”
Boston Dynamics (whose sale to Hyundai is expected to close in June) has devoted a good deal of time to showcasing the various tasks the robot can perform, from routine inspections at hazard sites to the complex dance moves it’s performed in a recent viral video. MSCHF’s primary — and, really, only — use is an interactive art piece.
“To be honest, we don’t have any further plans [for the robot],” says Greenberg. “I know we won’t do another drop with it as we do not do repeats so we will just have to get really creative. Maybe a waking cup holder.”
The Perseverance Mars rover landed safely yesterday, but only after a series of complex maneuvers as it descended at high speed through the atmosphere, known by the team as the “seven minutes of terror.” NASA has just shared a hair-raising image of the rover as it dangled from its jetpack above the Martian landscape, making that terror a lot easier to understand.
Published with others to the rover’s Twitter account (as always, in the first person), the image is among the first sent back from the rover; black-and-white shots from its navigation cameras appeared almost instantly after landing, but this is the first time we’ve seen the rover — or anything, really — from this perspective.
The image was taken by cameras on the descent stage or “jetpack,” a rocket-powered descent module that took over once the craft had sufficiently slowed via both atmospheric friction and its parachute. Once the heat shield was jettisoned, Perseverance scanned the landscape for a safe landing location, and once that was found, the jetpack’s job was to fly it there.
The image at the top of the story was taken by the descent stage’s “down-look cameras.” Image Credits: NASA/JPL-Caltech
When it was about 70 feet above the landing spot, the jetpack would have deployed the “sky crane,” a set of cables that would lower the rover to the ground from a distance that safely allowed the jetpack to rocket itself off to a crash landing far away.
The image at top was taken just moments before landing — it’s a bit hard to tell whether those swirls in the Martian soil are hundreds, dozens or just a handful of feet below, but follow-up images made it clear that the rocks you can see are pebbles, not boulders.
Image Credits: NASA/JPL-Caltech
The images are a reminder that the processes we see only third-hand as observers of an HQ tracking telemetry data sent millions of miles from Mars are in fact very physical, fast and occasionally brutal things. Seeing such an investment of time and passion dangling from cords above a distant planet after a descent that started at 5 kilometers per second, and required about a hundred different things to go right or else end up just another crater on Mars… it’s sobering and inspiring.
That said, that first person perspective may not even be the most impressive shot of the descent. Shortly after releasing that, NASA published an astonishing image from the Mars Reconnaissance Orbiter, which managed to capture Perseverance mid-fall under its parachute:
Image Credits: NASA/JPL-Caltech/University of Arizona
Keep in mind that MRO was 700 km away, and traveling at over 3 km/second at the time this shot was taken. “The extreme distance and high speeds of the two spacecraft were challenging conditions that required precise timing and for Mars Reconnaissance Orbiter to both pitch upward and roll hard to the left so that Perseverance was viewable by HiRISE at just the right moment,” NASA wrote in the description of the photo.
Chances are we’re going to be treated to a fuller picture of the “seven minutes of terror” soon, once NASA collects enough imagery from Perseverance, but for now the images above serve as reminders of the ingenuity and skill of the team there, and perhaps a sense of wonder and awe at the capabilities of science and engineering.
Massachusetts-based Locus Robotics today announced a $150 million Series E. The round, led by Tiger Global Management and Bond, brings the firm’s total to around $250 to date, and values the robotics company at $1 billion. Locus is notable for a more modular and flexible solution for automating warehouses than many of its competitors (see: Berkshire Grey). The company essentially leases out robotic fleet for organizes looking to automate logistics.
“We can change the wings on the plane while it’s flying,” CEO Rick Faulk tells TechCrunch. Basically no one else can do that. Companies want flexible automation. They don’t want to bolt anything to the floor. If you’re a third-party logistics company and you have a two, three, four-year contract, the last thing you want to do is invest $25-$50 million to buy a massive solution, bolt it to the floor and be locked into all of this upfront expense.”
The company currently has some 4,000 robots deployed across 80 sites. Roughly 80% of its deployments are in the U.S., with the remaining 20% in Europe. Part of this massive funding round will go toward expanding international operations, including a bigger push into the EU, as well as the APAC region, where it presently doesn’t have much of a footprint.
The company will also be investing in R&D, sales and marketing and increasing its current headcount of 165 by 75 in the coming year.
The pandemic is clearly a driver in interest around this brand of automation, with more companies looking toward robotics for help.
“COVID has put a spike in the growth of online ordering, clearly, and that spike is probably a four to five year jump,” says Faulk. “If you look at the trend of e-commerce, it’s been on a steady upward tick. It was about 11% last year and COVID put a spike up to 16/17%. We think that genie’s out of the bottle, and it’s not going back any time soon.”
The funding round also points to a company that seemingly has no desire to be acquired by a larger name, akin to Kiva Systems’ transformation into Amazon Robotics.
“We have no interest in being acquired,” the CEO says. “We think we can build the most and greatest value by operating independently. There are investors that want to invest in helping everyone that’s not named ‘Amazon’ compete.”
There will be one more robot on Mars tomorrow afternoon. The Perseverance rover will touch down just before 1:00 Pacific, beginning a major new expedition to the planet and kicking off a number of experiments — from a search for traces of life to the long-awaited Martian helicopter. Here’s what you can expect from Perseverance tomorrow and over the next few years.
It’s a big, complex mission — and like the Artemis program, is as much about preparing for the future, in which people will visit the Red Planet, as it is about learning more about it in the present. Perseverance is ambitious even among missions to Mars.
If you want to follow along live, NASA TV’s broadcast of the landing starts at 11:15 AM Pacific, providing context and interviews as the craft makes its final approach:
Until then, however, you might want to brush up on what Perseverance will be getting up to.
Seven months of anticipation and seven minutes of terror
Image Credits: NASA/JPL-Caltech
First, the car-sized rover has to get to the surface safely. It’s been traveling for seven months to arrive at the Red Planet, its arrival heralded by new orbiters from the UAE and China, which both arrived last week.
Perseverance isn’t looking to stick around in orbit, however, and will plunge directly into the thin atmosphere of Mars. The spacecraft carrying the rover has made small adjustments to its trajectory to be sure that it enters at the right time and angle to put Perseverance above its target, the Jezero crater.
The process of deceleration and landing will take about seven minutes once the craft enters the atmosphere. The landing process is the most complex and ambitious ever undertaken by an interplanetary mission, and goes as follows.
After slowing down in the atmosphere like a meteor to a leisurely 940 MPH or so, the parachute will deploy, slowing the descender over the next minute or two to a quarter of that speed. At the same time, the heat shield will separate, exposing the instruments on the underside of the craft.
Image Credits: NASA/JPL-Caltech
This is a crucial moment, as the craft will then autonomously — there’s no time to send the data to Earth — scan the area below it with radar and other instruments and find what it believes to be an optimal landing location.
Once it does so, from more than a mile up, the parachute will detach and the rover will continue downwards in a “powered descent” using a sort of jetpack that will take it down to just 70 feet above the surface. At this point the rover detaches, suspended at the end of a 21-foot “Sky Crane,” and as the jetpack descends the cable extends; once it touches down, the jetpack boosts itself away, Sky Crane and all, to crash somewhere safely distant.
All that takes place in about 410 seconds, during which time the team will be sweating madly and chewing their pencils. It’s all right here in this diagram for quick reference:
Jezero Crater was chosen as a region rich in possibilities for finding evidence of life, but also a good venue for many other scientific endeavors.
The most similar to previous missions are the geology and astrobiology goals. Jezero was “home to an ancient delta, flooded with water.” Tons of materials coalesce in deltas that not only foster life, but record its presence. Perseverance will undertake a detailed survey of the area in which it lands to help characterize the former climate of Mars.
Part of that investigation will specifically test for evidence of life, such as deposits of certain minerals in patterns likely to have resulted from colonies of microbes rather than geological processes. It’s not expected that the rover will stumble across any living creatures, but you know the team all secretly hope this astronomically unlikely possibility will occur.
One of the more future-embracing science goals is to collect and sequester samples from the environment in a central storage facility, which can then be sent back to Earth — though they’re still figuring out how to handle that last detail. The samples themselves will be carefully cut from the rock rather than drilled or chipped out, leaving them in pristine condition for analysis later.
Image Credits: NASA/JPL-Caltech
Perseverance will spend some time doubling back on its path to place as many as 30 capsules full of sampled material in a central depot, which will be kept sealed until such a time as they can be harvested and returned to Earth.
The whole time the rover will be acting as a mobile science laboratory, taking all kinds of readings as it goes. Some of the signs of life it’s looking for only result from detailed analysis of the soil, for instance, so sophisticating imaging and spectroscopy instruments are on board, PIXL and SHERLOC. It also carries a ground-penetrating radar (RIMFAX) to observe the fine structure of the landscape beneath it. And MEDA will continuously take measurements of temperature, wind, pressure, dust characteristics, and so on.
Of course the crowd-pleasing landscapes and “selfies” NASA’s rovers have become famous for will also be beamed back to Earth regularly. It has 19 cameras, though mostly they’ll be used for navigation and science purposes.
Exploring takes a little MOXIE and Ingenuity
Image Credits: NASA/JPL-Caltech
Perseverance is part of NASA’s long-term plan to visit the Red Planet in person, and it carries a handful of tech experiments that could contribute to that mission.
The most popular one, and for good reason, is the Ingenuity Mars Helicopter. This little solar-powered two-rotor craft will be the first ever demonstration of powered flight on another planet (the jetpack Perseverance rode in on doesn’t count).
The goals are modest: the main one is simply to take off and hover in the thin air a few feet off the ground for 20 to 30 seconds, then land safely. This will provide crucial real-world data about how a craft like this will perform on Mars, how much dust it kicks up, and all kinds of other metrics that future aerial craft will take into account. If the first flight goes well, the team plans additional ones that may look like the GIF above.
Being able to fly around on another planet would be huge for science and exploration, and eventually for industry and safety when people are there. Drones are have already become crucial tools for all kinds of surveying, rescue operations, and other tasks here on Earth — why wouldn’t it be the same case on Mars? Plus it’ll get some great shots from its onboard cameras.
MOXIE is the other forward-looking experiment, and could be even more important (though less flashy) than the helicopter. It stands for Mars Oxygen In-Situ Resource Utilization Experiment, and it’s all about trying to make breathable oxygen from the planet’s thin, mostly carbon dioxide atmosphere.
This isn’t about making oxygen to breathe, though it could be used for that too. MOXIE is about making oxygen at scales large enough that it could be used to provide rocket fuel for future takeoffs. Though if habitats like these ever end up getting built, it will be good to have plenty of O2 on hand just in case.
For a round trip to Mars, sourcing fuel from the there rather than trucking all the way from Earth to burn on the way back is an immense improvement in many ways. The 30-50 tons of liquid oxygen that would normally be brought over in the tanks could instead be functional payloads, and that kind of tonnage goes a long way when you’re talking about freeze-dried food, electronics, and other supplies.
MOXIE will be attempting, at a small scale (it’s about the size of a car battery, and future oxygen generators would be a hundred times bigger), to isolate oxygen from the CO2 surrounding it. The team is expecting about 10 grams per hour, but it will only be on intermittently so as not to draw too much power. With luck it’ll be enough of a success that this method can be pursued more seriously in the near future.
Image Credits: NASA/JPL-Caltech
One of the big challenges for previous rovers is that they have essentially been remote controlled with a 30-mintue delay — scientists on Earth examine the surroundings, send instructions like go forward 40 centimeters, turn front wheels 5 degrees to the right, go 75 centimeters, etc. This not only means a lot of work for the team but a huge delay as the rover makes moves, waits half an hour for more instructions to arrive, then repeats the process over and over.
Perseverance breaks with its forbears with a totally new autonomous navigation system. It has high resolution, wide-angle color cameras and a dedicated processing unit for turning images into terrain maps and choosing paths through them, much like a self-driving car.
Being able to go farther on its own means the rover can cover far more ground. The longest drive ever recorded in a single Martian day was 702 feet by Opportunity (RIP). Perseverance will aim to cover about that distance on average, and with far less human input. Chances are it’ll set a new record pretty quickly once it’s done tiptoeing around for the first few days.
In fact the first 30 sols after the terrifying landing will be mostly checks, double checks, instrument deployments, more checks, and rather unimpressive-looking short rolls around the immediate area. But remember, if all goes well, this thing could still be rolling around Mars in 10 or 15 years when people start showing up. This is just the very beginning of a long, long mission.
Hyundai Motor Group is back with a new “walking car” robot that can use its wheels to roll along a path or stand up and navigate tougher terrain on its legs. This time, the concept is designed to carry cargo and is small enough to be carried by a drone.
The TIGER robot — short for transforming intelligent ground excursion robot — is the first “uncrewed” ultimate mobility vehicle (UMV) concept to come out of New Horizons Studio, the Mountain View, Calif. facility that is home to Hyundai Motor Group’s UMV development. Tiger follows in the wheeled-footsteps of Elevate, a larger concept vehicle designed to carry people that the company unveiled in 2019 at the CES tech trade show.
Image Credits: Screenshot/Hyundai
While concepts don’t always translate into real products, New Horizons Studio head John Suh told TechCrunch that his aim is to bring Tiger to life “as soon as possible,” adding that it would likely be a five-year process.
Suh said the team will spend the next two years focused on the solving some core technical problems to establish a baseline design. In 2023 and 2024, the team will get to the beta-product stage and advanced testing will begin before finally becoming a product customers can buy.
Today’s version of the Tiger is based on a modular platform architecture, just like its larger cousin. The robot has a leg and wheel locomotion system, 360-degree directional control, a storage bay that can carry goods and a range of sensors for remote observation. It’s also designed to connect to a drone, which can charge the robot while flying it to its destination.
The Tiger has two modes that are deployed depending on the terrain. On smoother, less complex surfaces, the robot’s legs retract and the vehicle uses all four wheels to move. If the vehicle gets stuck or faces an obstacle like a small wall, berm or log, it can stand up, lock the wheels and then walk.
This is the first version of Tiger — known as X-1 for experimental — suggesting New Horizons will be bringing out more variants in the future. This one was created in partnership with engineering design software company. Autodesk and concept design firm Sundberg-Ferar.
What, you are no doubt asking, is a worm a blob? Well, it’s a blob of worms, obviously. More specifically, it’s a blob of California blackworms. It’s not a flock, nor a swam nor a school. It’s a big, undulating mass of a Lumbriculus variegatus tangled up, but somehow moving as one.
Roboticists, of course, have a long, storied history of drawing inspiration from nature. This time out, a team at Georgia Tech studied the aforementioned worm blob in hopes of learning gaining insight into its unusual form of locomotion. The researchers believe they can apply some of the learnings to rethink the way robots move.
The team published its findings in an academic journal earlier this month. According to the research, the blobs — which range from 10 to 50,000 individual organisms — are a kind of survival mechanism to adapt to things like changing temperatures. A few individuals are capable of moving the larger group, with around two or three being required to move a group of five.
The researchers set up a series of six 3D-printed robots with two arms and two light sensors a piece. Mesh and pins on the arms allowed the robots to become entangled with one another.
“Depending on the intensity, the robots try to move away from the light,” researcher Yasemin Ozkan-Aydin said in a release tied to the news. While there was no direct communication between the robots, they effectively operated as a group. “They generate emergent behavior that is similar to what we saw in the worms.”
Image Credits: Georgia Tech
The scientists think that sort of collective action can be applied to make individual robots more collaborative and cohesive units. “Often people want to make robot swarms do specific things, but they tend to be operating in pristine environments with simple situations,” Professor Daniel Goldman says of the research. “With these blobs, the whole point is that they work only because of physical interaction among the individuals. That’s an interesting factor to bring into robotics.”
DoorDash is expanding its robotic footprint into the kitchen. The delivery service is set to acquire Chowbotics, a Bay Area-based robotics best known for its salad-making robot, Sally. TechCrunch has confirmed the acquisition, which was first noted by The Wall Street Journal.
“We have long admired the work that Chowbotics has done to increase access to fresh meals, with its groundbreaking robotics product and vision,” DoorDash co-founder Stanley Tang said in a comment offered to TechCrunch. “At DoorDash, we are always working to innovate and continue improving how we support our merchant partners and their success — and are excited to leverage this technology to do so in new ways. With the Chowbotics team on board, we can explore new use cases and customers, providing another service to help our merchants grow.”
Founded in 2014, Chowbotics has raised around $21 million to date, including an $11 million round back in 2018. The company’s vending machine-style salad bar robot was already well-positioned for the pandemic, removing a human element from the food preparation process — not to mention the fact that salad bars and buffets tend to be open air affairs. In October, the startup added a contactless feature to the robot, letting users order ahead of time, via app.
“Joining the DoorDash team unlocks new possibilities for Chowbotics and the technology that this team has built over the past seven years,” CEO Rick Wilmer said in a statement. “As the leader in food delivery and on-demand logistics, DoorDash has the unparalleled reach and expertise to help us grow and deploy our technology more broadly, so together, we can make fresh, nutritious food easy for more people.”
It’s not entirely clear how the company’s technology will fit into the delivery service’s current offering, though DoorDash notes it will “improve consumer access to fresh and safe meals, and enhance our robust merchant offerings and logistics platform.” It also remains to be seen whether Chowbotics will continue to operate as its own entity within the broader DoorDash. We’ve reached out for more insight.
“At DoorDash, we strive to become a merchant’s first call when they want to grow their business,” Tang said. “What excites us most about Chowbotics is that the team has developed a remarkable tool for helping merchants grow. Bringing Chowbotics’ technology into the DoorDash platform gives us a new opportunity to help merchants expand their current menu offerings and reach new customers in new markets — which is a fundamental part of our merchant-first approach to empowering local economies.”
DoorDash has been working with robotics companies for a number of years now. Perhaps the most prominent example is a partnership with Starship Technologies to explore food delivery robots. Though that technology has seen a fair number of roadblocks among local officials not eager to turn their sidewalks over to robots. The delivery company likens Chowbotics’ kiosk-style technology to its work with ghost kitchens, effectively serving as a conduit to help expand food options at local merchants – be it in store or through delivery. The former will likely be of more interest once the current pandemic is in the rear view.
Details of the acquisition have not been disclosed.
Additive manufacturing has proven an ideal solution for certain tasks, but the technology still lacks more traditional methods in a number of categories. One of the biggest is the requirement for post-printing assembly. 3D printers can create extremely complex components, but an outside party (be it human or machine) is required to put them together.
MIT’s CSAIL department this week showcased “LaserFactory,” a new project that attempts to develop robotics, drones and other machines than can be fabricated as part of a “one-stop shop.” The system is comprised of a software kit and hardware platform designed to create structures and assemble circuitry and sensors for the machine.
A more fully realized version of the project will be showcased at an event in May, but the team is pulling back the curtain a bit to show what the concept looks like in practice. Here’s a breakdown from CSAIL’s page:
Let’s say a user has aspirations to create their own drone. They’d first design their device by placing components on it from a parts library, and then draw on circuit traces, which are the copper or aluminum lines on a printed circuit board that allow electricity to flow between electronic components. They’d then finalize the drone’s geometry in the 2D editor. In this case, they’d use propellers and batteries on the canvas, wire them up to make electrical connections, and draw the perimeter to define the quadcopter’s shape.
Printing circuit boards is certainly nothing new. What sets CSAIL’s machine apart here is the breadth of functionality that’s been jammed into the machine here. An accompanying video lays it out pretty well:
Of course, this is early days — we’re still months out from the official presentation. There are a lot of questions, and more to the point, a lot of potential points of failure for a complex machine like this — especially one that seems to have non-experts as a target audience.
“Making fabrication inexpensive, fast, and accessible to a layman remains a challenge,” PhD student and lead author Martin Nisser says in the release. “By leveraging widely available manufacturing platforms like 3D printers and laser cutters, LaserFactory is the first system that integrates these capabilities and automates the full pipeline for making functional devices in one system.”
The software appears to be a big piece of the puzzle — allowing users to view a version of the product before it’s printed. By then, of course, it’s too late.
There’s something difficult to reconcile watching Spot walk up a flight of stairs in some industrial setting. After years of watching viral videos of Boston Dynamics robots perform aesthetically impressive feats, there’s a banality in the quadruped performing those dull, dirty and dangerous tasks that roboticist love to talk about.
But six and half months after the company opened Spot up for sale (and more than 400 sold, per BD), companies have been deploying the advanced piece of machinery in some downright dreary settings. Yesterday morning, I had the opportunity to pilot the robot around one of them, from the comfort of my own desk.
This week the Hyundai-owned robotics pioneer is introducing Scout, a browser-based interface for remotely controlling the robot. The arrival will also be joined by a self-charging “Enterprise” edition of the robot and the already announced Spot Arm. All of the new hardware is available starting today through Boston Dynamics’ site (it’s one of those “call for a quote” pricing deals), though Scout will also be compatible with any version of the robot.
Image Credits: Boston Dynamics
That said, the company is recommending it be paired with the self-docking Enterprise version. After all, the robot runs about 90 minutes on a charge, so if you’re intending to use it to monitor a situation without people present, that’s probably the way to go.
I’ve driven Spot around a few times in person — and like those, there’s a bit of a learning curve here. Boston Dynamics estimates it will take around 15 minutes to get you fully up to speed, but after a minute or two, I was able to send the robot up and down a flight of stairs at BD HQ. There are, thankfully, a whole bunch of cameras and other sensors built into the $75,000 robot to help you avoid doing something really stupid.
Image Credits: Boston Dynamics
The system will work with Bluetooth gaming controllers, but for my demo, I was stuck with the keyboard. There’s a basic WASD control scheme that should be familiar if you’ve done any PC gaming. The arrow keys, meanwhile, can be used to switch between four cameras, giving you a view from all sides. There are a number of additional views, including a terrain mode that gives you a kind of top-down rendered view of the robot. That’s probably the best way to view all of the immediate obstacles — or better still, you can do a picture in picture to get views at once.
I found myself using “click to go” a lot, as well. It essentially works the way it sounds: You click on a point on the ground and Spot walks toward it. The feature is designed primarily for those with connection issues — imagine, say, you’ve got poor Spot deployed on some oil rig somewhere.
“[W]e have a power plant customer who had a possible equipment failure. They were able to use the robot to repeatedly inspect something that, if it failed could have been dangerous to a human inspector,” Spot’s Chief Engineer Zack Jackowski told TechCrunch. “So they logged in, checked on this pipe repeatedly and were able to avoid an expensive shut down.”
Image Credits: Boston Dynamics
There’s also a “stair mode” that positions the robot for walking up and down stairs. The feature needs to be manually toggled on and off, though the robot should be able to walk up a flight of stairs in normal mode (I did this in my demo, and didn’t appear to give anyone on staff a heart attack). For the time being, the functionality is limited to line of sight. Jackowski adds, “We have all sorts of crazy plans to extend that to building scale, but the first thing we want to get out there is line of vision.”
In addition to a new docking connector on the Robot’s bum, the Enterprise version will also sport an enhanced CPU and improved wireless connectivity. It will ship either as a bundle with the dock or solo.
Sadly, we weren’t able to take the new arm out for a spin, but Jackowski did offer some detail on that functionality, noting, “you issue the arm commands, like ‘move your hand here’ or ‘pick up this object’ or ‘turn this valve,’ and the robot’s actually smart enough to figure out, ‘hey, if I’m going to turn that valve, I need to stand over here, I need to shift my weight like this, I need to figure out how to keep the right part of my wrist limp to accommodate how that valve moves.’ ”
San Jose-based robotics company Fetch unveiled its latest robot this morning. The PalletTransport1500 is an autonomous bot designed specifically to replace forklift uses in warehouses. The systems, which are designed to pick up and delivery pallets, are capable of sporting payloads of up to 2,504 pounds.
The device joins a number of different robotic forklift solutions from various companies, including Toyota. Though Amazon’s own Kiva Systems-produced robots are likely still the best-known pallet moving robotics in the game.
The system was developed with Honeywell’s Intelligrated’s Momentum warehouse software. Fetch, of course, already offers a number of different warehouse robotic solutions, building out a kind of autonomous ecosystem. The company’s systems are notable for their relative flexibility over other full-scale solutions.
Per a press release, the new robot is designed to remove humans from the pallet-moving equation. Actions include,
Cross-docking: the AMR can transport pallets directly from inbound to outbound shipment areas. After pallets are unloaded from the truck, the AMR carries pallets routed from the inbound trailers/containers directly to the respective outbound shipping area location.
Returns: once inbound items are sorted based on product type or vendor, the AMR transports pallets to their appropriate return station (inventory, recycle, charity, etc.)
Warehouse transport: after received products are unloaded and palletized, the AMR moves inventory to storage locations based on business needs
This product category was no doubt one of its most highly demanded, given the fairly common occurance of forklift-related accidents. Per numbers from OSHA, “forklifts cause about 85 fatal accidents per year; 34,900 accidents result in serious injury; and 61,800 are classified as non-serious.” That’s a pretty big source of workplace accidents. The agency adds if you assign one accident per machine, that means somewhere in the neighborhood or 11% of U.S. forklifts are involved in an accident.
In addition to these concerns, COVID-19 related shutdowns have no doubt made the move toward automated fulfillment systems all the more compelling over the past year.
You’d be forgiven for being underwhelmed by the output from SoftBank Robotics thus far. The firm’s best-known product to date is almost certainly Pepper, a humanoid robot designed for greeting and signage that grew out of it 2015 acquisition of French robotics company, Aldebaran.
There’s also the matter of the investment firm’s acquisition and eventual sale of Boston Dynamics. The deal certainly went a ways toward accelerating the company’s go-to-market approach, but Boston Dynamics changed hands fairly quickly, when it was sold to Hyundai late last year (SoftBank maintains 20%).
The latest wrinkle in SoftBank’s robotic ambitions is nothing if not interesting. The firm announced today that it is joining forces with Iris Ohyama. The Japanese brand, which will hold a 51% stake in the venture (with SoftBank controlling the remainder), is best known for its home goods. The company makes a broad range of products, that includes, as Reuters put it, “everything from rice to rice cookers.”
You’ll be able to add robotics to that list, soon enough. The newly formed Iris Robotics has set an extremely aggressive goal of $965 million in sales by 2025. In a joint press release, the company noted Covid-19-related concerns as a major catalyst in the launch of the division. Certainly that makes strategic sense. There’s little question that the past year has kickstarted serious interest in robotics and automation.
The first couple of products from the venture don’t appear especially ambitious out of the gate, however. To start, it seems they’ll be rolling out “Iris Editions” of a pair of existing devices: Bear Robotics’ restaurant robot Servi and cleaning robot, Whiz.
Here’s a quote from SoftBank Robotics CEO (forgive the Google translate),
With the urgent need to realize the new normal in the corona virus, various new expectations are being placed on robots. This strong partnership with Iris Ohyama is a huge step forward for the expansion and penetration of robot solutions. Taking full advantage of the strengths of both companies, we will respond quickly to the challenges facing society.
Certainly the technical ambitions seem more modest than what the folks at companies like Boston Dynamics are currently working on, but Iris Ohyama seems well positioned to make some headway in the home robotics category to start.
German drone technology startup Wingcopter has raised a $22 million Series A – its first significant venture capital raise after mostly bootstrapping. The company, which focuses on drone delivery, has come a long way since its founding in 2017, having developed, built and flown its Wingcopter 178 heavy-lift cargo delivery drone using its proprietary and patented tilt-rotor propellant mechanism, which combines all the benefits of vertical take-off and landing with the advantages of fixed-wing aircraft for longer distance horizontal flight.
This new Series A round was led by Silicon Valley VC Xplorer Capital, as well as German growth fund Futury Regio Growth. Wingcopter CEO and founder Tom Plümmer explained to the in an interview that the addition of an SV-based investor is particularly important to the startup, since it’s in the process of preparing its entry into the U.S., with plans for an American facility, both for flight testing to satisfy FAA requirements for operational certification, as well as eventually for U.S.-based drone production.
Wingcopter has already been operating commercially in a few different markets globally, including in Vanuatu in partnership with Unicef for vaccine delivery to remote areas, in Tanzania for two-way medical supply delivery working with Tanzania, and in Ireland where it completed the world’s first delivery of insulin by drone beyond visual line of sight (BVLOS, the industry’s technical term for when a drone flies beyond the visual range of a human operator who has the ability to take control in case of emergencies).
Wingcopter CEO and co-founder Tom Plümmer
While Wingcopter has so far pursued a business as an OEM manufacturer of drones, and has had paying customers eager to purchase its hardware effectively since day one (Plümmer told me that they had at least one customer wiring them money before they even had a bank account set up for the business), but it’s also now getting into the business of offering drone delivery-as-a-service. After doing the hard work of building its technology from the ground up, and seeking out the necessary regulatory approvals to operate in multiple markets around the world, Plümmer says that he and his co-founders realized that operating a service business not only meant a new source of revenue, but also better-served the needs of many of its potential customers.
“We learned during this process, through applying for permission, receiving these permissions and working now in five continents in multiple countries, flying BVLOS, that actually operating drones is something we are now very good at,” he said. This was actually becoming a really good source of income, and ended up actually making up more than half of our revenue at some point. Also looking at scalability of the business model of being an OEM, it’s kind of […] linear.”
Linear growth with solid revenue and steady demand was fine for Wingcopter as a bootstrapped startup founded by university students supported by a small initial investment from family and friends. But Plümmer says the company say so much potential in the technology it had developed, and the emerging drone delivery market, that the exponential growth curve of its drone delivery-as-a-service model helped make traditional VC backing make sense. In the early days, Plümmer says Wingcopter had been approached by VCs, but at the time it didn’t make sense for what they were trying to do; that’s changed.
“We were really lucky to bootstrap over the last four years,” Plümmer said. “Basically, just by selling drones and creating revenue, we could employ our first 30 employees. But at some point, you realize you want to really plan with that revenue, so you want to have monthly revenues, which generally repeat like a software business – like software as a service.”
Wingcopter 178 cargo drone performing a delivery for Merck.
Wingcopter has also established a useful hedge regarding its service business, not only by being its own hardware supplier, but also by having worked closely with many global flight regulators on their regulatory process through the early days of commercial drone flights. They’re working with the FAA on its certification process now, for instance, with Plümmer saying that they participate in weekly calls with the regulator on its upcoming certification process for BVLOS drone operators. Understanding the regulatory environment, and even helping architect it, is a major selling point for partners who don’t want to have to build out that kind of expertise and regulatory team in-house.
Meanwhile, the company will continue to act as an OEM as well, selling not only its Wingcopter 178 heavy-lift model, which can fly up to 75 miles, at speeds of up to 100 mph, and that can carry payloads up to around 13 lbs. Because of its unique tilt-rotor mechanism, it’s not only more efficient in flight, but it can also fly in much windier conditions – and take-off and land in harsher conditions than most drones, too.
Plümmer tells me that Wingcopter doesn’t intend to rest on its laurels in the hardware department, either; it’s going to be introducing a new model of drone soon, with different capabilities that expand the company’s addressable market, both as an OEM and in its drones-as-a-service business.
With its U.S. expansion, Wingcopter will still look to focus specifically on the delivery market, but Plümmer points out that there’s no reason its unique technology couldn’t also work well to serve markets including observation and inspection, or to address needs in the communication space as well. The one market that Wingcopter doesn’t intend to pursue, however, is military and defense. While these are popular customers in the aerospace and drone industries, Plümmer says that Wingcopter has a mission “to create sustainable and efficient drone solutions for improving and saving lives,” and says the startup looks at every potential customer and ensures that it aligns with its vision – which defense customers do not.
While the company has just announced the close of its Series A round, Plümmer says they’re already in talks with some potential investors to join a Series B. It’s also going to be looking for U.S. based talent in embedded systems software and flight operations testing, to help with the testing process required its certification by the FAA.
Plümmer sees a long tail of value to be built from Wingcopter’s patented tilt-rotor design, with potential applications in a range of industries, and he says that Wingcopter won’t be looking around for any potential via M&A until it has fully realized that value. Meanwhile, the company is also starting to sow the seeds of its own potential future customers, with training programs in drone flights and operations it’s putting on in partnership with UNICEF’s African Drone and Data Academy. Wingcopter clearly envisions a bright future for drone delivery, and its work in focusing its efforts on building differentiating hardware, plus the role it’s playing in setting the regulatory agenda globally, could help position it at the center of that future.
MIT researchers are looking to address the significant gap between how quickly robots can process information (relatively slowly), and how fast they can move (very quickly thanks to modern hardware advances), and they’re using something called ‘robomorphic computing’ to do it. The method, designed by MIT Computer Science and Artificial Intelligence (CSAIL) graduate Dr. Sabrina Neuman, results in custom computer chips that can offer hardware acceleration as a means to faster response times.
Custom-built chips tailored to a very specific purpose are not new – if you’re using a modern iPhone, you have one in that device right now. But they have become more popular as companies and technologists look to do more local computing on devices with more conservative power and computing constraints, rather than round-tripping data to large data centers via network connections.
In this case, the method involves creating hyper-specific chips that are designed based on a robot’s physical layout and and its intended use. By taking into account the requirements a robot has in terms of its perception of its surroundings, its mapping and understanding of its position within those surroundings, and its motion planning resulting from said mapping and its required actions, researchers can design processing chips that greatly increase the efficiency of that last stage by supplementing software algorithms with hardware acceleration.
The classic example of hardware acceleration that most people encounter on a regular basis is a graphics processing unit, or GPU. A GPU is essentially a processor designed specifically for the task of handling graphical computing operations – like display rendering and video playback. GPUs are popular because almost all modern computers run into graphics-intensive applications, but custom chips for a range of different functions have become much more popular lately thanks to the advent of more customizable and efficient small-run chip fabrication techniques.
Here’s a description of how Neuman’s system works specifically in the case of optimizing a hardware chip design for robot control, per MIT News:
The system creates a customized hardware design to best serve a particular robot’s computing needs. The user inputs the parameters of a robot, like its limb layout and how its various joints can move. Neuman’s system translates these physical properties into mathematical matrices. These matrices are “sparse,” meaning they contain many zero values that roughly correspond to movements that are impossible given a robot’s particular anatomy. (Similarly, your arm’s movements are limited because it can only bend at certain joints — it’s not an infinitely pliable spaghetti noodle.)
The system then designs a hardware architecture specialized to run calculations only on the non-zero values in the matrices. The resulting chip design is therefore tailored to maximize efficiency for the robot’s computing needs. And that customization paid off in testing.
Neuman’s team used an field-programmable gate array (FPGA), which is sort of like a midpoint between a fully custom chip and an off-the-shelf CPU, and it achieved significantly better performance than the latter. That means that were you to actually custom manufacture a chip from scratch, you could expect much more significant performance improvements.
Making robots react faster to their environments isn’t just about increase manufacturing speed and efficiency – though it will do that. It’s also about making robots even safer to work with in situations where people are working directly alongside and in collaboration with them. That remains a significant barrier to more widespread use of robotics in everyday life, meaning this research could help unlock the sci-fi future of humans and robots living in integrated harmony.
Parents with kids stuck learning at home during the pandemic have had to look for alternative activities to promote the hands-on learning experiences kids are missing out on due to attending class virtually. The New York-based educational technology startup Thimble aims to help address this problem by offering a subscription service for STEM-based projects that allow kids to make robotics, electronics and other tech using a combination of kits shipped to the home and live online instruction.
Thimble began back in 2016 as Kickstarter project when it raised $300,000 in 45 days to develop its STEM-based robotics and programming kits. The next year, it then began selling its kits to schools, largely in New York, for use in the classroom or in after-school programs. Over the years that followed, Thimble scaled its customer base to include around 250 schools across New York, Pennsylvania, and California, who would buy the kits and gain access to teacher training.
But the COVID-19 pandemic changed the course of Thimble’s business.
“A lot of schools were in panic mode. They were not sure what was happening, and so their spending was frozen for some time,” explains Thimble co-founder and CEO Oscar Pedroso, whose background is in education. “Even our top customers that I would call, they would just give [say], ‘hey, this is not a good time. We think we’re going to be closing schools down.”
Pedroso realized that the company would have to quickly pivot to begin selling directly to parents instead.
Image Credits: Thimble
Around April, it made the shift — effectively entering the B2C market for the first time.
The company today offers parents a subscription that allows them to receive up to 15 different STEM-focused project kits and a curriculum that includes live instruction from an educator. One kit is shipped out over the course of three months, though an accelerated program is available that ships with more frequency.
The first kit is basic electronics where kids learn how to build simple circuits, like a doorbell, kitchen timer and a music composer, for example. The kit is designed so kids can experience “quick wins” to keep their attention and whet their appetite for more projects. This leads into future kits like those offering a Wi-Fi robot, a little drone, an LED compass that lights up, and a synthesizer that lets kids become their own D.J.
Image Credits: Thimble
While any family can use the kits to help kids experience hands-on electronics and robotics, Pedroso says that about 70% of subscribers are those where the child already has a knack for doing these sorts of projects. The remaining 30% are those where the parents are looking to introduce the concepts of robotics and programming, to see if the kids show an interest. Around 40% of the students are girls.
The subscription is more expensive than some DIY projects at $59.99/per month (or $47.99/mo if paid annually), but this is because it includes live instruction in the form of weekly 1-hour Zoom classes. Thimble has part-time employees who are not just able to understand teach the material, but can do so in a way that appeals to children — by being passionate, energetic and capable of jumping in to help if they sense a child is having an issue or getting frustrated. Two of the five teachers are women. One instructor is bilingual and teaches some classes in Spanish.
During class, one teacher instructs while a second helps moderate the chat room and answer the questions that kids ask in there.
The live classes will have around 15-20 students each, but Thimble additionally offers a package for small groups that reduces class size. These could be used by homeschool “pods” or other groups.
Image Credits: Thimble
“We started hearing from pods and then micro-schools,” notes Pedroso. “Those were parents who were connected to other parents, and wanted their kids to be part of the same class. They generally required a little bit more attention and wanted some things a little more customized,” he added.
These subscriptions are more expensive at $250/month, but the cost is shared among the group of parents, which brings the price down on per-household basis. Around 10% of the total customer base is on this plan, as most customers are individual families.
Thimble also works with several community programs and nonprofits in select markets that help to subsidize the cost of the kits to make the subscriptions more affordable. These are announced, as available, through schools, newsletters, and other marketing efforts.
Since pivoting to subscriptions, Thimble has re-established a customer base and now has 1,110 paid customers. Some, however, are grandfathered in to an earlier price point, so Thimble needs to scale the business further.
In addition to the Kickstarter, Thimble has raised funds and worked on the business over the year with the help of multiple accelerators, including LearnLaunch in Boston, Halcyon in D.C., and Telluride Venture Accelerator in Colorado.
The startup, co-founded by Joel Cilli in Pittsburgh, is now around 60% closed on its seed round of $1 million, but isn’t announcing details of that at this time.
The last several years have seen a substantial increase of ability for robotic exoskeleton technology. Completely understandable. For one thing, it’s that rare technology you encounter that really feels like it’s going to change lives for the better the first time you see it. I’ve had a number of demos with companies that frankly took my breath away — watching someone walk across the room for the first time in years while their spouse stands by you crying will do that.
For another thing, there are two distinct use cases for this tech. The first is the aforementioned mobility — whether it’s full paralysis or simply helping people with walking impairments move a bit more easily. The second is work. Exoskeletons have great potential to ease the burden of lifting heavy objects or standing for extended periods. For this reason, many companies like Esko Bionics have created two distinct divisions to serve both sides.
So it’s a big, potential market — albeit one that’s still going to take a number of years to mature. For that reason, we’re really only talking rough projections here. I do think there’s still space for some smaller companies to carve out a meaningful business in the category.
I also won’t be surprised when more big companies get involved in the category. It’s a good way to put your stamp on the robotics category. Samsung’s GEMS is certainly the biggest-name product in the category this week at CES — even if it didn’t warrant a ton of stage time. It debuted at the event two years ago and we were able to try it out. For now, the news centered around hardware improvements like battery and the beginning of clinical trials – a necessary part of bringing this sort of healthcare or healthcare-adjacent product to market.
As with most of Samsung’s robotics announced at the show this week, the jury is very much still out with regards to how seriously the company is taking the product. Last year it made a brief appearance at CES as part of an “immersive workout experience.”
Image Credits: Archelis
Some smaller companies have shown off compelling entries. Japan-based Archelis Inc. is top of mind, showcasing the ArchelisFX, whose name derives from the Japanese word for “walkable chair.” The device is designed for a number of different scenarios, including back pain and those who have recently undergone surgery. The company says it will be available to rent or buy for around $5,000.
On the whole, the exoskeletons on display at this year’s virtual CES tend largely toward the mobility side of the equation. Notably absent was Sarcos Robotics, which announced a partnership with Delta Airlines at last year’s event. In September, the company built on that interest to raise a $40 million round.
Mapping the ocean’s floor is a surprisingly vital enterprise, which helps with a range of activities including shipping, coastal protection, and deep-sea resource gathering. It’s also a very costly and time-consuming activity, which can be demanding and dangerous for those involved. Saildrone is a startup focused on building out autonomous exploratory vessels that can do lots of mapping, while making very little impact on the environment in which they operate, and without requiring any crew on board at all.
Saildrone’s newest robotic ocean explorer is the Surveyor, its largest vessel at 72-feet long. The Surveyor can spend up to 12 months at a stretch out at sea, and draws its power from wind (hence the large sail-like structure, which is not actually used like the sail on a sailboat) and the sun (via the solar panels dotting its above-water surfaces). Its sensor instrumentation includes sonar that can map down to 7,000 meters (around 22,000 feet). That’s not quite as deep as some of the deepest parts of the world’s oceans, but it’s plenty deep enough to cover the average depth of around 12,100 feet.
As Saildrone notes, we’ve only actually mapped around 20% of the Earth’s oceans to date – meaning we know less about it than we do the surface of Mars or the Moon. Saildrone has already been contributing to better understanding this last great frontier with its 23-foot Explorer model, which has already accumulated 500,000 nautical miles of travel on its autonomous sea voyages. The larger vessel will help not only with seafloor mapping, but also with a new DNA sample collection effort using sensors developed the University of New Hampshire and the Monterey Bay Aquarium Research Institute, to better understand the genetic makeup of various lifeforms that occupy the water column in more parts of the sea.
Pollen Robotics turned heads at last year’s CES . After all, a humanoid robot will do that on the show floor (even if it’s only half of one). The French startup is back for this year’s show (insofar as anyone is really back for the show), with some key updates to its robot, Reachy.
The biggest news this time out is the addition of teleoperation functionality using a virtual reality setup. Using a VR headset, a remote operator is capable of viewing video through the robot’s two face cameras. VR controllers are used to manipulate the robot’s arms for pick and place operations. The functionality can be used for, among other things, training the robot to perform tasks.
Reachy is notable for being an open-source robotics platform. The $17,000 bot is potentially suitable for robotics research, including prototyping one’s own technology. As evidenced by last year’s event, it’s also a fun presentation robot, fulfilling a similar function as, say, Softbank’s Pepper. It’s not, however, going to be taking over any manufacturing jobs any time soon.
Image Credits: Brian Heater
The robot’s software is built on the popular open-source robotics operating system, ROS 2. The on-board computer and cameras have both been upgraded since the company first showed off the robot roughly this time last year.
For the last couple of years, Samsung’s CES press conferences have featured a parade of futuristic home robots. They’re are smart, dexterous and impressive (and reasonably adorable). But home robots are hard. Like, really, really hard. There’s a reason the robotic vacuum continues to be one real viable home robot nearly 20 years after the Roomba’s introduction.
It’s the same reason the JetBot 90 AI+ Vacuum seems to be the one really viable bit of home robotics from the event. The company also showed off updates to the Bot Handy it introduced at last year’s show. That, coupled with the new Bot Care, is far more in line with the kind of humanoid designs science fiction has led us believe we’ll be getting in the next few years.
And science fiction seems to still be an operative descriptor here. At last years show, the robots put on a kind of Chuck E. Cheese-style presentation, running through choreographed tasks on stage, with limited human interaction. Understandably – there’s a lot that goes into this sort of thing, and for the moment, the technology feels like proof of concept more than anything.
Image Credits: Samsung
The company mentioned the “not too distant future” in reference to the tech, while the small print at the corner of the screen said “This robot is undergoing research and development, and is not yet for sale.” That seems to be putting it mildly, as the wheeled Bot Care reminds its owner of a meeting and pops up a screen for a conference call.
I don’t think anyone has an illusions that we’ll be seeing any of this tech during the current epidemic, though I suppose there’s an argument to be made that this is the “new normal” the company is prepping us for. The Bot Handy moving dishes from the sink to dishwasher seems roughly as realistic.
Image Credits: Samsung
I’m happy to be wrong, but I don’t think any of us are holding our breath for a viable version of this tech in the near term. We can, however, appreciate the JetBot 90 AI+ Vacuum. That, after all, has a rough date, arriving in the U.S. at some point in the first half of this year.
The robot vacuum features an on-board LiDAR sensor, coupled with an object detection algorithm that helps it build an ideal path around the user’s home. Interestingly, the camera can also be viewed remotely by the user, doubling as a kind of security cam (though Samsung seems to avoid actually using the word) and a pet monitor.
UBTech is the massively funding Chinese robotics company you’ve (probably) never heard of. If you’re familiar at all with the brand here in the States, it’s likely for its STEM or other robotic toys, including a Star Wars Storm Trooper robot from a few years back (my own first exposure to the company). But with around $940 million in funding, it seems the sky’s the limit, in terms of product expansion.
Among its announcements at CES today is the arrival of line of UV-C disinfecting robots. The need is certainly clear, following the events of the past 12 months. In fact, UBTech isn’t even the first company to introduce a UV robot for the show – that distinction falls to LG, which is set to offer its own solution this year.
And like LG, UBTech has already begun piloting its tech. The State of Delaware’s Department of Education has deployed models in a number of locations. The company is producing a few different models of Adibot – including rolling and stationary versions. They’re available for purchase starting this week, with financing plans starting at $15 a day.
As the company notes, while the need for such functionality has been at top of mind during the COVID-19 pandemic, it certainly doesn’t start or end with that specific virus. Epidemiologists have warned that this particular strain most likely won’t be the last major pandemic of our lifetimes.
You don’t need Qoobo in your life. Nobody needsQoobo, exactly. In fact, first reactions tend to range from befuddlement to bemusement. The robotic cat pillow doesn’t make a ton of sense on the face of it – in part because Qoobo has no face.
The handful of time I’ve interacted with the original Qoobo in person, reactions have been pretty uniform. The initial confusion gives way to the question of why such a thing needs to exist. And then, inevitably, someone ask how they can buy one of their own.
The original, larger version was fairly difficult to get here in the States for a while, owing to the limitation of a small robotics company has in bringing its product to a brand new market. I suspect there was also a question of whether such an idiosyncratic product would translate. In the end, however, there’s nothing particularly confusing about it.
Image Credits: Brian Heater
At its subtly beating heart is an attempt to deliver comfort in a small, furry package. It’s something we could all probably use more of these days. Following a successful Indiegogo campaign, the new Petit Qoobo delivers that in a smaller, more affordable design. “Petit Qoobo is a cushion-shaped robot with a tail,” the included User Guide begins. “When stroked, the tail waves gently.”
Honestly, that’s kind of the whole deal here. It’s a furry pillow with a robotic tail that waves when pet. Pet it more vigorously and the tail responds in kind. The pillow has a built in mic that listens for sound (though not specific words), which can elicit a wag. I’ve found that things like a knock on the door or loud music can also trigger this effect. It will also just wag at random “just to say ‘hello’.”
Petit Qoobo is sitting on my lap as I write this. And yes, it’s soothing. It’s not a replacement for a real pet – but I also know full well that my real pet (pictured above) would not be as chill about sitting on my lap while I try to get some work done. When I’m finished petting Qoobo, there’s no protest – the tail simply goes slack.
The robot will also “go to sleep” after extensive petting – in order to save on charge, one assumes. When time comes to recharge, there’s a port located – let’s just say it’s near the tail. A zipper along the outside makes it possible to remove the fur coat altogether for cleaning.
Image Credits: Brian Heater
The tail mechanism isn’t loud, per se, but it’s audible. You can hear the actuators moving as it goes to work. Honestly, the buzzing is more charming than anything. The only time it’s an issue is when using the device as a pillow. Qoobo’s other clever trick is a quiet heartbeat that triggers when squeezed. It’s a nice, calming effect – though one that can sometimes be overpowered by the tail noise
The device is part of a long and fascinating lineage of Japanese therapy robotics. The most notable example is probably Paro, which dates back to the 90s. The baby seal was designed to calm and comfort patients in hospitals and nursing rooms – essentially a way to bring the benefits of therapy animals without having to have actual animals involved. Of course, that project – which ultimately cost around $15 million in development – is on an entirely different scale than this product from Yukai Engineering .
Image Credits: Brian Heater
But the result isn’t entirely dissimilar. There are just certain parts of us that are wired to want pet something furry and hear a heartbeat – both boxes this strange little robot happily checks. I certainly feel a bit calmer writing this — and that’s probably the most you can ask for, these days.