Tag Archives: robotics

#439693 Agility Robotics’ Digit is Getting ...

Agility Robotics' Digit humanoid has been taking a bit of a break from work during the pandemic. Most of what we've seen from Agility and Digit over the past year and a half has been decidedly research-y. Don't get me wrong, Digit's been busy making humans look bad and not falling over when it really should have done, but remember that Agility's goal is to make Digit into a useful, practical robot. It's not a research platform—as Agility puts it, Digit is intended to “accelerate business productivity and people's pursuit of a more fulfilling life.” As far as I can make out, this is a fancier way of saying that Digit should really be spending its time doing dull repetitive tasks so that humans don't have to, and in a new video posted today, the robot shows how it can help out with boring warehouse tote shuffling.

The highlights here for me are really in the combination of legged mobility and object manipulation. Right at the beginning of the video, you see Digit squatting all the way down, grasping a tote bin, shuffling backwards to get the bin out from under the counter, and then standing again. There's an unfortunate cut there, but the sequence is shown again at 0:44, and you can see how Digit pulls the tote towards itself and then regrasps it before lifting. Clever. And at 1:20, the robot gives a tote that it just placed on a shelf a little nudge with one arm to make sure it's in the right spot.

These are all very small things, but I think of them as highlights because all of the big things seem to be more or less solved in this scenario. Digit has no problem lifting things, walking around, and not mowing over the occasional human, and once that stuff is all sorted, whether the robot is able to effectively work in an environment like this is to some extent reflected in all of these other little human-obvious things that often make the difference between success and failure.
The clear question, though, is why Digit (or, more broadly, any bipedal robot) is the right robot to be doing this kind of job. There are other robots out there already doing tasks like these in warehouses, and they generally have wheeled bases and manipulation systems specifically designed to move totes and do nothing else. If you were to use one of those robots instead of Digit, my guess is that you'd pay less for it, it would be somewhat safer, and it would likely do the job more efficiently. Fundamentally, Digit can't out box-move a box-moving robot. But the critical thing to consider here is that as soon as you run out of boxes to move, Digit can do all kinds of other things thanks to its versatile humanoid design, while your box-moving robot can only sit in the corner and be sad until more boxes show up.
“We did not set out to build a humanoid robot. We set out to solve mobility.”
—Agility CTO Jonathan Hurst
“Digit is very, very flexible automation,” Agility CTO Jonathan Hurst told us when we asked him about this. “The value of what we're doing is in generality, and having a robot that's going be able to work carrying totes for three or four hours, then go unload boxes from trailers for three or four hours, keep up with you if you change your workflow entirely. Many of these spaces are designed specifically around the human form factor, and it's possible for a robot like Digit to do all of these different boring, repetitive jobs. And then when things get complicated, humans are still doing it.”
The value of having a human-like robot in a human environment comes into play as soon as you start thinking about typical warehouse situations that would be trivial for a human to solve but that are impossible for wheeled robots. For example, Hurst says that Digit is capable of using a stool to reach objects on high shelves. You could, of course, design a wheeled robot with an extension system to allow it to reach high shelves, but you're now adding more cost and complexity, and the whole point of a generalist humanoid robot is that in human environments, you just don't have to worry about environmental challenges. Or that's the idea, anyway, but as Hurst explains, the fact that Digit ended up with a mostly humanoid form factor was more like a side effect of designing with specific capabilities in mind:
We did not set out to build a humanoid robot. We set out to solve mobility, and we've been on a methodical path towards understanding physical interaction in the world. Agility started with our robot Cassie, and one of the big problems with Cassie was that we didn't have enough inertia in the robot's body to counteract the leg swinging forward, which is why Digit has an upright torso. We wanted to give ourselves more control authority in the yaw direction with Cassie, so we experimented with putting a tail on the robot, and it turns out that the best tail is a pair of bilaterally symmetrical tails, one on either side.
Our goal was to design a machine that can go where people go while manipulating things in the world, and we ended up with this kind of form factor. It's a very different path for us to have gotten here than the vast majority of humanoid robots, and there's an awful lot of subtlety that is in our machine that is absent in most other machines.IEEE Spectrum: So are you saying that Digit's arms sort of started out as tails to help Cassie with yaw control?
Jonathan Hurst: There are many examples like this—we've been going down this path where we find a solution to a problem like yaw control, and it happens to look like it does with animals, but it's also a solution that's optimal in several different ways, like physical interaction and being able to catch the robot when it falls. It's not like it's a compromise between one thing and another thing, it's straight up the right solution for these three different performance design goals.
Looking back, we started by asking, should we put a reaction wheel or a gyro on Cassie for yaw control? Well, that's just wasted mass. We could use a tail, and there are a lot of nice robots with tails, but usually they're for controlling pitch. It's the same with animals; if you look at lizards, they use their tails for mid-air reorienting to land on their feet after they jump. Cassie doesn't need a tail for that, but we only have a couple of small feet on the ground to work with. And if you look at other bipedal animals, every one of them has some other way of getting that yaw authority. If you watch an ostrich run, when it turns, it sticks its wing out to get the control that it needs.
And so all of these things just fall into place, and a bilaterally symmetrical pair of tails is the best way to control yaw in a biped. When you see Digit walking and its arms are swinging, that's not something that we added to make the motion look right. It looks right because it literally is right—it's the physics of mobility. And that's a good sign for us that we're on the right path to getting the performance that we want.
“We're going for general purpose, but starting with some of the easiest use cases.”
—Agility CTO Jonathan Hurst
Spectrum: We've seen Digit demonstrating very impressive mobility skills. Why are we seeing a demo in a semi-constrained warehouse environment instead of somewhere that would more directly leverage Digit's unique advantages?
Jonathan Hurst: It's about finding the earliest, most appropriate, and most valuable use cases. There's a lot to this robot, and we're not going to be just a tote packing robot. We're not building a specialized robot for this one application, but we have a couple of pretty big logistics partners who are interested in the flexibility and the manipulation capabilities of this machine. And yeah, what you're seeing now is the robot on a flattish floor, but it's also not going to be tripped up by a curb, or a step, or, a wire cover, or other things on the ground. You don't have to worry about anything like that. So next, it's an easy transition next to unloading trailers, where it's going to have to be stepping over gaps and up and down things and around boxes on the floor and stuff like that. We're going for general purpose, but starting with some of the easiest use cases.
Damion Shelton, CEO: We're trying to prune down the industry space, to get to something where there's a clear value proposition with a partner and deploying there. We can respect the difficulty of the general purpose use case and work to deploy early and profitably, as opposed to continuing to push for the outdoor applications. The blessing and the curse of the Ford opportunity is that it's super interesting, but also super hard. And so it's very motivating, and it's clear to us that that's where one of the ultimate opportunities is, but it's also far enough away from a deployment timeline that it just doesn't map on to a viable business model.
This is a point that every robotics company runs into sooner or later, where aspirations have to succumb to the reality of selling robots in a long-term sustainable way. It's definitely not a bad thing, it just means that we may have to adjust our expectations accordingly. No matter what kind of flashy cutting-edge capabilities your robot has, if it can't cost effectively do dull or dirty or dangerous stuff, nobody's going to pay you money for it. And cost effective usefulness is, arguably, one of the biggest challenges in bipedal robotics right now. In the past, I've been impressed by Digit's weightlifting skills, or its ability to climb steep and muddy hills. I'll be just as impressed when it starts making money for Agility by doing boring repetitive tasks in warehouses, because that means that Agility will be able to keep working towards those more complex, more exciting things. “It's not general manipulation, and we're not solving the grand challenges of robotics,” says Hurst. “Yet. But we're on our way.” Continue reading

Posted in Human Robots

#439678 Video Friday: Afghan Girls Robotics Team ...

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USAWeRobot 2021 – September 23-25, 2021 – [Online Event]IROS 2021 – September 27-1, 2021 – [Online Event]ROSCon 2021 – October 20-21, 2021 – [Online Event]Let us know if you have suggestions for next week, and enjoy today's videos.
Five members of an all-girl Afghan robotics team have arrived in Mexico, fleeing an uncertain future at home after the recent collapse of the U.S.-backed government and takeover by the Taliban.
[ Reuters ] via [ FIRST Mexico ]
Thanks, Fan!
As far as autonomous cars are concerned, there's suburban Arizona difficulty, San Francisco difficulty, and then Asia rush hour difficulty. This is a 9:38 long video that is actually worth watching in its entirety because it's a fully autonomous car from AutoX driving through a Shenzhen urban village. Don't miss the astonished pedestrians, the near-miss with a wandering dog, and the comically one-sided human-vehicle interaction on a single lane road.

The AutoX Gen5 system has 50 sensors in total, as well as a vehicle control unit of 2200 TOPS computing power. There are 28 cameras capturing a total of 220 million pixels per second, six high-resolution LiDAR offering 15 million points per second, and 4D RADAR with 0.9-degree resolution encompassing a 360-degree view around the vehicle. Using cameras and LiDAR fusion perception blind spot modules, the Gen5 system covers the entire RoboTaxi body with zero blind spots.[ AutoX ]
Sometimes, robots do nice things for humans.

[ US Soccer ]
Body babbling? Body babbling.

[ CVUT ]
Thanks, Fan!
Matias from the Oxford Robotics Institute writes, “This is a demonstration of our safe visual teach and repeat navigation system running on the ANYmal robot in the Corsham mines/former Cold War bunker in the UK. This is part of some testing we've been doing for the DARPA SubT challenge as part of the Cerberus team.”

[ Oxford Robotics ]
Thanks, Matias!
We built a robotic chess player with a universal robot UR5e, a 2D camera, and a deep-learning neural network to illustrate what we do at the Mechatronics, Automation, and Control System Lab at the University of Washington.
[ MACS Lab ] via [ UW Engineering ]
Thanks, Sarah!
Autonomous inspection of powerlines with quadrotors is challenging. Flights require persistent perception to keep a close look at the lines. We propose a method that uses event cameras to robustly track powerlines. The performance is evaluated in real-world flights along a powerline. The tracker is able to persistently track the powerlines, with a mean lifetime of the line 10x longer than existing approaches.
[ ETHZ ]
I could totally do this, I just choose not to.

[ Flexiv ]
Thanks, Yunfan!
Drone Badminton enables people with low vision to play badminton again using a drone as a ball. This has the potential to diversify the physical activity for people with low vision.
[ Digital Nature Group ]
Even with the batteries installed, the Open Dynamic Robot Initiative's quadruped is still super skinny looking.

[ ODRI ]
At USC's Center for Advanced Manufacturing, we have developed a space for multidisciplinary human-robot interaction. The Baxter robot collaborates with the user to execute their own customizable tie-dye design.
[ USC Viterbi ]
I will never understand the impulse that marketing folks have to add bizarre motor noises to robot videos.

[ DeepRobotics ]
FedEx and Berkshire Grey have teamed up to streamline small package processing.
[ FedEx ]
ABB robot amalyzing COVID tests in a fully automated, unmanned state, back and forth between the stations Assist in the delivery of specimens between points, 24 hours a day, 24 hours a day, test results of 96 specimens can be completed every 60 minutes, processing more than 1,800 specimens per day.
[ ABB ]
Thanks, Fan!
This is, and I quote, “the best and greatest robot death scene of all time.”

[ The Black Hole ]
Thanks, Mark!
Audrow Nash interviews Melonee Wise for the Sense Think Act podcast.

[ Sense Think Act ]
Tom Galluzzo interviews Andrew Thomaz for the Crazy Hard Robots podcast.

[ Crazy Hard Robots ] Continue reading

Posted in Human Robots

#439604 Elephant Robotics Expands Lightweight ...

This article is sponsored by Elephant Robotics.

Elephant Robotics is well known for its line of innovative products that help enhance manufacturing, assembly, education, and more. In 2020, Elephant Robotics released the world's smallest 6-axis robot arm: myCobot. Since its release, myCobot has sold over 5,000 units to clients all over the world.

Following the footprint of myCobot and to fulfill the demand from more users, Elephant Robotics is now expanding its Lightweight Robot Arm Product Line.

myCobot provides an answer for affordable commercial robot arms
The idea of a lightweight commercial robot arm has been raised for a long time, but factory and assembly lines are still the most common scenes for robot arms. A traditional robot arm is usually heavy, loud, and difficult to program. Most importantly, the price is too high, and the cost recovery cycle becomes unacceptably long. These issues have limited robot arms from entering commercial settings.

Elephant Robotics' myCobot series, for the first time, provides an answer for all these issues.

The myCobot series of lightweight 6-axis robots has a payload from 250 grams to 2 kilograms and a working range from 280 to 600 mm. The innovative all-in-one design from
Elephant Robotics allows these robots to get rid of the traditional control box and have all controllers and panels integrated into the base.
myCobot series robots are all open source and support various ways of programming and are super easy for beginners to use and adapt to their needs.

• myCobot 280, as the knock-out product, is an open-source robot arm with a 250 g payload. It is an ideal platform for learning ROS, V-rep, myBlockly, Matlab, CAN, and 485 bus-mastering control.

• myCobot 320 has a payload of 1 kg payload and a continuous working time of 8 hours. myCobot 320 provides an unprecedented option for the service industry.

• myCobot Pro 600, as the top-level product of myCobot series products, features 600 mm arm reach and 2 kg payload. It is equipped withy three harmonic drives that are being used on the commercial robot for the first time. myCobot Pro 600 is expanding the use of robot arms to medical, catering, manufacturing, and other industries, which have not benefited from automation.

The myCobot series of robotic arms provides usability, security, and low-noise. Compared to other options, it's a highly competitive choice for a wide range of automation applications. It allows quick deployment and enables human-robot collaboration. It's safe, increases efficiency for businesses, and is a cost-effective solution.

Traditional industry + robot arm?
The myCobot series can be used for commercial scenarios including production, manufacturing, and assembly. For some more creative ideas, check out the following videos:
to make coffee, to make matcha, provide a robot message, or to help a photographer work.

myCobot Pro as a photographer assistant.
Elephant Robotics

The myCobot series can also be used for scientific research, educational purposes, and medical purposes.

A couple of other unique examples include using it as a smart barista to expand a coffee business; to provide an excellent experience of robot massage; to help in a photographic studio for more accurate and stable precision work; to produce efficient line work and to help print out photos continuously for the perfect combination of artistic creation and robotics.
It can also work as an assistant in a workshop for human and robot collaboration and infinite creativity. Its all-in-one design also make it a great fit for automated guided vehicle (AGV) solutions.

All of the products in the myCobot line are open source and work with Elephant Robotics' myStudio, a one-stop platform for all of the robots from
Elephant Robotics. This platform provides continuous updates of firmware, video tutorials, and provides maintenance and repair information (e.g. tutorials, Q&A, etc.). Users can also buy several accessories targeted at robotic collaboration applications as well.

Open source robot arm
myCobot product line offers various software interfaces and adapt to the majority of development platforms. myCobot product line can be integrated with applications like the Robot Operating System (ROS) and MoveIt, and various APIs, including Python, C++, C#, Java, and Arduino. It also supports multiple ways of programming, including myBlockly and RoboFlow.

Elephant aims to provide the best development experience and lower the development barriers to allow more users to have their hand on myCobots to create useful applications.

“With the new myCobot series products, we are happy to enable customers to create more efficiently on a larger scale than ever before,” said
Elephant Robotics cofounder and CEO Joey Song. “We have helped customers from different industries to achieve automation upgrading like the Tumor Thermal Therapy Robot in medical use.”

“We are hoping to allow more people to use our latest robotic arm,” he added, ” to create and enhance their businesses and maker work.” Continue reading

Posted in Human Robots

#439568 Corvus Robotics’ Autonomous Drones ...

Warehouses offer all kinds of opportunities for robots. Semi-structured controlled environments, lots of repetitive tasks, and humans that would almost universally rather be somewhere else. Robots have been doing great at taking over jobs that involve moving stuff from one place to another, but there are all kinds of other things that have to happen to keep warehouses operating efficiently.

Corvus Robotics, a YC-backed startup that's just coming out of stealth, has decided that they want to go after warehouse inventory tracking. That is, making sure that a warehouse knows exactly what's inside of it and where. This is a more complicated task than it seems like it should be, and not just any robot is able to do it. Corvus' solution involves autonomous drones that can fly unattended for weeks on end, collecting inventory data without any human intervention at all.

Many warehouses have a dedicated team of humans whose job is to wander around the warehouse scanning stuff to maintain an up to date list of where everything is, a task which is both very important and very boring. As it turns out, autonomous drones can scan up to ten times faster than humans—Corvus Robotics' drones are able to inventory an entire warehouse on a rolling basis in just a couple days, while it would take a human team weeks to do the same task.

Inventory is a significant opportunity for robotics, and we've seen a bunch of different attempts at doing inventory in places like supermarkets, but warehouses are different. Warehouses can be huge, in every dimension, meaning that the kinds of robots that can make supermarket inventory work just won't cut it in a warehouse environment for the simple reason that they can't see inventory stacked on shelves all the way to the ceiling, which can be over 20m high. And this is why the drone form factor, while novel, actually offers a uniquely useful solution.
It's probably fair to think of a warehouse as a semi-structured environment, with emphasis on the “semi.” At the beginning of a deployment, Corvus will generate one map of the operating area that includes both geometric and semantic information. After that, the drones will autonomously update that map with each flight throughout their entire lifetimes. There are walls and ceilings that don't move, along with large shelving units that are mostly stationary, but those things aren't going to do your localization system any favors since they all look the same. And the stuff that does offer some uniqueness, like the items on those shelves, is changing all the time. “That's a huge problem for us,” says Mohammed Kabir, Corvus Robotics' CTO. “Being able to do place recognition at the granularity that we need while everything is changing is really hard.” If you were looking closely at the video, you may have spotted some fiducials (optical patterns placed in the environment that vision systems find easy to spot), but we're told that the video was shot in Corvus Robotics' development warehouse where those markers are used for ground truth testing.
In real deployments, fiducials (or anything else) isn't necessary. The drone has its charging dock, and the initial map, but otherwise it's doing onboard visual-inertial SLAM (simultaneous localization and mapping), dense volumetric mapping, and motion planning with its 10 camera array and an autonomy stack running on ROS and PX4 for real time flight control. Corvus isn't willing to let us in on all of their secrets, but they did tell us that they incorporate some of the structured components of the environment into their SLAM solution, as well as some things are semi-static—that is, things that are unlikely to change over the duration of a single flight, helping the drone with loop closure.
One of the big parts of being able to do this is the ability to localize in very large, unstructured environments where things are constantly changing without having to rely on external infrastructure. For example, a WiFi connection back to our base station is not guaranteed, so everything needs to run on-board the drone, which is a non-trivial task. It's essentially all of the compute of a self-driving car, compressed into the drone. -Mohammed KabirCorvus is able to scan between 200 and 400 pallet positions per hour per drone, inclusive of recharge time. At ground level, this is probably about equivalent in speed to a human (although more sustainable). But as you start looking at inventory higher off the ground, the drone maintains a constant scan rate, while for a human, it gets exponentially harder, involving things like strapping yourself to a forklift. And of course the majority of the items in a high warehouse are not at ground level, because ground level only covers a tier or two of a space that may soar to 20 meters. Overall, Corvus says that they can do inventory up to 10x faster than a human.
With a few exceptions, it's unlikely that most warehouses are going to be able to go human-free in the foreseeable future, meaning that any time you talk about robot autonomy, you also have to talk about safety. “We can operate when no one's around, so our customers often schedule the drones during the third shift when the warehouse is dark,” says Mohammed Kabir. “There are also customers who want us to operate around people, which initially terrified us, because interacting with humans can be quite tricky. But over the last couple years, we've built safety systems to be able to deal with that.” In addition to the collision avoidance that comes with the 360 degree vision system that the drone uses to navigate, it has a variety of safety-first behaviors all the way up to searching for clear flat spots to land in the event of an emergency. But it sounds like the primary way that Corvus tries to maintain safety is by keeping drones and humans as separate as possible, which may involve process changes for the warehouse, explains Corvus Robotics CEO Jackie Wu. “If you see a drone in an aisle, just don't go in until it's done.”
We also asked Wu about what exactly he means when he calls the Corvus Robotics' drone “fully autonomous,” because depending on who you ask (and what kind of robot and task you're talking about), full autonomy can mean a lot of different things.
For us, full autonomy means continuous end to end operation with no human in the loop within a certain scenario or environment. Obviously, it's not level five autonomy, because nobody is doing level five, which would take some kind of generalized intelligence that can fly anywhere. But, for level four, for the warehouse interior, the drones fly on scheduled missions, intelligently find objects of interest while avoiding collisions, come back to land, recharge and share that data, all without anybody touching them. And we're able to do this repeatedly, without external localization infrastructure. -Jackie WuAs tempting as it is, we're not going to get into the weeds here about what exactly constitutes “full autonomy” in the context of drones. Well, okay, maybe we'll get into the weeds a little bit, just to say that being able to repeatedly do a useful task end-to-end without a human in the loop seems close enough to whatever your definition of full autonomy is that it's probably a fair term to apply here. Are there other drones that are arguably more autonomous, in the sense that they require even less structure in the environment? Sure. Are those same drones arguably less autonomous because they don't autonomously recharge? Probably. Corvus Robotics' perspective that the ability to run a drone autonomously for weeks at a time is a more important component of autonomy is perfectly valid considering their use case, but I think we're at the point where “full autonomy” at this level is becoming domain-specific enough to make direct comparisons difficult and maybe not all that useful.
Corvus has just recently come out of stealth, and they're currently working on pilot projects with a handful of Global 2000 companies. Continue reading

Posted in Human Robots

#439487 SoftBank Stops Making Pepper Robots, ...

Reuters is reporting that SoftBank stopped manufacturing Pepper robots at some point last year due to low demand, and by September, will cut about half of the 330 positions at SoftBank Robotics Europe in France. Most of the positions will be in Q&A, sales, and service, which hopefully leaves SoftBank Robotics’ research and development group mostly intact. But the cuts reflect poor long-term sales, with SoftBank Robotics Europe having lost over 100 million Euros in the past three years, according to French business news site JDN. Speaking with Nikkei, SoftBank said that this doesn’t actually mean a permanent end for Pepper, and that they “plan to resume production if demand recovers.” But things aren’t looking good.

Reuters says that “only” 27,000 Peppers were produced, but that sure seems like a lot of Peppers to me. Perhaps too many—a huge number of Peppers were used by SoftBank itself in its retail stores, and a hundred at once were turned into a cheerleading squad for the SoftBank Hawks baseball team because of the pandemic. There’s nothing wrong with either of those things, but it’s hard to use them to gauge how successful Pepper has actually been.

I won’t try to argue that Pepper would necessarily have been commercially viable in the long(er) term, since it’s a very capable robot in some ways, but not very capable in others. For example, Pepper has arms and hands with individually articulated fingers, but the robot can’t actually do much in the way of useful grasping or manipulation. SoftBank positioned Pepper as a robot that can attract attention and provide useful, socially interactive information in public places. Besides SoftBank’s own stores, Peppers have been used in banks, malls, airports, and other places of that nature. A lot of what Pepper seems to have uniquely offered was novelty, though, which ultimately may not be sustainable for a commercial robot, because at some point, the novelty just wears off and you’re basically left with a very cool looking (but expensive) kiosk.

Having said all that, the sheer number of Peppers that SoftBank put out in the world could be one of the most significant impacts that the robot has had. The fact that Pepper was able to successfully operate for long enough, and in enough places, that it even had a chance to stop becoming novel and instead become normal is an enormous achievement for Pepper specifically as well as for social robots more broadly. Angelica Lim, who worked with Pepper at SoftBank Robotics Europe for three years before founding the Rosie Lab at SFU, shared some perspective with us on this:

There has never been a robot with the ambition of Pepper. Its mission was huge—be adaptable and robust to different purposes and locations: loud sushi shops, quiet banks, and hospitals that change from hour to hour. Compare that with Alexa which has a pretty stable and quiet environment—the home. On top of that, the robot needed to respond to different ages, cultures, countries and languages. The only thing I can think of that comes close is the smartphone, and the expectation for it is much lower compared to the humanoid Pepper. Ten years ago, it was unthinkable that we could leave a robot on “in the wild” for days, weeks, months and years, and yet Pepper did it thanks to the team at SoftBank Robotics.

Peppers are still being used in education today, from elementary schools and high schools to research labs in North America, Asia and Europe. The next generation will grow up programming these, like they did with the Apple personal computer. I’m confident it’s just the next step to technology that adapts to us as humans rather than the other way around.

Pepper has been an amazing platform for HRI research as well as for STEM education more broadly, and our hope is that Pepper will continue to be impactful in those ways, whether or not any more of these robots are ever made. We also hope that SoftBank does whatever is necessary to make sure that Peppers remain useful and accessible well into the future in both software and hardware. But perhaps we’re being too pessimistic here—this is certainly not good news, but despite how it looks we don’t know for sure that it’s catastrophic for Pepper. All we can do is wait and see what happens at SoftBank Robotics Europe over the next six months, and hope that Pepper continues to get the support that it deserves. Continue reading

Posted in Human Robots