Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#441073 Understanding the Cell: The Elementary ...

In his latest book, the oncologist and acclaimed writer Siddhartha Mukherjee focuses his narrative microscope on the cell, the elementary building block from which complex systems and life itself emerge. It is the coordination of cells that allow hearts to beat, the specialization of cells that create robust immune systems, and the firing of cells that form thoughts. “We need to understand cells to understand the human body,” Mukherjee writes. “We need them to understand medicine. But most essentially, we need the story of the cell to tell the story of life and of our selves.”

His account, The Song of the Cell, reads at times like an artfully written biology textbook and at times like a philosophical tract. Mukherjee starts with the invention of the microscope and the historical origins of cell biology, from which he dives into cellular anatomy. He examines the dangers of foreign cells like bacteria, and of our own cells when they misbehave, are hijacked, or fail. He then moves into more complex cellular systems: blood and the immune system, organs, and the communication between cells. “The human body functions as a citizenship of cooperating cells,” he writes. “The disintegration of this citizenship tips us from wellness into disease.”

At each step, he’s careful to draw a clear line from the discovery of cellular functions to the therapeutic potential they hold. “A hip fracture, cardiac arrest, immunodeficiency, Alzheimer’s dementia, AIDS, pneumonia, lung cancer, kidney failure, arthritis—all could be reconceived as the results of cells, or systems of cells, functioning abnormally,” Mukherjee writes. “And all could be perceived as loci of cellular therapies.”

Understanding how electrical currents affect neurons, for example, led to experiments using deep brain stimulation to treat mood disorders. And T-cells, the “door-to-door wanderers” that travel through the body and hunt for pathogens, are being trained to fight cancer as doctors better understand how these wanderers discriminate between foreign cells and the “self.”

Mukherjee, who won a Pulitzer Prize for his 2010 book The Emperor of All Maladies, is an engaging writer. He skillfully picks out the human characters and the idiosyncratic historical details that will grab readers and hold them through the drier technical sections. Take, for instance, his long discourse on the amateur and academic scientists who toyed with early microscopes. Among descriptions of lenses and petty academic fights (some things, it seems, are eternal), Mukherjee adds the delectably lewd anecdote that in the 17th century, the Dutch trader and microscope enthusiast Antonie van Leeuwenhoek trained his scopes on, among other things, his own semen and the semen of someone infected with gonorrhea. In those samples, Leeuwenhoek saw what he called “a genital animalcule,” and what we now call spermatozoa, “moving like a snake or an eel swimming in water.”

Just as Mukherjee draws clear connections between scientific discoveries and potential therapeutics, he also excels at showing the high stakes of these treatments by drawing on case studies and vivid examples from patients he’s seen over the course of his career. There is Sam P., who jokes that his fast-moving cancer will spread by the time he walks to the bathroom; and M.K., a young man ravaged by a mysterious immune disorder, whose father trekked through the snow to Boston’s North End to buy his son’s favorite meatballs and ferry them to the hospital.

And there is Emily Whitehead, who, as a child, suffered from leukemia and whose cells are stored inside a freezer named after “The Simpsons” character Krusty the Clown. Some cells were genetically modified to recognize and fight off Whitehead’s disease. The success of that therapy, called CAR-T, heralded a change in cancer treatments and Whitehead became the miraculously healthy result of centuries of scientific inquiry. “She embodied our desire to get to the luminous heart of the cell, to understand its endlessly captivating mysteries,” Mukherjee writes. “And she embodied our aching aspiration to witness the birth of a new kind of medicine—cellular therapies—based on our deciphering the physiology of cells.”

As if forays into oncology, immunology, pathology, the history of science, and neurobiology weren’t enough, Mukherjee also gets to really big questions about the ethics of cellular therapies, the meaning of disability, perfectionism, and acceptance in a world where all physical features might be altered—and even the nature of life itself. “A cell is the unit of life,” he writes. “But that begs a deeper question: What is ‘life?’i”

In some ways, the cell is the perfect vessel in which to travel down these many winding, diverging, and intersecting paths. Cells are the site of some incredible stories of research, discovery, and promise, and Mukherjee gives himself ample room to investigate a diverse array of biological processes and interventions. But in trying to encompass everything that cells can be and do—both metaphorically and literally—Mukherjee ends up failing to fully explore these deep questions in a satisfying way.

It doesn’t help that he leans so heavily on metaphor. The cell is a “decoding machine,” a “dividing machine,” and an “unfamiliar spacecraft.” He likens cells to “Lego blocks,” “corporals,” “actors, players, doers, workers, builders, creators.” T-cells alone are described as both a “gumshoe detective” and a “rioting crowd disgorging inflammatory pamphlets on a rampage.” Not to mention the many cell metaphors Mukherjee quotes from others. Creating imagery readers can understand is an invaluable part of any science writer’s playbook, but so many images can also be distracting at times.

The final section grapples with the implications of enhanced humans who benefit from cellular tinkering. These “new humans” are not cyborgs or people augmented with superpowers, Mukherjee clarifies. When introducing the idea at the outset of the book, he writes, “I mean a human rebuilt anew with modified cells who looks and feels (mostly) like you and me.” But by engineering stem cells so that a person with diabetes can produce their own insulin or implanting an electrode in the brain of someone suffering with depression, Mukherjee posits that we’ve changed them in some fundamental way. Humans are a sum of their parts, he writes, but cell therapies cross a border, transforming people into a “new sum of new parts.”

This section echoes a famous philosophical thought experiment about the Ship of Theseus. Theseus left Athens in a wooden ship that, over the course of a long journey, had to be repaired. Sailors removed rotting wood and replaced the broken oars. By the time the ship returned, none of the original wood remained. Philosophers have debated the nature of the ship for centuries: Is the repaired ship the same as the one that left Athens or is it a new ship altogether?

The same question might be asked of Mukherjee’s “new humans.” How many cells must be altered in order to render us new? Do certain cells matter more than others? Or do humans possess some kind of inherent integrity—a conscience, a soul—that affects these calculations?

Mukherjee never fully arrives at an answer, but his book’s title may allude to one, recalling Walt Whitman’s Song of Myself, an ode to the interconnectedness of beings. Mukherjee urges scientists to abandon the “atomism” of examining only isolated units—be they atoms, genes, cells—in favor of a comprehensive approach that appreciates the whole of a system, or of a being. “Multicellularity evolved, again and again, because cells, while retaining their boundaries, found multiple benefits in citizenship,” he writes. “Perhaps we, too, should begin to move from the one to the many.”

This article was originally published on Undark. Read the original article.

Image Credit: Torsten Wittmann, University of California, San Francisco via NIH on Flickr Continue reading

Posted in Human Robots

#441070 Video Friday: Turkey Sandwich

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALANDEnjoy today's videos!
Happy Thanksgiving, for those who celebrate it. Now spend 10 minutes watching a telepresence robot assemble a turkey sandwich.

[ Sanctuary ]
Ayato Kanada, an assistant professor at Kyushu University in Japan, wrote in to share “the world's simplest omnidirectional mobile robot.”

We propose a palm-sized omnidirectional mobile robot with two torus wheels. A single torus wheel is made of an elastic elongated coil spring in which the two ends of the coil connected each other and is driven by a piezoelectric actuator (stator) that can generate 2-degrees-of-freedom (axial and angular) motions. The stator converts its thrust force and torque into longitudinal and meridian motions of the torus wheel, respectively, making the torus work as an omnidirectional wheel on a plane.[ Paper ]
Thanks, Ayato!
This work entitled “Virtually turning robotic manipulators into worn devices: opening new horizons for wearable assistive robotics” proposes a novel hybrid system using a virtually worn robotic arm in augmented-reality, and a real robotic manipulator servoed on such virtual representation. We basically aim at bringing an illusion of wearing a robotic system while its weight is fully deported. We believe that this approach could offers a solution to the critical challenge of wight and discomfort cause by robotic sensorimotor extensions (such as supernumerary robotics limbs (SRL), prostheses or handheld tools), and open new horizons for the development of wearable robotics.
[ Paper ]
Thanks, Nathanaël!
Engineers at Georgia Tech are the first to study the mechanics of springtails, which leap in the water to avoid predators. The researchers learned how the tiny hexapods control their jump, self-right in midair, and land on their feet in the blink of an eye. The team used the findings to build penny-sized jumping robots.
[ Georgia Tech ]
Thanks, Jason!
The European Space Agency (ESA) and the European Space Resources Innovation Centre (ESRIC) have asked European space industries and research institutions to develop innovative technologies for the exploration of resources on the Moon in the framework of the ESA-ESRIC Space Resources Challenge. As part of the challenge, teams of engineers have developed vehicles capable of prospecting for resources in a test-bed simulating the Moon's shaded polar regions. From 5 to 9 September 2022, the final of the ESA-ESRIC Space Resource Challenge took place at the Rockhal in Esch-sur-Alzette. On this occasion, lunar rover prototypes competed on a 1,800 m² 'lunar' terrain. The winning team will have the opportunity to have their technology implemented on the Moon.
[ ESA ]
Thanks, Arne!
If only cobots were as easy to use as this video from Kuka makes it seem.

The Kuka website doesn't say how much this thing costs, which means it's almost certainly not something that you impulse buy.
[ Kuka ]
We present the tensegrity aerial vehicle, a design of collision-resilient rotor robots with icosahedron tensegrity structures. With collision resilience and re-orientation ability, the tensegrity aerial vehicles can operate in cluttered environments without complex collision-avoidance strategies. These capabilities are validated by a test of an experimental tensegrity aerial vehicle operating with only onboard inertial sensors in a previously-unknown forest.
[ HiPeR Lab ]
The robotics research group Brubotics and polymer science and physical chemistry group FYSC of the university of Brussels have developed together self-healing materials that can be scratched, punctured or completely cut through and heal themselves back together, with the required heat, or even at room temperature.
[ Brubotics ]
Apparently, the World Cup needs more drone footage, because this is kinda neat.

[ DJI ]
Researchers at MIT's Center for Bits and Atoms have made significant progress toward creating robots that could build nearly anything, including things much larger than themselves, from vehicles to buildings to larger robots.
[ MIT ]
The researchers from North Carolina State University have recently developed a fast and efficient soft robotic swimmer that swims resembling human's butterfly-stroke style. It can achieve a high average swimming speed of 3.74 body length per second, close to five times faster than the fastest similar soft swimmers, and also a high-power efficiency with low cost of energy.
[ NC State ]
To facilitate sensing and physical interaction in remote and/or constrained environments, high-extension, lightweight robot manipulators are easier to transport and reach substantially further than traditional serial chain manipulators. We propose a novel planar 3-degree-of-freedom manipulator that achieves low weight and high extension through the use of a pair of spooling bistable tapes, commonly used in self-retracting tape measures, which are pinched together to form a reconfigurable revolute joint.
[ Charm Lab ]
SLURP!

[ River Lab ]
This video may encourage you to buy a drone. Or a snowmobile.

[ Skydio ]
Moxie is getting an update for the holidays!

[ Embodied ]
Robotics professor Henny Admoni answers the internet's burning questions about robots! How do you program a personality? Can robots pick up a single M&M? Why do we keep making humanoid robots? What is Elon Musk's goal for the Tesla Optimus robot? Will robots take over my job writing video descriptions…I mean, um, all our jobs? Henny answers all these questions and much more.
[ CMU ]
This GRASP on Robotics talk is from Julie Adams at Oregon State University, on “Towards Adaptive Human-Robot Teams: Workload Estimation.”

The ability for robots, be it a single robot, multiple robots or a robot swarm, to adapt to the humans with which they are teamed requires algorithms that allow robots to detect human performance in real time. The multi-dimensional workload algorithm incorporates physiological metrics to estimate overall workload and its components (i.e., cognitive, speech, auditory, visual and physical). The algorithm is sensitive to changes in a human’s individual workload components and overall workload across domains, human-robot teaming relationships (i.e., supervisory, peer-based), and individual differences. The algorithm has also been demonstrated to detect shifts in workload in real-time in order to adapt the robot’s interaction with the human and autonomously change task responsibilities when the human’s workload is over- or underloaded. Recently, the algorithm was used to post-hoc analyze the resulting workload for a single human deploying a heterogeneous robot swarm in an urban environment. Current efforts are focusing on predicting the human’s future workload, recognizing the human’s current tasks, and estimating workload for previously unseen tasks.[ UPenn ] Continue reading

Posted in Human Robots

#441067 Golf Robot Learns To Putt Like A Pro

While being able to drive the ball 300 yards might get the fans excited, a solid putting game is often what separates a golf champion from the journeymen. A robot built by German researchers is quickly becoming a master of this short game using a clever combination of classical control engineering and machine learning.
In golf tournaments, players often scout out the greens the day beforehand to think through how they are going to play their shots, says Annika Junker, a doctoral student at Paderborn University in Germany. So she and her colleagues decided to see if giving a robot similar capabilities could help it to sink a putt from anywhere on the green, without assistance from a human.
Golfi, as the team has dubbed their creation, uses a 3D camera to take a snapshot of the green, which it then feeds into a physics-based model to simulate thousands of random shots from different positions. These are used to train a neural network that can then predict exactly how hard and in what direction to hit a ball to get it in the hole, from anywhere on the green.
On the green, Golfi was successful six or seven times out of ten.
Like even the best pros, it doesn’t get a hole in one every time. The goal isn’t really to build a tournament winning golf robot though, says Junker, but to demonstrate the power of hybrid approaches to robotic control. “We try to combine data-driven and physics based methods and we searched for a nice example, which everyone can easily understand,” she says. “It's only a toy for us, but we hope to see some advantages of our approach for industrial applications.”
So far, the researchers have only tested their approach on a small mock-up green inside their lab. The robot, which is described in a paper due to be presented at the IEEE International Conference on Robotic Computing in Italy next month, navigates its way around the two meter-square space on four wheels, two of which are powered. Once in position it then uses a belt driven gear shaft with a putter attached to the end to strike the ball towards the hole.
First though, it needs to work out what shot to play given the position of the ball. The researchers begin by using a Microsoft Kinect 3D camera mounted on the ceiling to capture a depth map of the green. This data is then fed into a physics-based model, alongside other parameters like the rolling resistance of the turf, the weight of the ball and its starting velocity, to simulate three thousand random shots from various starting points.

golfi video

youtu.be

This data is used to train a neural network that can predict how hard and in what direction to hit the ball to get it in the hole from anywhere on the green. While it’s possible to solve this problem by combining the physics based model with classical optimization, says Junker, it’s far more computationally expensive. And training the robot on simulated golf shots takes just five minutes, compared to around 30 to 40 hours if they collected data on real-world strokes, she adds.
Before it can make it’s shot though, the robot first has to line its putter up with the ball just right, which requires it to work out where on the green both itself and the ball are. To do so, it uses a neural network that has been trained to spot golf balls and a hard-coded object detection algorithm that picks out colored dots on the top of the robot to work out its orientation. This positioning data is then combined with a physical model of the robot and fed into an optimization algorithm that works out how to control its wheel motors to navigate to the ball.
Junker admits that the approach isn’t flawless. The current set-up relies on a bird’s eye view, which would be hard to replicate on a real golf course, and switching to cameras on the robot would present major challenges, she says. The researchers also didn’t report how often Golfi successfully sinks the putt in their paper, because the figures were thrown off by the fact that it occasionally drove over the ball, knocking it out of position. When that didn’t happen though, Junker says it was successful six or seven times out of ten, and since they submitted the paper a colleague has reworked the navigation system to avoid the ball.
Golfi isn’t the first machine to try its hand at the sport. In 2016, a robot called LDRICK hit a hole-in-one at Arizona's TPC Scottsdale course and several devices have been built to test out golf clubs. But Noel Rousseau, a golf coach with a PhD in motor learning, says that typically they require an operator painstakingly setting them up for each shot, and any adjustments take considerable time. “The most impressive part to me is that the golf robot is able to find the ball, sight the hole and move itself into position for an accurate stoke,” he says.
Beyond mastering putting, the hope is that the underlying techniques the researchers have developed could translate to other robotics problems, says Niklas Fittkau, a doctoral student at Paderborn University and co-lead author of the paper. “You can also transfer that to other problems, where you have some knowledge about the system and could model parts of it to obtain some data, but you can’t model everything,” he says. Continue reading

Posted in Human Robots

#441063 Robot Gift Guide 2022

It’s been a couple of years, but the IEEE Spectrum Robot Gift Guide is back for 2022! We’ve got all kinds of new robots, and right now is an excellent time to buy one (or a dozen), since many of them are on sale this week. We’ve tried to focus on consumer robots that are actually available (or that you can at least order), but depending on when you’re reading this guide, the prices we have here may not be up to date, and we’re not taking shipping into account.

And if these robots aren’t enough for you, many of our picks from years past are still available: check out our guides from 2019, 2018, 2017, 2016, 2015, 2014, 2013, and 2012. And as always, if you have suggestions that you’d like to share, post a comment to help the rest of us find the perfect robot gift.

Lego Robotics Kits

Lego has decided to discontinue its classic Mindstorms robotics kits, but they’ll be supported for another couple of years and this is your last chance to buy one. If you like Lego’s approach to robotics education but don’t want to invest in a system at the end of its life, Lego also makes an education kit called Spike that shares many of the hardware and software features for students in grades 6 to 8.

$360–$385
Lego
Sphero Indi

Indi is a clever educational robot designed to teach problem solving and screenless coding to kids as young as 4, using a small wheeled robot with a color sensor and a system of colored strips that command the robot to do different behaviors. There’s also an app to access more options, and Sphero has more robots to choose from once your kid is ready for something more.

$110
Sphero | Amazon
Nybble and Bittle

Petoi’s quadrupedal robot kits are an adorable (and relatively affordable) way to get started with legged robotics. Whether you go with Nybble the cat or Bittle the dog, you get to do some easy hardware assembly and then leverage a bunch of friendly software tools to get your little legged friend walking around and doing tricks.

$220–$260
Petoi
iRobot Root

Root educational robots have a long and noble history, and iRobot has built on that to create an inexpensive platform to help kids learn to code starting as young as age 4. There are two different versions of Root; the more expensive one includes an RGB sensor, a programmable eraser, and the ability to stick to vertical whiteboards and move around on them.

$100–$250
iRobot

TurtleBot 4

The latest generation of TurtleBot from Clearpath, iRobot, and Open Robotics is a powerful and versatile ROS (Robot Operating System) platform for research and product development. For aspiring roboticists in undergrad and possibly high school, the Turtlebot 4 is just about as good as it gets unless you want to spend an order of magnitude more. And the fact that TurtleBots are used so extensively means that if you need some help, the ROS community will (hopefully) have your back.

$1,200–$1,900
RoboShop
iRobot Create 3

Newly updated just last year, iRobot's Create 3 is the perfect platform for folks who want to build their own robot, but not all of their own robot. The rugged mobile base is essentially a Roomba without the cleaning parts, and it's easy to add your own hardware on top. It runs ROS 2, but you can get started with Python.

$300
iRobot
Mini Pupper

Mini Pupper is one of the cutest ways of getting started with ROS. This legged robot is open source, and runs ROS on a Raspberry Pi, which makes it extra affordable if you have your own board lying around. Even if you don’t, though, the Mini Pupper kit is super affordable for what you get, and is a fun hardware project if you decide to save a little extra cash by assembling it yourself.

$400–$585
MangDang
Luxonis Rae

I’m not sure whether the world is ready for ROS 2 yet, but you can get there with Rae, which combines a pocket-size mobile robot with a pair of depth cameras and onboard computer shockingly cheaply. App support means that Rae can do cool stuff out of the box, but it’s easy to get more in-depth with it too. Rae will get delivered early next year, but it’s cool enough that we think a Kickstarter IOU is a perfectly acceptable gift.

$400
Kickstarter

Roomba Combo j7+

iRobot’s brand new top-of-the-line fully autonomous vacuuming and wet-mopping combo j7+ Roomba will get your floors clean and shiny, except for carpet, which it’s smart enough to not try to shine because it’ll cleverly lift the wet mop up out of the way. It’s also cloud connected and empties itself. You’ll have to put water in it if you want it to mop, but that’s way better than mopping yourself.

$900
iRobot
Neato D9

Neato’s robots might not be quite as pervasive as the Roomba, but they’re excellent vacuums, and they use a planar lidar system for obstacle avoidance and map making. The nice thing about lidar (besides the fact that it works in total darkness) is that Neato robots have no cameras at all and are physically incapable of collecting imagery of you or your home.

$300
Neato Robotics
Tertill

How often do you find an affordable, useful, reliable, durable, fully autonomous home robot? Not often! But Tertill is all of these things: powered entirely by the sun, it slowly prowls around your garden, whacking weeds as they sprout while avoiding your mature plants. All you have to do is make sure it can’t escape, then just let it loose and forget about it for months at a time.

$200
Tertill

Amazon Astro

If you like the idea of having a semi-autonomous mobile robot with a direct link to Amazon wandering around your house trying to be useful, then Amazon’s Astro might not sound like a terrible idea. You’ll have to apply for one, and it sounds like it’s more like a beta program, but could be fun, I guess?

$1,000
Amazon
Skydio 2+

The Skydio 2+ is an incremental (but significant) update to the Skydio 2 drone, with its magically cutting-edge obstacle avoidance and extremely impressive tracking skills. There are many drones out there that are cheaper and more portable, and if flying is your thing, get one of those. But if filming is your thing, the Skydio 2+ is the drone you want to fly.

$900
Skydio
DJI FPV

We had a blast flying DJI’s FPV drone. The VR system is exhilarating and the drone is easy to fly even for FPV beginners, but it’s powerful enough to grow along with your piloting skills. Just don’t get cocky, or you’ll crash it. Don’t ask me how I know this.

$900
DJI

ElliQ

ElliQ is an embodied voice assistant that is a lot more practical than a smart speaker. It's designed for older adults who may spend a lot of time alone at home, and can help with a bunch of things, including health and wellness tasks and communicating with friends and family. ElliQ costs $250 up front, plus a subscription of between $30 and $40 per month.

$250+
ElliQ
Moxie

Not all robots for kids are designed to teach them to code: Moxie helps to “supports social-emotional development in kids through play.” The carefully designed and curated interaction between Moxie and children helps them to communicate and build social skills in a friendly and engaging way. Note that Moxie also requires a subscription fee of $40 per month.

$800
Embodied
Petit Qoobo

What is Qoobo? It is “a tailed cushion that heals your heart,” according to the folks that make it. According to us, it’s a furry round pillow that responds to your touch by moving its tail, sort of like a single-purpose cat. It’s fuzzy tail therapy!

$130
Qoobo | Amazon
Unitree Go1

Before you decide on a real dog, consider the Unitree Go1 instead. Sure it’s expensive, but you know what? So are real dogs. And unlike with a real dog, you only have to walk the Go1 when you feel like it, and you can turn it off and stash it in a closet or under a bed whenever you like. For a fully featured dynamic legged robot, it’s staggeringly cheap, just keep in mind that shipping is $1,000.

$2,700
Unitree Continue reading

Posted in Human Robots

#441061 Self-organization: What robotics can ...

Amoebae are single-cell organisms. By means of self-organization, they can form complex structures—and do this purely through local interactions: If they have a lot of food, they disperse evenly through a culture medium. But if food becomes scarce, they emit the messenger known as cyclic adenosine monophosphate (cAMP). This chemical signal induces amoebae to gather in one place and form a multicellular aggregation. The result is a fruiting body. Continue reading

Posted in Human Robots