Tag Archives: way
#430579 What These Lifelike Androids Can Teach ...
For Dr. Hiroshi Ishiguro, one of the most interesting things about androids is the changing questions they pose us, their creators, as they evolve. Does it, for example, do something to the concept of being human if a human-made creation starts telling you about what kind of boys ‘she’ likes?
If you want to know the answer to the boys question, you need to ask ERICA, one of Dr. Ishiguro’s advanced androids. Beneath her plastic skull and silicone skin, wires connect to AI software systems that bring her to life. Her ability to respond goes far beyond standard inquiries. Spend a little time with her, and the feeling of a distinct personality starts to emerge. From time to time, she works as a receptionist at Dr. Ishiguro and his team’s Osaka University labs. One of her android sisters is an actor who has starred in plays and a film.
ERICA’s ‘brother’ is an android version of Dr. Ishiguro himself, which has represented its creator at various events while the biological Ishiguro can remain in his offices in Japan. Microphones and cameras capture Ishiguro’s voice and face movements, which are relayed to the android. Apart from mimicking its creator, the Geminoid™ android is also capable of lifelike blinking, fidgeting, and breathing movements.
Say hello to relaxation
As technological development continues to accelerate, so do the possibilities for androids. From a position as receptionist, ERICA may well branch out into many other professions in the coming years. Companion for the elderly, comic book storyteller (an ancient profession in Japan), pop star, conversational foreign language partner, and newscaster are some of the roles and responsibilities Dr. Ishiguro sees androids taking on in the near future.
“Androids are not uncanny anymore. Most people adapt to interacting with Erica very quickly. Actually, I think that in interacting with androids, which are still different from us, we get a better appreciation of interacting with other cultures. In both cases, we are talking with someone who is different from us and learn to overcome those differences,” he says.
A lot has been written about how robots will take our jobs. Dr. Ishiguro believes these fears are blown somewhat out of proportion.
“Robots and androids will take over many simple jobs. Initially there might be some job-related issues, but new schemes, like for example a robot tax similar to the one described by Bill Gates, should help,” he says.
“Androids will make it possible for humans to relax and keep evolving. If we compare the time we spend studying now compared to 100 years ago, it has grown a lot. I think it needs to keep growing if we are to keep expanding our scientific and technological knowledge. In the future, we might end up spending 20 percent of our lifetime on work and 80 percent of the time on education and growing our skills.”
Android asks who you are
For Dr. Ishiguro, another aspect of robotics in general, and androids in particular, is how they question what it means to be human.
“Identity is a very difficult concept for humans sometimes. For example, I think clothes are part of our identity, in a way that is similar to our faces and bodies. We don’t change those from one day to the next, and that is why I have ten matching black outfits,” he says.
This link between physical appearance and perceived identity is one of the aspects Dr. Ishiguro is exploring. Another closely linked concept is the connection between body and feeling of self. The Ishiguro avatar was once giving a presentation in Austria. Its creator recalls how he felt distinctly like he was in Austria, even capable of feeling sensation of touch on his own body when people laid their hands on the android. If he was distracted, he felt almost ‘sucked’ back into his body in Japan.
“I am constantly thinking about my life in this way, and I believe that androids are a unique mirror that helps us formulate questions about why we are here and why we have been so successful. I do not necessarily think I have found the answers to these questions, so if you have, please share,” he says with a laugh.
His work and these questions, while extremely interesting on their own, become extra poignant when considering the predicted melding of mind and machine in the near future.
The ability to be present in several locations through avatars—virtual or robotic—raises many questions of both philosophical and practical nature. Then add the hypotheticals, like why send a human out onto the hostile surface of Mars if you could send a remote-controlled android, capable of relaying everything it sees, hears and feels?
The two ways of robotics will meet
Dr. Ishiguro sees the world of AI-human interaction as currently roughly split into two. One is the chat-bot approach that companies like Amazon, Microsoft, Google, and recently Apple, employ using stationary objects like speakers. Androids like ERICA represent another approach.
“It is about more than the form factor. I think that the android approach is generally more story-based. We are integrating new conversation features based on assumptions about the situation and running different scenarios that expand the android’s vocabulary and interactions. Another aspect we are working on is giving androids desire and intention. Like with people, androids should have desires and intentions in order for you to want to interact with them over time,” Dr. Ishiguro explains.
This could be said to be part of a wider trend for Japan, where many companies are developing human-like robots that often have some Internet of Things capabilities, making them able to handle some of the same tasks as an Amazon Echo. The difference in approach could be summed up in the words ‘assistant’ (Apple, Amazon, etc.) and ‘companion’ (Japan).
Dr. Ishiguro sees this as partly linked to how Japanese as a language—and market—is somewhat limited. This has a direct impact on viability and practicality of ‘pure’ voice recognition systems. At the same time, Japanese people have had greater exposure to positive images of robots, and have a different cultural / religious view of objects having a ‘soul’. However, it may also mean Japanese companies and android scientists are both stealing a lap on their western counterparts.
“If you speak to an Amazon Echo, that is not a natural way to interact for humans. This is part of why we are making human-like robot systems. The human brain is set up to recognize and interact with humans. So, it makes sense to focus on developing the body for the AI mind, as well as the AI. I believe that the final goal for both Japanese and other companies and scientists is to create human-like interaction. Technology has to adapt to us, because we cannot adapt fast enough to it, as it develops so quickly,” he says.
Banner image courtesy of Hiroshi Ishiguro Laboratories, ATR all rights reserved.
Dr. Ishiguro’s team has collaborated with partners and developed a number of android systems:
Geminoid™ HI-2 has been developed by Hiroshi Ishiguro Laboratories and Advanced Telecommunications Research Institute International (ATR).
Geminoid™ F has been developed by Osaka University and Hiroshi Ishiguro Laboratories, Advanced Telecommunications Research Institute International (ATR).
ERICA has been developed by ERATO ISHIGURO Symbiotic Human-Robot Interaction Project Continue reading →
#430556 Forget Flying Cars, the Future Is ...
Flying car concepts have been around nearly as long as their earthbound cousins, but no one has yet made them a commercial success. MIT engineers think we’ve been coming at the problem from the wrong direction; rather than putting wings on cars, we should be helping drones to drive.
The team from the university’s Computer Science and Artificial Intelligence Laboratory (CSAIL) added wheels to a fleet of eight mini-quadcopters and tested driving and flying them around a tiny toy town made out of cardboard and fabric.
Adding the ability to drive reduced the distance the drone could fly by 14 percent compared to a wheel-less version. But while driving was slower, the drone could travel 150 percent further than when flying. The result is a vehicle that combines the speed and mobility of flying with the energy-efficiency of driving.
CSAIL director Daniela Rus told MIT News their work suggested that when looking to create flying cars, it might make more sense to build on years of research into drones rather than trying to simply “put wings on cars.”
Historically, flying car concepts have looked like someone took apart a Cessna light aircraft and a family sedan, mixed all the parts up, and bolted them back together again. Not everyone has abandoned this approach—two of the most developed flying car designs from Terrafugia and AeroMobil are cars with folding wings that need an airstrip to take off.
But flying car concepts are looking increasingly drone-like these days, with multiple small rotors, electric propulsion and vertical take-off abilities. Take the eHang 184 autonomous aerial vehicle being developed in China, the Kitty Hawk all-electric aircraft backed by Google founder Larry Page, which is little more than a quadcopter with a seat, the AirQuadOne designed by UK consortium Neva Aerospace, or Lilium Aviation’s Jet.
The attraction is obvious. Electric-powered drones are more compact, maneuverable, and environmentally friendly, making them suitable for urban environments.
Most of these vehicles are not quite the same as those proposed by the MIT engineers, as they’re pure flying machines. But a recent Airbus concept builds on the same principle that the future of urban mobility is vehicles that can both fly and drive. Its Pop.Up design is a two-passenger pod that can either be clipped to a set of wheels or hang under a quadcopter.
Importantly, they envisage their creation being autonomous in both flight and driving modes. And they’re not the only ones who think the future of flying cars is driverless. Uber has committed to developing a network of autonomous air taxis within a decade. This spring, Dubai announced it would launch a pilotless passenger drone service using the Ehang 184 as early as next month (July).
While integrating fully-fledged autonomous flying cars into urban environments will be far more complex, the study by Rus and her colleagues provides a good starting point for the kind of 3D route-planning and collision avoidance capabilities this would require.
The team developed multi-robot path planning algorithms that were able to control all eight drones as they flew and drove around their mock up city, while also making sure they didn’t crash into each other and avoided no-fly zones.
“This work provides an algorithmic solution for large-scale, mixed-mode transportation and shows its applicability to real-world problems,” Jingjin Yu, a computer science professor at Rutgers University who was not involved in the research, told MIT News.
This vision of a driverless future for flying cars might be a bit of a disappointment for those who’d envisaged themselves one day piloting their own hover car just like George Jetson. But autonomy and Uber-like ride-hailing business models are likely to be attractive, as they offer potential solutions to three of the biggest hurdles drone-like passenger vehicles face.
Firstly, it makes the vehicles accessible to anyone by removing the need to learn how to safely pilot an aircraft. Secondly, battery life still limits most electric vehicles to flight times measured in minutes. For personal vehicles this could be frustrating, but if you’re just hopping in a driverless air taxi for a five minute trip across town it’s unlikely to become apparent to you.
Operators of the service simply need to make sure they have a big enough fleet to ensure a charged vehicle is never too far away, or they’ll need a way to swap out batteries easily, such as the one suggested by the makers of the Volocopter electric helicopter.
Finally, there has already been significant progress in developing technology and regulations needed to integrate autonomous drones into our airspace that future driverless flying cars can most likely piggyback off of.
Safety requirements will inevitably be more stringent, but adding more predictable and controllable autonomous drones to the skies is likely to be more attractive to regulators than trying to license and police thousands of new amateur pilots.
Image Credit: Lilium Continue reading →
#428636 Stopping Killer Robots at the Source ...
Researchers suggest coding artificial intelligence in such a way that robots don't make a distinction between human and machine. Continue reading →
#428505 This Week’s Awesome Stories From ...
Revisiting the first self-driving car in 1986 gives us an idea of how long this tech has been in the works, paving the way for today's machine learning of books and video games and, perhaps, a jobless future. Plus, the Black Mirror tech that's already here and the first horror movie trailer made by an AI. ARTIFICIAL INTELLIGENCE: DeepMind and Blizzard to Release StarCraft II as an AI Research Environment Oriol Vinyals | DeepMind "DeepMind is… read more Continue reading →
#428432 This Intelligent 3D Printer Is Building ...
Imagine one day walking into a gorgeous structure—like LA's famous Walt Disney Concert Hall—only to discover it was designed by a computer system and constructed by automated robotic arms. Ai Build, a London-based startup, aims to pave the way to 3D printing on large scales. The company is equipping industrial-grade Kuka robotic arms with artificial intelligence and "3D printing guns" to 3D print large structures that focus on maximizing efficiency with labor and materials. Founder and CEO Daghan Cam dreamed up the… read more Continue reading →