Tag Archives: evolving

#431081 How the Intelligent Home of the Future ...

As Dorothy famously said in The Wizard of Oz, there’s no place like home. Home is where we go to rest and recharge. It’s familiar, comfortable, and our own. We take care of our homes by cleaning and maintaining them, and fixing things that break or go wrong.
What if our homes, on top of giving us shelter, could also take care of us in return?
According to Chris Arkenberg, this could be the case in the not-so-distant future. As part of Singularity University’s Experts On Air series, Arkenberg gave a talk called “How the Intelligent Home of The Future Will Care For You.”
Arkenberg is a research and strategy lead at Orange Silicon Valley, and was previously a research fellow at the Deloitte Center for the Edge and a visiting researcher at the Institute for the Future.
Arkenberg told the audience that there’s an evolution going on: homes are going from being smart to being connected, and will ultimately become intelligent.
Market Trends
Intelligent home technologies are just now budding, but broader trends point to huge potential for their growth. We as consumers already expect continuous connectivity wherever we go—what do you mean my phone won’t get reception in the middle of Yosemite? What do you mean the smart TV is down and I can’t stream Game of Thrones?
As connectivity has evolved from a privilege to a basic expectation, Arkenberg said, we’re also starting to have a better sense of what it means to give up our data in exchange for services and conveniences. It’s so easy to click a few buttons on Amazon and have stuff show up at your front door a few days later—never mind that data about your purchases gets recorded and aggregated.
“Right now we have single devices that are connected,” Arkenberg said. “Companies are still trying to show what the true value is and how durable it is beyond the hype.”

Connectivity is the basis of an intelligent home. To take a dumb object and make it smart, you get it online. Belkin’s Wemo, for example, lets users control lights and appliances wirelessly and remotely, and can be paired with Amazon Echo or Google Home for voice-activated control.
Speaking of voice-activated control, Arkenberg pointed out that physical interfaces are evolving, too, to the point that we’re actually getting rid of interfaces entirely, or transitioning to ‘soft’ interfaces like voice or gesture.
Drivers of change
Consumers are open to smart home tech and companies are working to provide it. But what are the drivers making this tech practical and affordable? Arkenberg said there are three big ones:
Computation: Computers have gotten exponentially more powerful over the past few decades. If it wasn’t for processors that could handle massive quantities of information, nothing resembling an Echo or Alexa would even be possible. Artificial intelligence and machine learning are powering these devices, and they hinge on computing power too.
Sensors: “There are more things connected now than there are people on the planet,” Arkenberg said. Market research firm Gartner estimates there are 8.4 billion connected things currently in use. Wherever digital can replace hardware, it’s doing so. Cheaper sensors mean we can connect more things, which can then connect to each other.
Data: “Data is the new oil,” Arkenberg said. “The top companies on the planet are all data-driven giants. If data is your business, though, then you need to keep finding new ways to get more and more data.” Home assistants are essentially data collection systems that sit in your living room and collect data about your life. That data in turn sets up the potential of machine learning.
Colonizing the Living Room
Alexa and Echo can turn lights on and off, and Nest can help you be energy-efficient. But beyond these, what does an intelligent home really look like?
Arkenberg’s vision of an intelligent home uses sensing, data, connectivity, and modeling to manage resource efficiency, security, productivity, and wellness.
Autonomous vehicles provide an interesting comparison: they’re surrounded by sensors that are constantly mapping the world to build dynamic models to understand the change around itself, and thereby predict things. Might we want this to become a model for our homes, too? By making them smart and connecting them, Arkenberg said, they’d become “more biological.”
There are already several products on the market that fit this description. RainMachine uses weather forecasts to adjust home landscape watering schedules. Neurio monitors energy usage, identifies areas where waste is happening, and makes recommendations for improvement.
These are small steps in connecting our homes with knowledge systems and giving them the ability to understand and act on that knowledge.
He sees the homes of the future being equipped with digital ears (in the form of home assistants, sensors, and monitoring devices) and digital eyes (in the form of facial recognition technology and machine vision to recognize who’s in the home). “These systems are increasingly able to interrogate emotions and understand how people are feeling,” he said. “When you push more of this active intelligence into things, the need for us to directly interface with them becomes less relevant.”
Could our homes use these same tools to benefit our health and wellness? FREDsense uses bacteria to create electrochemical sensors that can be applied to home water systems to detect contaminants. If that’s not personal enough for you, get a load of this: ClinicAI can be installed in your toilet bowl to monitor and evaluate your biowaste. What’s the point, you ask? Early detection of colon cancer and other diseases.
What if one day, your toilet’s biowaste analysis system could link up with your fridge, so that when you opened it it would tell you what to eat, and how much, and at what time of day?
Roadblocks to intelligence
“The connected and intelligent home is still a young category trying to establish value, but the technological requirements are now in place,” Arkenberg said. We’re already used to living in a world of ubiquitous computation and connectivity, and we have entrained expectations about things being connected. For the intelligent home to become a widespread reality, its value needs to be established and its challenges overcome.
One of the biggest challenges will be getting used to the idea of continuous surveillance. We’ll get convenience and functionality if we give up our data, but how far are we willing to go? Establishing security and trust is going to be a big challenge moving forward,” Arkenberg said.
There’s also cost and reliability, interoperability and fragmentation of devices, or conversely, what Arkenberg called ‘platform lock-on,’ where you’d end up relying on only one provider’s system and be unable to integrate devices from other brands.
Ultimately, Arkenberg sees homes being able to learn about us, manage our scheduling and transit, watch our moods and our preferences, and optimize our resource footprint while predicting and anticipating change.
“This is the really fascinating provocation of the intelligent home,” Arkenberg said. “And I think we’re going to start to see this play out over the next few years.”
Sounds like a home Dorothy wouldn’t recognize, in Kansas or anywhere else.
Stock Media provided by adam121 / Pond5 Continue reading

Posted in Human Robots

#430855 Why Education Is the Hardest Sector of ...

We’ve all heard the warning cries: automation will disrupt entire industries and put millions of people out of jobs. In fact, up to 45 percent of existing jobs can be automated using current technology.
However, this may not necessarily apply to the education sector. After a detailed analysis of more than 2,000-plus work activities for more than 800 occupations, a report by McKinsey & Co states that of all the sectors examined, “…the technical feasibility of automation is lowest in education.”
There is no doubt that technological trends will have a powerful impact on global education, both by improving the overall learning experience and by increasing global access to education. Massive open online courses (MOOCs), chatbot tutors, and AI-powered lesson plans are just a few examples of the digital transformation in global education. But will robots and artificial intelligence ever fully replace teachers?
The Most Difficult Sector to Automate
While various tasks revolving around education—like administrative tasks or facilities maintenance—are open to automation, teaching itself is not.
Effective education involves more than just transfer of information from a teacher to a student. Good teaching requires complex social interactions and adaptation to the individual student’s learning needs. An effective teacher is not just responsive to each student’s strengths and weaknesses, but is also empathetic towards the student’s state of mind. It’s about maximizing human potential.
Furthermore, students don’t just rely on effective teachers to teach them the course material, but also as a source of life guidance and career mentorship. Deep and meaningful human interaction is crucial and is something that is very difficult, if not impossible, to automate.
Automating teaching is an example of a task that would require artificial general intelligence (as opposed to narrow or specific intelligence). In other words, this is the kind of task that would require an AI that understands natural human language, can be empathetic towards emotions, plan, strategize and make impactful decisions under unpredictable circumstances.
This would be the kind of machine that can do anything a human can do, and it doesn’t exist—at least, not yet.
We’re Getting There
Let’s not forget how quickly AI is evolving. Just because it’s difficult to fully automate teaching, it doesn’t mean the world’s leading AI experts aren’t trying.
Meet Jill Watson, the teaching assistant from Georgia Institute of Technology. Watson isn’t your average TA. She’s an IBM-powered artificial intelligence that is being implemented in universities around the world. Watson is able to answer students’ questions with 97 percent certainty.
Technologies like this also have applications in grading and providing feedback. Some AI algorithms are being trained and refined to perform automatic essay scoring. One project has achieved a 0.945 correlation with human graders.
All of this will have a remarkable impact on online education as we know it and dramatically increase online student retention rates.

Any student with a smartphone can access a wealth of information and free courses from universities around the world. MOOCs have allowed valuable courses to become available to millions of students. But at the moment, not all participants can receive customized feedback for their work. Currently, this is limited by manpower, but in the future that may not be the case.
What chatbots like Jill Watson allow is the opportunity for hundreds of thousands, if not millions, of students to have their work reviewed and all their questions answered at a minimal cost.
AI algorithms also have a significant role to play in personalization of education. Every student is unique and has a different set of strengths and weaknesses. Data analysis can be used to improve individual student results, assess each student’s strengths and weaknesses, and create mass-customized programs. Algorithms can analyze student data and consequently make flexible programs that adapt to the learner based on real-time feedback. According to the McKinsey Global Institute, all of this data in education could unlock between $900 billion and $1.2 trillion in global economic value.
Beyond Automated Teaching
It’s important to recognize that technological automation alone won’t fix the many issues in our global education system today. Dominated by outdated curricula, standardized tests, and an emphasis on short-term knowledge, many experts are calling for a transformation of how we teach.
It is not enough to simply automate the process. We can have a completely digital learning experience that continues to focus on outdated skills and fails to prepare students for the future. In other words, we must not only be innovative with our automation capabilities, but also with educational content, strategy, and policies.
Are we equipping students with the most important survival skills? Are we inspiring young minds to create a better future? Are we meeting the unique learning needs of each and every student? There’s no point automating and digitizing a system that is already flawed. We need to ensure the system that is being digitized is itself being transformed for the better.
Stock Media provided by davincidig / Pond5 Continue reading

Posted in Human Robots

#430667 Welcome to a More Discoverable ...

This week we’ve rolled out our first major round of improvements to Singularity Hub since our ground-up redesign last December. If we did it right, you’ll find that discovering the technological goodies you come here for is much easier, and so too are other Singularity University offerings you might be interested in.
The first and most major change is in the way Hub’s navigation is structured.
The previous categories in our header (Tech, Future, Health, Science) have been replaced by a single page, Topics, which profiles the most popular tech topics across our site. The featured topics in this menu will be updated regularly based on article performance, so you can keep up with what’s trending in AI, biotech, neuroscience, robotics, or whatever is making the biggest splash most recently.
Rolling our hottest topic category tags into one header dropdown allowed us to create greater focus on some of our newest and best offerings.
Our header now prominently features In Focus, which includes articles on how leaders can make the most of today’s accelerating pace of change by learning to think like futurists, innovators, technologists, and humanitarians. We’ve always been technological optimists, and we want to to make it easy for leaders to find the stories that help make hopeful problem-solvers of us all.
We’ve added a section for Experts, which features leaders in the Singularity University community and showcases their thought leadership including interviews and books. In Events, we highlight Singularity University’s global library of local happenings and summits.
Lastly, we’re excited that our growing original video efforts—from our Ray Kurzweil series to our weekly tech news roundup posts—now live under a central Videos section on Hub. This also gives us a place to highlight our favorite video posts from around the web, including the sci-fi shorts we love so much.
Cruising through the rest of Hub, particularly our homepage, you’ll find a much greater variety of content options, including new stories, top stories, event coverage, and videos. In short, it’s everything a homepage should be. On posts, we’ve tried to keep things as clean as possible, and we put a lot of hours into laboriously streamlining our content tagging structure, making it much easier for you to click through category tags into other stories you might like.

Here’s what @singularityhub looked like 2 years ago, 2 weeks ago, & today. Check it out: https://t.co/7cmlTJwc7d pic.twitter.com/jDayIEIFNv
— Singularity Hub (@singularityhub) July 13, 2017

You’ll also see greater visibility into Singularity University events, along with clearer ways to keep up with Hub and SU both, from simple email newsletter signups to callouts for the SingularityU Hub iOS app and events like SU’s Experts on Air series.
We hope you enjoy the ever-evolving, ever-improving Singularity Hub, and we’d love to hear your feedback. Feel free to tweet us, and let us know your thoughts. You can also pitch us or email us. And as always, thank you for your support. Continue reading

Posted in Human Robots

#430579 What These Lifelike Androids Can Teach ...

For Dr. Hiroshi Ishiguro, one of the most interesting things about androids is the changing questions they pose us, their creators, as they evolve. Does it, for example, do something to the concept of being human if a human-made creation starts telling you about what kind of boys ‘she’ likes?
If you want to know the answer to the boys question, you need to ask ERICA, one of Dr. Ishiguro’s advanced androids. Beneath her plastic skull and silicone skin, wires connect to AI software systems that bring her to life. Her ability to respond goes far beyond standard inquiries. Spend a little time with her, and the feeling of a distinct personality starts to emerge. From time to time, she works as a receptionist at Dr. Ishiguro and his team’s Osaka University labs. One of her android sisters is an actor who has starred in plays and a film.

ERICA’s ‘brother’ is an android version of Dr. Ishiguro himself, which has represented its creator at various events while the biological Ishiguro can remain in his offices in Japan. Microphones and cameras capture Ishiguro’s voice and face movements, which are relayed to the android. Apart from mimicking its creator, the Geminoid™ android is also capable of lifelike blinking, fidgeting, and breathing movements.
Say hello to relaxation
As technological development continues to accelerate, so do the possibilities for androids. From a position as receptionist, ERICA may well branch out into many other professions in the coming years. Companion for the elderly, comic book storyteller (an ancient profession in Japan), pop star, conversational foreign language partner, and newscaster are some of the roles and responsibilities Dr. Ishiguro sees androids taking on in the near future.
“Androids are not uncanny anymore. Most people adapt to interacting with Erica very quickly. Actually, I think that in interacting with androids, which are still different from us, we get a better appreciation of interacting with other cultures. In both cases, we are talking with someone who is different from us and learn to overcome those differences,” he says.
A lot has been written about how robots will take our jobs. Dr. Ishiguro believes these fears are blown somewhat out of proportion.
“Robots and androids will take over many simple jobs. Initially there might be some job-related issues, but new schemes, like for example a robot tax similar to the one described by Bill Gates, should help,” he says.
“Androids will make it possible for humans to relax and keep evolving. If we compare the time we spend studying now compared to 100 years ago, it has grown a lot. I think it needs to keep growing if we are to keep expanding our scientific and technological knowledge. In the future, we might end up spending 20 percent of our lifetime on work and 80 percent of the time on education and growing our skills.”
Android asks who you are
For Dr. Ishiguro, another aspect of robotics in general, and androids in particular, is how they question what it means to be human.
“Identity is a very difficult concept for humans sometimes. For example, I think clothes are part of our identity, in a way that is similar to our faces and bodies. We don’t change those from one day to the next, and that is why I have ten matching black outfits,” he says.
This link between physical appearance and perceived identity is one of the aspects Dr. Ishiguro is exploring. Another closely linked concept is the connection between body and feeling of self. The Ishiguro avatar was once giving a presentation in Austria. Its creator recalls how he felt distinctly like he was in Austria, even capable of feeling sensation of touch on his own body when people laid their hands on the android. If he was distracted, he felt almost ‘sucked’ back into his body in Japan.
“I am constantly thinking about my life in this way, and I believe that androids are a unique mirror that helps us formulate questions about why we are here and why we have been so successful. I do not necessarily think I have found the answers to these questions, so if you have, please share,” he says with a laugh.
His work and these questions, while extremely interesting on their own, become extra poignant when considering the predicted melding of mind and machine in the near future.
The ability to be present in several locations through avatars—virtual or robotic—raises many questions of both philosophical and practical nature. Then add the hypotheticals, like why send a human out onto the hostile surface of Mars if you could send a remote-controlled android, capable of relaying everything it sees, hears and feels?
The two ways of robotics will meet
Dr. Ishiguro sees the world of AI-human interaction as currently roughly split into two. One is the chat-bot approach that companies like Amazon, Microsoft, Google, and recently Apple, employ using stationary objects like speakers. Androids like ERICA represent another approach.
“It is about more than the form factor. I think that the android approach is generally more story-based. We are integrating new conversation features based on assumptions about the situation and running different scenarios that expand the android’s vocabulary and interactions. Another aspect we are working on is giving androids desire and intention. Like with people, androids should have desires and intentions in order for you to want to interact with them over time,” Dr. Ishiguro explains.
This could be said to be part of a wider trend for Japan, where many companies are developing human-like robots that often have some Internet of Things capabilities, making them able to handle some of the same tasks as an Amazon Echo. The difference in approach could be summed up in the words ‘assistant’ (Apple, Amazon, etc.) and ‘companion’ (Japan).
Dr. Ishiguro sees this as partly linked to how Japanese as a language—and market—is somewhat limited. This has a direct impact on viability and practicality of ‘pure’ voice recognition systems. At the same time, Japanese people have had greater exposure to positive images of robots, and have a different cultural / religious view of objects having a ‘soul’. However, it may also mean Japanese companies and android scientists are both stealing a lap on their western counterparts.
“If you speak to an Amazon Echo, that is not a natural way to interact for humans. This is part of why we are making human-like robot systems. The human brain is set up to recognize and interact with humans. So, it makes sense to focus on developing the body for the AI mind, as well as the AI. I believe that the final goal for both Japanese and other companies and scientists is to create human-like interaction. Technology has to adapt to us, because we cannot adapt fast enough to it, as it develops so quickly,” he says.
Banner image courtesy of Hiroshi Ishiguro Laboratories, ATR all rights reserved.
Dr. Ishiguro’s team has collaborated with partners and developed a number of android systems:
Geminoid™ HI-2 has been developed by Hiroshi Ishiguro Laboratories and Advanced Telecommunications Research Institute International (ATR).
Geminoid™ F has been developed by Osaka University and Hiroshi Ishiguro Laboratories, Advanced Telecommunications Research Institute International (ATR).
ERICA has been developed by ERATO ISHIGURO Symbiotic Human-Robot Interaction Project Continue reading

Posted in Human Robots

#428452 Weird flailing Japanese Robot

According to Professor Ikegami in Tokyo Japan, this self-evolving robot is the precursor of a new robotic life form. The android contains sensors and artificial intelligence software and, although rudimentary today, the technology allows it to perceive the outside world … Continue reading

Posted in Human Robots