Tag Archives: play
As Dorothy famously said in The Wizard of Oz, there’s no place like home. Home is where we go to rest and recharge. It’s familiar, comfortable, and our own. We take care of our homes by cleaning and maintaining them, and fixing things that break or go wrong.
What if our homes, on top of giving us shelter, could also take care of us in return?
According to Chris Arkenberg, this could be the case in the not-so-distant future. As part of Singularity University’s Experts On Air series, Arkenberg gave a talk called “How the Intelligent Home of The Future Will Care For You.”
Arkenberg is a research and strategy lead at Orange Silicon Valley, and was previously a research fellow at the Deloitte Center for the Edge and a visiting researcher at the Institute for the Future.
Arkenberg told the audience that there’s an evolution going on: homes are going from being smart to being connected, and will ultimately become intelligent.
Intelligent home technologies are just now budding, but broader trends point to huge potential for their growth. We as consumers already expect continuous connectivity wherever we go—what do you mean my phone won’t get reception in the middle of Yosemite? What do you mean the smart TV is down and I can’t stream Game of Thrones?
As connectivity has evolved from a privilege to a basic expectation, Arkenberg said, we’re also starting to have a better sense of what it means to give up our data in exchange for services and conveniences. It’s so easy to click a few buttons on Amazon and have stuff show up at your front door a few days later—never mind that data about your purchases gets recorded and aggregated.
“Right now we have single devices that are connected,” Arkenberg said. “Companies are still trying to show what the true value is and how durable it is beyond the hype.”
Connectivity is the basis of an intelligent home. To take a dumb object and make it smart, you get it online. Belkin’s Wemo, for example, lets users control lights and appliances wirelessly and remotely, and can be paired with Amazon Echo or Google Home for voice-activated control.
Speaking of voice-activated control, Arkenberg pointed out that physical interfaces are evolving, too, to the point that we’re actually getting rid of interfaces entirely, or transitioning to ‘soft’ interfaces like voice or gesture.
Drivers of change
Consumers are open to smart home tech and companies are working to provide it. But what are the drivers making this tech practical and affordable? Arkenberg said there are three big ones:
Computation: Computers have gotten exponentially more powerful over the past few decades. If it wasn’t for processors that could handle massive quantities of information, nothing resembling an Echo or Alexa would even be possible. Artificial intelligence and machine learning are powering these devices, and they hinge on computing power too.
Sensors: “There are more things connected now than there are people on the planet,” Arkenberg said. Market research firm Gartner estimates there are 8.4 billion connected things currently in use. Wherever digital can replace hardware, it’s doing so. Cheaper sensors mean we can connect more things, which can then connect to each other.
Data: “Data is the new oil,” Arkenberg said. “The top companies on the planet are all data-driven giants. If data is your business, though, then you need to keep finding new ways to get more and more data.” Home assistants are essentially data collection systems that sit in your living room and collect data about your life. That data in turn sets up the potential of machine learning.
Colonizing the Living Room
Alexa and Echo can turn lights on and off, and Nest can help you be energy-efficient. But beyond these, what does an intelligent home really look like?
Arkenberg’s vision of an intelligent home uses sensing, data, connectivity, and modeling to manage resource efficiency, security, productivity, and wellness.
Autonomous vehicles provide an interesting comparison: they’re surrounded by sensors that are constantly mapping the world to build dynamic models to understand the change around itself, and thereby predict things. Might we want this to become a model for our homes, too? By making them smart and connecting them, Arkenberg said, they’d become “more biological.”
There are already several products on the market that fit this description. RainMachine uses weather forecasts to adjust home landscape watering schedules. Neurio monitors energy usage, identifies areas where waste is happening, and makes recommendations for improvement.
These are small steps in connecting our homes with knowledge systems and giving them the ability to understand and act on that knowledge.
He sees the homes of the future being equipped with digital ears (in the form of home assistants, sensors, and monitoring devices) and digital eyes (in the form of facial recognition technology and machine vision to recognize who’s in the home). “These systems are increasingly able to interrogate emotions and understand how people are feeling,” he said. “When you push more of this active intelligence into things, the need for us to directly interface with them becomes less relevant.”
Could our homes use these same tools to benefit our health and wellness? FREDsense uses bacteria to create electrochemical sensors that can be applied to home water systems to detect contaminants. If that’s not personal enough for you, get a load of this: ClinicAI can be installed in your toilet bowl to monitor and evaluate your biowaste. What’s the point, you ask? Early detection of colon cancer and other diseases.
What if one day, your toilet’s biowaste analysis system could link up with your fridge, so that when you opened it it would tell you what to eat, and how much, and at what time of day?
Roadblocks to intelligence
“The connected and intelligent home is still a young category trying to establish value, but the technological requirements are now in place,” Arkenberg said. We’re already used to living in a world of ubiquitous computation and connectivity, and we have entrained expectations about things being connected. For the intelligent home to become a widespread reality, its value needs to be established and its challenges overcome.
One of the biggest challenges will be getting used to the idea of continuous surveillance. We’ll get convenience and functionality if we give up our data, but how far are we willing to go? Establishing security and trust is going to be a big challenge moving forward,” Arkenberg said.
There’s also cost and reliability, interoperability and fragmentation of devices, or conversely, what Arkenberg called ‘platform lock-on,’ where you’d end up relying on only one provider’s system and be unable to integrate devices from other brands.
Ultimately, Arkenberg sees homes being able to learn about us, manage our scheduling and transit, watch our moods and our preferences, and optimize our resource footprint while predicting and anticipating change.
“This is the really fascinating provocation of the intelligent home,” Arkenberg said. “And I think we’re going to start to see this play out over the next few years.”
Sounds like a home Dorothy wouldn’t recognize, in Kansas or anywhere else.
Stock Media provided by adam121 / Pond5 Continue reading
On the wall of Aaron Dollar's office is a poster for R.U.R. (Rossum's Universal Robots), the 1920 Czech play that gave us the word "robot." The story ends with the nominal robots seizing control of the factory of their origin and then wiping out nearly all of humanity. Dollar, fortunately, has something more cheerful in mind for the future of human-robot relations. Continue reading
“The behavior of the computer at any moment is determined by the symbols which he is observing and his 'state of mind' at that moment.” – Alan Turing Artificial intelligence has a memory problem. Back in early 2015, Google’s mysterious DeepMind unveiled an algorithm that could teach itself to play Atari games. Based on deep neural nets, the AI impressively mastered nostalgic favorites such as Space Invaders and Pong without needing any explicit programming —… read more Continue reading
As ROS – Robot Operating System is being used by more and more robots, a new form of building robots that uses ROS is coming into play, which is called H-Ros, Hardware Robot Operating System. This is currently supported by several companies that adopt its standard interfaces. Each piece runs ROS 2.0 on its own has its own ROS nodes and topics. Building robots is about putting together different H-ROS components that can come from different manufacturers but still interoperate thanks to the standard hardware interfaces defined within H-ROS. The blocks that make up the system fall into 5 categories, which are, sensing, actuation, communication, cognition and hybrid components. Below is the press release provied to us by Erle Robotics, which is one of the several firms that are currently working on this.
Erle Robotics announced a new platform that provides manufacturers tools for building interoperable robot components that can easily be exchanged between robots
Photo Credit: https://www.h-ros.com/, www.erlerobotics.com
Erle Robotics announced during ROSCon 2016 in Seoul, Korea, a new game-changing standard for building robot components, H-ROS: the Hardware Robot Operating System. H-ROS provides manufacturers tools for building interoperable robot components that can easily be exchanged or replaced between robots.
Powered by the popular Robot Operating System (ROS), H-ROS offers building-block-style parts that come as reusable and reconfigurable components allowing developers, to easily upgrade their robots with hardware from different manufacturers and add new features in seconds.
With H-ROS, building robots will be about placing H-ROS-compatible hardware components together to build new robot configurations. Constructing robots won’t be restricted to a few with high technical skills but it will be extended to a great majority with a general understanding of the sensing and actuation needed in a particular scenario.
H-ROS was initially funded by the US Defense Advanced Research Projects Agency (DARPA) through the Robotics Fast Track program in 2016 and developed by Erle Robotics. The platform has already been tested by several international manufacturers who have built robots out of this technology. This is the case of H-ROS Turtlebot, which was presented during the conference in Seoul.
H-ROS is now available for selected industry partners and will soon be released for the wider robotics community. Additional information can be requested through its official web page at https://h-ros.com/.
Photo Credit: https://www.h-ros.com/, www.erlerobotics.comPhoto Credit: https://www.h-ros.com/, www.erlerobotics.comPhoto Credit: https://www.h-ros.com/, www.erlerobotics.comPhoto Credit: https://www.h-ros.com/, www.erlerobotics.com
The post H-Ros – Hardware Robot Operating System appeared first on Roboticmagazine. Continue reading