Tag Archives: hacker

#437709 iRobot Announces Major Software Update, ...

Since the release of the very first Roomba in 2002, iRobot’s long-term goal has been to deliver cleaner floors in a way that’s effortless and invisible. Which sounds pretty great, right? And arguably, iRobot has managed to do exactly this, with its most recent generation of robot vacuums that make their own maps and empty their own dustbins. For those of us who trust our robots, this is awesome, but iRobot has gradually been realizing that many Roomba users either don’t want this level of autonomy, or aren’t ready for it.

Today, iRobot is announcing a major new update to its app that represents a significant shift of its overall approach to home robot autonomy. Humans are being brought back into the loop through software that tries to learn when, where, and how you clean so that your Roomba can adapt itself to your life rather than the other way around.

To understand why this is such a shift for iRobot, let’s take a very brief look back at how the Roomba interface has evolved over the last couple of decades. The first generation of Roomba had three buttons on it that allowed (or required) the user to select whether the room being vacuumed was small or medium or large in size. iRobot ditched that system one generation later, replacing the room size buttons with one single “clean” button. Programmable scheduling meant that users no longer needed to push any buttons at all, and with Roombas able to find their way back to their docking stations, all you needed to do was empty the dustbin. And with the most recent few generations (the S and i series), the dustbin emptying is also done for you, reducing direct interaction with the robot to once a month or less.

Image: iRobot

iRobot CEO Colin Angle believes that working toward more intelligent human-robot collaboration is “the brave new frontier” of AI. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” he says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The point that the top-end Roombas are at now reflects a goal that iRobot has been working toward since 2002: With autonomy, scheduling, and the clean base to empty the bin, you can set up your Roomba to vacuum when you’re not home, giving you cleaner floors every single day without you even being aware that the Roomba is hard at work while you’re out. It’s not just hands-off, it’s brain-off. No noise, no fuss, just things being cleaner thanks to the efforts of a robot that does its best to be invisible to you. Personally, I’ve been completely sold on this idea for home robots, and iRobot CEO Colin Angle was as well.

“I probably told you that the perfect Roomba is the Roomba that you never see, you never touch, you just come home everyday and it’s done the right thing,” Angle told us. “But customers don’t want that—they want to be able to control what the robot does. We started to hear this a couple years ago, and it took a while before it sunk in, but it made sense.”

How? Angle compares it to having a human come into your house to clean, but you weren’t allowed to tell them where or when to do their job. Maybe after a while, you’ll build up the amount of trust necessary for that to work, but in the short term, it would likely be frustrating. And people get frustrated with their Roombas for this reason. “The desire to have more control over what the robot does kept coming up, and for me, it required a pretty big shift in my view of what intelligence we were trying to build. Autonomy is not intelligence. We need to do something more.”

That something more, Angle says, is a partnership as opposed to autonomy. It’s an acknowledgement that not everyone has the same level of trust in robots as the people who build them. It’s an understanding that people want to have a feeling of control over their homes, that they have set up the way that they want, and that they’ve been cleaning the way that they want, and a robot shouldn’t just come in and do its own thing.

This change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware.

“Until the robot proves that it knows enough about your home and about the way that you want your home cleaned,” Angle says, “you can’t move forward.” He adds that this is one of those things that seem obvious in retrospect, but even if they’d wanted to address the issue before, they didn’t have the technology to solve the problem. Now they do. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” Angle says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

Where to Clean
Knowing where to clean depends on your Roomba having a detailed and accurate map of its environment. For several generations now, Roombas have been using visual mapping and localization (VSLAM) to build persistent maps of your home. These maps have been used to tell the Roomba to clean in specific rooms, but that’s about it. With the new update, Roombas with cameras will be able to recognize some objects and features in your home, including chairs, tables, couches, and even countertops. The robots will use these features to identify where messes tend to happen so that they can focus on those areas—like around the dining room table or along the front of the couch.

We should take a minute here to clarify how the Roomba is using its camera. The original (primary?) purpose of the camera was for VSLAM, where the robot would take photos of your home, downsample them into QR-code-like patterns of light and dark, and then use those (with the assistance of other sensors) to navigate. Now the camera is also being used to take pictures of other stuff around your house to make that map more useful.

Photo: iRobot

The robots will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table.

This is done through machine learning using a library of images of common household objects from a floor perspective that iRobot had to develop from scratch. Angle clarified for us that this is all done via a neural net that runs on the robot, and that “no recognizable images are ever stored on the robot or kept, and no images ever leave the robot.” Worst case, if all the data iRobot has about your home gets somehow stolen, the hacker would only know that (for example) your dining room has a table in it and the approximate size and location of that table, because the map iRobot has of your place only stores symbolic representations rather than images.

Another useful new feature is intended to help manage the “evil Roomba places” (as Angle puts it) that every home has that cause Roombas to get stuck. If the place is evil enough that Roomba has to call you for help because it gave up completely, Roomba will now remember, and suggest that either you make some changes or that it stops cleaning there, which seems reasonable.

When to Clean
It turns out that the primary cause of mission failure for Roombas is not that they get stuck or that they run out of battery—it’s user cancellation, usually because the robot is getting in the way or being noisy when you don’t want it to be. “If you kill a Roomba’s job because it annoys you,” points out Angle, “how is that robot being a good partner? I think it’s an epic fail.” Of course, it’s not the robot’s fault, because Roombas only clean when we tell them to, which Angle says is part of the problem. “People actually aren’t very good at making their own schedules—they tend to oversimplify, and not think through what their schedules are actually about, which leads to lots of [figurative] Roomba death.”

To help you figure out when the robot should actually be cleaning, the new app will look for patterns in when you ask the robot to clean, and then recommend a schedule based on those patterns. That might mean the robot cleans different areas at different times every day of the week. The app will also make scheduling recommendations that are event-based as well, integrated with other smart home devices. Would you prefer the Roomba to clean every time you leave the house? The app can integrate with your security system (or garage door, or any number of other things) and take care of that for you.

More generally, Roomba will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table. The app will also, to some extent, pay attention to the environment and season. It might suggest increasing your vacuuming frequency if pollen counts are especially high, or if it’s pet shedding season and you have a dog. Unfortunately, Roomba isn’t (yet?) capable of recognizing dogs on its own, so the app has to cheat a little bit by asking you some basic questions.

A Smarter App

Image: iRobot

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

The app update, which should be available starting today, is free. The scheduling and recommendations will work on every Roomba model, although for object recognition and anything related to mapping, you’ll need one of the more recent and fancier models with a camera. Future app updates will happen on a more aggressive schedule. Major app releases should happen every six months, with incremental updates happening even more frequently than that.

Angle also told us that overall, this change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware. “It’s not like we’re done doing hardware,” Angle assured us. “But we do think about hardware differently. We view our robots as platforms that have longer life cycles, and each platform will be able to support multiple generations of software. We’ve kind of decoupled robot intelligence from hardware, and that’s a change.”

Angle believes that working toward more intelligent collaboration between humans and robots is “the brave new frontier of artificial intelligence. I expect it to be the frontier for a reasonable amount of time to come,” he adds. “We have a lot of work to do to create the type of easy-to-use experience that consumer robots need.” Continue reading

Posted in Human Robots

#434260 The Most Surprising Tech Breakthroughs ...

Development across the entire information technology landscape certainly didn’t slow down this year. From CRISPR babies, to the rapid decline of the crypto markets, to a new robot on Mars, and discovery of subatomic particles that could change modern physics as we know it, there was no shortage of headline-grabbing breakthroughs and discoveries.

As 2018 comes to a close, we can pause and reflect on some of the biggest technology breakthroughs and scientific discoveries that occurred this year.

I reached out to a few Singularity University speakers and faculty across the various technology domains we cover asking what they thought the biggest breakthrough was in their area of expertise. The question posed was:

“What, in your opinion, was the biggest development in your area of focus this year? Or, what was the breakthrough you were most surprised by in 2018?”

I can share that for me, hands down, the most surprising development I came across in 2018 was learning that a publicly-traded company that was briefly valued at over $1 billion, and has over 12,000 employees and contractors spread around the world, has no physical office space and the entire business is run and operated from inside an online virtual world. This is Ready Player One stuff happening now.

For the rest, here’s what our experts had to say.

DIGITAL BIOLOGY
Dr. Tiffany Vora | Faculty Director and Vice Chair, Digital Biology and Medicine, Singularity University

“That’s easy: CRISPR babies. I knew it was technically possible, and I’ve spent two years predicting it would happen first in China. I knew it was just a matter of time but I failed to predict the lack of oversight, the dubious consent process, the paucity of publicly-available data, and the targeting of a disease that we already know how to prevent and treat and that the children were at low risk of anyway.

I’m not convinced that this counts as a technical breakthrough, since one of the girls probably isn’t immune to HIV, but it sure was a surprise.”

For more, read Dr. Vora’s summary of this recent stunning news from China regarding CRISPR-editing human embryos.

QUANTUM COMPUTING
Andrew Fursman | Co-Founder/CEO 1Qbit, Faculty, Quantum Computing, Singularity University

“There were two last-minute holiday season surprise quantum computing funding and technology breakthroughs:

First, right before the government shutdown, one priority legislative accomplishment will provide $1.2 billion in quantum computing research over the next five years. Second, there’s the rise of ions as a truly viable, scalable quantum computing architecture.”

*Read this Gizmodo profile on an exciting startup in the space to learn more about this type of quantum computing

ENERGY
Ramez Naam | Chair, Energy and Environmental Systems, Singularity University

“2018 had plenty of energy surprises. In solar, we saw unsubsidized prices in the sunny parts of the world at just over two cents per kwh, or less than half the price of new coal or gas electricity. In the US southwest and Texas, new solar is also now cheaper than new coal or gas. But even more shockingly, in Germany, which is one of the least sunny countries on earth (it gets less sunlight than Canada) the average bid for new solar in a 2018 auction was less than 5 US cents per kwh. That’s as cheap as new natural gas in the US, and far cheaper than coal, gas, or any other new electricity source in most of Europe.

In fact, it’s now cheaper in some parts of the world to build new solar or wind than to run existing coal plants. Think tank Carbon Tracker calculates that, over the next 10 years, it will become cheaper to build new wind or solar than to operate coal power in most of the world, including specifically the US, most of Europe, and—most importantly—India and the world’s dominant burner of coal, China.

Here comes the sun.”

GLOBAL GRAND CHALLENGES
Darlene Damm | Vice Chair, Faculty, Global Grand Challenges, Singularity University

“In 2018 we saw a lot of areas in the Global Grand Challenges move forward—advancements in robotic farming technology and cultured meat, low-cost 3D printed housing, more sophisticated types of online education expanding to every corner of the world, and governments creating new policies to deal with the ethics of the digital world. These were the areas we were watching and had predicted there would be change.

What most surprised me was to see young people, especially teenagers, start to harness technology in powerful ways and use it as a platform to make their voices heard and drive meaningful change in the world. In 2018 we saw teenagers speak out on a number of issues related to their well-being and launch digital movements around issues such as gun and school safety, global warming and environmental issues. We often talk about the harm technology can cause to young people, but on the flip side, it can be a very powerful tool for youth to start changing the world today and something I hope we see more of in the future.”

BUSINESS STRATEGY
Pascal Finette | Chair, Entrepreneurship and Open Innovation, Singularity University

“Without a doubt the rapid and massive adoption of AI, specifically deep learning, across industries, sectors, and organizations. What was a curiosity for most companies at the beginning of the year has quickly made its way into the boardroom and leadership meetings, and all the way down into the innovation and IT department’s agenda. You are hard-pressed to find a mid- to large-sized company today that is not experimenting or implementing AI in various aspects of its business.

On the slightly snarkier side of answering this question: The very rapid decline in interest in blockchain (and cryptocurrencies). The blockchain party was short, ferocious, and ended earlier than most would have anticipated, with a huge hangover for some. The good news—with the hot air dissipated, we can now focus on exploring the unique use cases where blockchain does indeed offer real advantages over centralized approaches.”

*Author note: snark is welcome and appreciated

ROBOTICS
Hod Lipson | Director, Creative Machines Lab, Columbia University

“The biggest surprise for me this year in robotics was learning dexterity. For decades, roboticists have been trying to understand and imitate dexterous manipulation. We humans seem to be able to manipulate objects with our fingers with incredible ease—imagine sifting through a bunch of keys in the dark, or tossing and catching a cube. And while there has been much progress in machine perception, dexterous manipulation remained elusive.

There seemed to be something almost magical in how we humans can physically manipulate the physical world around us. Decades of research in grasping and manipulation, and millions of dollars spent on robot-hand hardware development, has brought us little progress. But in late 2018, the Berkley OpenAI group demonstrated that this hurdle may finally succumb to machine learning as well. Given 200 years worth of practice, machines learned to manipulate a physical object with amazing fluidity. This might be the beginning of a new age for dexterous robotics.”

MACHINE LEARNING
Jeremy Howard | Founding Researcher, fast.ai, Founder/CEO, Enlitic, Faculty Data Science, Singularity University

“The biggest development in machine learning this year has been the development of effective natural language processing (NLP).

The New York Times published an article last month titled “Finally, a Machine That Can Finish Your Sentence,” which argued that NLP neural networks have reached a significant milestone in capability and speed of development. The “finishing your sentence” capability mentioned in the title refers to a type of neural network called a “language model,” which is literally a model that learns how to finish your sentences.

Earlier this year, two systems (one, called ELMO, is from the Allen Institute for AI, and the other, called ULMFiT, was developed by me and Sebastian Ruder) showed that such a model could be fine-tuned to dramatically improve the state-of-the-art in nearly every NLP task that researchers study. This work was further developed by OpenAI, which in turn was greatly scaled up by Google Brain, who created a system called BERT which reached human-level performance on some of NLP’s toughest challenges.

Over the next year, expect to see fine-tuned language models used for everything from understanding medical texts to building disruptive social media troll armies.”

DIGITAL MANUFACTURING
Andre Wegner | Founder/CEO Authentise, Chair, Digital Manufacturing, Singularity University

“Most surprising to me was the extent and speed at which the industry finally opened up.

While previously, only few 3D printing suppliers had APIs and knew what to do with them, 2018 saw nearly every OEM (or original equipment manufacturer) enabling data access and, even more surprisingly, shying away from proprietary standards and adopting MTConnect, as stalwarts such as 3D Systems and Stratasys have been. This means that in two to three years, data access to machines will be easy, commonplace, and free. The value will be in what is being done with that data.

Another example of this openness are the seemingly endless announcements of integrated workflows: GE’s announcement with most major software players to enable integrated solutions, EOS’s announcement with Siemens, and many more. It’s clear that all actors in the additive ecosystem have taken a step forward in terms of openness. The result is a faster pace of innovation, particularly in the software and data domains that are crucial to enabling comprehensive digital workflow to drive agile and resilient manufacturing.

I’m more optimistic we’ll achieve that now than I was at the end of 2017.”

SCIENCE AND DISCOVERY
Paul Saffo | Chair, Future Studies, Singularity University, Distinguished Visiting Scholar, Stanford Media-X Research Network

“The most important development in technology this year isn’t a technology, but rather the astonishing science surprises made possible by recent technology innovations. My short list includes the discovery of the “neptmoon”, a Neptune-scale moon circling a Jupiter-scale planet 8,000 lightyears from us; the successful deployment of the Mars InSight Lander a month ago; and the tantalizing ANITA detection (what could be a new subatomic particle which would in turn blow the standard model wide open). The highest use of invention is to support science discovery, because those discoveries in turn lead us to the future innovations that will improve the state of the world—and fire up our imaginations.”

ROBOTICS
Pablos Holman | Inventor, Hacker, Faculty, Singularity University

“Just five or ten years ago, if you’d asked any of us technologists “What is harder for robots? Eyes, or fingers?” We’d have all said eyes. Robots have extraordinary eyes now, but even in a surgical robot, the fingers are numb and don’t feel anything. Stanford robotics researchers have invented fingertips that can feel, and this will be a kingpin that allows robots to go everywhere they haven’t been yet.”

BLOCKCHAIN
Nathana Sharma | Blockchain, Policy, Law, and Ethics, Faculty, Singularity University

“2017 was the year of peak blockchain hype. 2018 has been a year of resetting expectations and technological development, even as the broader cryptocurrency markets have faced a winter. It’s now about seeing adoption and applications that people want and need to use rise. An incredible piece of news from December 2018 is that Facebook is developing a cryptocurrency for users to make payments through Whatsapp. That’s surprisingly fast mainstream adoption of this new technology, and indicates how powerful it is.”

ARTIFICIAL INTELLIGENCE
Neil Jacobstein | Chair, Artificial Intelligence and Robotics, Singularity University

“I think one of the most visible improvements in AI was illustrated by the Boston Dynamics Parkour video. This was not due to an improvement in brushless motors, accelerometers, or gears. It was due to improvements in AI algorithms and training data. To be fair, the video released was cherry-picked from numerous attempts, many of which ended with a crash. However, the fact that it could be accomplished at all in 2018 was a real win for both AI and robotics.”

NEUROSCIENCE
Divya Chander | Chair, Neuroscience, Singularity University

“2018 ushered in a new era of exponential trends in non-invasive brain modulation. Changing behavior or restoring function takes on a new meaning when invasive interfaces are no longer needed to manipulate neural circuitry. The end of 2018 saw two amazing announcements: the ability to grow neural organoids (mini-brains) in a dish from neural stem cells that started expressing electrical activity, mimicking the brain function of premature babies, and the first (known) application of CRISPR to genetically alter two fetuses grown through IVF. Although this was ostensibly to provide genetic resilience against HIV infections, imagine what would happen if we started tinkering with neural circuitry and intelligence.”

Image Credit: Yurchanka Siarhei / Shutterstock.com Continue reading

Posted in Human Robots

#430814 The Age of Cyborgs Has Arrived

How many cyborgs did you see during your morning commute today? I would guess at least five. Did they make you nervous? Probably not; you likely didn’t even realize they were there.
In a presentation titled “Biohacking and the Connected Body” at Singularity University Global Summit, Hannes Sjoblad informed the audience that we’re already living in the age of cyborgs. Sjoblad is co-founder of the Sweden-based biohacker network Bionyfiken, a chartered non-profit that unites DIY-biologists, hackers, makers, body modification artists and health and performance devotees to explore human-machine integration.
Sjoblad said the cyborgs we see today don’t look like Hollywood prototypes; they’re regular people who have integrated technology into their bodies to improve or monitor some aspect of their health. Sjoblad defined biohacking as applying hacker ethic to biological systems. Some biohackers experiment with their biology with the goal of taking the human body’s experience beyond what nature intended.
Smart insulin monitoring systems, pacemakers, bionic eyes, and Cochlear implants are all examples of biohacking, according to Sjoblad. He told the audience, “We live in a time where, thanks to technology, we can make the deaf hear, the blind see, and the lame walk.” He is convinced that while biohacking could conceivably end up having Brave New World-like dystopian consequences, it can also be leveraged to improve and enhance our quality of life in multiple ways.
The field where biohacking can make the most positive impact is health. In addition to pacemakers and insulin monitors, several new technologies are being developed with the goal of improving our health and simplifying access to information about our bodies.
Ingestibles are a type of smart pill that use wireless technology to monitor internal reactions to medications, helping doctors determine optimum dosage levels and tailor treatments to different people. Your body doesn’t absorb or process medication exactly as your neighbor’s does, so shouldn’t you each have a treatment that works best with your unique system? Colonoscopies and endoscopies could one day be replaced by miniature pill-shaped video cameras that would collect and transmit images as they travel through the digestive tract.
Singularity University Global Summit is the culmination of the Exponential Conference Series and the definitive place to witness converging exponential technologies and understand how they’ll impact the world.
Security is another area where biohacking could be beneficial. One example Sjoblad gave was personalization of weapons: an invader in your house couldn’t fire your gun because it will have been matched to your fingerprint or synced with your body so that it only responds to you.
Biohacking can also simplify everyday tasks. In an impressive example of walking the walk rather than just talking the talk, Sjoblad had an NFC chip implanted in his hand. The chip contains data from everything he used to have to carry around in his pockets: credit and bank card information, key cards to enter his office building and gym, business cards, and frequent shopper loyalty cards. When he’s in line for a morning coffee or rushing to get to the office on time, he doesn’t have to root around in his pockets or bag to find the right card or key; he just waves his hand in front of a sensor and he’s good to go.
Evolved from radio frequency identification (RFID)—an old and widely distributed technology—NFC chips are activated by another chip, and small amounts of data can be transferred back and forth. No wireless connection is necessary. Sjoblad sees his NFC implant as a personal key to the Internet of Things, a simple way for him to talk to the smart, connected devices around him.
Sjoblad isn’t the only person who feels a need for connection.

When British science writer Frank Swain realized he was going to go deaf, he decided to hack his hearing to be able to hear Wi-Fi. Swain developed software that tunes into wireless communication fields and uses an inbuilt Wi-Fi sensor to pick up router name, encryption modes and distance from the device. This data is translated into an audio stream where distant signals click or pop, and strong signals sound their network ID in a looped melody. Swain hears it all through an upgraded hearing aid.
Global datastreams can also become sensory experiences. Spanish artist Moon Ribas developed and implanted a chip in her elbow that is connected to the global monitoring system for seismographic sensors; each time there’s an earthquake, she feels it through vibrations in her arm.
You can feel connected to our planet, too: North Sense makes a “standalone artificial sensory organ” that connects to your body and vibrates whenever you’re facing north. It’s a built-in compass; you’ll never get lost again.
Biohacking applications are likely to proliferate in the coming years, some of them more useful than others. But there are serious ethical questions that can’t be ignored during development and use of this technology. To what extent is it wise to tamper with nature, and who gets to decide?
Most of us are probably ok with waiting in line an extra 10 minutes or occasionally having to pull up a maps app on our phone if it means we don’t need to implant computer chips into our forearms. If it’s frightening to think of criminals stealing our wallets, imagine them cutting a chunk of our skin out to have instant access to and control over our personal data. The physical invasiveness and potential for something to go wrong seems to far outweigh the benefits the average person could derive from this technology.
But that may not always be the case. It’s worth noting the miniaturization of technology continues at a quick rate, and the smaller things get, the less invasive (and hopefully more useful) they’ll be. Even today, there are people already sensibly benefiting from biohacking. If you look closely enough, you’ll spot at least a couple cyborgs on your commute tomorrow morning.
Image Credit:Movement Control Laboratory/University of Washington – Deep Dream Generator Continue reading

Posted in Human Robots