Tag Archives: code

#436491 The Year’s Most Fascinating Tech ...

Last Saturday we took a look at some of the most-read Singularity Hub articles from 2019. This week, we’re featuring some of our favorite articles from the last year. As opposed to short pieces about what’s happening, these are long reads about why it matters and what’s coming next. Some of them make the news while others frame the news, go deep on big ideas, go behind the scenes, or explore the human side of technological progress.

We hope you find them as fascinating, inspiring, and illuminating as we did.

DeepMind and Google: The Battle to Control Artificial Intelligence
Hal Hodson | 1843
“[DeepMind cofounder and CEO Demis] Hassabis thought DeepMind would be a hybrid: it would have the drive of a startup, the brains of the greatest universities, and the deep pockets of one of the world’s most valuable companies. Every element was in place to hasten the arrival of [artificial general intelligence] and solve the causes of human misery.”

The Most Powerful Person in Silicon Valley
Katrina Brooker | Fast Company
“Billionaire Masayoshi Son—not Elon Musk, Jeff Bezos, or Mark Zuckerberg—has the most audacious vision for an AI-powered utopia where machines control how we live. And he’s spending hundreds of billions of dollars to realize it. Are you ready to live in Masa World?”

AR Will Spark the Next Big Tech Platform—Call It Mirrorworld
Kevin Kelly | Wired
“Eventually this melded world will be the size of our planet. It will be humanity’s greatest achievement, creating new levels of wealth, new social problems, and uncountable opportunities for billions of people. There are no experts yet to make this world; you are not late.”

Behind the Scenes of a Radical New Cancer Cure
Ilana Yurkiewicz | Undark
“I remember the first time I watched a patient get his Day 0 infusion. It felt anti-climactic. The entire process took about 15 minutes. The CAR-T cells are invisible to the naked eye, housed in a small plastic bag containing clear liquid. ‘That’s it?’ my patient asked when the nurse said it was over. The infusion part is easy. The hard part is everything that comes next.”

The Promise and Price of Cellular Therapies
Siddhartha Mukherjee | The New Yorker
“We like to imagine medical revolutions as, well, revolutionary—propelled forward through leaps of genius and technological innovation. But they are also evolutionary, nudged forward through the optimization of design and manufacture.”

Impossible Foods’ Rising Empire of Almost Meat
Chris Ip | Engadget
“Impossible says it wants to ultimately create a parallel universe of ersatz animal products from steak to eggs. …Yet as Impossible ventures deeper into the culinary uncanny valley, it also needs society to discard a fundamental cultural idea that dates back millennia and accept a new truth: Meat doesn’t have to come from animals.”

Inside the Amazon Warehouse Where Humans and Machines Become One
Matt Simon | Wired
“Seen from above, the scale of the system is dizzying. My robot, a little orange slab known as a ‘drive’ (or more formally and mythically, Pegasus), is just one of hundreds of its kind swarming a 125,000-square-foot ‘field’ pockmarked with chutes. It’s a symphony of electric whirring, with robots pausing for one another at intersections and delivering their packages to the slides.”

Boston Dynamics’ Robots Are Preparing to Leave the Lab—Is the World Ready?
James Vincent | The Verge
“After decades of kicking machines in parking lots, the company is set to launch its first ever commercial bot later this year: the quadrupedal Spot. It’s a crucial test for a company that’s spent decades pursuing long-sighted R&D. And more importantly, the success—or failure—of Spot will tell us a lot about our own robot future. Are we ready for machines to walk among us?”

I Cut the ‘Big Five’ Tech Giants From My Life. It Was Hell
Kashmir Hill | Gizmodo
“Critics of the big tech companies are often told, ‘If you don’t like the company, don’t use its products.’ I did this experiment to find out if that is possible, and I found out that it’s not—with the exception of Apple. …These companies are unavoidable because they control internet infrastructure, online commerce, and information flows.”

Why I (Still) Love Tech: In Defense of a Difficult Industry
Paul Ford | Wired
“The mysteries of software caught my eye when I was a boy, and I still see it with the same wonder, even though I’m now an adult. Proudshamed, yes, but I still love it, the mess of it, the code and toolkits, down to the pixels and the processors, and up to the buses and bridges. I love the whole made world. But I can’t deny that the miracle is over, and that there is an unbelievable amount of work left for us to do.”

The Peculiar Blindness of Experts
David Epstein | The Atlantic
“In business, esteemed (and lavishly compensated) forecasters routinely are wildly wrong in their predictions of everything from the next stock-market correction to the next housing boom. Reliable insight into the future is possible, however. It just requires a style of thinking that’s uncommon among experts who are certain that their deep knowledge has granted them a special grasp of what is to come.”

The Most Controversial Tree in the World
Rowan Jacobson | Pacific Standard
“…we are all GMOs, the beneficiaries of freakishly unlikely genetic mash-ups, and the real Island of Dr. Moreau is that blue-green botanical garden positioned third from the sun. Rather than changing the nature of nature, as I once thought, this might just be the very nature of nature.”

How an Augmented Reality Game Escalated Into Real-World Spy Warfare
Elizabeth Ballou | Vice
“In Ingress, players accept that every park and train station could be the site of an epic showdown, but that’s only the first step. The magic happens when other people accept that, too. When players feel like that magic is real, there are few limits to what they’ll do or where they’ll go for the sake of the game. ”

The Shady Cryptocurrency Boom on the Post-Soviet Frontier
Hannah Lucinda Smith | Wired
“…although the tourists won’t guess it as they stand at Kuchurgan’s gates, admiring how the evening light reflects off the silver plaque of Lenin, this plant is pumping out juice to a modern-day gold rush: a cryptocurrency boom that is underway all across the former Soviet Union, from the battlefields of eastern Ukraine to time-warp enclaves like Transnistria and freshly annexed Crimea.”

Scientists Are Totally Rethinking Animal Cognition
Ross Andersen | The Atlantic
“This idea that animals are conscious was long unpopular in the West, but it has lately found favor among scientists who study animal cognition. …For many scientists, the resonant mystery is no longer which animals are conscious, but which are not.”

I Wrote This on a 30-Year-Old Computer
Ian Bogost | The Atlantic
“[Back then] computing was an accompaniment to life, rather than the sieve through which all ideas and activities must filter. That makes using this 30-year-old device a surprising joy, one worth longing for on behalf of what it was at the time, rather than for the future it inaugurated.”

Image Credit: Wes Hicks / Unsplash Continue reading

Posted in Human Robots

#436488 Tech’s Biggest Leaps From the Last 10 ...

As we enter our third decade in the 21st century, it seems appropriate to reflect on the ways technology developed and note the breakthroughs that were achieved in the last 10 years.

The 2010s saw IBM’s Watson win a game of Jeopardy, ushering in mainstream awareness of machine learning, along with DeepMind’s AlphaGO becoming the world’s Go champion. It was the decade that industrial tools like drones, 3D printers, genetic sequencing, and virtual reality (VR) all became consumer products. And it was a decade in which some alarming trends related to surveillance, targeted misinformation, and deepfakes came online.

For better or worse, the past decade was a breathtaking era in human history in which the idea of exponential growth in information technologies powered by computation became a mainstream concept.

As I did last year for 2018 only, I’ve asked a collection of experts across the Singularity University faculty to help frame the biggest breakthroughs and moments that gave shape to the past 10 years. I asked them what, in their opinion, was the most important breakthrough in their respective fields over the past decade.

My own answer to this question, focused in the space of augmented and virtual reality, would be the stunning announcement in March of 2014 that Facebook acquired Oculus VR for $2 billion. Although VR technology had been around for a while, it was at this precise moment that VR arrived as a consumer technology platform. Facebook, largely fueled by the singular interest of CEO Mark Zuckerberg, has funded the development of this industry, keeping alive the hope that consumer VR can become a sustainable business. In the meantime, VR has continued to grow in sophistication and usefulness, though it has yet to truly take off as a mainstream concept. That will hopefully be a development for the 2020s.

Below is a decade in review across the technology areas that are giving shape to our modern world, as described by the SU community of experts.

Digital Biology
Dr. Tiffany Vora | Faculty Director and Vice Chair, Digital Biology and Medicine, Singularity University

In my mind, this decade of astounding breakthroughs in the life sciences and medicine rests on the achievement of the $1,000 human genome in 2016. More-than-exponentially falling costs of DNA sequencing have driven advances in medicine, agriculture, ecology, genome editing, synthetic biology, the battle against climate change, and our fundamental understanding of life and its breathtaking connections. The “digital” revolution in DNA constituted an important model for harnessing other types of biological information, from personalized bio data to massive datasets spanning populations and species.

Crucially, by aggressively driving down the cost of such analyses, researchers and entrepreneurs democratized access to the source code of life—with attendant financial, cultural, and ethical consequences. Exciting, but take heed: Veritas Genetics spearheaded a $600 genome in 2019, only to have to shutter USA operations due to a money trail tangled with the trade war with China. Stay tuned through the early 2020s to see the pricing of DNA sequencing fall even further … and to experience the many ways that cheaper, faster harvesting of biological data will enrich your daily life.

Cryptocurrency
Alex Gladstein | Chief Strategy Officer, Human Rights Foundation

The past decade has seen Bitcoin go from just an idea on an obscure online message board to a global financial network carrying more than 100 billion dollars in value. And we’re just getting started. One recent defining moment in the cryptocurrency space has been a stunning trend underway in Venezuela, where today, the daily dollar-denominated value of Bitcoin traded now far exceeds the daily dollar-denominated value traded on the Caracas Stock Exchange. It’s just one country, but it’s a significant country, and a paradigm shift.

Governments and corporations are following Bitcoin’s success too, and are looking to launch their own digital currencies. China will launch its “DC/EP” project in the coming months, and Facebook is trying to kickstart its Libra project. There are technical and regulatory uncertainties for both, but one thing is for certain: the era of digital currency has arrived.

Business Strategy and Entrepreneurship
Pascal Finnette | Chair, Entrepreneurship and Open Innovation, Singularity University

For me, without a doubt, the most interesting and quite possibly ground-shifting development in the fields of entrepreneurship and corporate innovation in the last ten years is the rapid maturing of customer-driven product development frameworks such as Lean Startup, and its subsequent adoption by corporates for their own innovation purposes.

Tools and frameworks like the Business Model Canvas, agile (software) development and the aforementioned Lean Startup methodology fundamentally shifted the way we think and go about building products, services, and companies, with many of these tools bursting onto the startup scene in the late 2000s and early 2010s.

As these tools matured they found mass adoption not only in startups around the world, but incumbent companies who eagerly adopted them to increase their own innovation velocity and success.

Energy
Ramez Naam | Co-Chair, Energy and Environment, Singularity University

The 2010s were the decade that saw clean electricity, energy storage, and electric vehicles break through price and performance barriers around the world. Solar, wind, batteries, and EVs started this decade as technologies that had to be subsidized. That was the first phase of their existence. Now they’re entering their third, most disruptive phase, where shifting to clean energy and mobility is cheaper than continuing to use existing coal, gas, or oil infrastructure.

Consider that at the start of 2010, there was no place on earth where building new solar or wind was cheaper than building new coal or gas power generation. By 2015, in some of the sunniest and windiest places on earth, solar and wind had entered their second phase, where they were cost-competitive for new power. And then, in 2018 and 2019, we started to see the edge of the third phase, as building new solar and wind, in some parts of the world, was cheaper than operating existing coal or gas power plants.

Food Technology
Liz Specht, Ph. D | Associate Director of Science & Technology, The Good Food Institute

The arrival of mainstream plant-based meat is easily the food tech advance of the decade. Meat analogs have, of course, been around forever. But only in the last decade have companies like Beyond Meat and Impossible Foods decided to cut animals out of the process and build no-compromise meat directly from plants.

Plant-based meat is already transforming the fast-food industry. For example, the introduction of the Impossible Whopper led Burger King to their most profitable quarter in many years. But the global food industry as a whole is shifting as well. Tyson, JBS, Nestle, Cargill, and many others are all embracing plant-based meat.

Augmented and Virtual Reality
Jody Medich | CEO, Superhuman-x

The breakthrough moment for augmented and virtual reality came in 2013 when Palmer Lucky took apart an Android smartphone and added optic lenses to make the first version of the Oculus Rift. Prior to that moment, we struggled with miniaturizing the components needed to develop low-latency head-worn devices. But thanks to the smartphone race started in 2006 with the iPhone, we finally had a suite of sensors, chips, displays, and computing power small enough to put on the head.

What will the next 10 years bring? Look for AR/VR to explode in a big way. We are right on the cusp of that tipping point when the tech is finally “good enough” for our linear expectations. Given all it can do today, we can’t even picture what’s possible. Just as today we can’t function without our phones, by 2029 we’ll feel lost without some AR/VR product. It will be the way we interact with computing, smart objects, and AI. Tim Cook, Apple CEO, predicts it will replace all of today’s computing devices. I can’t wait.

Philosophy of Technology
Alix Rübsaam | Faculty Fellow, Singularity University, Philosophy of Technology/Ethics of AI

The last decade has seen a significant shift in our general attitude towards the algorithms that we now know dictate much of our surroundings. Looking back at the beginning of the decade, it seems we were blissfully unaware of how the data we freely and willingly surrendered would feed the algorithms that would come to shape every aspect of our daily lives: the news we consume, the products we purchase, the opinions we hold, etc.

If I were to isolate a single publication that contributed greatly to the shift in public discourse on algorithms, it would have to be Cathy O’Neil’s Weapons of Math Destruction from 2016. It remains a comprehensive, readable, and highly informative insight into how algorithms dictate our finances, our jobs, where we go to school, or if we can get health insurance. Its publication represents a pivotal moment when the general public started to question whether we should be OK with outsourcing decision making to these opaque systems.

The ubiquity of ethical guidelines for AI and algorithms published just in the last year (perhaps most comprehensively by the AI Now Institute) fully demonstrates the shift in public opinion of this decade.

Data Science
Ola Kowalewski | Faculty Fellow, Singularity University, Data Innovation

In the last decade we entered the era of internet and smartphone ubiquity. The number of internet users doubled, with nearly 60 percent of the global population connected online and now over 35 percent of the globe owns a smartphone. With billions of people in a state of constant connectedness and therefore in a state of constant surveillance, the companies that have built the tech infrastructure and information pipelines have dominated the global economy. This shift from tech companies being the underdogs to arguably the world’s major powers sets the landscape we enter for the next decade.

Global Grand Challenges
Darlene Damm | Vice Chair, Faculty, Global Grand Challenges, Singularity University

The biggest breakthrough over the last decade in social impact and technology is that the social impact sector switched from seeing technology as something problematic to avoid, to one of the most effective ways to create social change. We now see people using exponential technologies to solve all sorts of social challenges in areas ranging from disaster response to hunger to shelter.

The world’s leading social organizations, such as UNICEF and the World Food Programme, have launched their own venture funds and accelerators, and the United Nations recently declared that digitization is revolutionizing global development.

Digital Biology
Raymond McCauley | Chair, Digital Biology, Singularity University, Co-Founder & Chief Architect, BioCurious; Principal, Exponential Biosciences

CRISPR is bringing about a revolution in genetic engineering. It’s obvious, and it’s huge. What may not be so obvious is the widespread adoption of genetic testing. And this may have an even longer-lasting effect. It’s used to test new babies, to solve medical mysteries, and to catch serial killers. Thanks to holiday ads from 23andMe and Ancestry.com, it’s everywhere. Testing your DNA is now a common over-the-counter product. People are using it to set their diet, to pick drugs, and even for dating (or at least picking healthy mates).

And we’re just in the early stages. Further down the line, doing large-scale studies on more people, with more data, will lead to the use of polygenic risk scores to help us rank our genetic potential for everything from getting cancer to being a genius. Can you imagine what it would be like for parents to pick new babies, GATTACA-style, to get the smartest kids? You don’t have to; it’s already happening.

Artificial Intelligence
Neil Jacobstein | Chair, Artificial Intelligence and Robotics, Singularity University

The convergence of exponentially improved computing power, the deep learning algorithm, and access to massive data resulted in a series of AI breakthroughs over the past decade. These included: vastly improved accuracy in identifying images, making self driving cars practical, beating several world champions in Go, and identifying gender, smoking status, and age from retinal fundus photographs.

Combined, these breakthroughs convinced researchers and investors that after 50+ years of research and development, AI was ready for prime-time applications. Now, virtually every field of human endeavor is being revolutionized by machine learning. We still have a long way to go to achieve human-level intelligence and beyond, but the pace of worldwide improvement is blistering.

Hod Lipson | Professor of Engineering and Data Science, Columbia University

The biggest moment in AI in the past decade (and in its entire history, in my humble opinion) was midnight, Pacific time, September 30, 2012: the moment when machines finally opened their eyes. It was the moment when deep learning took off, breaking stagnant decades of machine blindness, when AI couldn’t reliably tell apart even a cat from a dog. That seemingly trivial accomplishment—a task any one-year-old child can do—has had a ripple effect on AI applications from driverless cars to health diagnostics. And this is just the beginning of what is sure to be a Cambrian explosion of AI.

Neuroscience
Divya Chander | Chair, Neuroscience, Singularity University

If the 2000s were the decade of brain mapping, then the 2010s were the decade of brain writing. Optogenetics, a technique for precisely mapping and controlling neurons and neural circuits using genetically-directed light, saw incredible growth in the 2010s.

Also in the last 10 years, neuromodulation, or the ability to rewire the brain using both invasive and non-invasive interfaces and energy, has exploded in use and form. For instance, the Braingate consortium showed us how electrode arrays implanted into the motor cortex could be used by paralyzed people to use their thoughts to direct a robotic arm. These technologies, alone or in combination with robotics, exoskeletons, and flexible, implantable, electronics also make possible a future of human augmentation.

Image Credit: Image by Jorge Guillen from Pixabay Continue reading

Posted in Human Robots

#436258 For Centuries, People Dreamed of a ...

This is part six of a six-part series on the history of natural language processing.

In February of this year, OpenAI, one of the foremost artificial intelligence labs in the world, announced that a team of researchers had built a powerful new text generator called the Generative Pre-Trained Transformer 2, or GPT-2 for short. The researchers used a reinforcement learning algorithm to train their system on a broad set of natural language processing (NLP) capabilities, including reading comprehension, machine translation, and the ability to generate long strings of coherent text.

But as is often the case with NLP technology, the tool held both great promise and great peril. Researchers and policy makers at the lab were concerned that their system, if widely released, could be exploited by bad actors and misappropriated for “malicious purposes.”

The people of OpenAI, which defines its mission as “discovering and enacting the path to safe artificial general intelligence,” were concerned that GPT-2 could be used to flood the Internet with fake text, thereby degrading an already fragile information ecosystem. For this reason, OpenAI decided that it would not release the full version of GPT-2 to the public or other researchers.

GPT-2 is an example of a technique in NLP called language modeling, whereby the computational system internalizes a statistical blueprint of a text so it’s able to mimic it. Just like the predictive text on your phone—which selects words based on words you’ve used before—GPT-2 can look at a string of text and then predict what the next word is likely to be based on the probabilities inherent in that text.

GPT-2 can be seen as a descendant of the statistical language modeling that the Russian mathematician A. A. Markov developed in the early 20th century (covered in part three of this series).

GPT-2 used cutting-edge machine learning algorithms to do linguistic analysis with over 1.5 million parameters.

What’s different with GPT-2, though, is the scale of the textual data modeled by the system. Whereas Markov analyzed a string of 20,000 letters to create a rudimentary model that could predict the likelihood of the next letter of a text being a consonant or a vowel, GPT-2 used 8 million articles scraped from Reddit to predict what the next word might be within that entire dataset.

And whereas Markov manually trained his model by counting only two parameters—vowels and consonants—GPT-2 used cutting-edge machine learning algorithms to do linguistic analysis with over 1.5 million parameters, burning through huge amounts of computational power in the process.

The results were impressive. In their blog post, OpenAI reported that GPT-2 could generate synthetic text in response to prompts, mimicking whatever style of text it was shown. If you prompt the system with a line of William Blake’s poetry, it can generate a line back in the Romantic poet’s style. If you prompt the system with a cake recipe, you get a newly invented recipe in response.

Perhaps the most compelling feature of GPT-2 is that it can answer questions accurately. For example, when OpenAI researchers asked the system, “Who wrote the book The Origin of Species?”—it responded: “Charles Darwin.” While only able to respond accurately some of the time, the feature does seem to be a limited realization of Gottfried Leibniz’s dream of a language-generating machine that could answer any and all human questions (described in part two of this series).

After observing the power of the new system in practice, OpenAI elected not to release the fully trained model. In the lead up to its release in February, there had been heightened awareness about “deepfakes”—synthetic images and videos, generated via machine learning techniques, in which people do and say things they haven’t really done and said. Researchers at OpenAI worried that GPT-2 could be used to essentially create deepfake text, making it harder for people to trust textual information online.

Responses to this decision varied. On one hand, OpenAI’s caution prompted an overblown reaction in the media, with articles about the “dangerous” technology feeding into the Frankenstein narrative that often surrounds developments in AI.

Others took issue with OpenAI’s self-promotion, with some even suggesting that OpenAI purposefully exaggerated GPT-2s power in order to create hype—while contravening a norm in the AI research community, where labs routinely share data, code, and pre-trained models. As machine learning researcher Zachary Lipton tweeted, “Perhaps what's *most remarkable* about the @OpenAI controversy is how *unremarkable* the technology is. Despite their outsize attention & budget, the research itself is perfectly ordinary—right in the main branch of deep learning NLP research.”

OpenAI stood by its decision to release only a limited version of GPT-2, but has since released larger models for other researchers and the public to experiment with. As yet, there has been no reported case of a widely distributed fake news article generated by the system. But there have been a number of interesting spin-off projects, including GPT-2 poetry and a webpage where you can prompt the system with questions yourself.

Mimicking humans on Reddit, the bots have long conversations about a variety of topics, including conspiracy theories and
Star Wars movies.

There’s even a Reddit group populated entirely with text produced by GPT-2-powered bots. Mimicking humans on Reddit, the bots have long conversations about a variety of topics, including conspiracy theories and Star Wars movies.

This bot-powered conversation may signify the new condition of life online, where language is increasingly created by a combination of human and non-human agents, and where maintaining the distinction between human and non-human, despite our best efforts, is increasingly difficult.

The idea of using rules, mechanisms, and algorithms to generate language has inspired people in many different cultures throughout history. But it’s in the online world that this powerful form of wordcraft may really find its natural milieu—in an environment where the identity of speakers becomes more ambiguous, and perhaps, less relevant. It remains to be seen what the consequences will be for language, communication, and our sense of human identity, which is so bound up with our ability to speak in natural language.

This is the sixth installment of a six-part series on the history of natural language processing. Last week’s post explained how an innocent Microsoft chatbot turned instantly racist on Twitter.

You can also check out our prior series on the untold history of AI. Continue reading

Posted in Human Robots

#436234 Robot Gift Guide 2019

Welcome to the eighth edition of IEEE Spectrum’s Robot Gift Guide!

This year we’re featuring 15 robotic products that we think will make fantastic holiday gifts. As always, we tried to include a broad range of robot types and prices, focusing mostly on items released this year. (A reminder: While we provide links to places where you can buy these items, we’re not endorsing any in particular, and a little bit of research may result in better deals.)

If you need even more robot gift ideas, take a look at our past guides: 2018, 2017, 2016, 2015, 2014, 2013, and 2012. Some of those robots are still great choices and might be way cheaper now than when we first posted about them. And if you have suggestions that you’d like to share, post a comment below to help the rest of us find the perfect robot gift.

Skydio 2

Image: Skydio

What makes robots so compelling is their autonomy, and the Skydio 2 is one of the most autonomous robots we’ve ever seen. It uses an array of cameras to map its environment and avoid obstacles in real-time, making flight safe and effortless and enabling the kinds of shots that would be impossible otherwise. Seriously, this thing is magical, and it’s amazing that you can actually buy one.
$1,000
Skydio
UBTECH Jimu MeeBot 2

Image: UBTECH

The Jimu MeeBot 2.0 from UBTECH is a STEM education robot designed to be easy to build and program. It includes six servo motors, a color sensor, and LED lights. An app for iPhone or iPad provides step-by-step 3D instructions, and helps you code different behaviors for the robot. It’s available exclusively from Apple.
$130
Apple
iRobot Roomba s9+

Image: iRobot

We know that $1,400 is a crazy amount of money to spend on a robot vacuum, but the Roomba s9+ is a crazy robot vacuum. As if all of its sensors and mapping intelligence wasn’t enough, it empties itself, which means that you can have your floors vacuumed every single day for a month and you don’t have to even think about it. This is what home robots are supposed to be.
$1,400
iRobot
PFF Gita

Photo: Piaggio Fast Forward

Nobody likes carrying things, which is why Gita is perfect for everyone with an extra $3,000 lying around. Developed by Piaggio Fast Forward, this autonomous robot will follow you around with a cargo hold full of your most important stuff, and do it in a way guaranteed to attract as much attention as possible.
$3,250
Gita
DJI Mavic Mini

Photo: DJI

It’s tiny, it’s cheap, and it takes good pictures—what more could you ask for from a drone? And for $400, this is an excellent drone to get if you’re on a budget and comfortable with manual flight. Keep in mind that while the Mavic Mini is small enough that you don’t need to register it with the FAA, you do still need to follow all the same rules and regulations.
$400
DJI
LEGO Star Wars Droid Commander

Image: LEGO

Designed for kids ages 8+, this LEGO set includes more than 1,000 pieces, enough to build three different droids: R2-D2, Gonk Droid, and Mouse Droid. Using a Bluetooth-controlled robotic brick called Move Hub, which connects to the LEGO BOOST Star Wars app, kids can change how the robots behave and solve challenges, learning basic robotics and coding skills.
$200
LEGO
Sony Aibo

Photo: Sony

Robot pets don’t get much more sophisticated (or expensive) than Sony’s Aibo. Strictly speaking, it’s one of the most complex consumer robots you can buy, and Sony continues to add to Aibo’s software. Recent new features include user programmability, and the ability to “feed” it.
$2,900 (free aibone and paw pads until 12/29/2019)
Sony
Neato Botvac D4 Connected

Photo: Neato

The Neato Botvac D4 may not have all of the features of its fancier and more expensive siblings, but it does have the features that you probably care the most about: The ability to make maps of its environment for intelligent cleaning (using lasers!), along with user-defined no-go lines that keep it where you want it. And it cleans quite well, too.
$530 $350 (sale)
Neato Robotics
Cubelets Curiosity Set

Photo: Modular Robotics

Cubelets are magnetic blocks that you can snap together to make an endless variety of robots with no programming and no wires. The newest set, called Curiosity, is designed for kids ages 4+ and comes with 10 robotic cubes. These include light and distance sensors, motors, and a Bluetooth module, which connects the robot constructions to the Cubelets app.
$250
Modular Robotics
Tertill

Photo: Franklin Robotics

Tertill does one simple job: It weeds your garden. It’s waterproof, dirt proof, solar powered, and fully autonomous, meaning that you can leave it out in your garden all summer and just enjoy eating your plants rather than taking care of them.
$350
Tertill
iRobot Root

Photo: iRobot

Root was originally developed by Harvard University as a tool to help kids progressively learn to code. iRobot has taken over Root and is now supporting the curriculum, which starts for kids before they even know how to read and should keep them busy for years afterwards.
$200
iRobot
LOVOT

Image: Lovot

Let’s be honest: Nobody is really quite sure what LOVOT is. We can all agree that it’s kinda cute, though. And kinda weird. But cute. Created by Japanese robotics startup Groove X, LOVOT does have a whole bunch of tech packed into its bizarre little body and it will do its best to get you to love it.
$2,750 (¥300,000)
LOVOT
Sphero RVR

Photo: Sphero

RVR is a rugged, versatile, easy to program mobile robot. It’s a development platform designed to be a bridge between educational robots like Sphero and more sophisticated and expensive systems like Misty. It’s mostly affordable, very expandable, and comes from a company with a lot of experience making robots.
$250
Sphero
“How to Train Your Robot”

Image: Lawrence Hall of Science

Aimed at 4th and 5th graders, “How to Train Your Robot,” written by Blooma Goldberg, Ken Goldberg, and Ashley Chase, and illustrated by Dave Clegg, is a perfect introduction to robotics for kids who want to get started with designing and building robots. But the book isn’t just for beginners: It’s also a fun, inspiring read for kids who are already into robotics and want to go further—it even introduces concepts like computer simulations and deep learning. You can download a free digital copy or request hardcopies here.
Free
UC Berkeley
MIT Mini Cheetah

Photo: MIT

Yes, Boston Dynamics’ Spot, now available for lease, is probably the world’s most famous quadruped, but MIT is starting to pump out Mini Cheetahs en masse for researchers, and while we’re not exactly sure how you’d manage to get one of these things short of stealing one directly for MIT, a Mini Cheetah is our fantasy robotics gift this year. Mini Cheetah looks like a ton of fun—it’s portable, highly dynamic, super rugged, and easy to control. We want one!
Price N/A
MIT Biomimetic Robotics Lab

For more tech gift ideas, see also IEEE Spectrum’s annual Gift Guide. Continue reading

Posted in Human Robots

#436220 How Boston Dynamics Is Redefining Robot ...

Gif: Bob O’Connor/IEEE Spectrum

With their jaw-dropping agility and animal-like reflexes, Boston Dynamics’ bioinspired robots have always seemed to have no equal. But that preeminence hasn’t stopped the company from pushing its technology to new heights, sometimes literally. Its latest crop of legged machines can trudge up and down hills, clamber over obstacles, and even leap into the air like a gymnast. There’s no denying their appeal: Every time Boston Dynamics uploads a new video to YouTube, it quickly racks up millions of views. These are probably the first robots you could call Internet stars.

Spot

Photo: Bob O’Connor

84 cm HEIGHT

25 kg WEIGHT

5.76 km/h SPEED

SENSING: Stereo cameras, inertial measurement unit, position/force sensors

ACTUATION: 12 DC motors

POWER: Battery (90 minutes per charge)

Boston Dynamics, once owned by Google’s parent company, Alphabet, and now by the Japanese conglomerate SoftBank, has long been secretive about its designs. Few publications have been granted access to its Waltham, Mass., headquarters, near Boston. But one morning this past August, IEEE Spectrum got in. We were given permission to do a unique kind of photo shoot that day. We set out to capture the company’s robots in action—running, climbing, jumping—by using high-speed cameras coupled with powerful strobes. The results you see on this page: freeze-frames of pure robotic agility.

We also used the photos to create interactive views, which you can explore online on our Robots Guide. These interactives let you spin the robots 360 degrees, or make them walk and jump on your screen.

Boston Dynamics has amassed a minizoo of robotic beasts over the years, with names like BigDog, SandFlea, and WildCat. When we visited, we focused on the two most advanced machines the company has ever built: Spot, a nimble quadruped, and Atlas, an adult-size humanoid.

Spot can navigate almost any kind of terrain while sensing its environment. Boston Dynamics recently made it available for lease, with plans to manufacture something like a thousand units per year. It envisions Spot, or even packs of them, inspecting industrial sites, carrying out hazmat missions, and delivering packages. And its YouTube fame has not gone unnoticed: Even entertainment is a possibility, with Cirque du Soleil auditioning Spot as a potential new troupe member.

“It’s really a milestone for us going from robots that work in the lab to these that are hardened for work out in the field,” Boston Dynamics CEO Marc Raibert says in an interview.

Atlas

Photo: Bob O’Connor

150 cm HEIGHT

80 kg WEIGHT

5.4 km/h SPEED

SENSING: Lidar and stereo vision

ACTUATION: 28 hydraulic actuators

POWER: Battery

Our other photographic subject, Atlas, is Boston Dynamics’ biggest celebrity. This 150-centimeter-tall (4-foot-11-inch-tall) humanoid is capable of impressive athletic feats. Its actuators are driven by a compact yet powerful hydraulic system that the company engineered from scratch. The unique system gives the 80-kilogram (176-pound) robot the explosive strength needed to perform acrobatic leaps and flips that don’t seem possible for such a large humanoid to do. Atlas has inspired a string of parody videos on YouTube and more than a few jokes about a robot takeover.

While Boston Dynamics excels at making robots, it has yet to prove that it can sell them. Ever since its founding in 1992 as a spin-off from MIT, the company has been an R&D-centric operation, with most of its early funding coming from U.S. military programs. The emphasis on commercialization seems to have intensified after the acquisition by SoftBank, in 2017. SoftBank’s founder and CEO, Masayoshi Son, is known to love robots—and profits.

The launch of Spot is a significant step for Boston Dynamics as it seeks to “productize” its creations. Still, Raibert says his long-term goals have remained the same: He wants to build machines that interact with the world dynamically, just as animals and humans do. Has anything changed at all? Yes, one thing, he adds with a grin. In his early career as a roboticist, he used to write papers and count his citations. Now he counts YouTube views.

In the Spotlight

Photo: Bob O’Connor

Boston Dynamics designed Spot as a versatile mobile machine suitable for a variety of applications. The company has not announced how much Spot will cost, saying only that it is being made available to select customers, which will be able to lease the robot. A payload bay lets you add up to 14 kilograms of extra hardware to the robot’s back. One of the accessories that Boston Dynamics plans to offer is a 6-degrees-of-freedom arm, which will allow Spot to grasp objects and open doors.

Super Senses

Photo: Bob O’Connor

Spot’s hardware is almost entirely custom-designed. It includes powerful processing boards for control as well as sensor modules for perception. The ­sensors are located on the front, rear, and sides of the robot’s body. Each module consists of a pair of stereo cameras, a wide-angle camera, and a texture projector, which enhances 3D sensing in low light. The sensors allow the robot to use the navigation method known as SLAM, or simultaneous localization and mapping, to get around autonomously.

Stepping Up

Photo: Bob O’Connor

In addition to its autonomous behaviors, Spot can also be steered by a remote operator with a game-style controller. But even when in manual mode, the robot still exhibits a high degree of autonomy. If there’s an obstacle ahead, Spot will go around it. If there are stairs, Spot will climb them. The robot goes into these operating modes and then performs the related actions completely on its own, without any input from the operator. To go down a flight of stairs, Spot walks backward, an approach Boston Dynamics says provides greater stability.

Funky Feet

Gif: Bob O’Connor/IEEE Spectrum

Spot’s legs are powered by 12 custom DC motors, each geared down to provide high torque. The robot can walk forward, sideways, and backward, and trot at a top speed of 1.6 meters per second. It can also turn in place. Other gaits include crawling and pacing. In one wildly popular YouTube video, Spot shows off its fancy footwork by dancing to the pop hit “Uptown Funk.”

Robot Blood

Photo: Bob O’Connor

Atlas is powered by a hydraulic system consisting of 28 actuators. These actuators are basically cylinders filled with pressurized fluid that can drive a piston with great force. Their high performance is due in part to custom servo valves that are significantly smaller and lighter than the aerospace models that Boston Dynamics had been using in earlier designs. Though not visible from the outside, the innards of an Atlas are filled with these hydraulic actuators as well as the lines of fluid that connect them. When one of those lines ruptures, Atlas bleeds the hydraulic fluid, which happens to be red.

Next Generation

Gif: Bob O’Connor/IEEE Spectrum

The current version of Atlas is a thorough upgrade of the original model, which was built for the DARPA Robotics Challenge in 2015. The newest robot is lighter and more agile. Boston Dynamics used industrial-grade 3D printers to make key structural parts, giving the robot greater strength-to-weight ratio than earlier designs. The next-gen Atlas can also do something that its predecessor, famously, could not: It can get up after a fall.

Walk This Way

Photo: Bob O’Connor

To control Atlas, an operator provides general steering via a manual controller while the robot uses its stereo cameras and lidar to adjust to changes in the environment. Atlas can also perform certain tasks autonomously. For example, if you add special bar-code-type tags to cardboard boxes, Atlas can pick them up and stack them or place them on shelves.

Biologically Inspired

Photos: Bob O’Connor

Atlas’s control software doesn’t explicitly tell the robot how to move its joints, but rather it employs mathematical models of the underlying physics of the robot’s body and how it interacts with the environment. Atlas relies on its whole body to balance and move. When jumping over an obstacle or doing acrobatic stunts, the robot uses not only its legs but also its upper body, swinging its arms to propel itself just as an athlete would.

This article appears in the December 2019 print issue as “By Leaps and Bounds.” Continue reading

Posted in Human Robots