Tag Archives: hear

#436263 Skydio 2 Review: This Is the Drone You ...

Let me begin this review by saying that the Skydio 2 is one of the most impressive robots that I have ever seen. Over the last decade, I’ve spent enough time around robots to have a very good sense of what kinds of things are particularly challenging for them, and to set my expectations accordingly. Those expectations include things like “unstructured environments are basically impossible” and “full autonomy is impractically expensive” and “robot videos rarely reflect reality.”

Skydio’s newest drone is an exception to all of this. It’s able to fly autonomously at speed through complex environments in challenging real-world conditions in a way that’s completely effortless and stress-free for the end user, allowing you to capture the kind of video that would be otherwise impossible, even (I’m guessing) for professional drone pilots. When you see this technology in action, it’s (almost) indistinguishable from magic.

Skydio 2 Price
To be clear, the Skydio 2 is not without compromises, and the price of $999 (on pre-order with delivery of the next batch expected in spring of 2020) requires some justification. But the week I’ve had with this drone has left me feeling like its fundamental autonomous capability is so far beyond just about anything that I’ve ever experienced that I’m questioning why I would every fly anything else ever again.

We’ve written extensively about Skydio, beginning in early 2016 when the company posted a video of a prototype drone dodging trees while following a dude on a bike. Even three years ago, Skydio’s tech was way better than anything we’d seen outside of a research lab, and in early 2018, they introduced their first consumer product, the Skydio R1. A little over a year later, Skydio has introduced the Skydio 2, which is smaller, smarter, and much more affordable. Here’s an overview video just to get you caught up:

Skydio sent me a Skydio 2 review unit last week, and while I’m reasonably experienced with drones in general, this is the first time I’ve tried a Skydio drone in person. I had a pretty good idea what to expect, and I was absolutely blown away. Like, I was giggling to myself while running through the woods as the drone zoomed around, deftly avoiding trees and keeping me in sight. Robots aren’t supposed to be this good.

A week is really not enough time to explore everything that the Skydio can do, especially Thanksgiving week in Washington, D.C. (a no-fly zone) in early winter. But I found a nearby state park in which I could legally and safely fly the drone, and I did my best to put the Skydio 2 through its paces.

Note: Throughout this review, we’ve got a bunch of GIFs to help illustrate different features of the drone. To fit them all in, these GIFs had to be heavily compressed. Underneath each GIF is a timestamped link to this YouTube video (also available at the bottom of the post), which you can click on to see the an extended cut of the original 4K 30 fps footage. And there’s a bunch of interesting extra video in there as well.

Skydio 2 Specs

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 is primarily made out of magnesium, which (while light) is both heavier and more rigid and durable than plastic. The offset props (the back pair are above the body, and the front pair are below) are necessary to maintain the field of view of the navigation cameras.

The Skydio 2 both looks and feels like a well-designed and carefully thought-out drone. It’s solid, and a little on the heavy side as far as drones go—it’s primarily made out of magnesium, which (while light) is both heavier and more rigid and durable than plastic. The blue and black color scheme is far more attractive than you typically see with drones.

Photo: Evan Ackerman/IEEE Spectrum

To detect and avoid obstacles, the Skydio 2 uses an array of six 4K hemispherical cameras that feed data into an NVIDIA Jetson TX2 at 30 fps, with the drone processing a million points in 3D space per second to plan the safest path.

The Skydio 2 is built around an array of six hemispherical obstacle-avoidance cameras and the NVIDIA Jetson TX2 computing module that they’re connected to. This defines the placement of the gimbal, the motors and props, and the battery, since all of this stuff has to be as much as possible out of the view of the cameras in order for the drone to effectively avoid obstacles in any direction.

Without the bottom-mounted battery attached, the drone is quite flat. The offset props (the back pair are above the body, and the front pair are below) are necessary to maintain the field of view of the obstacle-avoidance cameras. These hemispherical cameras are on the end of each of the prop arms as well as above and below the body of the drone. They look awfully exposed, even though each is protected from ground contact by a little fin. You need to make sure these cameras are clean and smudge-free, and Skydio includes a cleaning cloth for this purpose. Underneath the drone there are slots for microSD cards, one for recording from the camera and a second one that the drone uses to store data. The attention to detail extends to the SD card insertion, which has a sloped channel that guides the card securely into its slot.

Once you snap the battery in, the drone goes from looking streamlined to looking a little chubby. Relative to other drones, the battery almost seems like an afterthought, like Skydio designed the drone and then remembered, “oops we have to add a battery somewhere, let’s just kludge it onto the bottom.” But again, the reason for this is to leave room inside the body for the NVIDIA TX2, while making sure that the battery stays out of view of the obstacle avoidance cameras.

The magnetic latching system for the battery is both solid and satisfying. I’m not sure why it’s necessary, strictly speaking, but I do like it, and it doesn’t seem like the battery will fly off even during the most aggressive maneuvers. Each battery includes an LED array that will display its charge level in 25 percent increments, as well as a button that you push to turn the drone on and off. Charging takes place via a USB-C port in the top of the drone, which I don’t like, because it means that the batteries can’t be charged on their own (like the Parrot Anafi’s battery), and that you can’t charge one battery while flying with another, like basically every other drone ever. A separate battery charger that will charge two at once is available from Skydio for an eyebrow-raising $129.

I appreciate that all of Skydio’s stuff (batteries, controller, and beacon) charges via USB-C, though. The included USB-C adapter with its beefy cable will output at up to 65 watts, which’ll charge a mostly depleted battery in under an hour. The drone turns itself on while charging, which seems unnecessary.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 is not foldable, making it not nearly as easy to transport as some other drones. But it does come with a nice case that mitigates this issue somewhat, and the drone plus two batteries end up as a passably flat package about the size of a laptop case.

The most obvious compromise that Skydio made with the Skydio 2 is that the drone is not foldable. Skydio CEO Adam Bry told us that adding folding joints to the arms of the Skydio 2 would have made calibrating all six cameras a nightmare and significantly impacted performance. This makes complete sense, of course, but it does mean that the Skydio 2 is not nearly as easy to transport as some other drones.

Photo: Evan Ackerman/IEEE Spectrum

Folded and unfolded: The Skydio 2 compared to the Parrot Anafi (upper left) and the DJI Mavic Pro (upper right).

The Skydio 2 does come with a very nice case that mitigates this issue somewhat, and the drone plus two batteries end up as a passably flat package about the size of a laptop case. Still, it’s just not as convenient to toss into a backpack as my Anafi, although the Mavic Mini might be even more portable.

Photo: Evan Ackerman/IEEE Spectrum

While the Skydio 2’s case is relatively compact, the non-foldable drone is overall a significantly larger package than the Parrot Anafi.

The design of the drone leads to some other compromises as well. Since landing gear would, I assume, occlude the camera system, the drone lands directly on the bottom of its battery pack, which has a slightly rubberized pad about the size of a playing card. This does’t feel particularly stable unless you end up on a very flat surface, and made me concerned for the exposed cameras underneath the drone as well as the lower set of props. I’d recommend hand takeoffs and landings—more on those later.

Skydio 2 Camera System

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2’s primary camera is a Sony IMX577 1/2.3″ 12.3-megapixel CMOS sensor. It’s mounted to a three-axis gimbal and records 4K video at 60 fps, or 1080p video at 120 fps.

The Skydio 2 comes with a three-axis gimbal supporting a 12-megapixel camera, just enough to record 4K video at 60 fps, or 1080p video at 120 fps. Skydio has provided plenty of evidence that its imaging system is at least as good if not better than other drone cameras. Tested against my Mavic Pro and Parrot Anafi, I found no reason to doubt that. To be clear, I didn’t do exhaustive pixel-peeping comparisons between them, you’re just getting my subjective opinion that the Skydio 2 has a totally decent camera that you won’t be disappointed with. I will say that I found the HDR photo function to be not all that great under the few situations in which I tested it—after looking at a few muddy sunset shots, I turned it off and was much happier.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2’s 12-megapixel camera is solid, although we weren’t impressed with the HDR option.

The video stabilization is fantastic, to the point where watching the video footage can be underwhelming because it doesn’t reflect the motion of the drone. I almost wish there was a way to change to unstabilized (or less-stabilized) video so that the viewer could get a little more of a wild ride. Or, ideally, there’d be a way for the drone to provide you with a visualization of what it was doing using the data collected by its cameras. That’s probably wishful thinking, though. The drone itself doesn’t record audio because all you’d get would be an annoying buzz, but the app does record audio, so the audio from your phone gets combined with the drone video. Don’t expect great quality, but it’s better than nothing.

Skydio 2 App
The app is very simple compared to every other drone app I’ve tried, and that’s a good thing. Here’s what it looks like:

Image: Skydio

Trackable subjects get a blue “+” sign over them, and if you tap them, the “+” turns into a spinny blue circle. Once you’ve got a subject selected, you can choose from a variety of cinematic skills that the drone will execute while following you.

You get the controls that you need and the information that you need, and nothing else. Manual flight with the on-screen buttons works adequately, and the double-tap to fly function on the phone works surprisingly well, making it easy to direct the drone to a particular spot above the ground.

The settings menus are limited but functional, allowing you to change settings for the camera and a few basic tweaks for controlling the drone. One unique setting to the Skydio 2 is the height floor—since the drone only avoids static obstacles, you can set it to maintain a height of at least 8 feet above the ground while flying autonomously to make sure that if you’re flying around other people, it won’t run into anyone who isn’t absurdly tall and therefore asking for it.

Trackable subjects get a blue “+” sign over them in the app, and if you tap them, the “+” turns into a spinny blue circle. Once you’ve got a subject selected, you can choose from a variety of cinematic skills that the drone will execute while following you, and in addition, you can select “one-shot” skills that involve the drone performing a specific maneuver before returning to the previously selected cinematic skill. For example, you can tell the drone to orbit around you, and then do a “rocket” one-shot where it’ll fly straight up above you (recording the whole time, of course), before returning to its orbiting.

After you’re done flying, you can scroll through your videos and easily clip out excerpts from them and save them to your phone for sharing. Again, it’s a fairly simple interface without a lot of options. You could call it limited, I guess, but I appreciate that it just does a few things that you care about and otherwise doesn’t clutter itself up.

The real limitation of the app is that it uses Wi-Fi to connect to the Skydio 2, which restricts the range. To fly much beyond a hundred meters or so, you’ll need to use the controller or beacon instead.

Skydio 2 Controller and Beacon

Photo: Evan Ackerman/IEEE Spectrum

While the Skydio 2 controller provides a better hands-on flight experience than with the phone, plus an extended range of up to 3.5 km, more experienced pilots may find manual control a bit frustrating, because the underlying autonomy will supersede your maneuvers when you start getting close to objects.

I was looking forward to using the controller, because with every other drone I’ve had, the precision that a physically controller provides is, I find, mandatory for a good flying experience and to get the photos and videos that you want. With Skydio 2, that’s all out the window. It’s not that the controller is useless or anything, it’s just that because the drone tracks you and avoids obstacles on its own, that level of control precision becomes largely unnecessary.

The controller itself is perfectly fine. It’s a rebranded Parrot Skycontroller3, which is the same as the one that you get with a Parrot Anafi. It’s too bad that the sticks don’t unscrew to make it a little more portable, and overall it’s functional rather than fancy, but it feels good to use and includes a sizeable antenna that makes a significant difference to the range that you get (up to 3.5 kilometers).

You definitely get a better hands-on flight experience with the controller than with the phone, so if you want to (say) zip the drone around some big open space for fun, it’s good for that. And it’s nice to be able to hand the controller to someone who’s never flown a drone before and let them take it for a spin without freaking out about them crashing it the whole time. For more experienced pilots, though, the controller is ultimately just a bit frustrating, because the underlying autonomy will supersede your control when you start getting close to objects, which (again) limits how useful the controller is relative to your phone.

I do still prefer the controller over the phone, but I’m not sure that it’s worth the extra $150, unless you plan to fly the Skydio 2 at very long distances or primarily in manual mode. And honestly, if either of those two things are your top priority, the Skydio 2 is probably not the drone for you.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 beacon uses GPS tracking to help the drone follow you, extending range up to 1.5 km. You can also fly the with the beacon alone, no phone necessary.

The purpose of the beacon, according to Skydio, is to give the drone a way of tracking you if it can’t see you, which can happen, albeit infrequently. My initial impression of the beacon was that it was primarily useful as a range-extending bridge between my phone and the drone. But I accidentally left my phone at home one day (oops) and had to fly the drone with only the beacon, and it was a surprisingly decent experience. The beacon allows for full manual control of a sort—you can tap different buttons to rotate, fly forward, and ascend or descend. This is sufficient for takeoff, landing, to make sure that the drone is looking at you when you engage visual tracking, and to rescue it if it gets trapped somewhere.

The rest of the beacon’s control functions are centered around a few different tracking modes, and with these, it works just about as well as your phone. You have fewer options overall, but all the basic stuff is there with just a few intuitive button clicks, including tracking range and angle. If you’re willing to deal with this relatively minor compromise, it’s nice to not have your phone available for other things rather than being monopolized by the drone.

Skydio 2 In Flight

GIF: Evan Ackerman/IEEE Spectrum

Hand takeoffs are simple and reliable.
Click here for a full resolution clip.

Starting up the Skydio 2 doesn’t require any kind of unusual calibration steps or anything like that. It prefers to be kept still, but you can start it up while holding it, it’ll just take a few seconds longer to tell you that it’s ready to go. While the drone will launch from any flat surface with significant clearance around it (it’ll tell you if it needs more room), the small footprint of the battery means that I was more comfortable hand launching it. This is not a “throw” launch; you just let the drone rest on your palm, tell it to take off, and then stay still while it gets its motors going and then gently lifts off. The lift off is so gentle that you have to be careful not to pull your hand away too soon—I did that once and the drone, being not quite ready, dropped towards the ground, but managed to recover without much drama.

GIF: Evan Ackerman/IEEE Spectrum

Hand landings always look scary, but the Skydio 2 is incredibly gentle. After trying this once, it became the only way I ever landed the drone.
Click here for a full resolution clip.

Catching the drone for landing is perhaps very slightly more dangerous, but not any more difficult. You put the drone above and in front of you facing away, tell it to land in the app or with the beacon, and then put your hand underneath it to grasp it as it slowly descends. It settles delicately and promptly turns itself off. Every drone should land this way. The battery pack provides a good place to grip, although you do have to be mindful of the forward set of props, which (since they’re the pair that are beneath the body of drone) are quite close to your fingers. You’ll certainly be mindful after you catch a blade with your fingers once. Which I did. For the purposes of this review and totally not by accident. No damage, for the record.

Photo: Evan Ackerman/IEEE Spectrum

You won’t be disappointed with the Skydio 2’s in-flight performance, unless you’re looking for a dedicated racing drone.

In normal flight, the Skydio 2 performs as well as you’d expect. It’s stable and manages light to moderate wind without any problems, although I did notice some occasional lateral drifting when the drone should have been in a stationary hover. While the controller gains are adjustable, the Skydio 2 isn’t quite as aggressive in flight as my Mavic Pro on Sport Mode, but again, if you’re looking for a high-speed drone, that’s really not what the Skydio is all about.

The Skydio 2 is substantially louder than my Anafi, although the Anafi is notably quiet for a drone. It’s not annoying to hear (not a high-pitched whine), but you can hear it from a ways away, and farther away than my Mavic Pro. I’m not sure whether that’s because of the absolute volume or the volume plus the pitch. In some ways, this is a feature, since you can hear the drone following you even if you’re not looking at it, you just need to be aware of the noise it makes when you’re flying it around people.

Obstacle Avoidance
The primary reason Skydio 2 is the drone that you want to fly is because of its autonomous subject tracking and obstacle avoidance. Skydio’s PR videos make this capability look almost too good, and since I hadn’t tried out one of their drones before, the first thing I did with it was exactly what you’d expect: attempt to fly it directly into the nearest tree.

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 deftly slides around trees and branches. The control inputs here were simple “forward” or “turn,” all obstacle avoidance is autonomous.
Click here for a full resolution clip.

And it just won’t do it. It slows down a bit, and then slides right around one tree after another, going over and under and around branches. I pointed the drone into a forest and just held down “forward” and away it went, without any fuss, effortlessly ducking and weaving its way around. Of course, it wasn’t effortless at all—six 4K cameras were feeding data into the NVIDIA TX2 at 30 fps, and the drone was processing a million points in 3D space per second to plan the safest path while simultaneously taking into account where I wanted it to go. I spent about 10 more minutes doing my level best to crash the drone into anything at all using a flying technique probably best described as “reckless,” but the drone was utterly unfazed. It’s incredible.

What knocked my socks off was telling the drone to pass through treetops—in the clip below, I’m just telling the drone to fly straight down. Watch as it weaves its way through gaps between the branches:

GIF: Evan Ackerman/IEEE Spectrum

The result of parking the Skydio 2 above some trees and holding “down” on the controller is this impressive fully autonomous descent through the branches.
Click here for a full resolution clip.

Here’s one more example, where I sent the drone across a lake and started poking around in a tree. Sometimes the Skydio 2 isn’t sure where you want it to go, and you have to give it a little bit of a nudge in a clear direction, but that’s it.

GIF: Evan Ackerman/IEEE Spectrum

In obstacle-heavy environments, the Skydio 2 prudently slows down, but it can pick its way through almost anything that it can see.
Click here for a full resolution clip.

It’s important to keep in mind that all of the Skydio 2’s intelligence is based on vision. It uses cameras to see the world, which means that it has similar challenges as your eyes do. Specifically, Skydio warns against flying in the following conditions:

Skydio 2 can’t see certain visually challenging obstacles. Do not fly around thin branches, telephone or power lines, ropes, netting, wires, chain link fencing or other objects less than ½ inch in diameter.
Do not fly around transparent surfaces like windows or reflective surfaces like mirrors greater than 60 cm wide.
When the sun is low on the horizon, it can temporarily blind Skydio 2’s cameras depending on the angle of flight. Your drone may be cautious or jerky when flying directly toward the sun.

Basically, if you’d have trouble seeing a thing, or seeing under some specific flight conditions, then the Skydio 2 almost certainly will also. It gets even more problematic when challenging obstacles are combined with challenging flight conditions, which is what I’m pretty sure led to the only near-crash I had with the drone. Here’s a video:

GIF: Evan Ackerman/IEEE Spectrum

Flying around very thin branches and into the sun can cause problems for the Skydio 2’s obstacle avoidance.
Click here for a full resolution clip.

I had the Skydio 2 set to follow me on my bike (more about following and tracking in a bit). It was mid afternoon, but since it’s late fall here in Washington, D.C., the sun doesn’t get much higher than 30 degrees above the horizon. Late fall also means that most of the deciduous trees have lost their leaves, and so there are a bunch of skinny branches all over the place. The drone was doing a pretty good job of following me along the road at a relatively slow speed, and then it clipped the branch that you can just barely see in the video above. It recovered in an acrobatic maneuver that has been mostly video-stabilized out, and resumed tracking me before I freaked and told it to land. You can see another example here, where the drone (again) clips a branch that has the sun behind it, and this clip shows me stopping my bike before the drone runs into another branch in a similar orientation. As the video shows, it’s very hard to see the branches until it’s too late.

As far as I can tell, the drone is no worse for wear from any of this, apart from a small nick in one of the props. But, this is a good illustration of a problematic situation for the Skydio 2: flying into a low sun angle around small bare branches. Should I not have been flying the drone in this situation? It’s hard to say. These probably qualify as “thin branches,” although there was plenty of room along with middle of the road. There is an open question with the Skydio 2 as to exactly how much responsibility the user should have about when and where it’s safe to fly—for branches, how thin is too thin? How low can the sun be? What if the branches are only kinda thin and the sun is only kinda low, but it’s also a little windy? Better to be safe than sorry, of course, but there’s really no way for the user (or the drone) to know what it can’t handle until it can’t handle it.

Edge cases like these aside, the obstacle avoidance just works. Even if you’re not deliberately trying to fly into branches, it’s keeping a lookout for you all the time, which means that flying the drone goes from somewhat stressful to just pure fun. I can’t emphasize enough how amazing it is to be able to fly without worrying about running into things, and how great it feels to be able to hand the controller to someone who’s never flown a drone before and say, with complete confidence, “go ahead, fly it around!”

Skydio 2 vs. DJI Mavic

Photo: Evan Ackerman/IEEE Spectrum

Both the Skydio 2 and many models of DJI’s Mavic use visual obstacle avoidance, but the Skydio 2 is so much more advanced that you can’t really compare the two systems.

It’s important to note that there’s a huge difference between the sort of obstacle avoidance that you get with a DJI Mavic, and the sort of obstacle avoidance that you get with the Skydio 2. The objective of the Mavic’s obstacle avoidance is really there to prevent you from accidentally running into things, and in that capacity, it usually works. But there are two things to keep in mind here—first, not running into things is not the same as avoiding things, because avoiding things means planning several steps ahead, not just one step.

Second, there’s the fact that the Mavic’s obstacle detection only works most of the time. Fundamentally, I don’t trust my Mavic Pro, because sometimes the safety system doesn’t kick in for whatever reason and the drone ends up alarmingly close to something. And that’s actually fine, because with the Mavic, I expect to be piloting it. It’s for this same reason that I don’t care that my Parrot Anafi doesn’t have obstacle avoidance at all: I’m piloting it anyway, and I’m a careful pilot, so it just doesn’t matter. The Skydio 2 is totally and completely different. It’s in a class by itself, and you can’t compare what it can do to what anything else out there right now. Period.

Skydio 2 Tracking
Skydio’s big selling point on the Skydio 2 is that it’ll autonomously track you while avoiding obstacles. It does this visually, by watching where you go, predicting your future motion, and then planning its own motion to keep you in frame. The works better than you might expect, in that it’s really very good at not losing you. Obviously, the drone prioritizes not running into stuff over tracking you, which means that it may not always be where you feel like it should be. It’s probably trying to get there, but in obstacle dense environments, it can take some creative paths.

Having said that, I found it to be very consistent with keeping me in the frame, and I only managed to lose it when changing direction while fully occluded by an obstacle, or while it was executing an avoidance maneuver that was more dynamic than normal. If you deliberately try to hide from the drone it’s not that hard to do so if there are enough obstacles around, but I didn’t find the tracking to be something that I had to worry about it most cases. When tracking does fail and you’re not using the beacon, the drone will come to a hover. It won’t try and find you, but it will reacquire you if you get back into its field of view.

The Skydio 2 had no problem tracking me running through fairly dense trees:

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 had no problem chasing me around through these trees, even while I was asking it to continually change its tracking angle.
Click here for a full resolution clip.

It also managed to keep up with me as I rode my bike along a tree-lined road:

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 is easily fast enough to keep up with me on a bike, even while avoiding tree branches.
Click here for a full resolution clip.

It lost me when I asked it to follow very close behind me as I wove through some particularly branch-y trees, but it fails more or less gracefully by just sort of nope-ing out of situations when they start to get bad and coming to a hover somewhere safe.

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 knows better than to put itself into situations that it can’t handle, and will bail to a safe spot if things get too complicated.
Click here for a full resolution clip.

After a few days of playing with the drone, I started to get to the point where I could set it to track me and then just forget about it while I rode my bike or whatever, as opposed to constantly turning around to make sure it was still behind me, which is what I was doing initially. It’s a level of trust that I don’t think would be possible with any other drone.

Should You Buy a Skydio 2?

Photo: Evan Ackerman/IEEE Spectrum

We think the Skydio 2 is fun and relaxing to fly, with unique autonomous intelligence that makes it worth the cost.

In case I haven’t said it often enough in this review, the Skydio 2 is an incredible piece of technology. As far as I know (as a robotics journalist, mind you), this represents the state of the art in commercial drone autonomy, and quite possibly the state of the art in drone autonomy, period. And it’s available for $999, which is expensive, but less money than a Mavic Pro 2. If you’re interested in a new drone, you should absolutely consider the Skydio 2.

There are some things to keep in mind—battery life is a solid but not stellar 20 minutes. Extra batteries are expensive at $99 each (the base kit includes just one). The controller and the beacon are also expensive, at $150 each. And while I think the Skydio 2 is definitely the drone you want to fly, it may not be the drone you want to travel with, since it’s bulky compared to other options.

But there’s no denying the fact that the experience is uniquely magical. Once you’ve flown the Skydio 2, you won’t want to fly anything else. This drone makes it possible to get pictures and videos that would be otherwise impossible, and you can do it completely on your own. You can trust the drone to do what it promises, as long as you’re mindful of some basic and common sense safety guidelines. And we’ve been told that the drone is only going to get smarter and more capable over time.

If you buy a Skydio 2, it comes with the following warranty from Skydio:

“If you’re operating your Skydio 2 within our Safe Flight guidelines, and it crashes, we’ll repair or replace it for free.”

Skydio trusts their drone to go out into a chaotic and unstructured world and dodge just about anything that comes its way. And after a week with this drone, I can see how they’re able to offer this kind of guarantee. This is the kind of autonomy that robots have been promising for years, and the Skydio 2 makes it real.

Detailed technical specifications are available on Skydio’s website, and if you have any questions, post a comment—we’ve got this drone for a little while longer, and I’d be happy to try out (nearly) anything with it.

Skydio 2 Review Video Highlights
This video is about 7 minutes of 4K, 30 fps footage directly from the Skydio 2. The only editing I did was cutting clips together, no stabilization or color correcting or anything like that. The drone will record in 4K 60 fps, so it gets smoother than this, but I, er, forgot to change the setting.

[ Skydio ] Continue reading

Posted in Human Robots

#436202 Trump CTO Addresses AI, Facial ...

Michael Kratsios, the Chief Technology Officer of the United States, took the stage at Stanford University last week to field questions from Stanford’s Eileen Donahoe and attendees at the 2019 Fall Conference of the Institute for Human-Centered Artificial Intelligence (HAI).

Kratsios, the fourth to hold the U.S. CTO position since its creation by President Barack Obama in 2009, was confirmed in August as President Donald Trump’s first CTO. Before joining the Trump administration, he was chief of staff at investment firm Thiel Capital and chief financial officer of hedge fund Clarium Capital. Donahoe is Executive Director of Stanford’s Global Digital Policy Incubator and served as the first U.S. Ambassador to the United Nations Human Rights Council during the Obama Administration.

The conversation jumped around, hitting on both accomplishments and controversies. Kratsios touted the administration’s success in fixing policy around the use of drones, its memorandum on STEM education, and an increase in funding for basic research in AI—though the magnitude of that increase wasn’t specified. He pointed out that the Trump administration’s AI policy has been a continuation of the policies of the Obama administration, and will continue to build on that foundation. As proof of this, he pointed to Trump’s signing of the American AI Initiative earlier this year. That executive order, Kratsios said, was intended to bring various government agencies together to coordinate their AI efforts and to push the idea that AI is a tool for the American worker. The AI Initiative, he noted, also took into consideration that AI will cause job displacement, and asked private companies to pledge to retrain workers.

The administration, he said, is also looking to remove barriers to AI innovation. In service of that goal, the government will, in the next month or so, release a regulatory guidance memo instructing government agencies about “how they should think about AI technologies,” said Kratsios.

U.S. vs China in AI

A few of the exchanges between Kratsios and Donahoe hit on current hot topics, starting with the tension between the U.S. and China.

Donahoe:

“You talk a lot about unique U.S. ecosystem. In which aspect of AI is the U.S. dominant, and where is China challenging us in dominance?

Kratsios:

“They are challenging us on machine vision. They have more data to work with, given that they have surveillance data.”

Donahoe:

“To what extent would you say the quantity of data collected and available will be a determining factor in AI dominance?”

Kratsios:

“It makes a big difference in the short term. But we do research on how we get over these data humps. There is a future where you don’t need as much data, a lot of federal grants are going to [research in] how you can train models using less data.”

Donahoe turned the conversation to a different tension—that between innovation and values.

Donahoe:

“A lot of conversation yesterday was about the tension between innovation and values, and how do you hold those things together and lead in both realms.”

Kratsios:

“We recognized that the U.S. hadn’t signed on to principles around developing AI. In May, we signed [the Organization for Economic Cooperation and Development Principles on Artificial Intelligence], coming together with other Western democracies to say that these are values that we hold dear.

[Meanwhile,] we have adversaries around the world using AI to surveil people, to suppress human rights. That is why American leadership is so critical: We want to come out with the next great product. And we want our values to underpin the use cases.”

A member of the audience pushed further:

“Maintaining U.S. leadership in AI might have costs in terms of individuals and society. What costs should individuals and society bear to maintain leadership?”

Kratsios:

“I don’t view the world that way. Our companies big and small do not hesitate to talk about the values that underpin their technology. [That is] markedly different from the way our adversaries think. The alternatives are so dire [that we] need to push efforts to bake the values that we hold dear into this technology.”

Facial recognition

And then the conversation turned to the use of AI for facial recognition, an application which (at least for police and other government agencies) was recently banned in San Francisco.

Donahoe:

“Some private sector companies have called for government regulation of facial recognition, and there already are some instances of local governments regulating it. Do you expect federal regulation of facial recognition anytime soon? If not, what ought the parameters be?”

Kratsios:

“A patchwork of regulation of technology is not beneficial for the country. We want to avoid that. Facial recognition has important roles—for example, finding lost or displaced children. There are use cases, but they need to be underpinned by values.”

A member of the audience followed up on that topic, referring to some data presented earlier at the HAI conference on bias in AI:

“Frequently the example of finding missing children is given as the example of why we should not restrict use of facial recognition. But we saw Joy Buolamwini’s presentation on bias in data. I would like to hear your thoughts about how government thinks we should use facial recognition, knowing about this bias.”

Kratsios:

“Fairness, accountability, and robustness are things we want to bake into any technology—not just facial recognition—as we build rules governing use cases.”

Immigration and innovation

A member of the audience brought up the issue of immigration:

“One major pillar of innovation is immigration, does your office advocate for it?”

Kratsios:

“Our office pushes for best and brightest people from around the world to come to work here and study here. There are a few efforts we have made to move towards a more merit-based immigration system, without congressional action. [For example, in] the H1-B visa system, you go through two lotteries. We switched the order of them in order to get more people with advanced degrees through.”

The government’s tech infrastructure

Donahoe brought the conversation around to the tech infrastructure of the government itself:

“We talk about the shiny object, AI, but the 80 percent is the unsexy stuff, at federal and state levels. We don’t have a modern digital infrastructure to enable all the services—like a research cloud. How do we create this digital infrastructure?”

Kratsios:

“I couldn’t agree more; the least partisan issue in Washington is about modernizing IT infrastructure. We spend like $85 billion a year on IT at the federal level, we can certainly do a better job of using those dollars.” Continue reading

Posted in Human Robots

#436184 Why People Demanded Privacy to Confide ...

This is part four of a six-part series on the history of natural language processing.

Between 1964 and 1966, Joseph Weizenbaum, a German American computer scientist at MIT’s artificial intelligence lab, developed the first-ever chatbot [PDF].

While there were already some rudimentary digital language generators in existence—programs that could spit out somewhat coherent lines of text—Weizenbaum’s program was the first designed explicitly for interactions with humans. The user could type in some statement or set of statements in their normal language, press enter, and receive a response from the machine. As Weizenbaum explained, his program made “certain kinds of natural-language conversation between man and computer possible.”

He named the program Eliza after Eliza Doolittle, the working-class hero of George Bernard Shaw’s Pygmalion who learns how to talk with an upper-class accent. The new Eliza was written for the 36-bit IBM 7094, an early transistorized mainframe computer, in a programming language that Weizenbaum developed called MAD-SLIP.

Because computer time was a valuable resource, Eliza could only be run via a time-sharing system; the user interacted with the program remotely via an electric typewriter and printer. When the user typed in a sentence and pressed enter, a message was sent to the mainframe computer. Eliza scanned the message for the presence of a keyword and used it in a new sentence to form a response that was sent back, printed out, and read by the user.

To encourage ongoing dialogue, Weizenbaum designed Eliza to simulate the type of conversational style used by a Rogerian psychoanalyst. The program would take something the user said and reflect it back with a question, as can be seen in the example below. (Notice how the program takes keywords like “boyfriend” and “depressed” and re-uses them.)

Human: Men are all alike.

Computer: IN WHAT WAY?

Human: They're always bugging us about something or other.

Computer: CAN YOU THINK OF A SPECIFIC EXAMPLE?

Human: Well, my boyfriend made me come here.

Computer: YOUR BOYFRIEND MADE YOU COME HERE

Human: He says I'm depressed much of the time.

Computer: I AM SORRY TO HEAR YOU ARE DEPRESSED

Human: It's true. I am unhappy.

Weizenbaum chose this mode of dialogue for Eliza because it gave the impression that the computer understood what was being said without having to offer anything new to the conversation. It created the illusion of comprehension and engagement in a mere 200 lines of code.

To test Eliza’s capacity to engage an interlocutor, Weizenbaum invited students and colleagues into his office and let them chat with the machine while he looked on. He noticed, with some concern, that during their brief interactions with Eliza, many users began forming emotional attachments to the algorithm. They would open up to the machine and confess problems they were facing in their lives and relationships.

During their brief interactions with Eliza, many users began forming emotional attachments to the algorithm.

Even more surprising was that this sense of intimacy persisted even after Weizenbaum described how the machine worked and explained that it didn’t really understand anything that was being said. Weizenbaum was most troubled when his secretary, who had watched him build the program from scratch over many months, insisted that he leave the room so she could talk to Eliza in private.

For Weizenbaum, this experiment with Eliza made him question an idea that Alan Turing had proposed in 1950 about machine intelligence. In his paper, entitled “Computing Machinery and Intelligence,” Turing suggested that if a computer could conduct a convincingly human conversation in text, one could assume it was intelligent—an idea that became the basis of the famous Turing Test.

But Eliza demonstrated that convincing communication between a human and a machine could take place even if comprehension only flowed from one side: The simulation of intelligence, rather than intelligence itself, was enough to fool people. Weizenbaum called this the Eliza effect, and believed it was a type of “delusional thinking” that humanity would collectively suffer from in the digital age. This insight was a profound shock for Weizenbaum, and one that came to define his intellectual trajectory over the next decade.

The simulation of intelligence, rather than intelligence itself, was enough to fool people.

In 1976, he published Computing Power and Human Reason: From Judgment to Calculation [PDF], which offered a long meditation on why people are willing to believe that a simple machine might be able to understand their complex human emotions.

In this book, he argues that the Eliza effect signifies a broader pathology afflicting “modern man.” In a world conquered by science, technology, and capitalism, people had grown accustomed to viewing themselves as isolated cogs in a large and uncaring machine. In such a diminished social world, Weizenbaum reasoned, people had grown so desperate for connection that they put aside their reason and judgment in order to believe that a program could care about their problems.

Weizenbaum spent the rest of his life developing this humanistic critique of artificial intelligence and digital technology. His mission was to remind people that their machines were not as smart as they were often said to be. And that even though it sometimes appeared as though they could talk, they were never really listening.

This is the fourth installment of a six-part series on the history of natural language processing. Last week’s post described Andrey Markov and Claude Shannon’s painstaking efforts to create statistical models of language for text generation. Come back next Monday for part five, “In 2016, Microsoft’s Racist Chatbot Revealed the Dangers of Conversation.”

You can also check out our prior series on the untold history of AI. Continue reading

Posted in Human Robots

#436114 Video Friday: Transferring Human Motion ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

We are very sad to say that MIT professor emeritus Woodie Flowers has passed away. Flowers will be remembered for (among many other things, like co-founding FIRST) the MIT 2.007 course that he began teaching in the mid-1970s, famous for its student competitions.

These competitions got a bunch of well-deserved publicity over the years; here’s one from 1985:

And the 2.007 competitions are still going strong—this year’s theme was Moonshot, and you can watch a replay of the event here.

[ MIT ]

Looks like Aibo is getting wireless integration with Hitachi appliances, which turns out to be pretty cute:

What is this magical box where you push a button and 60 seconds later fluffy pancakes come out?!

[ Aibo ]

LiftTiles are a “modular and reconfigurable room-scale shape display” that can turn your floor and walls into on-demand structures.

[ LiftTiles ]

Ben Katz, a grad student in MIT’s Biomimetics Robotics Lab, has been working on these beautiful desktop-sized Furuta pendulums:

That’s a crowdfunding project I’d pay way too much for.

[ Ben Katz ]

A clever bit of cable manipulation from MIT, using GelSight tactile sensors.

[ Paper ]

A useful display of industrial autonomy on ANYmal from the Oxford Robotics Group.

This video is of a demonstration for the ORCA Robotics Hub showing the ANYbotics ANYmal robot carrying out industrial inspection using autonomy software from Oxford Robotics Institute.

[ ORCA Hub ] via [ DRS ]

Thanks Maurice!

Meet Katie Hamilton, a software engineer at NASA’s Ames Research Center, who got into robotics because she wanted to help people with daily life. Katie writes code for robots, like Astrobee, who are assisting astronauts with routine tasks on the International Space Station.

[ NASA Astrobee ]

Transferring human motion to a mobile robotic manipulator and ensuring safe physical human-robot interaction are crucial steps towards automating complex manipulation tasks in human-shared environments. In this work we present a robot whole-body teleoperation framework for human motion transfer. We validate our approach through several experiments using the TIAGo robot, showing this could be an easy way for a non-expert to teach a rough manipulation skill to an assistive robot.

[ Paper ]

This is pretty cool looking for an autonomous boat, but we’ll see if they can build a real one by 2020 since at the moment it’s just an average rendering.

[ ProMare ]

I had no idea that asparagus grows like this. But, sure does make it easy for a robot to harvest.

[ Inaho ]

Skip to 2:30 in this Pepper unboxing video to hear the noise it makes when tickled.

[ HIT Lab NZ ]

In this interview, Jean Paul Laumond discusses his movement from mathematics to robotics and his career contributions to the field, especially in regards to motion planning and anthropomorphic motion. Describing his involvement at CNRS and in other robotics projects, such as HILARE, he comments on the distinction in perception between the robotics approach and a mathematics one.

[ IEEE RAS History ]

Here’s a couple of videos from the CMU Robotics Institute archives, showing some of the work that took place over the last few decades.

[ CMU RI ]

In this episode of the Artificial Intelligence Podcast, Lex Fridman speaks with David Ferrucci from IBM about Watson and (you guessed it) artificial intelligence.

David Ferrucci led the team that built Watson, the IBM question-answering system that beat the top humans in the world at the game of Jeopardy. He is also the Founder, CEO, and Chief Scientist of Elemental Cognition, a company working engineer AI systems that understand the world the way people do. This conversation is part of the Artificial Intelligence podcast.

[ AI Podcast ]

This week’s CMU RI Seminar is by Pieter Abbeel from UC Berkeley, on “Deep Learning for Robotics.”

Programming robots remains notoriously difficult. Equipping robots with the ability to learn would by-pass the need for what otherwise often ends up being time-consuming task specific programming. This talk will describe recent progress in deep reinforcement learning (robots learning through their own trial and error), in apprenticeship learning (robots learning from observing people), and in meta-learning for action (robots learning to learn). This work has led to new robotic capabilities in manipulation, locomotion, and flight, with the same approach underlying advances in each of these domains.

[ CMU RI ] Continue reading

Posted in Human Robots

#436100 Labrador Systems Developing Affordable ...

Developing robots for the home is still a challenge, especially if you want those robots to interact with people and help them do practical, useful things. However, the potential markets for home robots are huge, and one of the most compelling markets is for home robots that can assist humans who need them. Today, Labrador Systems, a startup based in California, is announcing a pre-seed funding round of $2 million (led by SOSV’s hardware accelerator HAX with participation from Amazon’s Alexa Fund and iRobot Ventures, among others) with the goal of expanding development and conducting pilot studies of “a new [assistive robot] platform for supporting home health.”

Labrador was founded two years ago by Mike Dooley and Nikolai Romanov. Both Mike and Nikolai have backgrounds in consumer robotics at Evolution Robotics and iRobot, but as an ’80s gamer, Mike’s bio (or at least the parts of his bio on LinkedIn) caught my attention: From 1995 to 1997, Mike worked at Brøderbund Software, helping to manage play testing for games like Myst and Riven and the Where in the World is Carmen San Diego series. He then spent three years at Lego as the product manager for MindStorms. After doing some marginally less interesting things, Mike was the VP of product development at Evolution Robotics from 2006 to 2012, where he led the team that developed the Mint floor sweeping robot. Evolution was acquired by iRobot in 2012, and Mike ended up as the VP of product development over there until 2017, when he co-founded Labrador.

I was pretty much sold at Where in the World is Carmen San Diego (the original version of which I played from a 5.25” floppy on my dad’s Apple IIe)*, but as you can see from all that other stuff, Mike knows what he’s doing in robotics as well.

And according to Labrador’s press release, what they’re doing is this:

Labrador Systems is an early stage technology company developing a new generation of assistive robots to help people live more independently. The company’s core focus is creating affordable solutions that address practical and physical needs at a fraction of the cost of commercial robots. … Labrador’s technology platform offers an affordable solution to improve the quality of care while promoting independence and successful aging.

Labrador’s personal robot, the company’s first offering, will enter pilot studies in 2020.

That’s about as light on detail as a press release gets, but there’s a bit more on Labrador’s website, including:

Our core focus is creating affordable solutions that address practical and physical needs. (we are not a social robot company)
By affordable, we mean products and technologies that will be available at less than 1/10th the cost of commercial robots.
We achieve those low costs by fusing the latest technologies coming out of augmented reality with robotics to move things in the real world.

The only hardware we’ve actually seen from Labrador at this point is a demo that they put together for Amazon’s re:MARS conference, which took place a few months ago, showing a “demonstration project” called Smart Walker:

This isn’t the home assistance robot that Labrador got its funding for, but rather a demonstration of some of their technology. So of course, the question is, what’s Labrador working on, then? It’s still a secret, but Mike Dooley was able to give us a few more details.

IEEE Spectrum: Your website shows a smart walker concept—how is that related to the assistive robot that you’re working on?

Mike Dooley: The smart walker was a request from a major senior living organization to have our robot (which is really good at navigation) guide residents from place to place within their communities. To test the idea with residents, it turned out to be much quicker to take the navigation system from the robot and put it on an existing rollator walker. So when you see the clips of the technology in the smart walker video on our website, that’s actually the robot’s navigation system localizing in real time and path planning in an environment.

“Assistive robot” can cover a huge range of designs and capabilities—can you give us any more detail about your robot, and what it’ll be able to do?

One of the core features of our robot is to help people move things where they have difficulty moving themselves, particularly in the home setting. That may sound trivial, but to someone who has impaired mobility, it can be a major daily challenge and negatively impact their life and health in a number of ways. Some examples we repeatedly hear are people not staying hydrated or taking their medication on time simply because there is a distance between where they are and the items they need. Once we have those base capabilities, i.e. the ability to navigate around a home and move things within it, then the robot becomes a platform for a wider variety of applications.

What made you decide to develop assistive robots, and why are robots a good solution for seniors who want to live independently?

Supporting independent living has been seen as a massive opportunity in robotics for some time, but also as something off in the future. The turning point for me was watching my mother enter that stage in her life and seeing her transition to using a cane, then a walker, and eventually to a wheelchair. That made the problems very real for me. It also made things much clearer about how we could start addressing specific needs with the tools that are becoming available now.

In terms of why robots can be a good solution, the basic answer is the level of need is so overwhelming that even helping with “basic” tasks can make an appreciable difference in the quality of someone’s daily life. It’s also very much about giving individuals a degree of control back over their environment. That applies to seniors as well as others whose world starts getting more complex to manage as their abilities become more impaired.

What are the particular challenges of developing assistive robots, and how are you addressing them? Why do you think there aren’t more robotics startups in this space?

The setting (operating in homes and personal spaces) and the core purpose of the product (aiding a wide variety of individuals) bring a lot of complexity to any capability you want to build into an assistive robot. Our approach is to put as much structure as we can into the system to make it functional, affordable, understandable and reliable.

I think one of the reasons you don’t see more startups in the space is that a lot of roboticists want to skip ahead and do the fancy stuff, such as taking on human-level capabilities around things like manipulation. Those are very interesting research topics, but we think those are also very far away from being practical solutions you can productize for people to use in their homes.

How do you think assistive robots and human caregivers should work together?

The ideal scenario is allowing caregivers to focus more of their time on the high-touch, personal side of care. The robot can offload the more basic support tasks as well as extend the impact of the caregiver for the long hours of the day they can’t be with someone at their home. We see that applying to both paid care providers as well as the 40 million unpaid family members and friends that provide assistance.

The robot is really there as a tool, both for individuals in need and the people that help them. What’s promising in the research discussions we’ve had so far, is that even when a caregiver is present, giving control back to the individual for simple things can mean a lot in the relationship between them and the caregiver.

What should we look forward to from Labrador in 2020?

Our big goal in 2020 is to start placing the next version of the robot with individuals with different types of needs to let them experience it naturally in their own homes and provide feedback on what they like, what don’t like and how we can make it better. We are currently reaching out to companies in the healthcare and home health fields to participate in those studies and test specific applications related to their services. We plan to share more detail about those studies and the robot itself as we get further into 2020.

If you’re an organization (or individual) who wants to possibly try out Labrador’s prototype, the company encourages you to connect with them through their website. And as we learn more about what Labrador is up to, we’ll have updates for you, presumably in 2020.

[ Labrador Systems ]

* I just lost an hour of my life after finding out that you can play Where in the World is Carmen San Diego in your browser for free. Continue reading

Posted in Human Robots