Tag Archives: might

#436263 Skydio 2 Review: This Is the Drone You ...

Let me begin this review by saying that the Skydio 2 is one of the most impressive robots that I have ever seen. Over the last decade, I’ve spent enough time around robots to have a very good sense of what kinds of things are particularly challenging for them, and to set my expectations accordingly. Those expectations include things like “unstructured environments are basically impossible” and “full autonomy is impractically expensive” and “robot videos rarely reflect reality.”

Skydio’s newest drone is an exception to all of this. It’s able to fly autonomously at speed through complex environments in challenging real-world conditions in a way that’s completely effortless and stress-free for the end user, allowing you to capture the kind of video that would be otherwise impossible, even (I’m guessing) for professional drone pilots. When you see this technology in action, it’s (almost) indistinguishable from magic.

Skydio 2 Price
To be clear, the Skydio 2 is not without compromises, and the price of $999 (on pre-order with delivery of the next batch expected in spring of 2020) requires some justification. But the week I’ve had with this drone has left me feeling like its fundamental autonomous capability is so far beyond just about anything that I’ve ever experienced that I’m questioning why I would every fly anything else ever again.

We’ve written extensively about Skydio, beginning in early 2016 when the company posted a video of a prototype drone dodging trees while following a dude on a bike. Even three years ago, Skydio’s tech was way better than anything we’d seen outside of a research lab, and in early 2018, they introduced their first consumer product, the Skydio R1. A little over a year later, Skydio has introduced the Skydio 2, which is smaller, smarter, and much more affordable. Here’s an overview video just to get you caught up:

Skydio sent me a Skydio 2 review unit last week, and while I’m reasonably experienced with drones in general, this is the first time I’ve tried a Skydio drone in person. I had a pretty good idea what to expect, and I was absolutely blown away. Like, I was giggling to myself while running through the woods as the drone zoomed around, deftly avoiding trees and keeping me in sight. Robots aren’t supposed to be this good.

A week is really not enough time to explore everything that the Skydio can do, especially Thanksgiving week in Washington, D.C. (a no-fly zone) in early winter. But I found a nearby state park in which I could legally and safely fly the drone, and I did my best to put the Skydio 2 through its paces.

Note: Throughout this review, we’ve got a bunch of GIFs to help illustrate different features of the drone. To fit them all in, these GIFs had to be heavily compressed. Underneath each GIF is a timestamped link to this YouTube video (also available at the bottom of the post), which you can click on to see the an extended cut of the original 4K 30 fps footage. And there’s a bunch of interesting extra video in there as well.

Skydio 2 Specs

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 is primarily made out of magnesium, which (while light) is both heavier and more rigid and durable than plastic. The offset props (the back pair are above the body, and the front pair are below) are necessary to maintain the field of view of the navigation cameras.

The Skydio 2 both looks and feels like a well-designed and carefully thought-out drone. It’s solid, and a little on the heavy side as far as drones go—it’s primarily made out of magnesium, which (while light) is both heavier and more rigid and durable than plastic. The blue and black color scheme is far more attractive than you typically see with drones.

Photo: Evan Ackerman/IEEE Spectrum

To detect and avoid obstacles, the Skydio 2 uses an array of six 4K hemispherical cameras that feed data into an NVIDIA Jetson TX2 at 30 fps, with the drone processing a million points in 3D space per second to plan the safest path.

The Skydio 2 is built around an array of six hemispherical obstacle-avoidance cameras and the NVIDIA Jetson TX2 computing module that they’re connected to. This defines the placement of the gimbal, the motors and props, and the battery, since all of this stuff has to be as much as possible out of the view of the cameras in order for the drone to effectively avoid obstacles in any direction.

Without the bottom-mounted battery attached, the drone is quite flat. The offset props (the back pair are above the body, and the front pair are below) are necessary to maintain the field of view of the obstacle-avoidance cameras. These hemispherical cameras are on the end of each of the prop arms as well as above and below the body of the drone. They look awfully exposed, even though each is protected from ground contact by a little fin. You need to make sure these cameras are clean and smudge-free, and Skydio includes a cleaning cloth for this purpose. Underneath the drone there are slots for microSD cards, one for recording from the camera and a second one that the drone uses to store data. The attention to detail extends to the SD card insertion, which has a sloped channel that guides the card securely into its slot.

Once you snap the battery in, the drone goes from looking streamlined to looking a little chubby. Relative to other drones, the battery almost seems like an afterthought, like Skydio designed the drone and then remembered, “oops we have to add a battery somewhere, let’s just kludge it onto the bottom.” But again, the reason for this is to leave room inside the body for the NVIDIA TX2, while making sure that the battery stays out of view of the obstacle avoidance cameras.

The magnetic latching system for the battery is both solid and satisfying. I’m not sure why it’s necessary, strictly speaking, but I do like it, and it doesn’t seem like the battery will fly off even during the most aggressive maneuvers. Each battery includes an LED array that will display its charge level in 25 percent increments, as well as a button that you push to turn the drone on and off. Charging takes place via a USB-C port in the top of the drone, which I don’t like, because it means that the batteries can’t be charged on their own (like the Parrot Anafi’s battery), and that you can’t charge one battery while flying with another, like basically every other drone ever. A separate battery charger that will charge two at once is available from Skydio for an eyebrow-raising $129.

I appreciate that all of Skydio’s stuff (batteries, controller, and beacon) charges via USB-C, though. The included USB-C adapter with its beefy cable will output at up to 65 watts, which’ll charge a mostly depleted battery in under an hour. The drone turns itself on while charging, which seems unnecessary.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 is not foldable, making it not nearly as easy to transport as some other drones. But it does come with a nice case that mitigates this issue somewhat, and the drone plus two batteries end up as a passably flat package about the size of a laptop case.

The most obvious compromise that Skydio made with the Skydio 2 is that the drone is not foldable. Skydio CEO Adam Bry told us that adding folding joints to the arms of the Skydio 2 would have made calibrating all six cameras a nightmare and significantly impacted performance. This makes complete sense, of course, but it does mean that the Skydio 2 is not nearly as easy to transport as some other drones.

Photo: Evan Ackerman/IEEE Spectrum

Folded and unfolded: The Skydio 2 compared to the Parrot Anafi (upper left) and the DJI Mavic Pro (upper right).

The Skydio 2 does come with a very nice case that mitigates this issue somewhat, and the drone plus two batteries end up as a passably flat package about the size of a laptop case. Still, it’s just not as convenient to toss into a backpack as my Anafi, although the Mavic Mini might be even more portable.

Photo: Evan Ackerman/IEEE Spectrum

While the Skydio 2’s case is relatively compact, the non-foldable drone is overall a significantly larger package than the Parrot Anafi.

The design of the drone leads to some other compromises as well. Since landing gear would, I assume, occlude the camera system, the drone lands directly on the bottom of its battery pack, which has a slightly rubberized pad about the size of a playing card. This does’t feel particularly stable unless you end up on a very flat surface, and made me concerned for the exposed cameras underneath the drone as well as the lower set of props. I’d recommend hand takeoffs and landings—more on those later.

Skydio 2 Camera System

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2’s primary camera is a Sony IMX577 1/2.3″ 12.3-megapixel CMOS sensor. It’s mounted to a three-axis gimbal and records 4K video at 60 fps, or 1080p video at 120 fps.

The Skydio 2 comes with a three-axis gimbal supporting a 12-megapixel camera, just enough to record 4K video at 60 fps, or 1080p video at 120 fps. Skydio has provided plenty of evidence that its imaging system is at least as good if not better than other drone cameras. Tested against my Mavic Pro and Parrot Anafi, I found no reason to doubt that. To be clear, I didn’t do exhaustive pixel-peeping comparisons between them, you’re just getting my subjective opinion that the Skydio 2 has a totally decent camera that you won’t be disappointed with. I will say that I found the HDR photo function to be not all that great under the few situations in which I tested it—after looking at a few muddy sunset shots, I turned it off and was much happier.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2’s 12-megapixel camera is solid, although we weren’t impressed with the HDR option.

The video stabilization is fantastic, to the point where watching the video footage can be underwhelming because it doesn’t reflect the motion of the drone. I almost wish there was a way to change to unstabilized (or less-stabilized) video so that the viewer could get a little more of a wild ride. Or, ideally, there’d be a way for the drone to provide you with a visualization of what it was doing using the data collected by its cameras. That’s probably wishful thinking, though. The drone itself doesn’t record audio because all you’d get would be an annoying buzz, but the app does record audio, so the audio from your phone gets combined with the drone video. Don’t expect great quality, but it’s better than nothing.

Skydio 2 App
The app is very simple compared to every other drone app I’ve tried, and that’s a good thing. Here’s what it looks like:

Image: Skydio

Trackable subjects get a blue “+” sign over them, and if you tap them, the “+” turns into a spinny blue circle. Once you’ve got a subject selected, you can choose from a variety of cinematic skills that the drone will execute while following you.

You get the controls that you need and the information that you need, and nothing else. Manual flight with the on-screen buttons works adequately, and the double-tap to fly function on the phone works surprisingly well, making it easy to direct the drone to a particular spot above the ground.

The settings menus are limited but functional, allowing you to change settings for the camera and a few basic tweaks for controlling the drone. One unique setting to the Skydio 2 is the height floor—since the drone only avoids static obstacles, you can set it to maintain a height of at least 8 feet above the ground while flying autonomously to make sure that if you’re flying around other people, it won’t run into anyone who isn’t absurdly tall and therefore asking for it.

Trackable subjects get a blue “+” sign over them in the app, and if you tap them, the “+” turns into a spinny blue circle. Once you’ve got a subject selected, you can choose from a variety of cinematic skills that the drone will execute while following you, and in addition, you can select “one-shot” skills that involve the drone performing a specific maneuver before returning to the previously selected cinematic skill. For example, you can tell the drone to orbit around you, and then do a “rocket” one-shot where it’ll fly straight up above you (recording the whole time, of course), before returning to its orbiting.

After you’re done flying, you can scroll through your videos and easily clip out excerpts from them and save them to your phone for sharing. Again, it’s a fairly simple interface without a lot of options. You could call it limited, I guess, but I appreciate that it just does a few things that you care about and otherwise doesn’t clutter itself up.

The real limitation of the app is that it uses Wi-Fi to connect to the Skydio 2, which restricts the range. To fly much beyond a hundred meters or so, you’ll need to use the controller or beacon instead.

Skydio 2 Controller and Beacon

Photo: Evan Ackerman/IEEE Spectrum

While the Skydio 2 controller provides a better hands-on flight experience than with the phone, plus an extended range of up to 3.5 km, more experienced pilots may find manual control a bit frustrating, because the underlying autonomy will supersede your maneuvers when you start getting close to objects.

I was looking forward to using the controller, because with every other drone I’ve had, the precision that a physically controller provides is, I find, mandatory for a good flying experience and to get the photos and videos that you want. With Skydio 2, that’s all out the window. It’s not that the controller is useless or anything, it’s just that because the drone tracks you and avoids obstacles on its own, that level of control precision becomes largely unnecessary.

The controller itself is perfectly fine. It’s a rebranded Parrot Skycontroller3, which is the same as the one that you get with a Parrot Anafi. It’s too bad that the sticks don’t unscrew to make it a little more portable, and overall it’s functional rather than fancy, but it feels good to use and includes a sizeable antenna that makes a significant difference to the range that you get (up to 3.5 kilometers).

You definitely get a better hands-on flight experience with the controller than with the phone, so if you want to (say) zip the drone around some big open space for fun, it’s good for that. And it’s nice to be able to hand the controller to someone who’s never flown a drone before and let them take it for a spin without freaking out about them crashing it the whole time. For more experienced pilots, though, the controller is ultimately just a bit frustrating, because the underlying autonomy will supersede your control when you start getting close to objects, which (again) limits how useful the controller is relative to your phone.

I do still prefer the controller over the phone, but I’m not sure that it’s worth the extra $150, unless you plan to fly the Skydio 2 at very long distances or primarily in manual mode. And honestly, if either of those two things are your top priority, the Skydio 2 is probably not the drone for you.

Photo: Evan Ackerman/IEEE Spectrum

The Skydio 2 beacon uses GPS tracking to help the drone follow you, extending range up to 1.5 km. You can also fly the with the beacon alone, no phone necessary.

The purpose of the beacon, according to Skydio, is to give the drone a way of tracking you if it can’t see you, which can happen, albeit infrequently. My initial impression of the beacon was that it was primarily useful as a range-extending bridge between my phone and the drone. But I accidentally left my phone at home one day (oops) and had to fly the drone with only the beacon, and it was a surprisingly decent experience. The beacon allows for full manual control of a sort—you can tap different buttons to rotate, fly forward, and ascend or descend. This is sufficient for takeoff, landing, to make sure that the drone is looking at you when you engage visual tracking, and to rescue it if it gets trapped somewhere.

The rest of the beacon’s control functions are centered around a few different tracking modes, and with these, it works just about as well as your phone. You have fewer options overall, but all the basic stuff is there with just a few intuitive button clicks, including tracking range and angle. If you’re willing to deal with this relatively minor compromise, it’s nice to not have your phone available for other things rather than being monopolized by the drone.

Skydio 2 In Flight

GIF: Evan Ackerman/IEEE Spectrum

Hand takeoffs are simple and reliable.
Click here for a full resolution clip.

Starting up the Skydio 2 doesn’t require any kind of unusual calibration steps or anything like that. It prefers to be kept still, but you can start it up while holding it, it’ll just take a few seconds longer to tell you that it’s ready to go. While the drone will launch from any flat surface with significant clearance around it (it’ll tell you if it needs more room), the small footprint of the battery means that I was more comfortable hand launching it. This is not a “throw” launch; you just let the drone rest on your palm, tell it to take off, and then stay still while it gets its motors going and then gently lifts off. The lift off is so gentle that you have to be careful not to pull your hand away too soon—I did that once and the drone, being not quite ready, dropped towards the ground, but managed to recover without much drama.

GIF: Evan Ackerman/IEEE Spectrum

Hand landings always look scary, but the Skydio 2 is incredibly gentle. After trying this once, it became the only way I ever landed the drone.
Click here for a full resolution clip.

Catching the drone for landing is perhaps very slightly more dangerous, but not any more difficult. You put the drone above and in front of you facing away, tell it to land in the app or with the beacon, and then put your hand underneath it to grasp it as it slowly descends. It settles delicately and promptly turns itself off. Every drone should land this way. The battery pack provides a good place to grip, although you do have to be mindful of the forward set of props, which (since they’re the pair that are beneath the body of drone) are quite close to your fingers. You’ll certainly be mindful after you catch a blade with your fingers once. Which I did. For the purposes of this review and totally not by accident. No damage, for the record.

Photo: Evan Ackerman/IEEE Spectrum

You won’t be disappointed with the Skydio 2’s in-flight performance, unless you’re looking for a dedicated racing drone.

In normal flight, the Skydio 2 performs as well as you’d expect. It’s stable and manages light to moderate wind without any problems, although I did notice some occasional lateral drifting when the drone should have been in a stationary hover. While the controller gains are adjustable, the Skydio 2 isn’t quite as aggressive in flight as my Mavic Pro on Sport Mode, but again, if you’re looking for a high-speed drone, that’s really not what the Skydio is all about.

The Skydio 2 is substantially louder than my Anafi, although the Anafi is notably quiet for a drone. It’s not annoying to hear (not a high-pitched whine), but you can hear it from a ways away, and farther away than my Mavic Pro. I’m not sure whether that’s because of the absolute volume or the volume plus the pitch. In some ways, this is a feature, since you can hear the drone following you even if you’re not looking at it, you just need to be aware of the noise it makes when you’re flying it around people.

Obstacle Avoidance
The primary reason Skydio 2 is the drone that you want to fly is because of its autonomous subject tracking and obstacle avoidance. Skydio’s PR videos make this capability look almost too good, and since I hadn’t tried out one of their drones before, the first thing I did with it was exactly what you’d expect: attempt to fly it directly into the nearest tree.

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 deftly slides around trees and branches. The control inputs here were simple “forward” or “turn,” all obstacle avoidance is autonomous.
Click here for a full resolution clip.

And it just won’t do it. It slows down a bit, and then slides right around one tree after another, going over and under and around branches. I pointed the drone into a forest and just held down “forward” and away it went, without any fuss, effortlessly ducking and weaving its way around. Of course, it wasn’t effortless at all—six 4K cameras were feeding data into the NVIDIA TX2 at 30 fps, and the drone was processing a million points in 3D space per second to plan the safest path while simultaneously taking into account where I wanted it to go. I spent about 10 more minutes doing my level best to crash the drone into anything at all using a flying technique probably best described as “reckless,” but the drone was utterly unfazed. It’s incredible.

What knocked my socks off was telling the drone to pass through treetops—in the clip below, I’m just telling the drone to fly straight down. Watch as it weaves its way through gaps between the branches:

GIF: Evan Ackerman/IEEE Spectrum

The result of parking the Skydio 2 above some trees and holding “down” on the controller is this impressive fully autonomous descent through the branches.
Click here for a full resolution clip.

Here’s one more example, where I sent the drone across a lake and started poking around in a tree. Sometimes the Skydio 2 isn’t sure where you want it to go, and you have to give it a little bit of a nudge in a clear direction, but that’s it.

GIF: Evan Ackerman/IEEE Spectrum

In obstacle-heavy environments, the Skydio 2 prudently slows down, but it can pick its way through almost anything that it can see.
Click here for a full resolution clip.

It’s important to keep in mind that all of the Skydio 2’s intelligence is based on vision. It uses cameras to see the world, which means that it has similar challenges as your eyes do. Specifically, Skydio warns against flying in the following conditions:

Skydio 2 can’t see certain visually challenging obstacles. Do not fly around thin branches, telephone or power lines, ropes, netting, wires, chain link fencing or other objects less than ½ inch in diameter.
Do not fly around transparent surfaces like windows or reflective surfaces like mirrors greater than 60 cm wide.
When the sun is low on the horizon, it can temporarily blind Skydio 2’s cameras depending on the angle of flight. Your drone may be cautious or jerky when flying directly toward the sun.

Basically, if you’d have trouble seeing a thing, or seeing under some specific flight conditions, then the Skydio 2 almost certainly will also. It gets even more problematic when challenging obstacles are combined with challenging flight conditions, which is what I’m pretty sure led to the only near-crash I had with the drone. Here’s a video:

GIF: Evan Ackerman/IEEE Spectrum

Flying around very thin branches and into the sun can cause problems for the Skydio 2’s obstacle avoidance.
Click here for a full resolution clip.

I had the Skydio 2 set to follow me on my bike (more about following and tracking in a bit). It was mid afternoon, but since it’s late fall here in Washington, D.C., the sun doesn’t get much higher than 30 degrees above the horizon. Late fall also means that most of the deciduous trees have lost their leaves, and so there are a bunch of skinny branches all over the place. The drone was doing a pretty good job of following me along the road at a relatively slow speed, and then it clipped the branch that you can just barely see in the video above. It recovered in an acrobatic maneuver that has been mostly video-stabilized out, and resumed tracking me before I freaked and told it to land. You can see another example here, where the drone (again) clips a branch that has the sun behind it, and this clip shows me stopping my bike before the drone runs into another branch in a similar orientation. As the video shows, it’s very hard to see the branches until it’s too late.

As far as I can tell, the drone is no worse for wear from any of this, apart from a small nick in one of the props. But, this is a good illustration of a problematic situation for the Skydio 2: flying into a low sun angle around small bare branches. Should I not have been flying the drone in this situation? It’s hard to say. These probably qualify as “thin branches,” although there was plenty of room along with middle of the road. There is an open question with the Skydio 2 as to exactly how much responsibility the user should have about when and where it’s safe to fly—for branches, how thin is too thin? How low can the sun be? What if the branches are only kinda thin and the sun is only kinda low, but it’s also a little windy? Better to be safe than sorry, of course, but there’s really no way for the user (or the drone) to know what it can’t handle until it can’t handle it.

Edge cases like these aside, the obstacle avoidance just works. Even if you’re not deliberately trying to fly into branches, it’s keeping a lookout for you all the time, which means that flying the drone goes from somewhat stressful to just pure fun. I can’t emphasize enough how amazing it is to be able to fly without worrying about running into things, and how great it feels to be able to hand the controller to someone who’s never flown a drone before and say, with complete confidence, “go ahead, fly it around!”

Skydio 2 vs. DJI Mavic

Photo: Evan Ackerman/IEEE Spectrum

Both the Skydio 2 and many models of DJI’s Mavic use visual obstacle avoidance, but the Skydio 2 is so much more advanced that you can’t really compare the two systems.

It’s important to note that there’s a huge difference between the sort of obstacle avoidance that you get with a DJI Mavic, and the sort of obstacle avoidance that you get with the Skydio 2. The objective of the Mavic’s obstacle avoidance is really there to prevent you from accidentally running into things, and in that capacity, it usually works. But there are two things to keep in mind here—first, not running into things is not the same as avoiding things, because avoiding things means planning several steps ahead, not just one step.

Second, there’s the fact that the Mavic’s obstacle detection only works most of the time. Fundamentally, I don’t trust my Mavic Pro, because sometimes the safety system doesn’t kick in for whatever reason and the drone ends up alarmingly close to something. And that’s actually fine, because with the Mavic, I expect to be piloting it. It’s for this same reason that I don’t care that my Parrot Anafi doesn’t have obstacle avoidance at all: I’m piloting it anyway, and I’m a careful pilot, so it just doesn’t matter. The Skydio 2 is totally and completely different. It’s in a class by itself, and you can’t compare what it can do to what anything else out there right now. Period.

Skydio 2 Tracking
Skydio’s big selling point on the Skydio 2 is that it’ll autonomously track you while avoiding obstacles. It does this visually, by watching where you go, predicting your future motion, and then planning its own motion to keep you in frame. The works better than you might expect, in that it’s really very good at not losing you. Obviously, the drone prioritizes not running into stuff over tracking you, which means that it may not always be where you feel like it should be. It’s probably trying to get there, but in obstacle dense environments, it can take some creative paths.

Having said that, I found it to be very consistent with keeping me in the frame, and I only managed to lose it when changing direction while fully occluded by an obstacle, or while it was executing an avoidance maneuver that was more dynamic than normal. If you deliberately try to hide from the drone it’s not that hard to do so if there are enough obstacles around, but I didn’t find the tracking to be something that I had to worry about it most cases. When tracking does fail and you’re not using the beacon, the drone will come to a hover. It won’t try and find you, but it will reacquire you if you get back into its field of view.

The Skydio 2 had no problem tracking me running through fairly dense trees:

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 had no problem chasing me around through these trees, even while I was asking it to continually change its tracking angle.
Click here for a full resolution clip.

It also managed to keep up with me as I rode my bike along a tree-lined road:

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 is easily fast enough to keep up with me on a bike, even while avoiding tree branches.
Click here for a full resolution clip.

It lost me when I asked it to follow very close behind me as I wove through some particularly branch-y trees, but it fails more or less gracefully by just sort of nope-ing out of situations when they start to get bad and coming to a hover somewhere safe.

GIF: Evan Ackerman/IEEE Spectrum

The Skydio 2 knows better than to put itself into situations that it can’t handle, and will bail to a safe spot if things get too complicated.
Click here for a full resolution clip.

After a few days of playing with the drone, I started to get to the point where I could set it to track me and then just forget about it while I rode my bike or whatever, as opposed to constantly turning around to make sure it was still behind me, which is what I was doing initially. It’s a level of trust that I don’t think would be possible with any other drone.

Should You Buy a Skydio 2?

Photo: Evan Ackerman/IEEE Spectrum

We think the Skydio 2 is fun and relaxing to fly, with unique autonomous intelligence that makes it worth the cost.

In case I haven’t said it often enough in this review, the Skydio 2 is an incredible piece of technology. As far as I know (as a robotics journalist, mind you), this represents the state of the art in commercial drone autonomy, and quite possibly the state of the art in drone autonomy, period. And it’s available for $999, which is expensive, but less money than a Mavic Pro 2. If you’re interested in a new drone, you should absolutely consider the Skydio 2.

There are some things to keep in mind—battery life is a solid but not stellar 20 minutes. Extra batteries are expensive at $99 each (the base kit includes just one). The controller and the beacon are also expensive, at $150 each. And while I think the Skydio 2 is definitely the drone you want to fly, it may not be the drone you want to travel with, since it’s bulky compared to other options.

But there’s no denying the fact that the experience is uniquely magical. Once you’ve flown the Skydio 2, you won’t want to fly anything else. This drone makes it possible to get pictures and videos that would be otherwise impossible, and you can do it completely on your own. You can trust the drone to do what it promises, as long as you’re mindful of some basic and common sense safety guidelines. And we’ve been told that the drone is only going to get smarter and more capable over time.

If you buy a Skydio 2, it comes with the following warranty from Skydio:

“If you’re operating your Skydio 2 within our Safe Flight guidelines, and it crashes, we’ll repair or replace it for free.”

Skydio trusts their drone to go out into a chaotic and unstructured world and dodge just about anything that comes its way. And after a week with this drone, I can see how they’re able to offer this kind of guarantee. This is the kind of autonomy that robots have been promising for years, and the Skydio 2 makes it real.

Detailed technical specifications are available on Skydio’s website, and if you have any questions, post a comment—we’ve got this drone for a little while longer, and I’d be happy to try out (nearly) anything with it.

Skydio 2 Review Video Highlights
This video is about 7 minutes of 4K, 30 fps footage directly from the Skydio 2. The only editing I did was cutting clips together, no stabilization or color correcting or anything like that. The drone will record in 4K 60 fps, so it gets smoother than this, but I, er, forgot to change the setting.

[ Skydio ] Continue reading

Posted in Human Robots

#436261 AI and the future of work: The prospects ...

AI experts gathered at MIT last week, with the aim of predicting the role artificial intelligence will play in the future of work. Will it be the enemy of the human worker? Will it prove to be a savior? Or will it be just another innovation—like electricity or the internet?

As IEEE Spectrum previously reported, this conference (“AI and the Future of Work Congress”), held at MIT’s Kresge Auditorium, offered sometimes pessimistic outlooks on the job- and industry-destroying path that AI and automation seems to be taking: Self-driving technology will put truck drivers out of work; smart law clerk algorithms will put paralegals out of work; robots will (continue to) put factory and warehouse workers out of work.

Andrew McAfee, co-director of MIT’s Initiative on the Digital Economy, said even just in the past couple years, he’s noticed a shift in the public’s perception of AI. “I remember from previous versions of this conference, it felt like we had to make the case that we’re living in a period of accelerating change and that AI’s going to have a big impact,” he said. “Nobody had to make that case today.”

Elisabeth Reynolds, executive director of MIT’s Task Force on the Work of the Future, noted that following the path of least resistance is not a viable way forward. “If we do nothing, we’re in trouble,” she said. “The future will not take care of itself. We have to do something about it.”

Panelists and speakers spoke about championing productive uses of AI in the workplace, which ultimately benefit both employees and customers.

As one example, Zeynep Ton, professor at MIT Sloan School of Management, highlighted retailer Sam’s Club’s recent rollout of a program called Sam’s Garage. Previously customers shopping for tires for their car spent somewhere between 30 and 45 minutes with a Sam’s Club associate paging through manuals and looking up specs on websites.

But with an AI algorithm, they were able to cut that spec hunting time down to 2.2 minutes. “Now instead of wasting their time trying to figure out the different tires, they can field the different options and talk about which one would work best [for the customer],” she said. “This is a great example of solving a real problem, including [enhancing] the experience of the associate as well as the customer.”

“We think of it as an AI-first world that’s coming,” said Scott Prevost, VP of engineering at Adobe. Prevost said AI agents in Adobe’s software will behave something like a creative assistant or intern who will take care of more mundane tasks for you.

“We need a mindset change. That it is not just about minimizing costs or maximizing tax benefits, but really worrying about what kind of society we’re creating and what kind of environment we’re creating if we keep on just automating and [eliminating] good jobs.”
—Daron Acemoglu, MIT Institute Professor of Economics

Prevost cited an internal survey of Adobe customers that found 74 percent of respondents’ time was spent doing repetitive work—the kind that might be automated by an AI script or smart agent.

“It used to be you’d have the resources to work on three ideas [for a creative pitch or presentation],” Prevost said. “But if the AI can do a lot of the production work, then you can have 10 or 100. Which means you can actually explore some of the further out ideas. It’s also lowering the bar for everyday people to create really compelling output.”

In addition to changing the nature of work, noted a number of speakers at the event, AI is also directly transforming the workforce.

Jacob Hsu, CEO of the recruitment company Catalyte spoke about using AI as a job placement tool. The company seeks to fill myriad positions including auto mechanics, baristas, and office workers—with its sights on candidates including young people and mid-career job changers. To find them, it advertises on Craigslist, social media, and traditional media.

The prospects who sign up with Catalyte take a battery of tests. The company’s AI algorithms then match each prospect’s skills with the field best suited for their talents.

“We want to be like the Harry Potter Sorting Hat,” Hsu said.

Guillermo Miranda, IBM’s global head of corporate social responsibility, said IBM has increasingly been hiring based not on credentials but on skills. For instance, he said, as much as 50 per cent of the company’s new hires in some divisions do not have a traditional four-year college degree. “As a company, we need to be much more clear about hiring by skills,” he said. “It takes discipline. It takes conviction. It takes a little bit of enforcing with H.R. by the business leaders. But if you hire by skills, it works.”

Ardine Williams, Amazon’s VP of workforce development, said the e-commerce giant has been experimenting with developing skills of the employees at its warehouses (a.k.a. fulfillment centers) with an eye toward putting them in a position to get higher-paying work with other companies.

She described an agreement Amazon had made in its Dallas fulfillment center with aircraft maker Sikorsky, which had been experiencing a shortage of skilled workers for its nearby factory. So Amazon offered to its employees a free certification training to seek higher-paying work at Sikorsky.

“I do that because now I have an attraction mechanism—like a G.I. Bill,” Williams said. The program is also only available for employees who have worked at least a year with Amazon. So their program offers medium-term job retention, while ultimately moving workers up the wage ladder.

Radha Basu, CEO of AI data company iMerit, said her firm aggressively hires from the pool of women and under-resourced minority communities in the U.S. and India. The company specializes in turning unstructured data (e.g. video or audio feeds) into tagged and annotated data for machine learning, natural language processing, or computer vision applications.

“There is a motivation with these young people to learn these things,” she said. “It comes with no baggage.”

Alastair Fitzpayne, executive director of The Aspen Institute’s Future of Work Initiative, said the future of work ultimately means, in bottom-line terms, the future of human capital. “We have an R&D tax credit,” he said. “We’ve had it for decades. It provides credit for companies that make new investment in research and development. But we have nothing on the human capital side that’s analogous.”

So a company that’s making a big investment in worker training does it on their own dime, without any of the tax benefits that they might accrue if they, say, spent it on new equipment or new technology. Fitzpayne said a simple tweak to the R&D tax credit could make a big difference by incentivizing new investment programs in worker training. Which still means Amazon’s pre-existing worker training programs—for a company that already famously pays no taxes—would not count.

“We need a different way of developing new technologies,” said Daron Acemoglu, MIT Institute Professor of Economics. He pointed to the clean energy sector as an example. First a consensus around the problem needs to emerge. Then a broadly agreed-upon set of goals and measurements needs to be developed (e.g., that AI and automation would, for instance, create at least X new jobs for every Y jobs that it eliminates).

Then it just needs to be implemented.

“We need to build a consensus that, along the path we’re following at the moment, there are going to be increasing problems for labor,” Acemoglu said. “We need a mindset change. That it is not just about minimizing costs or maximizing tax benefits, but really worrying about what kind of society we’re creating and what kind of environment we’re creating if we keep on just automating and [eliminating] good jobs.” Continue reading

Posted in Human Robots

#436258 For Centuries, People Dreamed of a ...

This is part six of a six-part series on the history of natural language processing.

In February of this year, OpenAI, one of the foremost artificial intelligence labs in the world, announced that a team of researchers had built a powerful new text generator called the Generative Pre-Trained Transformer 2, or GPT-2 for short. The researchers used a reinforcement learning algorithm to train their system on a broad set of natural language processing (NLP) capabilities, including reading comprehension, machine translation, and the ability to generate long strings of coherent text.

But as is often the case with NLP technology, the tool held both great promise and great peril. Researchers and policy makers at the lab were concerned that their system, if widely released, could be exploited by bad actors and misappropriated for “malicious purposes.”

The people of OpenAI, which defines its mission as “discovering and enacting the path to safe artificial general intelligence,” were concerned that GPT-2 could be used to flood the Internet with fake text, thereby degrading an already fragile information ecosystem. For this reason, OpenAI decided that it would not release the full version of GPT-2 to the public or other researchers.

GPT-2 is an example of a technique in NLP called language modeling, whereby the computational system internalizes a statistical blueprint of a text so it’s able to mimic it. Just like the predictive text on your phone—which selects words based on words you’ve used before—GPT-2 can look at a string of text and then predict what the next word is likely to be based on the probabilities inherent in that text.

GPT-2 can be seen as a descendant of the statistical language modeling that the Russian mathematician A. A. Markov developed in the early 20th century (covered in part three of this series).

GPT-2 used cutting-edge machine learning algorithms to do linguistic analysis with over 1.5 million parameters.

What’s different with GPT-2, though, is the scale of the textual data modeled by the system. Whereas Markov analyzed a string of 20,000 letters to create a rudimentary model that could predict the likelihood of the next letter of a text being a consonant or a vowel, GPT-2 used 8 million articles scraped from Reddit to predict what the next word might be within that entire dataset.

And whereas Markov manually trained his model by counting only two parameters—vowels and consonants—GPT-2 used cutting-edge machine learning algorithms to do linguistic analysis with over 1.5 million parameters, burning through huge amounts of computational power in the process.

The results were impressive. In their blog post, OpenAI reported that GPT-2 could generate synthetic text in response to prompts, mimicking whatever style of text it was shown. If you prompt the system with a line of William Blake’s poetry, it can generate a line back in the Romantic poet’s style. If you prompt the system with a cake recipe, you get a newly invented recipe in response.

Perhaps the most compelling feature of GPT-2 is that it can answer questions accurately. For example, when OpenAI researchers asked the system, “Who wrote the book The Origin of Species?”—it responded: “Charles Darwin.” While only able to respond accurately some of the time, the feature does seem to be a limited realization of Gottfried Leibniz’s dream of a language-generating machine that could answer any and all human questions (described in part two of this series).

After observing the power of the new system in practice, OpenAI elected not to release the fully trained model. In the lead up to its release in February, there had been heightened awareness about “deepfakes”—synthetic images and videos, generated via machine learning techniques, in which people do and say things they haven’t really done and said. Researchers at OpenAI worried that GPT-2 could be used to essentially create deepfake text, making it harder for people to trust textual information online.

Responses to this decision varied. On one hand, OpenAI’s caution prompted an overblown reaction in the media, with articles about the “dangerous” technology feeding into the Frankenstein narrative that often surrounds developments in AI.

Others took issue with OpenAI’s self-promotion, with some even suggesting that OpenAI purposefully exaggerated GPT-2s power in order to create hype—while contravening a norm in the AI research community, where labs routinely share data, code, and pre-trained models. As machine learning researcher Zachary Lipton tweeted, “Perhaps what's *most remarkable* about the @OpenAI controversy is how *unremarkable* the technology is. Despite their outsize attention & budget, the research itself is perfectly ordinary—right in the main branch of deep learning NLP research.”

OpenAI stood by its decision to release only a limited version of GPT-2, but has since released larger models for other researchers and the public to experiment with. As yet, there has been no reported case of a widely distributed fake news article generated by the system. But there have been a number of interesting spin-off projects, including GPT-2 poetry and a webpage where you can prompt the system with questions yourself.

Mimicking humans on Reddit, the bots have long conversations about a variety of topics, including conspiracy theories and
Star Wars movies.

There’s even a Reddit group populated entirely with text produced by GPT-2-powered bots. Mimicking humans on Reddit, the bots have long conversations about a variety of topics, including conspiracy theories and Star Wars movies.

This bot-powered conversation may signify the new condition of life online, where language is increasingly created by a combination of human and non-human agents, and where maintaining the distinction between human and non-human, despite our best efforts, is increasingly difficult.

The idea of using rules, mechanisms, and algorithms to generate language has inspired people in many different cultures throughout history. But it’s in the online world that this powerful form of wordcraft may really find its natural milieu—in an environment where the identity of speakers becomes more ambiguous, and perhaps, less relevant. It remains to be seen what the consequences will be for language, communication, and our sense of human identity, which is so bound up with our ability to speak in natural language.

This is the sixth installment of a six-part series on the history of natural language processing. Last week’s post explained how an innocent Microsoft chatbot turned instantly racist on Twitter.

You can also check out our prior series on the untold history of AI. Continue reading

Posted in Human Robots

#436256 Alphabet Is Developing a Robot to Take ...

Robots excel at carrying out specialized tasks in controlled environments, but put them in your average office and they’d be lost. Alphabet wants to change that by developing what they call the Everyday Robot, which could learn to help us out with our daily chores.

For a long time most robots were painstakingly hand-coded to carry out their functions, but since the deep learning revolution earlier this decade there’s been a growing effort to imbue them with AI that lets them learn new tasks through experience.

That’s led to some impressive breakthroughs, like a robotic hand nimble enough to solve a Rubik’s cube and a robotic arm that can accurately toss bananas across a room.

And it turns out Alphabet’s early-stage research and development division, Alphabet X, has also secretly been using similar machine learning techniques to develop robots adaptable enough to carry out a range of tasks in cluttered and unpredictable human environments like homes and offices.

The robots they’ve built combine a wheeled base with a single arm and a head full of sensors (including LIDAR) for 3D scanning, borrowed from Alphabet’s self-driving car division, Waymo.

At the minute, though, they’re largely restricted to sorting trash for recycling, project leader Hans Peter Brondmo writes in a blog post. While that might sound mundane, identifying different kinds of trash, grasping it, and moving it to the correct bin is still a difficult thing for a robot to do consistently. Some of the robots also have to navigate around the office to sort trash at various recycling stations.

Alphabet says even its human staff were getting it wrong 20 percent of the time, but after several months of training the robots have managed to get that down to 3.5 percent.

Every day, 30 robots toil away in what’s been dubbed the “playpen” sorting trash, and then every night thousands of virtual robots continue to practice in a simulation. This experience is then used to update the robots’ control algorithms each night. All the robots also share their experiences with the others through a process called collaborative learning.

The process isn’t flawless, though. Simonite notes that while the robots exhibit some uncannily smart behaviors, like stirring piles of rubbish to make it easier to grab specific items, they also frequently miss or fumble the objects they’re trying to grasp.

Nonetheless, the project’s leaders are happy with their progress so far. And the hope is that creating robots that are able to learn from little more than experience in complex environments like an office should be a first step towards general-purpose robots that can pick up a variety of useful skills to assist humans.

Taking that next step will be the major test of the project. So far there’s been limited evidence that experience gained by robots in one task can be transferred to learning another. That’s something the group hopes to demonstrate next year.

And it seems there may be more robot news coming out of Alphabet X soon. The group has several other robotics “moonshots” in the pipeline, built on technology and talent transferred over in 2016 from the remains of a broadly unsuccessful splurge on robotics startups by former Google executive Andy Rubin.

Whether this robotics renaissance at Alphabet will finally help robots break into our homes and offices remains to be seen, but with the resources they have at hand, they just may be able to make it happen.

Image Credit: Everyday Robot, Alphabet X Continue reading

Posted in Human Robots

#436234 Robot Gift Guide 2019

Welcome to the eighth edition of IEEE Spectrum’s Robot Gift Guide!

This year we’re featuring 15 robotic products that we think will make fantastic holiday gifts. As always, we tried to include a broad range of robot types and prices, focusing mostly on items released this year. (A reminder: While we provide links to places where you can buy these items, we’re not endorsing any in particular, and a little bit of research may result in better deals.)

If you need even more robot gift ideas, take a look at our past guides: 2018, 2017, 2016, 2015, 2014, 2013, and 2012. Some of those robots are still great choices and might be way cheaper now than when we first posted about them. And if you have suggestions that you’d like to share, post a comment below to help the rest of us find the perfect robot gift.

Skydio 2

Image: Skydio

What makes robots so compelling is their autonomy, and the Skydio 2 is one of the most autonomous robots we’ve ever seen. It uses an array of cameras to map its environment and avoid obstacles in real-time, making flight safe and effortless and enabling the kinds of shots that would be impossible otherwise. Seriously, this thing is magical, and it’s amazing that you can actually buy one.
$1,000
Skydio
UBTECH Jimu MeeBot 2

Image: UBTECH

The Jimu MeeBot 2.0 from UBTECH is a STEM education robot designed to be easy to build and program. It includes six servo motors, a color sensor, and LED lights. An app for iPhone or iPad provides step-by-step 3D instructions, and helps you code different behaviors for the robot. It’s available exclusively from Apple.
$130
Apple
iRobot Roomba s9+

Image: iRobot

We know that $1,400 is a crazy amount of money to spend on a robot vacuum, but the Roomba s9+ is a crazy robot vacuum. As if all of its sensors and mapping intelligence wasn’t enough, it empties itself, which means that you can have your floors vacuumed every single day for a month and you don’t have to even think about it. This is what home robots are supposed to be.
$1,400
iRobot
PFF Gita

Photo: Piaggio Fast Forward

Nobody likes carrying things, which is why Gita is perfect for everyone with an extra $3,000 lying around. Developed by Piaggio Fast Forward, this autonomous robot will follow you around with a cargo hold full of your most important stuff, and do it in a way guaranteed to attract as much attention as possible.
$3,250
Gita
DJI Mavic Mini

Photo: DJI

It’s tiny, it’s cheap, and it takes good pictures—what more could you ask for from a drone? And for $400, this is an excellent drone to get if you’re on a budget and comfortable with manual flight. Keep in mind that while the Mavic Mini is small enough that you don’t need to register it with the FAA, you do still need to follow all the same rules and regulations.
$400
DJI
LEGO Star Wars Droid Commander

Image: LEGO

Designed for kids ages 8+, this LEGO set includes more than 1,000 pieces, enough to build three different droids: R2-D2, Gonk Droid, and Mouse Droid. Using a Bluetooth-controlled robotic brick called Move Hub, which connects to the LEGO BOOST Star Wars app, kids can change how the robots behave and solve challenges, learning basic robotics and coding skills.
$200
LEGO
Sony Aibo

Photo: Sony

Robot pets don’t get much more sophisticated (or expensive) than Sony’s Aibo. Strictly speaking, it’s one of the most complex consumer robots you can buy, and Sony continues to add to Aibo’s software. Recent new features include user programmability, and the ability to “feed” it.
$2,900 (free aibone and paw pads until 12/29/2019)
Sony
Neato Botvac D4 Connected

Photo: Neato

The Neato Botvac D4 may not have all of the features of its fancier and more expensive siblings, but it does have the features that you probably care the most about: The ability to make maps of its environment for intelligent cleaning (using lasers!), along with user-defined no-go lines that keep it where you want it. And it cleans quite well, too.
$530 $350 (sale)
Neato Robotics
Cubelets Curiosity Set

Photo: Modular Robotics

Cubelets are magnetic blocks that you can snap together to make an endless variety of robots with no programming and no wires. The newest set, called Curiosity, is designed for kids ages 4+ and comes with 10 robotic cubes. These include light and distance sensors, motors, and a Bluetooth module, which connects the robot constructions to the Cubelets app.
$250
Modular Robotics
Tertill

Photo: Franklin Robotics

Tertill does one simple job: It weeds your garden. It’s waterproof, dirt proof, solar powered, and fully autonomous, meaning that you can leave it out in your garden all summer and just enjoy eating your plants rather than taking care of them.
$350
Tertill
iRobot Root

Photo: iRobot

Root was originally developed by Harvard University as a tool to help kids progressively learn to code. iRobot has taken over Root and is now supporting the curriculum, which starts for kids before they even know how to read and should keep them busy for years afterwards.
$200
iRobot
LOVOT

Image: Lovot

Let’s be honest: Nobody is really quite sure what LOVOT is. We can all agree that it’s kinda cute, though. And kinda weird. But cute. Created by Japanese robotics startup Groove X, LOVOT does have a whole bunch of tech packed into its bizarre little body and it will do its best to get you to love it.
$2,750 (¥300,000)
LOVOT
Sphero RVR

Photo: Sphero

RVR is a rugged, versatile, easy to program mobile robot. It’s a development platform designed to be a bridge between educational robots like Sphero and more sophisticated and expensive systems like Misty. It’s mostly affordable, very expandable, and comes from a company with a lot of experience making robots.
$250
Sphero
“How to Train Your Robot”

Image: Lawrence Hall of Science

Aimed at 4th and 5th graders, “How to Train Your Robot,” written by Blooma Goldberg, Ken Goldberg, and Ashley Chase, and illustrated by Dave Clegg, is a perfect introduction to robotics for kids who want to get started with designing and building robots. But the book isn’t just for beginners: It’s also a fun, inspiring read for kids who are already into robotics and want to go further—it even introduces concepts like computer simulations and deep learning. You can download a free digital copy or request hardcopies here.
Free
UC Berkeley
MIT Mini Cheetah

Photo: MIT

Yes, Boston Dynamics’ Spot, now available for lease, is probably the world’s most famous quadruped, but MIT is starting to pump out Mini Cheetahs en masse for researchers, and while we’re not exactly sure how you’d manage to get one of these things short of stealing one directly for MIT, a Mini Cheetah is our fantasy robotics gift this year. Mini Cheetah looks like a ton of fun—it’s portable, highly dynamic, super rugged, and easy to control. We want one!
Price N/A
MIT Biomimetic Robotics Lab

For more tech gift ideas, see also IEEE Spectrum’s annual Gift Guide. Continue reading

Posted in Human Robots