Tag Archives: keep

#435674 MIT Future of Work Report: We ...

Robots aren’t going to take everyone’s jobs, but technology has already reshaped the world of work in ways that are creating clear winners and losers. And it will continue to do so without intervention, says the first report of MIT’s Task Force on the Work of the Future.

The supergroup of MIT academics was set up by MIT President Rafael Reif in early 2018 to investigate how emerging technologies will impact employment and devise strategies to steer developments in a positive direction. And the headline finding from their first publication is that it’s not the quantity of jobs we should be worried about, but the quality.

Widespread press reports of a looming “employment apocalypse” brought on by AI and automation are probably wide of the mark, according to the authors. Shrinking workforces as developed countries age and outstanding limitations in what machines can do mean we’re unlikely to have a shortage of jobs.

But while unemployment is historically low, recent decades have seen a polarization of the workforce as the number of both high- and low-skilled jobs have grown at the expense of the middle-skilled ones, driving growing income inequality and depriving the non-college-educated of viable careers.

This is at least partly attributable to the growth of digital technology and automation, the report notes, which are rendering obsolete many middle-skilled jobs based around routine work like assembly lines and administrative support.

That leaves workers to either pursue high-skilled jobs that require deep knowledge and creativity, or settle for low-paid jobs that rely on skills—like manual dexterity or interpersonal communication—that are still beyond machines, but generic to most humans and therefore not valued by employers. And the growth of emerging technology like AI and robotics is only likely to exacerbate the problem.

This isn’t the first report to note this trend. The World Bank’s 2016 World Development Report noted how technology is causing a “hollowing out” of labor markets. But the MIT report goes further in saying that the cause isn’t simply technology, but the institutions and policies we’ve built around it.

The motivation for introducing new technology is broadly assumed to be to increase productivity, but the authors note a rarely-acknowledged fact: “Not all innovations that raise productivity displace workers, and not all innovations that displace workers substantially raise productivity.”

Examples of the former include computer-aided design software that makes engineers and architects more productive, while examples of the latter include self-service checkouts and automated customer support that replace human workers, often at the expense of a worse customer experience.

While the report notes that companies have increasingly adopted the language of technology augmenting labor, in reality this has only really benefited high-skilled workers. For lower-skilled jobs the motivation is primarily labor cost savings, which highlights the other major force shaping technology’s impact on employment: shareholder capitalism.

The authors note that up until the 1980s, increasing productivity resulted in wage growth across the economic spectrum, but since then average wage growth has failed to keep pace and gains have dramatically skewed towards the top earners.

The report shies away from directly linking this trend to the birth of Reaganomics (something others have been happy to do), but it notes that American veneration of the shareholder as the primary stakeholder in a business and tax policies that incentivize investment in capital rather than labor have exacerbated the negative impacts technology can have on employment.

That means the current focus on re-skilling workers to thrive in the new economy is a necessary, but not sufficient, solution to the disruptive impact technology is having on work, the authors say.

Alongside significant investment in education, fiscal policies need to be re-balanced away from subsidizing investment in physical capital and towards boosting investment in human capital, the authors write, and workers need to have a greater say in corporate decision-making.

The authors point to other developed economies where productivity growth, income growth, and equality haven’t become so disconnected thanks to investments in worker skills, social safety nets, and incentives to invest in human capital. Whether such a radical reshaping of US economic policy is achievable in today’s political climate remains to be seen, but the authors conclude with a call to arms.

“The failure of the US labor market to deliver broadly shared prosperity despite rising productivity is not an inevitable byproduct of current technologies or free markets,” they write. “We can and should do better.”

Image Credit: Simon Abrams / Unsplash/a> Continue reading

Posted in Human Robots

#435660 Toyota Research Developing New ...

With the Olympics taking place next year in Japan, Toyota is (among other things) stepping up its robotics game to help provide “mobility for all.” We know that Toyota’s HSR will be doing work there, along with a few other mobile systems, but the Toyota Research Institute (TRI) has just announced a new telepresence robot called the T-TR1, featuring an absolutely massive screen designed to give you a near-lifesize virtual presence.

T-TR1 is a virtual mobility/tele-presence robot developed by Toyota Research Institute in the United States. It is equipped with a camera atop a large, near-lifesize display.
By projecting an image of a user from a remote location, the robot will help that person feel more physically present at the robot’s location.
With T-TR1, Toyota will give people that are physically unable to attend the events such as the Games a chance to virtually attend, with an on-screen presence capable of conversation between the two locations.

TRI isn’t ready to share much more detail on this system yet (we asked, of course), but we can infer some things from the video and the rest of the info that’s out there. For example, that ball on top is a 360-degree camera (that looks a lot like an Insta360 Pro), giving the remote user just as good of an awareness of their surroundings as they would if they were there in person. There are multiple 3D-sensing systems, including at least two depth cameras plus a lidar at the base. It’s not at all clear whether the robot is autonomous or semi-autonomous (using the sensors for automated obstacle avoidance, say), and since the woman on the other end of the robot does not seem to be controlling it at all for the demo, it’s hard to make an educated guess about the level of autonomy, or even how it’s supposed to be controlled.

We really like that enormous screen—despite the fact that telepresence now requires pants. It adds to the embodiment that makes independent telepresence robots useful.

We really like that enormous screen—despite the fact that telepresence now requires pants. It adds to the embodiment that makes independent telepresence robots useful. It’s also nice that the robot can move fast enough to keep up a person walking briskly. Hopefully, it’s safe for it to move at that speed in an environment more realistic than a carpeted, half-empty conference room, although it’ll probably have to leverage all of those sensors to do so. The other challenge for the T-TR1 will be bandwidth—even assuming that all of the sensor data processing and stuff is done on-robot, 360 cameras are huge bandwidth hogs, plus there’s the primary (presumably high quality) feed from the main camera, and then the video of the user coming the other way. It’s a lot of data in a very latency-sensitive application, and it’ll presumably be operating in places where connectivity is going to be a challenge due to crowds. This has always been a problem for telepresence robots—no matter how amazing your robot is, the experience will often for better or worse be defined by Internet connections that you may have no control over.

We should emphasize that Toyota has only released the bare minimum of information about the T-TR1, although we’re told that we can expect more as the 2020 Olympics approach: opening ceremonies are one year from today.

[ TRI ] Continue reading

Posted in Human Robots

#435648 Surprisingly Speedy Soft Robot Survives ...

Soft robots are getting more and more popular for some very good reasons. Their relative simplicity is one. Their relative low cost is another. And for their simplicity and low cost, they’re generally able to perform very impressively, leveraging the unique features inherent to their design and construction to move themselves and interact with their environment. The other significant reason why soft robots are so appealing is that they’re durable. Without the constraints of rigid parts, they can withstand the sort of abuse that would make any roboticist cringe.

In the current issue of Science Robotics, a group of researchers from Tsinghua University in China and University of California, Berkeley, present a new kind of soft robot that’s both higher performance and much more robust than just about anything we’ve seen before. The deceptively simple robot looks like a bent strip of paper, but it’s able to move at 20 body lengths per second and survive being stomped on by a human wearing tennis shoes. Take that, cockroaches.

This prototype robot measures just 3 centimeters by 1.5 cm. It takes a scanning electron microscope to actually see what the robot is made of—a thermoplastic layer is sandwiched by palladium-gold electrodes, bonded with adhesive silicone to a structural plastic at the bottom. When an AC voltage (as low as 8 volts but typically about 60 volts) is run through the electrodes, the thermoplastic extends and contracts, causing the robot’s back to flex and the little “foot” to shuffle. A complete step cycle takes just 50 milliseconds, yielding a 200 hertz gait. And technically, the robot “runs,” since it does have a brief aerial phase.

Image: Science Robotics

Photos from a high-speed camera show the robot’s gait (A to D) as it contracts and expands its body.

To put the robot’s top speed of 20 body lengths per second in perspective, have a look at this nifty chart, which shows where other animals relative running speeds of some animals and robots versus body mass:

Image: Science Robotics

This chart shows the relative running speeds of some mammals (purple area), arthropods (orange area), and soft robots (blue area) versus body mass. For both mammals and arthropods, relative speeds show a strong negative scaling law with respect to the body mass: speeds increase as body masses decrease. However, for soft robots, the relationship appears to be the opposite: speeds decrease as the body mass decrease. For the little soft robots created by the researchers from Tsinghua University and UC Berkeley (red stars), the scaling law is similar to that of living animals: Higher speed was attained as the body mass decreased.

If you were wondering, like we were, just what that number 39 is on that chart (top left corner), it’s a species of tiny mite that was discovered underneath a rock in California in 1916. The mite is just under 1 mm in size, but it can run at 0.8 kilometer per hour, which is 322 body lengths per second, making it by far (like, by a factor of two at least) the fastest land animal on Earth relative to size. If a human was to run that fast relative to our size, we’d be traveling at a little bit over 2,000 kilometers per hour. It’s not a coincidence that pretty much everything in the upper left of the chart is an insect—speed scales favorably with decreasing mass, since actuators have a proportionally larger effect.

Other notable robots on the chart with impressive speed to mass ratios are number 27, which is this magnetically driven quadruped robot from UMD, and number 86, UC Berkeley’s X2-VelociRoACH.

Anyway, back to this robot. Some other cool things about it:

You can step on it, squishing it flat with a load about 1 million times its own body weight, and it’ll keep on crawling, albeit only half as fast.
Even climbing a slope of 15 degrees, it can still manage to move at 1 body length per second.
It carries peanuts! With a payload of six times its own weight, it moves a sixth as fast, but still, it’s not like you need your peanuts delivered all that quickly anyway, do you?

Image: Science Robotics

The researchers also put together a prototype with two legs instead of one, which was able to demonstrate a potentially faster galloping gait by spending more time in the air. They suggest that robots like these could be used for “environmental exploration, structural inspection, information reconnaissance, and disaster relief,” which are the sorts of things that you suggest that your robot could be used for when you really have no idea what it could be used for. But this work is certainly impressive, with speed and robustness that are largely unmatched by other soft robots. An untethered version seems possible due to the relatively low voltages required to drive the robot, and if they can put some peanut-sized sensors on there as well, practical applications might actually be forthcoming sometime soon.

“Insect-scale Fast Moving and Ultrarobust Soft Robot,” by Yichuan Wu, Justin K. Yim, Jiaming Liang, Zhichun Shao, Mingjing Qi, Junwen Zhong, Zihao Luo, Xiaojun Yan, Min Zhang, Xiaohao Wang, Ronald S. Fearing, Robert J. Full, and Liwei Lin from Tsinghua University and UC Berkeley, is published in Science Robotics. Continue reading

Posted in Human Robots

#435640 Video Friday: This Wearable Robotic Tail ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
CLAWAR 2019 – August 26-28, 2019 – Kuala Lumpur, Malaysia
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

Lakshmi Nair from Georgia Tech describes some fascinating research towards robots that can create their own tools, as presented at ICRA this year:

Using a novel capability to reason about shape, function, and attachment of unrelated parts, researchers have for the first time successfully trained an intelligent agent to create basic tools by combining objects.

The breakthrough comes from Georgia Tech’s Robot Autonomy and Interactive Learning (RAIL) research lab and is a significant step toward enabling intelligent agents to devise more advanced tools that could prove useful in hazardous – and potentially life-threatening – environments.

[ Lakshmi Nair ]

Victor Barasuol, from the Dynamic Legged Systems Lab at IIT, wrote in to share some new research on their HyQ quadruped that enables sensorless shin collision detection. This helps the robot navigate unstructured environments, and also mitigates all those painful shin strikes, because ouch.

This will be presented later this month at the International Conference on Climbing and Walking Robots (CLAWAR) in Kuala Lumpur, Malaysia.

[ IIT ]

Thanks Victor!

You used to have a tail, you know—as an embryo, about a month in to your development. All mammals used to have tails, and now we just have useless tailbones, which don’t help us with balancing even a little bit. BRING BACK THE TAIL!

The tail, created by Junichi Nabeshima, Kouta Minamizawa, and MHD Yamen Saraiji from Keio University’s Graduate School of Media Design, was presented at SIGGRAPH 2019 Emerging Technologies.

[ Paper ] via [ Gizmodo ]

The noises in this video are fantastic.

[ ESA ]

Apparently the industrial revolution wasn’t a thorough enough beatdown of human knitting, because the robots are at it again.

[ MIT CSAIL ]

Skydio’s drones just keep getting more and more impressive. Now if only they’d make one that I can afford…

[ Skydio ]

The only thing more fun than watching robots is watching people react to robots.

[ SEER ]

There aren’t any robots in this video, but it’s robotics-related research, and very soothing to watch.

[ Stanford ]

#autonomousicecreamtricycle

In case it wasn’t clear, which it wasn’t, this is a Roboy project. And if you didn’t understand that first video, you definitely won’t understand this second one:

Whatever that t-shirt is at the end (Roboy in sunglasses puking rainbows…?) I need one.

[ Roboy ]

By adding electronics and computation technology to a simple cane that has been around since ancient times, a team of researchers at Columbia Engineering have transformed it into a 21st century robotic device that can provide light-touch assistance in walking to the aged and others with impaired mobility.

The light-touch robotic cane, called CANINE, acts as a cane-like mobile assistant. The device improves the individual’s proprioception, or self-awareness in space, during walking, which in turn improves stability and balance.

[ ROAR Lab ]

During the second field experiment for DARPA’s OFFensive Swarm-Enabled Tactics (OFFSET) program, which took place at Fort Benning, Georgia, teams of autonomous air and ground robots tested tactics on a mission to isolate an urban objective. Similar to the way a firefighting crew establishes a boundary around a burning building, they first identified locations of interest and then created a perimeter around the focal point.

[ DARPA ]

I think there’s a bit of new footage here of Ghost Robotics’ Vision 60 quadruped walking around without sensors on unstructured terrain.

[ Ghost Robotics ]

If you’re as tired of passenger drone hype as I am, there’s absolutely no need to watch this video of NEC’s latest hover test.

[ AP ]

As researchers teach robots to perform more and more complex tasks, the need for realistic simulation environments is growing. Existing techniques for closing the reality gap by approximating real-world physics often require extensive real world data and/or thousands of simulation samples. This paper presents TuneNet, a new machine learning-based method to directly tune the parameters of one model to match another using an iterative residual tuning technique. TuneNet estimates the parameter difference between two models using a single observation from the target and minimal simulation, allowing rapid, accurate and sample-efficient parameter estimation.

The system can be trained via supervised learning over an auto-generated simulated dataset. We show that TuneNet can perform system identification, even when the true parameter values lie well outside the distribution seen during training, and demonstrate that simulators tuned with TuneNet outperform existing techniques for predicting rigid body motion. Finally, we show that our method can estimate real-world parameter values, allowing a robot to perform sim-to-real task transfer on a dynamic manipulation task unseen during training. We are also making a baseline implementation of our code available online.

[ Paper ]

Here’s an update on what GITAI has been up to with their telepresence astronaut-replacement robot.

[ GITAI ]

Curiosity captured this 360-degree panorama of a location on Mars called “Teal Ridge” on June 18, 2019. This location is part of a larger region the rover has been exploring called the “clay-bearing unit” on the side of Mount Sharp, which is inside Gale Crater. The scene is presented with a color adjustment that approximates white balancing to resemble how the rocks and sand would appear under daytime lighting conditions on Earth.

[ MSL ]

Some updates (in English) on ROS from ROSCon France. The first is a keynote from Brian Gerkey:

And this second video is from Omri Ben-Bassat, about how to keep your Anki Vector alive using ROS:

All of the ROSCon FR talks are available on Vimeo.

[ ROSCon FR ] Continue reading

Posted in Human Robots

#435632 DARPA Subterranean Challenge: Tunnel ...

The Tunnel Circuit of the DARPA Subterranean Challenge starts later this week at the NIOSH research mine just outside of Pittsburgh, Pennsylvania. From 15-22 August, 11 teams will send robots into a mine that they've never seen before, with the goal of making maps and locating items. All DARPA SubT events involve tunnels of one sort or another, but in this case, the “Tunnel Circuit” refers to mines as opposed to urban underground areas or natural caves. This month’s challenge is the first of three discrete events leading up to a huge final event in August of 2021.

While the Tunnel Circuit competition will be closed to the public, and media are only allowed access for a single day (which we'll be at, of course), DARPA has provided a substantial amount of information about what teams will be able to expect. We also have details from the SubT Integration Exercise, called STIX, which was a completely closed event that took place back in April. STIX was aimed at giving some teams (and DARPA) a chance to practice in a real tunnel environment.

For more general background on SubT, here are some articles to get you all caught up:

SubT: The Next DARPA Challenge for Robotics

Q&A with DARPA Program Manager Tim Chung

Meet The First Nine Teams

It makes sense to take a closer look at what happened at April's STIX exercise, because it is (probably) very similar to what teams will experience in the upcoming Tunnel Circuit. STIX took place at Edgar Experimental Mine in Colorado, and while no two mines are the same (and many are very, very different), there are enough similarities for STIX to have been a valuable experience for teams. Here's an overview video of the exercise from DARPA:

DARPA has also put together a much more detailed walkthrough of the STIX mine exercise, which gives you a sense of just how vast, complicated, and (frankly) challenging for robots the mine environment is:

So, that's the kind of thing that teams had to deal with back in April. Since the event was an exercise, rather than a competition, DARPA didn't really keep score, and wouldn't comment on the performance of individual teams. We've been trolling YouTube for STIX footage, though, to get a sense of how things went, and we found a few interesting videos.

Here's a nice overview from Team CERBERUS, which used drones plus an ANYmal quadruped:

Team CTU-CRAS also used drones, along with a tracked robot:

Team Robotika was brave enough to post video of a “fatal failure” experienced by its wheeled robot; the poor little bot gets rescued at about 7:00 in case you get worried:

So that was STIX. But what about the Tunnel Circuit competition this week? Here's a course preview video from DARPA:

It sort of looks like the NIOSH mine might be a bit less dusty than the Edgar mine was, but it could also be wetter and muddier. It’s hard to tell, because we’re just getting a few snapshots of what’s probably an enormous area with kilometers of tunnels that the robots will have to explore. But DARPA has promised “constrained passages, sharp turns, large drops/climbs, inclines, steps, ladders, and mud, sand, and/or water.” Combine that with the serious challenge to communications imposed by the mine itself, and robots will have to be both physically capable, and almost entirely autonomous. Which is, of course, exactly what DARPA is looking to test with this challenge.

Lastly, we had a chance to catch up with Tim Chung, Program Manager for the Subterranean Challenge at DARPA, and ask him a few brief questions about STIX and what we have to look forward to this week.

IEEE Spectrum: How did STIX go?

Tim Chung: It was a lot of fun! I think it gave a lot of the teams a great opportunity to really get a taste of what these types of real world environments look like, and also what DARPA has in store for them in the SubT Challenge. STIX I saw as an experiment—a learning experience for all the teams involved (as well as the DARPA team) so that we can continue our calibration.

What do you think teams took away from STIX, and what do you think DARPA took away from STIX?

I think the thing that teams took away was that, when DARPA hosts a challenge, we have very audacious visions for what the art of the possible is. And that's what we want—in my mind, the purpose of a DARPA Grand Challenge is to provide that inspiration of, ‘Holy cow, someone thinks we can do this!’ So I do think the teams walked away with a better understanding of what DARPA's vision is for the capabilities we're seeking in the SubT Challenge, and hopefully walked away with a better understanding of the technical, physical, even maybe mental challenges of doing this in the wild— which will all roll back into how they think about the problem, and how they develop their systems.

This was a collaborative exercise, so the DARPA field team was out there interacting with the other engineers, figuring out what their strengths and weaknesses and needs might be, and even understanding how to handle the robots themselves. That will help [strengthen] connections between these university teams and DARPA going forward. Across the board, I think that collaborative spirit is something we really wish to encourage, and something that the DARPA folks were able to take away.

What do we have to look forward to during the Tunnel Circuit?

The vision here is that the Tunnel Circuit is representative of one of the three subterranean subdomains, along with urban and cave. Characteristics of all of these three subdomains will be mashed together in an epic final course, so that teams will have to face hints of tunnel once again in that final event.

Without giving too much away, the NIOSH mine will be similar to the Edgar mine in that it's a human-made environment that supports mining operations and research. But of course, every site is different, and these differences, I think, will provide good opportunities for the teams to shine.

Again, we'll be visiting the NIOSH mine in Pennsylvania during the Tunnel Circuit and will post as much as we can from there. But if you’re an actual participant in the Subterranean Challenge, please tweet me @BotJunkie so that I can follow and help share live updates.

[ DARPA Subterranean Challenge ] Continue reading

Posted in Human Robots