Tag Archives: Deep learning

#436209 Video Friday: Robotic Endoscope Travels ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, WA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Kuka has just announced the results of its annual Innovation Award. From an initial batch of 30 applicants, five teams reached the finals (we were part of the judging committee). The five finalists worked for nearly a year on their applications, which they demonstrated this week at the Medica trade show in Düsseldorf, Germany. And the winner of the €20,000 prize is…Team RoboFORCE, led by the STORM Lab in the U.K., which developed a “robotic magnetic flexible endoscope for painless colorectal cancer screening, surveillance, and intervention.”

The system could improve colonoscopy procedures by reducing pain and discomfort as well as other risks such as bleeding and perforation, according to the STORM Lab researchers. It uses a magnetic field to control the endoscope, pulling rather than pushing it through the colon.

The other four finalists also presented some really interesting applications—you can see their videos below.

“Because we were so please with the high quality of the submissions, we will have next year’s finals again at the Medica fair, and the challenge will be named ‘Medical Robotics’,” says Rainer Bischoff, vice president for corporate research at Kuka. He adds that the selected teams will again use Kuka’s LBR Med robot arm, which is “already certified for integration into medical products and makes it particularly easy for startups to use a robot as the main component for a particular solution.”

Applications are now open for Kuka’s Innovation Award 2020. You can find more information on how to enter here. The deadline is 5 January 2020.

[ Kuka ]

Oh good, Aibo needs to be fed now.

You know what comes next, right?

[ Aibo ]

Your cat needs this robot.

It's about $200 on Kickstarter.

[ Kickstarter ]

Enjoy this tour of the Skydio offices courtesy Skydio 2, which runs into not even one single thing.

If any Skydio employees had important piles of papers on their desks, well, they don’t anymore.

[ Skydio ]

Artificial intelligence is everywhere nowadays, but what exactly does it mean? We asked a group MIT computer science grad students and post-docs how they personally define AI.

“When most people say AI, they actually mean machine learning, which is just pattern recognition.” Yup.

[ MIT ]

Using event-based cameras, this drone control system can track attitude at 1600 degrees per second (!).

[ UZH ]

Introduced at CES 2018, Walker is an intelligent humanoid service robot from UBTECH Robotics. Below are the latest features and technologies used during our latest round of development to make Walker even better.

[ Ubtech ]

Introducing the Alpha Prime by #VelodyneLidar, the most advanced lidar sensor on the market! Alpha Prime delivers an unrivaled combination of field-of-view, range, high-resolution, clarity and operational performance.

Performance looks good, but don’t expect it to be cheap.

[ Velodyne ]

Ghost Robotics’ Spirit 40 will start shipping to researchers in January of next year.

[ Ghost Robotics ]

Unitree is about to ship the first batch of their AlienGo quadrupeds as well:

[ Unitree ]

Mechanical engineering’s Sarah Bergbreiter discusses her work on micro robotics, how they draw inspiration from insects and animals, and how tiny robots can help humans in a variety of fields.

[ CMU ]

Learning contact-rich, robotic manipulation skills is a challenging problem due to the high-dimensionality of the state and action space as well as uncertainty from noisy sensors and inaccurate motor control. To combat these factors and achieve more robust manipulation, humans actively exploit contact constraints in the environment. By adopting a similar strategy, robots can also achieve more robust manipulation. In this paper, we enable a robot to autonomously modify its environment and thereby discover how to ease manipulation skill learning. Specifically, we provide the robot with fixtures that it can freely place within the environment. These fixtures provide hard constraints that limit the outcome of robot actions. Thereby, they funnel uncertainty from perception and motor control and scaffold manipulation skill learning.

[ Stanford ]

Since 2016, Verity's drones have completed more than 200,000 flights around the world. Completely autonomous, client-operated and designed for live events, Verity is making the magic real by turning drones into flying lights, characters, and props.

[ Verity ]

To monitor and stop the spread of wildfires, University of Michigan engineers developed UAVs that could find, map and report fires. One day UAVs like this could work with disaster response units, firefighters and other emergency teams to provide real-time accurate information to reduce damage and save lives. For their research, the University of Michigan graduate students won first place at a competition for using a swarm of UAVs to successfully map and report simulated wildfires.

[ University of Michigan ]

Here’s an important issue that I haven’t heard talked about all that much: How first responders should interact with self-driving cars.

“To put the car in manual mode, you must call Waymo.” Huh.

[ Waymo ]

Here’s what Gitai has been up to recently, from a Humanoids 2019 workshop talk.

[ Gitai ]

The latest CMU RI seminar comes from Girish Chowdhary at the University of Illinois at Urbana-Champaign on “Autonomous and Intelligent Robots in Unstructured Field Environments.”

What if a team of collaborative autonomous robots grew your food for you? In this talk, I will discuss some key advances in robotics, machine learning, and autonomy that will one day enable teams of small robots to grow food for you in your backyard in a fundamentally more sustainable way than modern mega-farms! Teams of small aerial and ground robots could be a potential solution to many of the serious problems that modern agriculture is facing. However, fully autonomous robots that operate without supervision for weeks, months, or entire growing season are not yet practical. I will discuss my group’s theoretical and practical work towards the underlying challenging problems in robotic systems, autonomy, sensing, and learning. I will begin with our lightweight, compact, and autonomous field robot TerraSentia and the recent successes of this type of undercanopy robots for high-throughput phenotyping with deep learning-based machine vision. I will also discuss how to make a team of autonomous robots learn to coordinate to weed large agricultural farms under partial observability. These direct applications will help me make the case for the type of reinforcement learning and adaptive control that are necessary to usher in the next generation of autonomous field robots that learn to solve complex problems in harsh, changing, and dynamic environments. I will then end with an overview of our new MURI, in which we are working towards developing AI and control that leverages neurodynamics inspired by the Octopus brain.

[ CMU RI ] Continue reading

Posted in Human Robots

#436114 Video Friday: Transferring Human Motion ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

We are very sad to say that MIT professor emeritus Woodie Flowers has passed away. Flowers will be remembered for (among many other things, like co-founding FIRST) the MIT 2.007 course that he began teaching in the mid-1970s, famous for its student competitions.

These competitions got a bunch of well-deserved publicity over the years; here’s one from 1985:

And the 2.007 competitions are still going strong—this year’s theme was Moonshot, and you can watch a replay of the event here.

[ MIT ]

Looks like Aibo is getting wireless integration with Hitachi appliances, which turns out to be pretty cute:

What is this magical box where you push a button and 60 seconds later fluffy pancakes come out?!

[ Aibo ]

LiftTiles are a “modular and reconfigurable room-scale shape display” that can turn your floor and walls into on-demand structures.

[ LiftTiles ]

Ben Katz, a grad student in MIT’s Biomimetics Robotics Lab, has been working on these beautiful desktop-sized Furuta pendulums:

That’s a crowdfunding project I’d pay way too much for.

[ Ben Katz ]

A clever bit of cable manipulation from MIT, using GelSight tactile sensors.

[ Paper ]

A useful display of industrial autonomy on ANYmal from the Oxford Robotics Group.

This video is of a demonstration for the ORCA Robotics Hub showing the ANYbotics ANYmal robot carrying out industrial inspection using autonomy software from Oxford Robotics Institute.

[ ORCA Hub ] via [ DRS ]

Thanks Maurice!

Meet Katie Hamilton, a software engineer at NASA’s Ames Research Center, who got into robotics because she wanted to help people with daily life. Katie writes code for robots, like Astrobee, who are assisting astronauts with routine tasks on the International Space Station.

[ NASA Astrobee ]

Transferring human motion to a mobile robotic manipulator and ensuring safe physical human-robot interaction are crucial steps towards automating complex manipulation tasks in human-shared environments. In this work we present a robot whole-body teleoperation framework for human motion transfer. We validate our approach through several experiments using the TIAGo robot, showing this could be an easy way for a non-expert to teach a rough manipulation skill to an assistive robot.

[ Paper ]

This is pretty cool looking for an autonomous boat, but we’ll see if they can build a real one by 2020 since at the moment it’s just an average rendering.

[ ProMare ]

I had no idea that asparagus grows like this. But, sure does make it easy for a robot to harvest.

[ Inaho ]

Skip to 2:30 in this Pepper unboxing video to hear the noise it makes when tickled.

[ HIT Lab NZ ]

In this interview, Jean Paul Laumond discusses his movement from mathematics to robotics and his career contributions to the field, especially in regards to motion planning and anthropomorphic motion. Describing his involvement at CNRS and in other robotics projects, such as HILARE, he comments on the distinction in perception between the robotics approach and a mathematics one.

[ IEEE RAS History ]

Here’s a couple of videos from the CMU Robotics Institute archives, showing some of the work that took place over the last few decades.

[ CMU RI ]

In this episode of the Artificial Intelligence Podcast, Lex Fridman speaks with David Ferrucci from IBM about Watson and (you guessed it) artificial intelligence.

David Ferrucci led the team that built Watson, the IBM question-answering system that beat the top humans in the world at the game of Jeopardy. He is also the Founder, CEO, and Chief Scientist of Elemental Cognition, a company working engineer AI systems that understand the world the way people do. This conversation is part of the Artificial Intelligence podcast.

[ AI Podcast ]

This week’s CMU RI Seminar is by Pieter Abbeel from UC Berkeley, on “Deep Learning for Robotics.”

Programming robots remains notoriously difficult. Equipping robots with the ability to learn would by-pass the need for what otherwise often ends up being time-consuming task specific programming. This talk will describe recent progress in deep reinforcement learning (robots learning through their own trial and error), in apprenticeship learning (robots learning from observing people), and in meta-learning for action (robots learning to learn). This work has led to new robotic capabilities in manipulation, locomotion, and flight, with the same approach underlying advances in each of these domains.

[ CMU RI ] Continue reading

Posted in Human Robots

#436079 Video Friday: This Humanoid Robot Will ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Northeast Robotics Colloquium – October 12, 2019 – Philadelphia, Pa., USA
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

What’s better than a robotics paper with “dynamic” in the title? A robotics paper with “highly dynamic” in the title. From Sangbae Kim’s lab at MIT, the latest exploits of Mini Cheetah:

Yes I’d very much like one please. Full paper at the link below.

[ Paper ] via [ MIT ]

A humanoid robot serving you ice cream—on his own ice cream bike: What a delicious vision!

[ Roboy ]

The Roomba “i” series and “s” series vacuums have just gotten an update that lets you set “keep out” zones, which is super useful. Tell your robot where not to go!

I feel bad, that Roomba was probably just hungry 🙁

[ iRobot ]

We wrote about Voliro’s tilt-rotor hexcopter a couple years ago, and now it’s off doing practical things, like spray painting a building pretty much the same color that it was before.

[ Voliro ]

Thanks Mina!

Here’s a clever approach for bin-picking problematic objects, like shiny things: Just grab a whole bunch, and then sort out what you need on a nice robot-friendly table.

It might take a little bit longer, but what do you care, you’re probably off sipping a cocktail with a little umbrella in it on a beach somewhere.

[ Harada Lab ]

A unique combination of the IRB 1200 and YuMi industrial robots that use vision, AI and deep learning to recognize and categorize trash for recycling.

[ ABB ]

Measuring glacial movements in-situ is a challenging, but necessary task to model glaciers and predict their future evolution. However, installing GPS stations on ice can be dangerous and expensive when not impossible in the presence of large crevasses. In this project, the ASL develops UAVs for dropping and recovering lightweight GPS stations over inaccessible glaciers to record the ice flow motion. This video shows the results of first tests performed at Gorner glacier, Switzerland, in July 2019.

[ EPFL ]

Turns out Tertills actually do a pretty great job fighting weeds.

Plus, they leave all those cute lil’ Tertill tracks.

[ Franklin Robotics ]

The online autonomous navigation and semantic mapping experiment presented [below] is conducted with the Cassie Blue bipedal robot at the University of Michigan. The sensors attached to the robot include an IMU, a 32-beam LiDAR and an RGB-D camera. The whole online process runs in real-time on a Jetson Xavier and a laptop with an i7 processor.

The resulting map is so precise that it looks like we are doing real-time SLAM (simultaneous localization and mapping). In fact, the map is based on dead-reckoning via the InvEKF.

[ GTSAM ] via [ University of Michigan ]

UBTECH has announced an upgraded version of its Meebot, which is 30 percent bigger and comes with more sensors and programmable eyes.

[ UBTECH ]

ABB’s research team will be working with medical staff, scientist and engineers to develop non-surgical medical robotics systems, including logistics and next-generation automated laboratory technologies. The team will develop robotics solutions that will help eliminate bottlenecks in laboratory work and address the global shortage of skilled medical staff.

[ ABB ]

In this video, Ian and Chris go through Misty’s SDK, discussing the languages we’ve included, the tools that make it easy for you to get started quickly, a quick rundown of how to run the skills you build, plus what’s ahead on the Misty SDK roadmap.

[ Misty Robotics ]

My guess is that this was not one of iRobot’s testing environments for the Roomba.

You know, that’s actually super impressive. And maybe if they threw one of the self-emptying Roombas in there, it would be a viable solution to the entire problem.

[ How Farms Work ]

Part of WeRobotics’ Flying Labs network, Panama Flying Labs is a local knowledge hub catalyzing social good and empowering local experts. Through training and workshops, demonstrations and missions, the Panama Flying Labs team leverages the power of drones, data, and AI to promote entrepreneurship, build local capacity, and confront the pressing social challenges faced by communities in Panama and across Central America.

[ Panama Flying Labs ]

Go on a virtual flythrough of the NIOSH Experimental Mine, one of two courses used in the recent DARPA Subterranean Challenge Tunnel Circuit Event held 15-22 August, 2019. The data used for this partial flythrough tour were collected using 3D LIDAR sensors similar to the sensors commonly used on autonomous mobile robots.

[ SubT ]

Special thanks to PBS, Mark Knobil, Joe Seamans and Stan Brandorff and many others who produced this program in 1991.

It features Reid Simmons (and his 1 year old son), David Wettergreen, Red Whittaker, Mac Macdonald, Omead Amidi, and other Field Robotics Center alumni building the planetary walker prototype called Ambler. The team gets ready for an important demo for NASA.

[ CMU RI ]

As art and technology merge, roboticist Madeline Gannon explores the frontiers of human-robot interaction across the arts, sciences and society, and explores what this could mean for the future.

[ Sonar+D ] Continue reading

Posted in Human Robots

#436044 Want a Really Hard Machine Learning ...

What’s the world’s hardest machine learning problem? Autonomous vehicles? Robots that can walk? Cancer detection?

Nope, says Julian Sanchez. It’s agriculture.

Sanchez might be a little biased. He is the director of precision agriculture for John Deere, and is in charge of adding intelligence to traditional farm vehicles. But he does have a little perspective, having spent time working on software for both medical devices and air traffic control systems.

I met with Sanchez and Alexey Rostapshov, head of digital innovation at John Deere Labs, at the organization’s San Francisco offices last month. Labs launched in 2017 to take advantage of the area’s tech expertise, both to apply machine learning to in-house agricultural problems and to work with partners to build technologies that play nicely with Deere’s big green machines. Deere’s neighbors in San Francisco’s tech-heavy South of Market are LinkedIn, Salesforce, and Planet Labs, which puts it in a good position for recruiting.

“We’ve literally had folks knock on the door and say, ‘What are you doing here?’” says Rostapshov, and some return to drop off resumes.

Here’s why Sanchez believes agriculture is such a big challenge for artificial intelligence.

“It’s not just about driving tractors around,” he says, although autonomous driving technologies are part of the mix. (John Deere is doing a lot of work with precision GPS to improve autonomous driving, for example, and allow tractors to plan their own routes around fields.)

But more complex than the driving problem, says Sanchez, are the classification problems.

Corn: A Classic Classification Problem

Photo: Tekla Perry

One key effort, Sanchez says, are AI systems “that allow me to tell whether grain being harvested is good quality or low quality and to make automatic adjustment systems for the harvester.” The company is already selling an early version of this image analysis technology. But the many differences between grain types, and grains grown under different conditions, make this task a tough one for machine learning.

“Take corn,” Sanchez says. “Let’s say we are building a deep learning algorithm to detect this corn. And we take lots of pictures of kernels to give it. Say we pick those kernels in central Illinois. But, one mile over, the farmer planted a slightly different hybrid which has slightly different coloration of yellow. Meanwhile, this other farm harvested three days later in a field five miles away; it’s the same hybrid, but it also looks different.

“It’s an overwhelming classification challenge, and that’s just for corn. But you are not only doing it for corn, you have to add 20 more varieties of grain to the mix; and some, like canola, are almost microscopic.”

Even the ground conditions vary dramatically—far more than road conditions, Sanchez points out.

“Let’s say we are building a deep learning algorithm to detect how much residue is left on the soil after a harvest, including stubble and some chaff. Let’s drive 2,000 acres of fields in the Midwest looking at residue. That’s great, but I guarantee that if you go drive those the next year, it will look significantly different.

“Deep learning is great at interpolating conditions between what it knows; it is not good at extrapolating to situations it hasn’t seen. And in agriculture, you always feel that there is a set of conditions that you haven’t yet classified.”

A Flood of Big Data

The scale of the data is also daunting, Rostapshov points out. “We are one of the largest users of cloud computing services in the world,” he says. “We are gathering 5 to 15 million measurements per second from 130,000 connected machines globally. We have over 150 million acres in our databases, using petabytes and petabytes [of storage]. We process more data than Twitter does.”

Much of this information is so-called dirty data, that is, it doesn’t share the same format or structure, because it’s coming not only from a wide variety of John Deere machines, but also includes data from some 100 other companies that have access to the platform, including weather information, aerial imagery, and soil analyses.

As a result, says Sanchez, Deere has had to make “tremendous investments in back-end data cleanup.”

Deep learning is great at interpolating conditions between what it knows; it is not good at extrapolating to situations it hasn’t seen.”
—Julian Sanchez, John Deere

“We have gotten progressively more skilled at that problem,” he says. “We started simply by cleaning up our own data. You’d think it would be nice and neat, since it’s coming from our own machines, but there is a wide variety of different models and different years. Then we started geospatially tagging the agronomic data—the information about where you are applying herbicides and fertilizer and the like—coming in from our vehicles. When we started bringing in other data, from drones, say, we were already good at cleaning it up.”

John Deere’s Hiring Pitch

Hard problems can be a good thing to have for a company looking to hire machine learning engineers.

“Our opening line to potential recruits,” Sanchez says, “is ‘This stuff matters.’ Then, if we get a chance to talk to them more, we follow up with ‘Not only does this stuff matter, but the problems are really hard and interesting.’ When we explain the variability in farming and how we have to apply all the latest tools to these problems, we get their attention.”

Software engineers “know that feeding a growing population is a massive problem and are excited about the prospect of making a difference,” Rostapshov says.

Only 20 engineers work in the San Francisco labs right now, and that’s on a busy day—some of the researchers spend part of their time at Blue River Technology, a startup based in Sunnyvale that was acquired by Deere in 2017. About half of the researchers are focusing on AI. The Lab is in the process of doubling its office space (no word on staffing plans for that expansion yet).

“We are one of the largest users of cloud computing services in the world.”
—Alexey Rostapshov, John Deere Labs

Company-wide, Deere has thousands of software engineers, with many using AI and machine learning tools in their work, and about the same number of mechanical and electrical engineers, Sanchez reports. “If you look at our hiring 10 years ago,” he says, “it was heavily weighted to mechanical engineers. But if you look at those numbers now, it is by a large majority [engineers working] in the software space. We still need mechanical engineers—we do build green machines—but if you go by our footprint of tech talent, it is pretty safe to call John Deere a software company. And if you follow the key conversations that are happening in the company right now, 95 percent of them are software-related.”

For now, these software engineers are focused on developing technologies that allow farmers to “do more with less,” Sanchez says. Meaning, to get more and better crops from less fuel, less seed, less fertilizer, less pesticide, and fewer workers, and putting together building blocks that, he says, could eventually lead to fully autonomous farm vehicles. The data Deere collects today, for the most part, stays in silos (the virtual kind), with AI algorithms that analyze specific sets of data to provide guidance to individual farmers. At some point, however, with tools to anonymize data and buy-in from farmers, aggregating data could provide some powerful insights.

“We are not asking farmers for that yet,” Sanchez says. “We are not doing aggregation to look for patterns. We are focused on offering technology that allows an individual farmer to use less, on positioning ourselves to be in a neutral spot. We are not about selling you more seed or more fertilizer. So we are building up a good trust level. In the long term, we can have conversations about doing more with deep learning.” Continue reading

Posted in Human Robots

#436021 AI Faces Speed Bumps and Potholes on Its ...

Implementing machine learning in the real world isn’t easy. The tools are available and the road is well-marked—but the speed bumps are many.

That was the conclusion of panelists wrapping up a day of discussions at the IEEE AI Symposium 2019, held at Cisco’s San Jose, Calif., campus last week.

The toughest problem, says Ben Irving, senior manager of Cisco’s strategy innovations group, is people.

It’s tough to find data scientist expertise, he indicated, so companies are looking into non-traditional sources of personnel, like political science. “There are some untapped areas with a lot of untapped data science expertise,” Irving says.

Lazard’s artificial intelligence manager Trevor Mottl agreed that would-be data scientists don’t need formal training or experience to break into the field. “This field is changing really rapidly,” he says. “There are new language models coming out every month, and new tools, so [anyone should] expect to not know everything. Experiment, try out new tools and techniques, read, study, spend time; there aren’t any true experts at this point because the foundational elements are shifting so rapidly.”

“It is a wonderful time to get into a field,” he reasons, noting that it doesn’t take long to catch up because there aren’t 20 years of history.”

Confusion about what different kinds of machine learning specialists do doesn’t help the personnel situation. An audience member asked panelists to explain the difference between data scientist, data analyst, and data engineer. Darrin Johnson, Nvidia global director of technical marketing for enterprise, admitted it’s hard to sort out, and any two companies could define the positions differently. “Sometimes,” he says, particularly at smaller companies, “a data scientist plays all three roles. But as companies grow, there are different groups that ingest data, clean data, and use data. At some companies, training and inference are separate. It really depends, which is a challenge when you are trying to hire someone.”

Mitigating the risks of a hot job market

The competition to hire data scientists, analysts, engineers, or whatever companies call them requires that managers make sure any work being done is structured and comprehensible at all times, the panelists cautioned.

“We need to remember that our data scientists go home every day and sometimes they don’t come back because they go home and then go to a different company,” says Lazard’s Mottl. “That’s a fact of life. If you give people choice on [how they do development], and have a successful person who gets poached by competitor, you have to either hire a team to unwrap what that person built or jettison their work and rebuild it.”

By contrast, he says, “places that have structured coding and structured commits and organized constructions of software have done very well.”

But keeping all of a company’s engineers working with the same languages and on the same development paths is not easy to do in a field that moves as fast as machine learning. Zongjie Diao, Cisco director of product management for machine learning, quipped: “I have a data scientist friend who says the speed at which he changes girlfriends is less than speed at which he changes languages.”

The data scientist/IT manager clash

Once a company finds the data engineers and scientists they need and get them started on the task of applying machine learning to that company’s operations, one of the first obstacles they face just might be the company’s IT department, the panelists suggested.

“IT is process oriented,” Mottl says. The IT team “knows how to keep data secure, to set up servers. But when you bring in a data science team, they want sandboxes, they want freedom, they want to explore and play.”

Also, Nvidia’s Johnson pointed out, “There is a language barrier.” The AI world, he says, is very different from networking or storage, and data scientists find it hard to articulate their requirements to IT.

On the ground or in the cloud?

And then there is the decision of where exactly machine learning should happen—on site, or in the cloud? At Lazard, Mottl says, the deep learning engineers do their experimentation on premises; that’s their sandbox. “But when we deploy, we deploy in the cloud,” he says.

Nvidia, Johnson says, thinks the opposite approach is better. We see the cloud as “the sandbox,” he says. “So you can run as many experiments as possible, fail fast, and learn faster.”

For Cisco’s Irving, the “where” of machine learning depends on the confidentiality of the data.

Mottl, who says rolling machine learning technology into operation can hit resistance from all across the company, had one last word of caution for those aiming to implement AI:

Data scientists are building things that might change the ways other people in the organization work, like sales and even knowledge workers. [You need to] think about the internal stakeholders and prepare them, because the last thing you want to do is to create a valuable new thing that nobody likes and people take potshots against.

The AI Symposium was organized by the Silicon Valley chapters of the IEEE Young Professionals, the IEEE Consultants’ Network, and IEEE Women in Engineering and supported by Cisco. Continue reading

Posted in Human Robots