Tag Archives: USA

#435619 Video Friday: Watch This Robot Dog ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
RoboBusiness 2019 – October 1-3, 2019 – Santa Clara, CA, USA
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

Team PLUTO (University of Pennsylvania, Ghost Robotics, and Exyn Technologies) put together this video giving us a robot’s-eye-view (or whatever they happen to be using for eyes) of the DARPA Subterranean Challenge tunnel circuits.

[ PLUTO ]

Zhifeng Huang has been improving his jet-stepping humanoid robot, which features new hardware and the ability to take larger and more complex steps.

This video reported the last progress of an ongoing project utilizing ducted-fan propulsion system to improve humanoid robot’s ability in stepping over large ditches. The landing point of the robot’s swing foot can be not only forward but also side direction. With keeping quasi-static balance, the robot was able to step over a ditch with 450mm in width (up to 97% of the robot’s leg’s length) in 3D stepping.

[ Paper ]

Thanks Zhifeng!

These underacuated hands from Matei Ciocarlie’s lab at Columbia are magically able to reconfigure themselves to grasp different object types with just one or two motors.

[ Paper ] via [ ROAM Lab ]

This is one reason we should pursue not “autonomous cars” but “fully autonomous cars” that never require humans to take over. We can’t be trusted.

During our early days as the Google self-driving car project, we invited some employees to test our vehicles on their commutes and weekend trips. What we were testing at the time was similar to the highway driver assist features that are now available on cars today, where the car takes over the boring parts of the driving, but if something outside its ability occurs, the driver has to take over immediately.

What we saw was that our testers put too much trust in that technology. They were doing things like texting, applying makeup, and even falling asleep that made it clear they would not be ready to take over driving if the vehicle asked them to. This is why we believe that nothing short of full autonomy will do.

[ Waymo ]

Buddy is a DIY and fetchingly minimalist social robot (of sorts) that will be coming to Kickstarter this month.

We have created a new arduino kit. His name is Buddy. He is a DIY social robot to serve as a replacement for Jibo, Cozmo, or any of the other bots that are no longer available. Fully 3D printed and supported he adds much more to our series of Arduino STEM robotics kits.

Buddy is able to look around and map his surroundings and react to changes within them. He can be surprised and he will always have a unique reaction to changes. The kit can be built very easily in less than an hour. It is even robust enough to take the abuse that kids can give it in a classroom.

[ Littlebots ]

The android Mindar, based on the Buddhist deity of mercy, preaches sermons at Kodaiji temple in Kyoto, and its human colleagues predict that with artificial intelligence it could one day acquire unlimited wisdom. Developed at a cost of almost $1 million (¥106 million) in a joint project between the Zen temple and robotics professor Hiroshi Ishiguro, the robot teaches about compassion and the dangers of desire, anger and ego.

[ Japan Times ]

I’m not sure whether it’s the sound or what, but this thing scares me for some reason.

[ BIRL ]

This gripper uses magnets as a sort of adjustable spring for dynamic stiffness control, which seems pretty clever.

[ Buffalo ]

What a package of medicine sees while being flown by drone from a hospital to a remote clinic in the Dominican Republic. The drone flew 11 km horizontally and 800 meters vertically, and I can’t even imagine what it would take to make that drive.

[ WeRobotics ]

My first ride in a fully autonomous car was at Stanford in 2009. I vividly remember getting in the back seat of a descendant of Junior, and watching the steering wheel turn by itself as the car executed a perfect parking maneuver. Ten years later, it’s still fun to watch other people have that experience.

[ Waymo ]

Flirtey, the pioneer of the commercial drone delivery industry, has unveiled the much-anticipated first video of its next-generation delivery drone, the Flirtey Eagle. The aircraft designer and manufacturer also unveiled the Flirtey Portal, a sophisticated take off and landing platform that enables scalable store-to-door operations; and an autonomous software platform that enables drones to deliver safely to homes.

[ Flirtey ]

EPFL scientists are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects.

[ EPFL ]

This video is a few years old, but we’ll take any excuse to watch the majestic sage-grouse be majestic in all their majesticness.

[ UC Davis ]

I like the idea of a game of soccer (or, football to you weirdos in the rest of the world) where the ball has a mind of its own.

[ Sphero ]

Looks like the whole delivery glider idea is really taking off! Or, you know, not taking off.

Weird that they didn’t show the landing, because it sure looked like it was going to plow into the side of the hill at full speed.

[ Yates ] via [ sUAS News ]

This video is from a 2018 paper, but it’s not like we ever get tired of seeing quadrupeds do stuff, right?

[ MIT ]

Founder and Head of Product, Ian Bernstein, and Head of Engineering, Morgan Bell, have been involved in the Misty project for years and they have learned a thing or two about building robots. Hear how and why Misty evolved into a robot development platform, learn what some of the earliest prototypes did (and why they didn’t work for what we envision), and take a deep dive into the technology decisions that form the Misty II platform.

[ Misty Robotics ]

Lex Fridman interviews Vijay Kumar on the Artifiical Intelligence Podcast.

[ AI Podcast ]

This week’s CMU RI Seminar is from Ross Knepper at Cornell, on Formalizing Teamwork in Human-Robot Interaction.

Robots out in the world today work for people but not with people. Before robots can work closely with ordinary people as part of a human-robot team in a home or office setting, robots need the ability to acquire a new mix of functional and social skills. Working with people requires a shared understanding of the task, capabilities, intentions, and background knowledge. For robots to act jointly as part of a team with people, they must engage in collaborative planning, which involves forming a consensus through an exchange of information about goals, capabilities, and partial plans. Often, much of this information is conveyed through implicit communication. In this talk, I formalize components of teamwork involving collaboration, communication, and representation. I illustrate how these concepts interact in the application of social navigation, which I argue is a first-class example of teamwork. In this setting, participants must avoid collision by legibly conveying intended passing sides via nonverbal cues like path shape. A topological representation using the braid groups enables the robot to reason about a small enumerable set of passing outcomes. I show how implicit communication of topological group plans achieves rapid covergence to a group consensus, and how a robot in the group can deliberately influence the ultimate outcome to maximize joint performance, yielding pedestrian comfort with the robot.

[ CMU RI ]

In this week’s episode of Robots in Depth, Per speaks with Julien Bourgeois about Claytronics, a project from Carnegie Mellon and Intel to develop “programmable matter.”

Julien started out as a computer scientist. He was always interested in robotics privately but then had the opportunity to get into micro robots when his lab was merged into the FEMTO-ST Institute. He later worked with Seth Copen Goldstein at Carnegie Mellon on the Claytronics project.

Julien shows an enlarged mock-up of the small robots that make up programmable matter, catoms, and speaks about how they are designed. Currently he is working on a unit that is one centimeter in diameter and he shows us the very small CPU that goes into that model.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#433799 The First Novel Written by AI Is ...

Last year, a novelist went on a road trip across the USA. The trip was an attempt to emulate Jack Kerouac—to go out on the road and find something essential to write about in the experience. There is, however, a key difference between this writer and anyone else talking your ear off in the bar. This writer is just a microphone, a GPS, and a camera hooked up to a laptop and a whole bunch of linear algebra.

People who are optimistic that artificial intelligence and machine learning won’t put us all out of a job say that human ingenuity and creativity will be difficult to imitate. The classic argument is that, just as machines freed us from repetitive manual tasks, machine learning will free us from repetitive intellectual tasks.

This leaves us free to spend more time on the rewarding aspects of our work, pursuing creative hobbies, spending time with loved ones, and generally being human.

In this worldview, creative works like a great novel or symphony, and the emotions they evoke, cannot be reduced to lines of code. Humans retain a dimension of superiority over algorithms.

But is creativity a fundamentally human phenomenon? Or can it be learned by machines?

And if they learn to understand us better than we understand ourselves, could the great AI novel—tailored, of course, to your own predispositions in fiction—be the best you’ll ever read?

Maybe Not a Beach Read
This is the futurist’s view, of course. The reality, as the jury-rigged contraption in Ross Goodwin’s Cadillac for that road trip can attest, is some way off.

“This is very much an imperfect document, a rapid prototyping project. The output isn’t perfect. I don’t think it’s a human novel, or anywhere near it,” Goodwin said of the novel that his machine created. 1 The Road is currently marketed as the first novel written by AI.

Once the neural network has been trained, it can generate any length of text that the author desires, either at random or working from a specific seed word or phrase. Goodwin used the sights and sounds of the road trip to provide these seeds: the novel is written one sentence at a time, based on images, locations, dialogue from the microphone, and even the computer’s own internal clock.

The results are… mixed.

The novel begins suitably enough, quoting the time: “It was nine seventeen in the morning, and the house was heavy.” Descriptions of locations begin according to the Foursquare dataset fed into the algorithm, but rapidly veer off into the weeds, becoming surreal. While experimentation in literature is a wonderful thing, repeatedly quoting longitude and latitude coordinates verbatim is unlikely to win anyone the Booker Prize.

Data In, Art Out?
Neural networks as creative agents have some advantages. They excel at being trained on large datasets, identifying the patterns in those datasets, and producing output that follows those same rules. Music inspired by or written by AI has become a growing subgenre—there’s even a pop album by human-machine collaborators called the Songularity.

A neural network can “listen to” all of Bach and Mozart in hours, and train itself on the works of Shakespeare to produce passable pseudo-Bard. The idea of artificial creativity has become so widespread that there’s even a meme format about forcibly training neural network ‘bots’ on human writing samples, with hilarious consequences—although the best joke was undoubtedly human in origin.

The AI that roamed from New York to New Orleans was an LSTM (long short-term memory) neural net. By default, information contained in individual neurons is preserved, and only small parts can be “forgotten” or “learned” in an individual timestep, rather than neurons being entirely overwritten.

The LSTM architecture performs better than previous recurrent neural networks at tasks such as handwriting and speech recognition. The neural net—and its programmer—looked further in search of literary influences, ingesting 60 million words (360 MB) of raw literature according to Goodwin’s recipe: one third poetry, one third science fiction, and one third “bleak” literature.

In this way, Goodwin has some creative control over the project; the source material influences the machine’s vocabulary and sentence structuring, and hence the tone of the piece.

The Thoughts Beneath the Words
The problem with artificially intelligent novelists is the same problem with conversational artificial intelligence that computer scientists have been trying to solve from Turing’s day. The machines can understand and reproduce complex patterns increasingly better than humans can, but they have no understanding of what these patterns mean.

Goodwin’s neural network spits out sentences one letter at a time, on a tiny printer hooked up to the laptop. Statistical associations such as those tracked by neural nets can form words from letters, and sentences from words, but they know nothing of character or plot.

When talking to a chatbot, the code has no real understanding of what’s been said before, and there is no dataset large enough to train it through all of the billions of possible conversations.

Unless restricted to a predetermined set of options, it loses the thread of the conversation after a reply or two. In a similar way, the creative neural nets have no real grasp of what they’re writing, and no way to produce anything with any overarching coherence or narrative.

Goodwin’s experiment is an attempt to add some coherent backbone to the AI “novel” by repeatedly grounding it with stimuli from the cameras or microphones—the thematic links and narrative provided by the American landscape the neural network drives through.

Goodwin feels that this approach (the car itself moving through the landscape, as if a character) borrows some continuity and coherence from the journey itself. “Coherent prose is the holy grail of natural-language generation—feeling that I had somehow solved a small part of the problem was exhilarating. And I do think it makes a point about language in time that’s unexpected and interesting.”

AI Is Still No Kerouac
A coherent tone and semantic “style” might be enough to produce some vaguely-convincing teenage poetry, as Google did, and experimental fiction that uses neural networks can have intriguing results. But wading through the surreal AI prose of this era, searching for some meaning or motif beyond novelty value, can be a frustrating experience.

Maybe machines can learn the complexities of the human heart and brain, or how to write evocative or entertaining prose. But they’re a long way off, and somehow “more layers!” or a bigger corpus of data doesn’t feel like enough to bridge that gulf.

Real attempts by machines to write fiction have so far been broadly incoherent, but with flashes of poetry—dreamlike, hallucinatory ramblings.

Neural networks might not be capable of writing intricately-plotted works with charm and wit, like Dickens or Dostoevsky, but there’s still an eeriness to trying to decipher the surreal, Finnegans’ Wake mish-mash.

You might see, in the odd line, the flickering ghost of something like consciousness, a deeper understanding. Or you might just see fragments of meaning thrown into a neural network blender, full of hype and fury, obeying rules in an occasionally striking way, but ultimately signifying nothing. In that sense, at least, the RNN’s grappling with metaphor feels like a metaphor for the hype surrounding the latest AI summer as a whole.

Or, as the human author of On The Road put it: “You guys are going somewhere or just going?”

Image Credit: eurobanks / Shutterstock.com Continue reading

Posted in Human Robots

#428367 Fusion for Energy signs multi-million ...

Fusion for Energy signs multi-million deal with Airbus Safran Launchers, Nuvia Limited and Cegelec CEM to develop robotics equipment for ITER
The contract for a value of nearly 100 million EUR is considered to be the single biggest robotics deal to date in the field of fusion energy. The state of the art equipment will form part of ITER, the world’s largest experimental fusion facility and the first in history to produce 500 MW. The prestigious project brings together seven parties (China, Europe, Japan, India, the Republic of Korea, the Russian Federation and the USA) which represent 50% of the world’s population and 80% of the global GDP.
The collaboration between Fusion for Energy (F4E), the EU organisation managing Europe’s contribution to ITER, with a consortium of companies consisting of Airbus Safran Launchers (France-Germany), Nuvia Limited (UK) and Cegelec CEM (France), companies of the VINCI Group, will run for a period of seven years. The UK Atomic Energy Authority (UK), Instituto Superior Tecnico (Portugal), AVT Europe NV (Belgium) and Millennium (France) will also be part of this deal which will deliver remotely operated systems for the transportation and confinement of components located in the ITER vacuum vessel.
The contract carries also a symbolic importance marking the signature all procurement packages managed by Europe in the field of remote handling. Carlo Damiani, F4E’s Project Manager for ITER Remote Handling Systems, explained that “F4E’s stake in ITER offers an unparalleled opportunity to companies and laboratories to develop expertise and an industrial culture in fusion reactors’ maintenance.”
Cut-away image of the ITER machine showing the casks at the three levels of the ITER machine. ITER IO © (Remote1 web). Photo Credit: f4e.europa.euIllustration of lorry next to an ITER cask. F4E © (Remote 2 web). Photo Credit: f4e.europa.euAerial view of the ITER construction site, October 2016. F4E © (ITER site aerial Oct). Photo Credit: f4e.europa.eu

Why ITER requires Remote Handling?
Remote handling refers to the high-tech systems that will help us maintain and repair the ITER machine. The space where the bulky equipment will operate is limited and the exposure of some of the components to radioactivity, prohibit any manual intervention inside the vacuum vessel.

What will be delivered through this contract?
The transfer of components from the ITER vacuum vessel to the Hot Cell building, where they will be deposited for maintenance, will need to be carried out with the help of massive double-door containers known as casks. According to current estimates, 15 of these casks will need to be manufactured and in their largest configuration they will measure 8.5 m x 3.7 m x 2.6 m approaching 100 tonnes when transporting the heaviest components. These enormous “boxes”, resembling to a conventional lorry container, will be remotely operated as they move between the different levels and buildings of the machine. Apart from the transportation and confinement of components, the ITER Cask and Plug Remote Handling System will also ensure the installation of the remote handling equipment entering into the vacuum vessel to pick up the components to be removed. The technologies underpinning this system will encompass a variety of high-tech skills and comply with nuclear safety requirements. A proven manufacturing experience in similar fields and the development of bespoke systems to perform mechanical transfers will be essential.

Background information
MEMO: Fusion for Energy signs multi-million deal with Airbus Safran Launchers, Nuvia Limited and Cegelec CEM to develop robotics equipment for ITER
Multimedia
To see how the ITER Remote Handling System will operate click on clip 1 and clip 2
To see the progress of the ITER construction site click here
To take a virtual tour on the ITER construction site click here

Image captions
Cut-away image of the ITER machine showing the casks at the three levels of the ITER machine. ITER IO © (Remote1 web)

Illustration of lorry next to an ITER cask. F4E © (Remote 2 web)

Aerial view of the ITER construction site, October 2016. F4E © (ITER site aerial Oct)

The consortium of companies
The consortium combines the space expertise of Airbus Safran Launchers, adapted to this extreme environment to ensure safe conditions for the ITER teams; with Nuvia comes a wealth of nuclear experience dating back to the beginnings of the UK Nuclear industry. Nuvia has delivered solutions to some of the world’s most complex nuclear challenges; and with Cegelec CEM as a specialist in mechanical projects for French nuclear sector, which contributes over 30 years in the nuclear arena, including turnkey projects for large scientific installations, as well as the realisation of complex mechanical systems.

Fusion for Energy
Fusion for Energy (F4E) is the European Union’s organisation for Europe’s contribution to ITER.
One of the main tasks of F4E is to work together with European industry, SMEs and research organisations to develop and provide a wide range of high technology components together with engineering, maintenance and support services for the ITER project.
F4E supports fusion R&D initiatives through the Broader Approach Agreement signed with Japan and prepares for the construction of demonstration fusion reactors (DEMO).
F4E was created by a decision of the Council of the European Union as an independent legal entity and was established in April 2007 for a period of 35 years.
Its offices are in Barcelona, Spain.
http://www.fusionforenergy.europa.eu
http://www.youtube.com/user/fusionforenergy
http://twitter.com/fusionforenergy
http://www.flickr.com/photos/fusionforenergy

ITER
ITER is a first-of-a-kind global collaboration. It will be the world’s largest experimental fusion facility and is designed to demonstrate the scientific and technological feasibility of fusion power. It is expected to produce a significant amount of fusion power (500 MW) for about seven minutes. Fusion is the process which powers the sun and the stars. When light atomic nuclei fuse together form heavier ones, a large amount of energy is released. Fusion research is aimed at developing a safe, limitless and environmentally responsible energy source.
Europe will contribute almost half of the costs of its construction, while the other six parties to this joint international venture (China, Japan, India, the Republic of Korea, the Russian Federation and the USA), will contribute equally to the rest.
The site of the ITER project is in Cadarache, in the South of France.
http://www.iter.org

For Fusion for Energy media enquiries contact:
Aris Apollonatos
E-mail: aris.apollonatos@f4e.europa.eu
Tel: + 34 93 3201833 + 34 649 179 42
The post Fusion for Energy signs multi-million deal to develop robotics equipment for ITER appeared first on Roboticmagazine. Continue reading

Posted in Human Robots

#428039 Naturipe Berry Growers Invests in ...

FOR IMMEDIATE RELEASE CONTACT: Gary Wishnatzki
O: (813)498-4278
C: (813)335-3959
gw@harvestcroo.com

NATURIPE BERRY GROWERS INVESTS IN HARVEST CROO ROBOTICS
Adds to the growing list of strawberry industry investors

Tampa, FL (September 20, 2016) – Naturipe Berry Growers has joined the growing list of strawberry industry investors supporting Harvest CROO Robotics’ mission to answer the need for agricultural labor with technology. Naturipe is one of the largest strawberry growers in North America. With the support of Naturipe, now more than 20% of the U.S. strawberry industry has invested in Harvest CROO Robotics.

“The lack of availability of labor to harvest strawberries is one of the great challenges facing our industry,” said Rich Amirsehhi, President and CEO of Naturipe Berry Growers. “Harvest CROO Robotics’ technology to harvest berries has tremendous promise to solve this critical problem.”

Harvest CROO Robotics continues to develop and test the latest technology for agricultural robotics. The company will test their latest prototype during the Florida strawberry season, which begins in November. Improvements include harvest speed and the development of an autonomous mobile platform that will carry the robotic pickers through the field. After berries are picked, they will be transferred overhead to the platform level, where they will be inspected and packed into consumer units by delta robots. The development of the packing robots, next year, will mark another key milestone in Harvest CROO Robotics’ technological advances.

“The technology is prepared to make a major leap this coming season,” said Bob Pitzer, Co-founder and Chief Technology Officer of Harvest CROO. “We were at commercial speed, last March, at a rate of 8 seconds to pick a plant. Now by using embedded processors and a streamlined picking head design, we expect to easily cut that time in half.”

“Naturipe Berry Growers sees joining this collaborative effort as an important step in ensuring the sustainability of the U.S. strawberry industry and putting our growers in a position to be early adopters of the technology,” said Amirsehhi.

Harvest CROO is currently fundraising in preparation for the next round of prototypes. To learn more about Harvest CROO, including investment opportunities, contact info@harvestcroo.com.
###

About Harvest CROO:

Harvest CROO (Computerized Robotic Optimized Obtainer) began in 2012 on Gary Wishnatzki’s vision of creating a solution to the dwindling labor force in agriculture. With the expertise of Co-founder and Chief Technical Officer, Bob Pitzer, they began developing the first Harvest CROO machines. In Previous rounds, $1.8 million was raised through qualified investors. Many of these investors are members of the strawberry industry, including Sweet Life Farms, Sam Astin III, California Giant, Inc., Main Street Produce, Inc., Sweet Darling Sales, Inc. Innovative Produce Inc., DG Berry, Inc., Central West, and Naturipe Berry Growers. In Round C, Harvest CROO is seeking to raise $3 million to build the next version, the Alpha unit, which will be the predecessor to a production model. To learn more about Harvest CROO, including current career opportunities for experienced engineers, contact info@harvestcroo.com.

About Naturipe Berry Growers:

Naturipe Berry Growers (NBG) is a co-op of growers that was founded in 1917 as the Central California Berry Growers Association. NBG markets their fruit through Naturipe Farms LLC, which is a grower-owned producer and international marketer of healthy, best tasting, premium berries. With production primarily from multi generation family farms, located in prime berry growing regions throughout North and South America. The diverse grower base ensures year-round availability of “locally grown” and “in-season global” conventional and organic berries. Naturipe Farms, formed in 2000, is a partnership between MBG Marketing, Hortifrut SA, Naturipe Berry Growers and Munger Farms. With sales and customer service offices located strategically throughout the USA – (HQ) Salinas CA., Grand Junction MI., Estero FL., Boston MA., Wenatchee WA., Atlanta GA.
For more information visit: www.naturipefarms.com or https://www.facebook.com/Naturipe
The post Naturipe Berry Growers Invests in Harvest CROO Robotics appeared first on Roboticmagazine. Continue reading

Posted in Human Robots