Tag Archives: moon

#435828 Video Friday: Boston Dynamics’ ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RoboBusiness 2019 – October 1-3, 2019 – Santa Clara, Calif., USA
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

You’ve almost certainly seen the new Spot and Atlas videos from Boston Dynamics, if for no other reason than we posted about Spot’s commercial availability earlier this week. But what, are we supposed to NOT include them in Video Friday anyway? Psh! Here you go:

[ Boston Dynamics ]

Eight deadly-looking robots. One Giant Nut trophy. Tonight is the BattleBots season finale, airing on Discovery, 8 p.m. ET, or check your local channels.

[ BattleBots ]

Thanks Trey!

Speaking of battling robots… Having giant robots fight each other is one of those things that sounds really great in theory, but doesn’t work out so well in reality. And sadly, MegaBots is having to deal with reality, which means putting their giant fighting robot up on eBay.

As of Friday afternoon, the current bid is just over $100,000 with a week to go.

[ MegaBots ]

Michigan Engineering has figured out the secret formula to getting 150,000 views on YouTube: drone plus nail gun.

[ Michigan Engineering ]

Michael Burke from the University of Edinburgh writes:

We’ve been learning to scoop grapefruit segments using a PR2, by “feeling” the difference between peel and pulp. We use joint torque measurements to predict the probability that the knife is in the peel or pulp, and use this to apply feedback control to a nominal cutting trajectory learned from human demonstration, so that we remain in a position of maximum uncertainty about which medium we’re cutting. This means we slice along the boundary between the two mediums. It works pretty well!

[ Paper ] via [ Robust Autonomy and Decisions Group ]

Thanks Michael!

Hey look, it’s Jan with eight EMYS robot heads. Hi, Jan! Hi, EMYSes!

[ EMYS ]

We’re putting the KRAKEN Arm through its paces, demonstrating that it can unfold from an Express Rack locker on the International Space Station and access neighboring lockers in NASA’s FabLab system to enable transfer of materials and parts between manufacturing, inspection, and storage stations. The KRAKEN arm will be able to change between multiple ’end effector’ tools such as grippers and inspection sensors – those are in development so they’re not shown in this video.

[ Tethers Unlimited ]

UBTECH’s Alpha Mini Robot with Smart Robot’s “Maatje” software is offering healthcare service to children at Praktijk Intraverte Multidisciplinary Institution in Netherlands.

This institution is using Alpha Mini in counseling children’s behavior. Alpha Mini can move and talk to children and offers games and activities to stimulate and interact with them. Alpha Mini talks, helps and motivates children thereby becoming more flexible in society.

[ UBTECH ]

Some impressive work here from Anusha Nagabandi, Kurt Konoglie, Sergey Levine, Vikash Kumar at Google Brain, training a dexterous multi-fingered hand to do that thing with two balls that I’m really bad at.

Dexterous multi-fingered hands can provide robots with the ability to flexibly perform a wide range of manipulation skills. However, many of the more complex behaviors are also notoriously difficult to control: Performing in-hand object manipulation, executing finger gaits to move objects, and exhibiting precise fine motor skills such as writing, all require finely balancing contact forces, breaking and reestablishing contacts repeatedly, and maintaining control of unactuated objects. In this work, we demonstrate that our method of online planning with deep dynamics models (PDDM) addresses both of these limitations; we show that improvements in learned dynamics models, together with improvements in online model-predictive control, can indeed enable efficient and effective learning of flexible contact-rich dexterous manipulation skills — and that too, on a 24-DoF anthropomorphic hand in the real world, using just 2-4 hours of purely real-world data to learn to simultaneously coordinate multiple free-floating objects.

[ PDDM ]

Thanks Vikash!

CMU’s Ballbot has a deceptively light touch that’s ideal for leading people around.

A paper on this has been submitted to IROS 2019.

[ CMU ]

The Autonomous Robots Lab at the University of Nevada is sharing some of the work they’ve done on path planning and exploration for aerial robots during the DARPA SubT Challenge.

[ Autonomous Robots Lab ]

More proof that anything can be a drone if you staple some motors to it. Even 32 feet of styrofoam insulation.

[ YouTube ]

Whatever you think of military drones, we can all agree that they look cool.

[ Boeing ]

I appreciate the fact that iCub has eyelids, I really do, but sometimes, it ends up looking kinda sleepy in research videos.

[ EPFL LASA ]

Video shows autonomous flight of a lightweight aerial vehicle outdoors and indoors on the campus of Carnegie Mellon University. The vehicle is equipped with limited onboard sensing from a front-facing camera and a proximity sensor. The aerial autonomy is enabled by utilizing a 3D prior map built in Step 1.

[ CMU ]

The Stanford Space Robotics Facility allows researchers to test innovative guidance and navigation algorithms on a realistic frictionless, underactuated system.

[ Stanford ASL ]

In this video, Ian and CP discuss Misty’s many capabilities including robust locomotion, obstacle avoidance, 3D mapping/SLAM, face detection and recognition, sound localization, hardware extensibility, photo and video capture, and programmable personality. They also talk about some of the skills he’s built using these capabilities (and others) and how those skills can be expanded upon by you.

[ Misty Robotics ]

This week’s CMU RI Seminar comes from Aaron Parness at Caltech and NASA JPL, on “Robotic Grippers for Planetary Applications.”

The previous generation of NASA missions to the outer solar system discovered salt water oceans on Europa and Enceladus, each with more liquid water than Earth – compelling targets to look for extraterrestrial life. Closer to home, JAXA and NASA have imaged sky-light entrances to lava tube caves on the Moon more than 100 m in diameter and ESA has characterized the incredibly varied and complex terrain of Comet 67P. While JPL has successfully landed and operated four rovers on the surface of Mars using a 6-wheeled rocker-bogie architecture, future missions will require new mobility architectures for these extreme environments. Unfortunately, the highest value science targets often lie in the terrain that is hardest to access. This talk will explore robotic grippers that enable missions to these extreme terrains through their ability to grip a wide variety of surfaces (shapes, sizes, and geotechnical properties). To prepare for use in space where repair or replacement is not possible, we field-test these grippers and robots in analog extreme terrain on Earth. Many of these systems are enabled by advances in autonomy. The talk will present a rapid overview of my work and a detailed case study of an underactuated rock gripper for deflecting asteroids.

[ CMU ]

Rod Brooks gives some of the best robotics talks ever. He gave this one earlier this week at UC Berkeley, on “Steps Toward Super Intelligence and the Search for a New Path.”

[ UC Berkeley ] Continue reading

Posted in Human Robots

#435791 To Fly Solo, Racing Drones Have a Need ...

Drone racing’s ultimate vision of quadcopters weaving nimbly through obstacle courses has attracted far less excitement and investment than self-driving cars aimed at reshaping ground transportation. But the U.S. military and defense industry are betting on autonomous drone racing as the next frontier for developing AI so that it can handle high-speed navigation within tight spaces without human intervention.

The autonomous drone challenge requires split-second decision-making with six degrees of freedom instead of a car’s mere two degrees of road freedom. One research team developing the AI necessary for controlling autonomous racing drones is the Robotics and Perception Group at the University of Zurich in Switzerland. In late May, the Swiss researchers were among nine teams revealed to be competing in the two-year AlphaPilot open innovation challenge sponsored by U.S. aerospace company Lockheed Martin. The winning team will walk away with up to $2.25 million for beating other autonomous racing drones and a professional human drone pilot in head-to-head competitions.

“I think it is important to first point out that having an autonomous drone to finish a racing track at high speeds or even beating a human pilot does not imply that we can have autonomous drones [capable of] navigating in real-world, complex, unstructured, unknown environments such as disaster zones, collapsed buildings, caves, tunnels or narrow pipes, forests, military scenarios, and so on,” says Davide Scaramuzza, a professor of robotics and perception at the University of Zurich and ETH Zurich. “However, the robust and computationally efficient state estimation algorithms, control, and planning algorithms developed for autonomous drone racing would represent a starting point.”

The nine teams that made the cut—from a pool of 424 AlphaPilot applicants—will compete in four 2019 racing events organized under the Drone Racing League’s Artificial Intelligence Robotic Racing Circuit, says Keith Lynn, program manager for AlphaPilot at Lockheed Martin. To ensure an apples-to-apples comparison of each team’s AI secret sauce, each AlphaPilot team will upload its AI code into identical, specially-built drones that have the NVIDIA Xavier GPU at the core of the onboard computing hardware.

“Lockheed Martin is offering mentorship to the nine AlphaPilot teams to support their AI tech development and innovations,” says Lynn. The company “will be hosting a week-long Developers Summit at MIT in July, dedicated to workshopping and improving AlphaPilot teams’ code,” he added. He notes that each team will retain the intellectual property rights to its AI code.

The AlphaPilot challenge takes inspiration from older autonomous drone racing events hosted by academic researchers, Scaramuzza says. He credits Hyungpil Moon, a professor of robotics and mechanical engineering at Sungkyunkwan University in South Korea, for having organized the annual autonomous drone racing competition at the International Conference on Intelligent Robots and Systems since 2016.

It’s no easy task to create and train AI that can perform high-speed flight through complex environments by relying on visual navigation. One big challenge comes from how drones can accelerate sharply, take sharp turns, fly sideways, do zig-zag patterns and even perform back flips. That means camera images can suddenly appear tilted or even upside down during drone flight. Motion blur may occur when a drone flies very close to structures at high speeds and camera pixels collect light from multiple directions. Both cameras and visual software can also struggle to compensate for sudden changes between light and dark parts of an environment.

To lend AI a helping hand, Scaramuzza’s group recently published a drone racing dataset that includes realistic training data taken from a drone flown by a professional pilot in both indoor and outdoor spaces. The data, which includes complicated aerial maneuvers such as back flips, flight sequences that cover hundreds of meters, and flight speeds of up to 83 kilometers per hour, was presented at the 2019 IEEE International Conference on Robotics and Automation.

The drone racing dataset also includes data captured by the group’s special bioinspired event cameras that can detect changes in motion on a per-pixel basis within microseconds. By comparison, ordinary cameras need milliseconds (each millisecond being 1,000 microseconds) to compare motion changes in each image frame. The event cameras have already proven capable of helping drones nimbly dodge soccer balls thrown at them by the Swiss lab’s researchers.

The Swiss group’s work on the racing drone dataset received funding in part from the U.S. Defense Advanced Research Projects Agency (DARPA), which acts as the U.S. military’s special R&D arm for more futuristic projects. Specifically, the funding came from DARPA’s Fast Lightweight Autonomy program that envisions small autonomous drones capable of flying at high speeds through cluttered environments without GPS guidance or communication with human pilots.

Such speedy drones could serve as military scouts checking out dangerous buildings or alleys. They could also someday help search-and-rescue teams find people trapped in semi-collapsed buildings or lost in the woods. Being able to fly at high speed without crashing into things also makes a drone more efficient at all sorts of tasks by making the most of limited battery life, Scaramuzza says. After all, most drone battery life gets used up by the need to hover in flight and doesn’t get drained much by flying faster.

Even if AI manages to conquer the drone racing obstacle courses, that would be the end of the beginning of the technology’s development. What would still be required? Scaramuzza specifically singled out the need to handle low-visibility conditions involving smoke, dust, fog, rain, snow, fire, hail, as some of the biggest challenges for vision-based algorithms and AI in complex real-life environments.

“I think we should develop and release datasets containing smoke, dust, fog, rain, fire, etc. if we want to allow using autonomous robots to complement human rescuers in saving people lives after an earthquake or natural disaster in the future,” Scaramuzza says. Continue reading

Posted in Human Robots

#435750 Video Friday: Amazon CEO Jeff Bezos ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events):

RSS 2019 – June 22-26, 2019 – Freiburg, Germany
Hamlyn Symposium on Medical Robotics – June 23-26, 2019 – London, U.K.
ETH Robotics Summer School – June 27-1, 2019 – Zurich, Switzerland
MARSS 2019 – July 1-5, 2019 – Helsinki, Finland
ICRES 2019 – July 29-30, 2019 – London, U.K.
Let us know if you have suggestions for next week, and enjoy today’s videos.

Last week at the re:MARS conference, Amazon CEO and aspiring supervillain Jeff Bezos tried out this pair of dexterous robotic hands, which he described as “weirdly natural” to operate. The system combines Shadow Robot’s anthropomorphic robot hands with SynTouch’s biomimetic tactile sensors and HaptX’s haptic feedback gloves.

After playing with the robot, Bezos let out his trademark evil laugh.

[ Shadow Robot ]

The RoboMaster S1 is DJI’s advanced new educational robot that opens the door to limitless learning and entertainment. Develop programming skills, get familiar with AI technology, and enjoy thrilling FPV driving with games and competition. From young learners to tech enthusiasts, get ready to discover endless possibilities with the RoboMaster S1.

[ DJI ]

It’s very impressive to see DLR’s humanoid robot Toro dynamically balancing, even while being handed heavy objects, pushing things, and using multi-contact techniques to kick a fire extinguisher for some reason.

The paper is in RA-L, and you can find it at the link below.

[ RA-L ] via [ DLR ]

Thanks Maximo!

Is it just me, or does the Suzumori Endo Robotics Laboratory’s Super Dragon arm somehow just keep getting longer?

Suzumori Endo Lab, Tokyo Tech developed a 10 m-long articulated manipulator for investigation inside the primary containment vessel of the Fukushima Daiichi Nuclear Power Plants. We employed a coupled tendon-driven mechanism and a gravity compensation mechanism using synthetic fiber ropes to design a lightweight and slender articulated manipulator. This work was published in IEEE Robotics and Automation Letters and Transactions of the JSME.

[ Suzumori Endo Lab ]

From what I can make out thanks to Google Translate, this cute little robot duck (developed by Nissan) helps minimize weeds in rice fields by stirring up the water.

[ Nippon.com ]

Confidence in your robot is when you can just casually throw it off of a balcony 15 meters up.

[ SUTD ]

You had me at “we’re going to completely submerge this apple in chocolate syrup.”

[ Soft Robotics Inc ]

In the mid 2020s, the European Space Agency is planning on sending a robotic sample return mission to the Moon. It’s called Heracles, after the noted snake-strangler of Greek mythology.

[ ESA ]

Rethink Robotics is still around, they’re just much more German than before. And Sawyer is still hard at work stealing jobs from humans.

[ Rethink Robotics ]

The reason to watch this new video of the Ghost Robotics Vision 60 quadruped is for the 3 seconds worth of barrel roll about 40 seconds in.

[ Ghost Robotics ]

This is a relatively low-altitude drop for Squishy Robotics’ tensegrity scout, but it still cool to watch a robot that’s resilient enough to be able to fall and just not worry about it.

[ Squishy Robotics ]

We control here the Apptronik DRACO bipedal robot for unsupported dynamic locomotion. DRACO consists of a 10 DoF lower body with liquid cooled viscoelastic actuators to reduce weight, increase payload, and achieve fast dynamic walking. Control and walking algorithms are designed by UT HCRL Laboratory.

I think all robot videos should be required to start with two “oops” clips followed by a “for real now” clip.

[ Apptronik ]

SAKE’s EZGripper manages to pick up a wrench, and also pick up a raspberry without turning it into instajam.

[ SAKE Robotics ]

And now: the robotic long-tongued piggy, courtesy Sony Toio.

[ Toio ]

In this video the ornithopter developed inside the ERC Advanced Grant GRIFFIN project performs its first flight. This projects aims to develop a flapping wing system with manipulation and human interaction capabilities.

A flapping-wing system with manipulation and human interaction capabilities, you say? I would like to subscribe to your newsletter.

[ GRVC ]

KITECH’s robotic hands and arms can manipulate, among other things, five boxes of Elmos. I’m not sure about the conversion of Elmos to Snuffleupaguses, although it turns out that one Snuffleupagus is exactly 1,000 pounds.

[ Ji-Hun Bae ]

The Australian Centre for Field Robotics (ACFR) has been working on agricultural robots for almost a decade, and this video sums up a bunch of the stuff that they’ve been doing, even if it’s more amusing than practical at times.

[ ACFR ]

ROS 2 is great for multi-robot coordination, like when you need your bubble level to stay really, really level.

[ Acutronic Robotics ]

We don’t hear iRobot CEO Colin Angle give a lot of talks, so this recent one (from Amazon’s re:MARS conference) is definitely worth a listen, especially considering how much innovation we’ve seen from iRobot recently.

Colin Angle, founder and CEO of iRobot, has unveil a series of breakthrough innovations in home robots from iRobot. For the first time on stage, he will discuss and demonstrate what it takes to build a truly intelligent system of robots that work together to accomplish more within the home – and enable that home, and the devices within it, to work together as one.

[ iRobot ]

In the latest episode of Robots in Depth, Per speaks with Federico Pecora from the Center for Applied Autonomous Sensor Systems at Örebro University in Sweden.

Federico talks about working on AI and service robotics. In this area he has worked on planning, especially focusing on why a particular goal is the one that the robot should work on. To make robots as useful and user friendly as possible, he works on inferring the goal from the robot’s environment so that the user does not have to tell the robot everything.

Federico has also worked with AI robotics planning in industry to optimize results. Managing the relative importance of tasks is another challenging area there. In this context, he works on automating not only a single robot for its goal, but an entire fleet of robots for their collective goal. We get to hear about how these techniques are being used in warehouse operations, in mines and in agriculture.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435731 Video Friday: NASA Is Sending This ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

MARSS 2019 – July 1-5, 2019 – Helsinki, Finland
ICRES 2019 – July 29-30, 2019 – London, UK
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, PA, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The big news today is that NASA is sending a robot to Saturn’s moon Titan. A flying robot. The Dragonfly mission will launch in 2026 and arrive in 2034, but you knew that already, because last January, we posted a detailed article about the concept from the Applied Physics Lab at Johns Hopkins University. And now it’s not a concept anymore, yay!

Again, read all the details plus an interview in 2018 article.

[ NASA ]

A robotic gripping arm that uses engineered bacteria to “taste” for a specific chemical has been developed by engineers at the University of California, Davis, and Carnegie Mellon University. The gripper is a proof-of-concept for biologically-based soft robotics.

The new device uses a biosensing module based on E. coli bacteria engineered to respond to the chemical IPTG by producing a fluorescent protein. The bacterial cells reside in wells with a flexible, porous membrane that allows chemicals to enter but keeps the cells inside. This biosensing module is built into the surface of a flexible gripper on a robotic arm, so the gripper can “taste” the environment through its fingers.

When IPTG crosses the membrane into the chamber, the cells fluoresce and electronic circuits inside the module detect the light. The electrical signal travels to the gripper’s control unit, which can decide whether to pick something up or release it.

[ UC Davis ]

The Toyota Research Institute (TRI) is taking on the hard problems in manipulation research toward making human-assist robots reliable and robust. Dr. Russ Tedrake, TRI Vice President of Robotics Research, explains how we are exploring the challenges and addressing the reliability gap by using a robot loading dishes in a dishwasher as an example task.

[ TRI ]

The Tactile Telerobot is the world’s first haptic telerobotic system that transmits realistic touch feedback to an operator located anywhere in the world. It is the product of joint collaboration between Shadow Robot Company, HaptX, and SynTouch. All Nippon Airways funded the project’s initial research and development.

What’s really unique about this is the HaptX tactile feedback system, which is something we’ve been following for several years now. It’s one of the most magical tech experiences I’ve ever had, and you can read about it here and here.

[ HaptX ]

Thanks Andrew!

I love how snake robots can emulate some of the fanciest moves of real snakes, and then also do bonkers things that real snakes never do.

[ Matsuno Lab ]

Here are a couple interesting videos from the Human-Robot Interaction Lab at Tufts.

A robot is instructed to perform an action and cannot do it due to lack of sensors. But when another robot is placed nearby, it can execute the instruction by tacitly tapping into the other robot’s mind and using that robot’s sensors for its own actions. Yes, it’s automatic, and yes, it’s the BORG!

Two Nao robots are instructed to perform a dance and are able to do it right after instruction. Moreover, they can switch roles immediately, and even a third different PR2 robot can perform the dance right away, demonstrating the ability of our DIARC architecture to learn quickly and share the knowledge with any type of robot running the architecture.

Compared to Nao, PR2 just sounds… depressed.

[ HRI Lab ]

This work explores the problem of robot tool construction – creating tools from parts available in the environment. We advance the state-of-the-art in robotic tool construction by introducing an approach that enables the robot to construct a wider range of tools with greater computational efficiency. Specifically, given an action that the robot wishes to accomplish and a set of building parts available to the robot, our approach reasons about the shape of the parts and potential ways of attaching them, generating a ranking of part combinations that the robot then uses to construct and test the target tool. We validate our approach on the construction of five tools using a physical 7-DOF robot arm.

[ RAIL Lab ] via [ RSS ]

We like Magazino’s approach to warehouse picking- constrain the problem to something you can reliably solve, like shoeboxes.

Magazino has announced a new pricing model for their robots. You pay 55k euros for the robot itself, and then after that, all you pay to keep the robot working is 6 cents per pick, so the robot is only costing you money for the work that it actually does.

[ Magazino ]

Thanks Florin!

Human-Robot Collaborations are happening across factories worldwide, yet very few are using it for smaller businesses, due to high costs or the difficulty of customization. Elephant Robotics, a new player from Shenzhen, the Silicon Valley of Asia, has set its sight on helping smaller businesses gain access to smart robotics. They created a Catbot (a collaborative robotic arm) that will offer high efficiency and flexibility to various industries.

The Catbot is set to help from education projects, photography, massaging, to being a personal barista or co-playing a table game. The customizations are endless. To increase the flexibility of usage, the Catbot is extremely easy to program from a high precision task up to covering hefty ground projects.

[ Elephant Robotics ]

Thanks Johnson!

Dronistics, an EPFL spin-off, has been testing out their enclosed delivery drone in the Dominican Republic through a partnership with WeRobotics.

[ WeRobotics ]

QTrobot is an expressive humanoid robot designed to help children with autism spectrum disorder and children with special educational needs in learning new skills. QTrobot uses simple and exaggerated facial expressions combined by interactive games and stories, to help children improve their emotional skills. QTrobot helps children to learn about and better understand the emotions and teach them strategies to handle their emotions more effectively.

[ LuxAI ]

Here’s a typical day in the life of a Tertill solar-powered autonomous weed-destroying robot.

$300, now shipping from Franklin Robotics.

[ Tertill ]

PAL Robotics is excited to announce a new TIAGo with two arms, TIAGo++! After carefully listening to the robotics community needs, we used TIAGo’s modularity to integrate two 7-DoF arms to our mobile manipulator. TIAGo++ can help you swiftly accomplish your research goals, opening endless possibilities in mobile manipulation.

[ PAL Robotics ]

Thanks Jack!

You’ve definitely already met the Cobalt security robot, but Toyota AI Ventures just threw a pile of money at them and would therefore like you to experience this re-introduction:

[ Cobalt Robotics ] via [ Toyota AI ]

ROSIE is a mobile manipulator kit from HEBI Robotics. And if you don’t like ROSIE, the modular nature of HEBI’s hardware means that you can take her apart and make something more interesting.

[ HEBI Robotics ]

Learn about Kawasaki Robotics’ second addition to their line of duAro dual-arm collaborative robots, duAro2. This model offers an extended vertical reach (550 mm) and an increased payload capacity (3 kg/arm).

[ Kawasaki Robotics ]

Drone Delivery Canada has partnered with Peel Region Paramedics to pilot its proprietary drone delivery platform to enable rapid first responder technology via drone with the goal to reduce response time and potentially save lives.

[ Drone Delivery Canada ]

In this week’s episode of Robots in Depth, Per speaks with Harri Ketamo, from Headai.

Harri Ketamo talks about AI and how he aims to mimic human decision making with algorithms. Harri has done a lot of AI for computer games to create opponents that are entertaining to play against. It is easy to develop a very bad or a very good opponent, but designing an opponent that behaves like a human, is entertaining to play against and that you can beat is quite hard. He talks about how AI in computer games is a very important story telling tool and an important part of making a game entertaining to play.

This work led him into other parts of the AI field. Harri thinks that we sometimes have a problem separating what is real from what is the type of story telling he knows from gaming AI. He calls for critical analysis of AI and says that data has to be used to verify AI decisions and results.

[ Robots in Depth ]

Thanks Per! Continue reading

Posted in Human Robots

#435691 Squeezing Rocket Fuel From Moon Rocks

Illustration: John MacNeill

Engineers and Architects Are Already Designing Lunar Habitats
Squeezing Rocket Fuel From Moon Rocks
Robots Will Navigate the Moon With Maps They Make Themselves
Kim Stanley Robinson Built a Moon Base in His Mind

The most valuable natural resource on the moon may be water. In addition to sustaining lunar colonists, it could also be broken down into its constituent elements—hydrogen and oxygen—and used to make rocket propellant.

Although the ancients called the dark areas on the moon maria (Latin for “seas”), it has long been clear that liquid water can’t exist on the lunar surface, where it would swiftly evaporate. Since the 1960s, though, scientists have hypothesized that the moon indeed harbors water, in the form of ice. Because the moon has a very small axial tilt—just 1.5 degrees—the floors of many polar craters remain in perpetual darkness. Water could thus condense and survive in such polar “cold traps,” where it might one day be mined.

1/5

Water Water Everywhere: Finding rich deposits of ice and extracting it should be possible but will be technically challenging for lunar settlers. Illustration: John MacNeill

2/5

Mapping the Moon: Several lunar missions have produced strong evidence of water ice. A NASA instrument called the Moon Mineralogy Mapper (M3) found indications of water ice on the permanently shadowed floors of some polar craters. However, the measurements suggest that only a small fraction of cold traps contain ice [colored areas], and that the ice is probably mixed with lunar regolith. Data source.

3/5

Rover-Mounted Drill: The most straightforward strategy for extracting water from polar ice deposits uses a rover-mounted drill. Honeybee Robotics has designed a Planetary Volatiles Extractor with a heated auger, which would cause any water ice in the drilled regolith to vaporize. That vapor would then move through a tube to a condenser unit, where it would turn back into ice. Illustration: John MacNeill

4/5

Thermal Mining: A more ambitious scheme for extracting water from the moon is “thermal mining.” Researchers at the Colorado School of Mines have proposed redirecting the sun’s rays , using heliostats mounted on a crater rim. Water trapped in the regolith would turn into vapor that would be collected in a large tent, then vented into refrigerated cold traps, where it would condense as pure water ice. Illustration: John MacNeill

5/5

Compressed-Gas Transport: To produce rocket fuel from water ice would require an electrolyzer to break the water into hydrogen and oxygen, which would then be compressed and stored for later use. In situ production would also require vehicles to transport the processed fuel to rocket pads. Illustration: John MacNeill

Previous
Next Continue reading

Posted in Human Robots