Tag Archives: real
#439783 This Google-Funded Project Is Tracking ...
It’s crunch time on climate change. The IPCC’s latest report told the world just how bad it is, and…it’s bad. Companies, NGOs, and governments are scrambling for fixes, both short-term and long-term, from banning sale of combustion-engine vehicles to pouring money into hydrogen to building direct air capture plants. And one initiative, launched last week, is taking an “if you can name it, you can tame it” approach by creating an independent database that measures and tracks emissions all over the world.
Climate TRACE, which stands for tracking real-time atmospheric carbon emissions, is a collaboration between nonprofits, tech companies, and universities, including CarbonPlan, Earthrise Alliance, Johns Hopkins Applied Physics Laboratory, former US Vice President Al Gore, and others. The organization started thanks to a grant from Google, which funded an effort to measure power plant emissions using satellites. A team of fellows from Google helped build algorithms to monitor the power plants (the Google.org Fellowship was created in 2019 to let Google employees do pro bono technical work for grant recipients).
Climate TRACE uses data from satellites and other remote sensing technologies to “see” emissions. Artificial intelligence algorithms combine this data with verifiable emissions measurements to produce estimates of the total emissions coming from various sources.
These sources are divided into ten sectors—like power, manufacturing, transportation, and agriculture—each with multiple subsectors (i.e., two subsectors of agriculture are rice cultivation and manure management). The total carbon emitted January 2015 to December 2020, by the project’s estimation, was 303.96 billion tons. The biggest offender? Electricity generation. It’s no wonder, then, that states, companies, and countries are rushing to make (occasionally unrealistic) carbon-neutral pledges, and that the renewable energy industry is booming.
The founders of the initiative hope that, by increasing transparency, the database will increase accountability, thereby spurring action. Younger consumers care about climate change, and are likely to push companies and brands to do something about it.
The BBC reported that in a recent survey led by the UK’s Bath University, almost 60 percent of respondents said they were “very worried” or “extremely worried” about climate change, while more than 45 percent said feelings about the climate affected their daily lives. The survey received responses from 10,000 people aged 16 to 25, finding that young people are the most concerned with climate change in the global south, while in the northern hemisphere those most worried are in Portugal, which has grappled with severe wildfires. Many of the survey respondents, independent of location, reportedly feel that “humanity is doomed.”
Once this demographic reaches working age, they’ll be able to throw their weight around, and it seems likely they’ll do so in a way that puts the planet and its future at center stage. For all its sanctimoniousness, “naming and shaming” of emitters not doing their part may end up being both necessary and helpful.
Until now, Climate TRACE’s website points out, emissions inventories have been largely self-reported (I mean, what’s even the point?), and they’ve used outdated information and opaque measurement methods. Besides being independent, which is huge in itself, TRACE is using 59 trillion bytes of data from more than 300 satellites, more than 11,100 sensors, and other sources of emissions information.
“We’ve established a shared, open monitoring system capable of detecting essentially all forms of humanity’s greenhouse gas emissions,” said Gavin McCormick, executive director of coalition convening member WattTime. “This is a transformative step forward that puts timely information at the fingertips of all those who seek to drive significant emissions reductions on our path to net zero.”
Given the scale of the project, the parties involved, and how quickly it has all come together—the grant from Google was in May 2019—it seems Climate TRACE is well-positioned to make a difference.
Image Credit: NASA Continue reading
#439380 Autonomous excavators ready for around ...
Researchers from Baidu Research Robotics and Auto-Driving Lab (RAL) and the University of Maryland, College Park, have introduced an autonomous excavator system (AES) that can perform material loading tasks for a long duration without any human intervention while offering performance closely equivalent to that of an experienced human operator. Continue reading
#439110 Robotic Exoskeletons Could One Day Walk ...
Engineers, using artificial intelligence and wearable cameras, now aim to help robotic exoskeletons walk by themselves.
Increasingly, researchers around the world are developing lower-body exoskeletons to help people walk. These are essentially walking robots users can strap to their legs to help them move.
One problem with such exoskeletons: They often depend on manual controls to switch from one mode of locomotion to another, such as from sitting to standing, or standing to walking, or walking on the ground to walking up or down stairs. Relying on joysticks or smartphone apps every time you want to switch the way you want to move can prove awkward and mentally taxing, says Brokoslaw Laschowski, a robotics researcher at the University of Waterloo in Canada.
Scientists are working on automated ways to help exoskeletons recognize when to switch locomotion modes — for instance, using sensors attached to legs that can detect bioelectric signals sent from your brain to your muscles telling them to move. However, this approach comes with a number of challenges, such as how how skin conductivity can change as a person’s skin gets sweatier or dries off.
Now several research groups are experimenting with a new approach: fitting exoskeleton users with wearable cameras to provide the machines with vision data that will let them operate autonomously. Artificial intelligence (AI) software can analyze this data to recognize stairs, doors, and other features of the surrounding environment and calculate how best to respond.
Laschowski leads the ExoNet project, the first open-source database of high-resolution wearable camera images of human locomotion scenarios. It holds more than 5.6 million images of indoor and outdoor real-world walking environments. The team used this data to train deep-learning algorithms; their convolutional neural networks can already automatically recognize different walking environments with 73 percent accuracy “despite the large variance in different surfaces and objects sensed by the wearable camera,” Laschowski notes.
According to Laschowski, a potential limitation of their work their reliance on conventional 2-D images, whereas depth cameras could also capture potentially useful distance data. He and his collaborators ultimately chose not to rely on depth cameras for a number of reasons, including the fact that the accuracy of depth measurements typically degrades in outdoor lighting and with increasing distance, he says.
In similar work, researchers in North Carolina had volunteers with cameras either mounted on their eyeglasses or strapped onto their knees walk through a variety of indoor and outdoor settings to capture the kind of image data exoskeletons might use to see the world around them. The aim? “To automate motion,” says Edgar Lobaton an electrical engineering researcher at North Carolina State University. He says they are focusing on how AI software might reduce uncertainty due to factors such as motion blur or overexposed images “to ensure safe operation. We want to ensure that we can really rely on the vision and AI portion before integrating it into the hardware.”
In the future, Laschowski and his colleagues will focus on improving the accuracy of their environmental analysis software with low computational and memory storage requirements, which are important for onboard, real-time operations on robotic exoskeletons. Lobaton and his team also seek to account for uncertainty introduced into their visual systems by movements .
Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system’s analysis of a user's current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says.
However, Laschowski adds, “User safety is of the utmost importance, especially considering that we're working with individuals with mobility impairments,” resulting perhaps from advanced age or physical disabilities.
“The exoskeleton user will always have the ability to override the system should the classification algorithm or controller make a wrong decision.” Continue reading
#439105 This Robot Taught Itself to Walk in a ...
Recently, in a Berkeley lab, a robot called Cassie taught itself to walk, a little like a toddler might. Through trial and error, it learned to move in a simulated world. Then its handlers sent it strolling through a minefield of real-world tests to see how it’d fare.
And, as it turns out, it fared pretty damn well. With no further fine-tuning, the robot—which is basically just a pair of legs—was able to walk in all directions, squat down while walking, right itself when pushed off balance, and adjust to different kinds of surfaces.
It’s the first time a machine learning approach known as reinforcement learning has been so successfully applied in two-legged robots.
This likely isn’t the first robot video you’ve seen, nor the most polished.
For years, the internet has been enthralled by videos of robots doing far more than walking and regaining their balance. All that is table stakes these days. Boston Dynamics, the heavyweight champ of robot videos, regularly releases mind-blowing footage of robots doing parkour, back flips, and complex dance routines. At times, it can seem the world of iRobot is just around the corner.
This sense of awe is well-earned. Boston Dynamics is one of the world’s top makers of advanced robots.
But they still have to meticulously hand program and choreograph the movements of the robots in their videos. This is a powerful approach, and the Boston Dynamics team has done incredible things with it.
In real-world situations, however, robots need to be robust and resilient. They need to regularly deal with the unexpected, and no amount of choreography will do. Which is how, it’s hoped, machine learning can help.
Reinforcement learning has been most famously exploited by Alphabet’s DeepMind to train algorithms that thrash humans at some the most difficult games. Simplistically, it’s modeled on the way we learn. Touch the stove, get burned, don’t touch the damn thing again; say please, get a jelly bean, politely ask for another.
In Cassie’s case, the Berkeley team used reinforcement learning to train an algorithm to walk in a simulation. It’s not the first AI to learn to walk in this manner. But going from simulation to the real world doesn’t always translate.
Subtle differences between the two can (literally) trip up a fledgling robot as it tries out its sim skills for the first time.
To overcome this challenge, the researchers used two simulations instead of one. The first simulation, an open source training environment called MuJoCo, was where the algorithm drew upon a large library of possible movements and, through trial and error, learned to apply them. The second simulation, called Matlab SimMechanics, served as a low-stakes testing ground that more precisely matched real-world conditions.
Once the algorithm was good enough, it graduated to Cassie.
And amazingly, it didn’t need further polishing. Said another way, when it was born into the physical world—it knew how to walk just fine. In addition, it was also quite robust. The researchers write that two motors in Cassie’s knee malfunctioned during the experiment, but the robot was able to adjust and keep on trucking.
Other labs have been hard at work applying machine learning to robotics.
Last year Google used reinforcement learning to train a (simpler) four-legged robot. And OpenAI has used it with robotic arms. Boston Dynamics, too, will likely explore ways to augment their robots with machine learning. New approaches—like this one aimed at training multi-skilled robots or this one offering continuous learning beyond training—may also move the dial. It’s early yet, however, and there’s no telling when machine learning will exceed more traditional methods.
And in the meantime, Boston Dynamics bots are testing the commercial waters.
Still, robotics researchers, who were not part of the Berkeley team, think the approach is promising. Edward Johns, head of Imperial College London’s Robot Learning Lab, told MIT Technology Review, “This is one of the most successful examples I have seen.”
The Berkeley team hopes to build on that success by trying out “more dynamic and agile behaviors.” So, might a self-taught parkour-Cassie be headed our way? We’ll see.
Image Credit: University of California Berkeley Hybrid Robotics via YouTube Continue reading