Tag Archives: nasa
#437598 Video Friday: Sarcos Is Developing a New ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.
NASA’s Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) spacecraft unfurled its robotic arm Oct. 20, 2020, and in a first for the agency, briefly touched an asteroid to collect dust and pebbles from the surface for delivery to Earth in 2023.
[ NASA ]
New from David Zarrouk’s lab at BGU is AmphiSTAR, which Zarrouk describes as “a kind of a ground-water drone inspired by the cockroaches (sprawling) and by the Basilisk lizard (running over water). The robot hovers due to the collision of its propellers with the water (hydrodynamics not aerodynamics). The robot can crawl and swim at high and low speeds and smoothly transition between the two. It can reach 3.5 m/s on ground and 1.5m/s in water.”
AmphiSTAR will be presented at IROS, starting next week!
[ BGU ]
This is unfortunately not a great video of a video that was taken at a SoftBank Hawks baseball game in Japan last week, but it’s showing an Atlas robot doing an honestly kind of impressive dance routine to support the team.
ロボット応援団に人型ロボット『ATLAS』がアメリカからリモートで緊急参戦!!!
ホークスビジョンの映像をお楽しみ下さい♪#sbhawks #Pepper #spot pic.twitter.com/6aTYn8GGli
— 福岡ソフトバンクホークス(公式) (@HAWKS_official)
October 16, 2020
Editor’s Note: The tweet embed above is not working for some reason—see the video here.
[ SoftBank Hawks ]
Thanks Thomas!
Sarcos is working on a new robot, which looks to be the torso of their powered exoskeleton with the human relocated somewhere else.
[ Sarcos ]
The biggest holiday of the year, International Sloth Day, was on Tuesday! To celebrate, here’s Slothbot!
[ NSF ]
This is one of those simple-seeming tasks that are really difficult for robots.
I love self-resetting training environments.
[ MIT CSAIL ]
The Chiel lab collaborates with engineers at the Center for Biologically Inspired Robotics Research at Case Western Reserve University to design novel worm-like robots that have potential applications in search-and-rescue missions, endoscopic medicine, or other scenarios requiring navigation through narrow spaces.
[ Case Western ]
ANYbotics partnered with Losinger Marazzi to explore ANYmal’s potential of patrolling construction sites to identify and report safety issues. With such a complex environment, only a robot designed to navigate difficult terrain is able to bring digitalization to such a physically demanding industry.
[ ANYbotics ]
Happy 2018 Halloween from Clearpath Robotics!
[ Clearpath ]
Overcoming illumination variance is a critical factor in vision-based navigation. Existing methods tackled this radical illumination variance issue by proposing camera control or high dynamic range (HDR) image fusion. Despite these efforts, we have found that the vision-based approaches still suffer from overcoming darkness. This paper presents real-time image synthesizing from carefully controlled seed low dynamic range (LDR) image, to enable visual simultaneous localization and mapping (SLAM) in an extremely dark environment (less than 10 lux).
[ KAIST ]
What can MoveIt do? Who knows! Let's find out!
[ MoveIt ]
Thanks Dave!
Here we pick a cube from a starting point, manipulate it within the hand, and then put it back. To explore the capabilities of the hand, no sensors were used in this demonstration. The RBO Hand 3 uses soft pneumatic actuators made of silicone. The softness imparts considerable robustness against variations in object pose and size. This lets us design manipulation funnels that work reliably without needing sensor feedback. We take advantage of this reliability to chain these funnels into more complex multi-step manipulation plans.
[ TU Berlin ]
If this was a real solar array, King Louie would have totally cleaned it. Mostly.
[ BYU ]
Autonomous exploration is a fundamental problem for various applications of unmanned aerial vehicles(UAVs). Existing methods, however, were demonstrated to have low efficiency, due to the lack of optimality consideration, conservative motion plans and low decision frequencies. In this paper, we propose FUEL, a hierarchical framework that can support Fast UAV ExpLoration in complex unknown environments.
[ HKUST ]
Countless precise repetitions? This is the perfect task for a robot, thought researchers at the University of Liverpool in the Department of Chemistry, and without further ado they developed an automation solution that can carry out and monitor research tasks, making autonomous decisions about what to do next.
[ Kuka ]
This video shows a demonstration of central results of the SecondHands project. In the context of maintenance and repair tasks, in warehouse environments, the collaborative humanoid robot ARMAR-6 demonstrates a number of cognitive and sensorimotor abilities such as 1) recognition of the need of help based on speech, force, haptics and visual scene and action interpretation, 2) collaborative bimanual manipulation of large objects, 3) compliant mobile manipulation, 4) grasping known and unknown objects and tools, 5) human-robot interaction (object and tool handover) 6) natural dialog and 7) force predictive control.
[ SecondHands ]
In celebration of Ada Lovelace Day, Silicon Valley Robotics hosted a panel of Women in Robotics.
[ Robohub ]
As part of the upcoming virtual IROS conference, HEBI robotics is putting together a tutorial on robotics actuation. While I’m sure HEBI would like you to take a long look at their own actuators, we’ve been assured that no matter what kind of actuators you use, this tutorial will still be informative and useful.
[ YouTube ] via [ HEBI Robotics ]
Thanks Dave!
This week’s UMD Lockheed Martin Robotics Seminar comes from Julie Shah at MIT, on “Enhancing Human Capability with Intelligent Machine Teammates.”
Every team has top performers- people who excel at working in a team to find the right solutions in complex, difficult situations. These top performers include nurses who run hospital floors, emergency response teams, air traffic controllers, and factory line supervisors. While they may outperform the most sophisticated optimization and scheduling algorithms, they cannot often tell us how they do it. Similarly, even when a machine can do the job better than most of us, it can’t explain how. In this talk I share recent work investigating effective ways to blend the unique decision-making strengths of humans and machines. I discuss the development of computational models that enable machines to efficiently infer the mental state of human teammates and thereby collaborate with people in richer, more flexible ways.
[ UMD ]
Matthew Piccoli gives a talk to the UPenn GRASP Lab on “Trading Complexities: Smart Motors and Dumb Vehicles.”
We will discuss my research journey through Penn making the world's smallest, simplest flying vehicles, and in parallel making the most complex brushless motors. What do they have in common? We'll touch on why the quadrotor went from an obscure type of helicopter to the current ubiquitous drone. Finally, we'll get into my life after Penn and what tools I'm creating to further drone and robot designs of the future.
[ UPenn ] Continue reading →
#437592 Coordinated Robotics Wins DARPA SubT ...
DARPA held the Virtual Cave Circuit event of the Subterranean Challenge on Tuesday in the form of a several hour-long livestream. We got to watch (along with all of the competing teams) as virtual robots explored virtual caves fully autonomously, dodging rockfalls, spotting artifacts, scoring points, and sometimes running into stuff and falling over.
Expert commentary was provided by DARPA, and we were able to watch multiple teams running at once, skipping from highlight to highlight. It was really very well done (you can watch an archive of the entire stream here), but they made us wait until the very end to learn who won: First place went to Coordinated Robotics, with BARCS taking second, and third place going to newcomer Team Dynamo.
Huge congratulations to Coordinated Robotics! It’s worth pointing out that the top three teams were separated by an incredibly small handful of points, and on a slightly different day, with slightly different artifact positions, any of them could have come out on top. This doesn’t diminish Coordinated Robotics’ victory in the least—it means that the competition was fierce, and that the problem of autonomous cave exploration with robots has been solved (virtually, at least) in several different but effective ways.
We know Coordinated Robotics pretty well at this point, but here’s an introduction video:
You heard that right—Coordinated Robotics is just Kevin Knoedler, all by himself. This would be astonishing, if we weren’t already familiar with Kevin’s abilities: He won NASA’s virtual Space Robotics Challenge by himself in 2017, and Coordinated Robotics placed first in the DARPA SubT Virtual Tunnel Circuit and second in the Virtual Urban Circuit. We asked Kevin how he managed to do so spectacularly well (again), and here’s what he told us:
IEEE Spectrum: Can you describe what it was like to watch your team of robots on the live stream, and to see them score the most points?
Kevin Knoedler: It was exciting and stressful watching the live stream. It was exciting as the top few scores were quite close for the cave circuit. It was stressful because I started out behind and worked my way up, but did not do well on the final world. Luckily, not doing well on the first and last worlds was offset by better scores on many of the runs in between. DARPA did a very nice job with their live stream of the cave circuit results.
How did you decide on the makeup of your team, and on what sensors to use?
To decide on the makeup of the team I experimented with quite a few different vehicles. I had a lot of trouble with the X2 and other small ground vehicles flipping over. Based on that I looked at the larger ground vehicles that also had a sensor capable of identifying drop-offs. The vehicles that met those criteria for me were the Marble HD2, Marble Husky, Ozbot ATR, and the Absolem. Of those ground vehicles I went with the Marble HD2. It had a downward looking depth camera that I could use to detect drop-offs and was much more stable on the varied terrain than the X2. I had used the X3 aerial vehicle before and so that was my first choice for an aerial platform.
What were some things that you learned in Tunnel and Urban that you were able to incorporate into your strategy for Cave?
In the Tunnel circuit I had learned a strategy to use ground vehicles and in the Urban circuit I had learned a strategy to use aerial vehicles. At a high level that was the biggest thing I learned from the previous circuits that I was able to apply to the Cave circuit. At a lower level I was able to apply many of the development and testing strategies from the previous circuits to the Cave circuit.
What aspect of the cave environment was most challenging for your robots?
I would say it wasn't just one aspect of the cave environment that was challenging for the robots. There were quite a few challenging aspects of the cave environment. For the ground vehicles there were frequently paths that looked good as the robot started on the path, but turned into drop-offs or difficult boulder crawls. While it was fun to see the robot plan well enough to slowly execute paths over the boulders, I was wishing that the robot was smart enough to try a different path rather than wasting so much time crawling over the large boulders. For the aerial vehicles the combination of tight paths along with large vertical spaces was the biggest challenge in the environment. The large open vertical areas were particularly challenging for my aerial robots. They could easily lose track of their position without enough nearby features to track and it was challenging to find the correct path in and out of such large vertical areas.
How will you be preparing for the SubT Final?
To prepare for the SubT Final the vehicles will be getting a lot smarter. The ground vehicles will be better at navigation and communicating with one another. The aerial vehicles will be better able to handle large vertical areas both from a positioning and a planning point of view. Finally, all of the vehicles will do a better job coordinating what areas have been explored and what areas have good leads for further exploration.
Image: DARPA
The final score for the DARPA SubT Cave Circuit virtual competition.
We also had a chance to ask SubT program manager Tim Chung a few questions at yesterday’s post-event press conference, about the course itself and what he thinks teams should have learned from the competition:
IEEE Spectrum: Having looked through some real caves, can you give some examples of some of the most significant differences between this simulation and real caves? And with the enormous variety of caves out there, how generalizable are the solutions that teams came up with?
Tim Chung: Many of the caves that I’ve had to crawl through and gotten bumps and scrapes from had a couple of different features that I’ll highlight. The first is the variations in moisture— a lot of these caves were naturally formed with streams and such, so many of the caves we went to had significant mud, flowing water, and such. And so one of the things we're not capturing in the SubT simulator is explicitly anything that would submerge the robots, or otherwise short any of their systems. So from that perspective, that's one difference that's certainly notable.
And then the other difference I think is the granularity of the terrain, whether it's rubble, sand, or just raw dirt, friction coefficients are all across the board, and I think that's one of the things that any terrestrial simulator will both struggle with and potentially benefit from— that is, terramechanics simulation abilities. Given the emphasis on mobility in the SubT simulation, we’re capturing just a sliver of the complexity of terramechanics, but I think that's probably another take away that you'll certainly see— where there’s that distinction between physical and virtual technologies.
To answer your second question about generalizability— that’s the multi-million dollar question! It’s definitely at the crux of why we have eight diverse worlds, both in size verticality, dimensions, constraint passageways, etc. But this is eight out of countless variations, and the goal of course is to be able to investigate what those key dependencies are. What I'll say is that the out of the seventy three different virtual cave tiles, which are the building blocks that make up these virtual worlds, quite a number of them were not only inspired by real world caves, but were specifically designed so that we can essentially use these tiles as unit tests going forward. So, if I want to simulate vertical inclines, here are the tiles that are the vertical vertical unit tests for robots, and that’s how we’re trying to to think through how to tease out that generalizability factor.
What are some observations from this event that you think systems track teams should pay attention to as they prepare for the final event?
One of the key things about the virtual competition is that you submit your software, and that's it. So you have to design everything from state management to failure mode triage, really thinking about what could go wrong and then building out your autonomous capabilities either to react to some of those conditions, or to anticipate them. And to be honest I think that the humans in the loop that we have in the systems competition really are key enablers of their capability, but also could someday (if not already) be a crutch that we might not be able to develop.
Thinking through some of the failure modes in a fully autonomous software deployed setting are going to be incredibly valuable for the systems competitors, so that for example the human supervisor doesn't have to worry about those failure modes as much, or can respond in a more supervisory way rather than trying to joystick the robot around. I think that's going to be one of the greatest impacts, thinking through what it means to send these robots off to autonomously get you the information you need and complete the mission
This isn’t to say that the humans aren't going to be useful and continue to play a role of course, but I think this shifting of the role of the human supervisor from being a state manager to being more of a tactical commander will dramatically highlight the impact of the virtual side on the systems side.
What, if anything, should we take away from one person teams being able to do so consistently well in the virtual circuit?
It’s a really interesting question. I think part of it has to do with systems integration versus software integration. There's something to be said for the richness of the technologies that can be developed, and how many people it requires to be able to develop some of those technologies. With the systems competitors, having one person try to build, manage, deploy, service, and operate all of those robots is still functionally quite challenging, whereas in the virtual competition, it really is a software deployment more than anything else. And so I think the commonality of single person teams may just be a virtue of the virtual competition not having some of those person-intensive requirements.
In terms of their strong performance, I give credit to all of these really talented folks who are taking upon themselves to jump into the competitor pool and see how well they do, and I think that just goes to show you that whether you're one person or ten people people or a hundred people on a team, a good idea translated and executed well really goes a long way.
Looking ahead, teams have a year to prepare for the final event, which is still scheduled to be held sometime in fall 2021. And even though there was no cave event for systems track teams, the fact that the final event will be a combination of tunnel, urban, and cave circuits means that systems track teams have been figuring out how to get their robots to work in caves anyway, and we’ll be bringing you some of their stories over the next few weeks.
[ DARPA SubT ] Continue reading →
#437562 Video Friday: Aquanaut Robot Takes to ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
IROS 2020 – October 25-25, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Bay Area Robotics Symposium – November 20, 2020 – [Online]
ACRA 2020 – December 8-10, 2020 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.
To prepare the Perseverance rover for its date with Mars, NASA’s Mars 2020 mission team conducted a wide array of tests to help ensure a successful entry, descent and landing at the Red Planet. From parachute verification in the world’s largest wind tunnel, to hazard avoidance practice in Death Valley, California, to wheel drop testing at NASA’s Jet Propulsion Laboratory and much more, every system was put through its paces to get ready for the big day. The Perseverance rover is scheduled to land on Mars on February 18, 2021.
[ JPL ]
Awesome to see Aquanaut—the “underwater transformer” we wrote about last year—take to the ocean!
Also their new website has SHARKS on it.
[ HMI ]
Nature has inspired engineers at UNSW Sydney to develop a soft fabric robotic gripper which behaves like an elephant's trunk to grasp, pick up and release objects without breaking them.
[ UNSW ]
Collaborative robots offer increased interaction capabilities at relatively low cost but, in contrast to their industrial counterparts, they inevitably lack precision. We address this problem by relying on a dual-arm system with laser-based sensing to measure relative poses between objects of interest and compensate for pose errors coming from robot proprioception.
[ Paper ]
Developed by NAVER LABS, with Korea University of Technology & Education (Koreatech), the robot arm now features an added waist, extending the available workspace, as well as a sensor head that can perceive objects. It has also been equipped with a robot hand “BLT Gripper” that can change to various grasping methods.
[ NAVER Labs ]
In case you were still wondering why SoftBank acquired Aldebaran and Boston Dynamics:
[ RobotStart ]
DJI's new Mini 2 drone is here with a commercial so hip it makes my teeth scream.
[ DJI ]
Using simple materials, such as plastic struts and cardboard rolls, the first prototype of the RBO Hand 3 is already capable of grasping a large range of different objects thanks to its opposable thumb.
The RBO Hand 3 performs an edge grasp before handing-over the object to a person. The hand actively exploits constraints in the environment (the tabletop) for grasping the object. Thanks to its compliance, this interaction is safe and robust.
[ TU Berlin ]
Flyability's Elios 2 helped researchers inspect Reactor Five at the Chernobyl nuclear disaster site in order to determine whether any uranium was present. Prior to this mission, Reactor Five had not been investigated since the disaster in April of 1986.
[ Flyability ]
Thanks Zacc!
SOTO 2 is here! Together with our development partners from the industry, we have greatly enhanced the SOTO prototype over the last two years. With the new version of the robot, Industry 4.0 will become a great deal more real: SOTO brings materials to the assembly line, just-in-time and completely autonomously.
[ Magazino ]
A drone that can fly sustainably for long distances over land and water, and can land almost anywhere, will be able to serve a wide range of applications. There are already drones that fly using ‘green’ hydrogen, but they either fly very slowly or cannot land vertically. That’s why researchers at TU Delft, together with the Royal Netherlands Navy and the Netherlands Coastguard, developed a hydrogen-powered drone that is capable of vertical take-off and landing whilst also being able to fly horizontally efficiently for several hours, much like regular aircraft. The drone uses a combination of hydrogen and batteries as its power source.
[ MAVLab ]
The National Nuclear User Facility for Hot Robotics (NNUF-HR) is an EPSRC funded facility to support UK academia and industry to deliver ground-breaking, impactful research in robotics and artificial intelligence for application in extreme and challenging nuclear environments.
[ NNUF ]
At the Karolinska University Laboratory in Sweden, an innovation project based around an ABB collaborative robot has increased efficiency and created a better working environment for lab staff.
[ ABB ]
What I find interesting about DJI's enormous new agricultural drone is that it's got a spinning obstacle detecting sensor that's a radar, not a lidar.
Also worth noting is that it seems to detect the telephone pole, but not the support wire that you can see in the video feed, although the visualization does make it seem like it can spot the power lines above.
[ DJI ]
Josh Pieper has spend the last year building his own quadruped, and you can see what he's been up to in just 12 minutes.
[ mjbots ]
Thanks Josh!
Dr. Ryan Eustice, TRI Senior Vice President of Automated Driving, delivers a keynote speech — “The Road to Vehicle Automation, a Toyota Guardian Approach” — to SPIE's Future Sensing Technologies 2020. During the presentation, Eustice provides his perspective on the current state of automated driving, summarizes TRI's Guardian approach — which amplifies human drivers, rather than replacing them — and summarizes TRI's recent developments in core AD capabilities.
[ TRI ]
Two excellent talks this week from UPenn GRASP Lab, from Ruzena Bajcsy and Vijay Kumar.
A panel discussion on the future of robotics and societal challenges with Dr. Ruzena Bajcsy as a Roboticist and Founder of the GRASP Lab.
In this talk I will describe the role of the White House Office of Science and Technology Policy in supporting science and technology research and education, and the lessons I learned while serving in the office. I will also identify a few opportunities at the intersection of technology and policy and broad societal challenges.
[ UPenn ]
The IROS 2020 “Perception, Learning, and Control for Autonomous Agile Vehicles” workshop is all online—here's the intro, but you can click through for a playlist that includes videos of the entire program, and slides are available as well.
[ NYU ] Continue reading →
#437460 This Week’s Awesome Tech Stories From ...
ARTIFICIAL INTELLIGENCE
A Radical New Technique Lets AI Learn With Practically No Data
Karen Hao | MIT Technology Review
“Shown photos of a horse and a rhino, and told a unicorn is something in between, [children] can recognize the mythical creature in a picture book the first time they see it. …Now a new paper from the University of Waterloo in Ontario suggests that AI models should also be able to do this—a process the researchers call ‘less than one’-shot, or LO-shot, learning.”
FUTURE
Artificial General Intelligence: Are We Close, and Does It Even Make Sense to Try?
Will Douglas Heaven | MIT Technology Review
“A machine that could think like a person has been the guiding vision of AI research since the earliest days—and remains its most divisive idea. …So why is AGI controversial? Why does it matter? And is it a reckless, misleading dream—or the ultimate goal?”
HEALTH
The Race for a Super-Antibody Against the Coronavirus
Apoorva Mandavilli | The New York Times
“Dozens of companies and academic groups are racing to develop antibody therapies. …But some scientists are betting on a dark horse: Prometheus, a ragtag group of scientists who are months behind in the competition—and yet may ultimately deliver the most powerful antibody.”
SPACE
How to Build a Spacecraft to Save the World
Daniel Oberhaus | Wired
“The goal of the Double Asteroid Redirection Test, or DART, is to slam the [spacecraft] into a small asteroid orbiting a larger asteroid 7 million miles from Earth. …It should be able to change the asteroid’s orbit just enough to be detectable from Earth, demonstrating that this kind of strike could nudge an oncoming threat out of Earth’s way. Beyond that, everything is just an educated guess, which is exactly why NASA needs to punch an asteroid with a robot.”
TRANSPORTATION
Inside Gravity’s Daring Mission to Make Jetpacks a Reality
Oliver Franklin-Wallis | Wired
“The first time someone flies a jetpack, a curious thing happens: just as their body leaves the ground, their legs start to flail. …It’s as if the vestibular system can’t quite believe what’s happening. This isn’t natural. Then suddenly, thrust exceeds weight, and—they’re aloft. …It’s that moment, lift-off, that has given jetpacks an enduring appeal for over a century.”
FUTURE OF FOOD
Inside Singapore’s Huge Bet on Vertical Farming
Megan Tatum | MIT Technology Review
“…to cram all [of Singapore’s] gleaming towers and nearly 6 million people into a land mass half the size of Los Angeles, it has sacrificed many things, including food production. Farms make up no more than 1% of its total land (in the United States it’s 40%), forcing the small city-state to shell out around $10 billion each year importing 90% of its food. Here was an example of technology that could change all that.”
COMPUTING
The Effort to Build the Mathematical Library of the Future
Kevin Hartnett | Quanta
“Digitizing mathematics is a longtime dream. The expected benefits range from the mundane—computers grading students’ homework—to the transcendent: using artificial intelligence to discover new mathematics and find new solutions to old problems.”
Image credit: Kevin Mueller / Unsplash Continue reading →
#437230 How Drones and Aerial Vehicles Could ...
Drones, personal flying vehicles, and air taxis may be part of our everyday life in the very near future. Drones and air taxis will create new means of mobility and transport routes. Drones will be used for surveillance, delivery, and in the construction sector as it moves towards automation.
The introduction of these aerial craft into cities will require the built environment to change dramatically. Drones and other new aerial vehicles will require landing pads, charging points, and drone ports. They could usher in new styles of building, and lead to more sustainable design.
My research explores the impact of aerial vehicles on urban design, mapping out possible future trajectories.
An Aerial Age
Already, civilian drones can vary widely in size and complexity. They can carry a range of items from high-resolution cameras, delivery mechanisms, and thermal image technology to speakers and scanners. In the public sector, drones are used in disaster response and by the fire service to tackle fires which could endanger firefighters.
During the coronavirus pandemic, drones have been used by the police to enforce lockdown. Drones normally used in agriculture have sprayed disinfectant over cities. In the UK, drone delivery trials are taking place to carry medical items to the Isle of Wight.
Alongside drones, our future cities could also be populated by vertical takeoff and landing craft (VTOL), used as private vehicles and air taxis.
These vehicles are familiar to sci-fi fans. The late Syd Mead’s illustrations of the Spinner VTOL craft in the film Blade Runner captured the popular imagination, and the screens for the Spinners in Blade Runner 2049 created by Territory Studio provided a careful design fiction of the experience of piloting these types of vehicle.
Now, though, these flying vehicles are reality. A number of companies are developing eVTOL with electric multi-rotor jets, and a whole new motorsport is being established around them.
These aircraft have the potential to change our cities. However, they need to be tested extensively in urban airspace. A study conducted by Airbus found that public concerns about VTOL use focused on the safety of those on the ground and noise emissions.
New Cities
The widespread adoption of drones and VTOL will lead to new architecture and infrastructure. Existing buildings will require adaptations: landing pads, solar photovoltaic panels for energy efficiency, charging points for delivery drones, and landscaping to mitigate noise emissions.
A number of companies are already trialing drone delivery services. Existing buildings will need to be adapted to accommodate these new networks, and new design principles will have to be implemented in future ones.
The architect Saúl Ajuria Fernández has developed a design for a delivery drone port hub. This drone port acts like a beehive where drones recharge and collect parcels for distribution. Architectural firm Humphreys & Partners’ Pier 2, a design for a modular apartment building of the future, includes a cantilevered drone port for delivery services.
The Norman Foster Foundation has designed a drone port for delivery of medical supplies and other items for rural communities in Rwanda. The structure is also intended to function as a space for the public to congregate, as well as to receive training in robotics.
Drones may also help the urban environment become more sustainable. Researchers at the University of Stuttgart have developed a re-configurable architectural roof canopy system deployed by drones. By adjusting to follow the direction of the sun, the canopy provides shade and reduces reliance on ventilation systems.
Demand for air taxis and personal flying vehicles will develop where failures in other transport systems take place. The Airbus research found that of the cities surveyed, highest demand for VTOLs was in Los Angeles and Mexico City, urban areas famous for traffic pollution. To accommodate these aerial vehicles, urban space will need to transform to include landing pads, airport-like infrastructure, and recharge points.
Furthermore, this whole logistics system in lower airspace (below 500 feet), or what I term “hover space,” will need an urban traffic management system. One great example of how this hover space could work can be seen in a speculative project from design studio Superflux in their Drone Aviary project. A number of drones with different functions move around an urban area in a network, following different paths at varying heights.
We are at a critical period in urban history, faced by climatic breakdown and pandemic. Drones and aerial vehicles can be part of a profound rethink of the urban environment.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Image Credit: NASA Continue reading →