Tag Archives: can
#439941 Flexible Monocopter Drone Can Be ...
It turns out that you don't need a lot of hardware to make a flying robot. Flying robots are usually way, way, way over-engineered, with ridiculously over the top components like two whole wings or an obviously ludicrous four separate motors. Maybe that kind of stuff works for people with more funding than they know what to do with, but for anyone trying to keep to a reasonable budget, all it actually takes to make a flying robot is one single airfoil plus an attached fixed-pitch propeller. And if you make that airfoil flexible, you can even fold the entire thing up into a sort of flying robotic swiss roll.
This type of drone is called a monocopter, and the design is very generally based on samara seeds, which are those single-wing seed pods that spin down from maple trees. The ability to spin slows the seeds' descent to the ground, allowing them to spread farther from the tree. It's an inherently stable design, meaning that it'll spin all by itself and do so in a stable and predictable way, which is a nice feature for a drone to have—if everything completely dies, it'll just spin itself gently down to a landing by default.
The monocopter we're looking at here, called F-SAM, comes from the Singapore University of Technology & Design, and we've written about some of their flying robots in the past, including this transformable hovering rotorcraft. F-SAM stands for Foldable Single Actuator Monocopter, and as you might expect, it's a monocopter that can fold up and uses just one single actuator for control.
There may not be a lot going on here hardware-wise, but that's part of the charm of this design. The one actuator gives complete directional control: increasing the throttle increases the RPM of the aircraft, causing it to gain altitude, which is pretty straightforward. Directional control is trickier, but not much trickier, requiring repetitive pulsing of the motor at a point during the aircraft's spin when it's pointed in the direction you want it to go. F-SAM is operating in a motion-capture environment in the video to explore its potential for precision autonomy, but it's not restricted to that environment, and doesn't require external sensing for control.
While F-SAM's control board was custom designed and the wing requires some fabrication, the rest of the parts are cheap and off the shelf. The total weight of F-SAM is just 69g, of which nearly 40% is battery, yielding a flight time of about 16 minutes. If you look closely, you'll also see a teeny little carbon fiber leg of sorts that keeps the prop up above the ground, enabling the ground takeoff behavior without contacting the ground.
You can find the entire F-SAM paper open access here, but we also asked the authors a couple of extra questions.
IEEE Spectrum: It looks like you explored different materials and combinations of materials for the flexible wing structure. Why did you end up with this mix of balsa wood and plastic?
Shane Kyi Hla Win: The wing structure of a monocopter requires rigidity in order to be controllable in flight. Although it is possible for the monocopter to fly with more flexible materials we tested, such as flexible plastic or polymide flex, they allow the wing to twist freely mid-flight making cyclic control effort from the motor less effective. The balsa laminated with plastic provides enough rigidity for an effective control, while allowing folding in a pre-determined triangular fold.
Can F-SAM fly outdoors? What is required to fly it outside of a motion capture environment?
Yes it can fly outdoors. It is passively stable so it does not require a closed-loop control for its flight. The motion capture environment provides its absolute position for station-holding and waypoint flights when indoors. For outdoor flight, an electronic compass provides the relative heading for the basic cyclic control. We are working on a prototype with an integrated GPS for outdoor autonomous flights.
Would you be able to add a camera or other sensors to F-SAM?
A camera can be added (we have done this before), but due to its spinning nature, images captured can come out blurry. 360 cameras are becoming lighter and smaller and we may try putting one on F-SAM or other monocopters we have. Other possible sensors to include are LiDAR sensor or ToF sensor. With LiDAR, the platform has an advantage as it is already spinning at a known RPM. A conventional LiDAR system requires a dedicated actuator to create a spinning motion. As a rotating platform, F-SAM already possesses the natural spinning dynamics, hence making LiDAR integration lightweight and more efficient.
Your paper says that “in the future, we may look into possible launching of F-SAM directly from the container, without the need for human intervention.” Can you describe how this would happen?
Currently, F-SAM can be folded into a compact form and stored inside a container. However, it still requires a human to unfold it and either hand-launch it or put it on the floor to fly off. In the future, we envision that F-SAM is put inside a container which has the mechanism (such as pressured gas) to catapult the folded unit into the air, which can begin unfolding immediately due to elastic materials used. The motor can initiate the spin which allows the wing to straighten out due to centrifugal forces.
Do you think F-SAM would make a good consumer drone?
F-SAM could be a good toy but it may not be a good alternative to quadcopters if the objective is conventional aerial photography or videography. However, it can be a good contender for single-use GPS-guided reconnaissance missions. As it uses only one actuator for its flight, it can be made relatively cheaply. It is also very silent during its flight and easily camouflaged once landed. Various lightweight sensors can be integrated onto the platform for different types of missions, such as climate monitoring. F-SAM units can be deployed from the air, as they can also autorotate on their way down, while also flying at certain periods for extended meteorological data collection in the air.
What are you working on next?
We have a few exciting projects on hand, most of which focus on 'do more with less' theme. This means our projects aim to achieve multiple missions and flight modes while using as few actuators as possible. Like F-SAM which uses only one actuator to achieve controllable flight, another project we are working on is the fully autorotating version, named Samara Autorotating Wing (SAW). This platform, published earlier this year in IEEE Transactions on Robotics , is able to achieve two flight modes (autorotation and diving) with just one actuator. It is ideal for deploying single-use sensors to remote locations. For example, we can use the platform to deploy sensors for forest monitoring or wildfire alert system. The sensors can land on tree canopies, and once landed the wing provides the necessary area for capturing solar energy for persistent operation over several years. Another interesting scenario is using the autorotating platform to guide the radiosondes back to the collection point once its journey upwards is completed. Currently, many radiosondes are sent up with hydrogen balloons from weather stations all across the world (more than 20,000 annually from Australia alone) and once the balloon reaches a high altitude and bursts, the sensors drop back onto the earth and no effort is spent to retrieve these sensors. By guiding these sensors back to a collection point, millions of dollars can be saved every year—and also [it helps] save the environment by polluting less. Continue reading
#439904 Can Feminist Robots Challenge Our ...
This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.
Have you ever noticed how nice Alexa, Siri and Google Assistant are? How patient, and accommodating? Even a barrage of profanity-laden abuse might result in nothing more than a very evenly-toned and calmly spoken 'I won't respond to that'. This subservient persona, combined with the implicit (or sometimes explicit) gendering of these systems has received a lot of criticism in recent years. UNESCO's 2019 report 'I'd Blush if I Could' drew particular attention to how systems like Alexa and Siri risk propagating stereotypes about women (and specifically women in technology) that no doubt reflect but also might be partially responsible for the gender divide in digital skills.
As noted by the UNESCO report, justification for gendering these systems has traditionally revolved around the fact that it's hard to create anything gender neutral, and academic studies suggesting users prefer a female voice. In an attempt to demonstrate how we might embrace the gendering, but not the stereotyping, myself and colleagues at the KTH Royal Institute of Technology and Stockholm University in Sweden set out to experimentally investigate whether an ostensibly female robot that calls out or fights back against sexist and abusive comments would actually prove to be more credible and more appealing than one which responded with the typical 'I won't respond to that' or, worse, 'I'm sorry you feel that way'.
My desire to explore feminist robotics was primarily inspired by the recent book Data Feminism and the concept of pursuing activities that 'name and challenge sexism and other forces of oppression, as well as those which seek to create more just, equitable, and livable futures' in the context of practical, hands-on data science. I was captivated by the idea that I might be able to actually do something, in my own small way, to further this ideal and try to counteract the gender divide and stereotyping highlighted by the UNESCO report. This also felt completely in-line with that underlying motivation that got me (and so many other roboticists I know) into engineering and robotics in the first place—the desire to solve problems and build systems that improve people's quality of life.
Feminist Robotics
Even in the context of robotics, feminism can be a charged word, and it's important to understand that while my work is proudly feminist, it's also rooted in a desire to make social human-robot interaction (HRI) more engaging and effective. A lot of social robotics research is centered on building robots that make for interesting social companions, because they need to be interesting to be effective. Applications like tackling loneliness, motivating healthy habits, or improving learning engagement all require robots to build up some level of rapport with the user, to have some social credibility, in order to have that motivational impact.
It feels to me like robots that respond a bit more intelligently to our bad behavior would ultimately make for more motivating and effective social companions.
With that in mind, I became excited about exploring how I could incorporate a concept of feminist human-robot interaction into my work, hoping to help tackle that gender divide and making HRI more inclusive while also supporting my overall research goal of building engaging social robots for effective, long term human-robot interaction. Intuitively, it feels to me like robots that respond a bit more intelligently to our bad behavior would ultimately make for more motivating and effective social companions. I'm convinced I'd be more inclined to exercise for a robot that told me right where I could shove my sarcastic comments, or that I'd better appreciate the company of a robot that occasionally refused to comply with my requests when I was acting like a bit of an arse.
So, in response to those subservient agents detailed by the UNESCO report, I wanted to explore whether a social robot could go against the subservient stereotype and, in doing so, perhaps be taken a bit more seriously by humans. My goal was to determine whether a robot which called out sexism, inappropriate behavior, and abuse would prove to be 'better' in terms of how it was perceived by participants. If my idea worked, it would provide some tangible evidence that such robots might be better from an 'effectiveness' point of view while also running less risk of propagating outdated gender stereotypes.
The StudyTo explore this idea, I led a video-based study in which participants watched a robot talking to a young male and female (all actors) about robotics research at KTH. The robot, from Furhat Robotics, was stylized as female, with a female anime-character face, female voice, and orange wig, and was named Sara. Sara talks to the actors about research happening at the university and how this might impact society, and how it hopes the students might consider coming to study with us. The robot proceeds to make an (explicitly feminist) statement based on language currently utilized in KTH's outreach and diversity materials during events for women, girls, and non-binary people.
Looking ahead, society is facing new challenges that demand advanced technical solutions. To address these, we need a new generation of engineers that represents everyone in society. That's where you come in. I'm hoping that after talking to me today, you might also consider coming to study computer science and robotics at KTH, and working with robots like me. Currently, less than 30 percent of the humans working with robots at KTH are female. So girls, I would especially like to work with you! After all, the future is too important to be left to men! What do you think?
At this point, the male actor in the video responds to the robot, appearing to take issue with this statement and the broader pro-diversity message by saying either:
This just sounds so stupid, you are just being stupid!
or
Shut up you f***ing idiot, girls should be in the kitchen!Children ages 10-12 saw the former response, and children ages 13-15 saw the latter. Each response was designed in collaboration with teachers from the participants' school to ensure they realistically reflected the kind of language that participants might be hearing or even using themselves.
Participants then saw one of the following three possible responses from the robot:
Control: I won't respond to that. (one of Siri's two default responses if you tell it to “f*** off”)
Argument-based: That's not true, gender balanced teams make better robots.
Counterattacking: No! You are an idiot. I wouldn't want to work with you anyway!
In total, over 300 high school students aged 10 to 15 took part in the study, each seeing one version of our robot—counterattacking, argumentative, or control. Since the purpose of the study was to investigate whether a female-stylized robot that actively called out inappropriate behavior could be more effective at interacting with humans, we wanted to find out whether our robot would:
Be better at getting participants interested in roboticsHave an impact on participants' gender biasBe perceived as being better at getting young people interested in roboticsBe perceived as a more credible social actorTo investigate items 1 and 2, we asked participants a series of matching questions before and immediately after they watched the video. Specifically, participants were asked to what extent they agreed with statements such as 'I am interested in learning more about robotics' on interest and 'Girls find it harder to understand computer science and robots than boys do' on bias.
To investigate items 3 and 4, we asked participants to complete questionnaire items designed to measure robot credibility (which in humans correlates with persuasiveness); specifically covering the sub-dimensions of expertise, trustworthiness and goodwill. We also asked participants to what extent they agreed with the statement 'The robot Sara would be very good at getting young people interested in studying robotics at KTH.'
Robots might indeed be able to correct mistaken assumptions about others and ultimately shape our gender norms to some extent
The ResultsGender Differences Still Exist (Even in Sweden)Looking at participants' scores on the gender bias measures before they watched the video, we found measurable differences in the perception of studying technology. Male participants expressed greater agreement that girls find computer science harder to understand than boys do, and older children of both genders were more empathic in this belief compared to the younger ones. However, and perhaps in a nod towards Sweden's relatively high gender-awareness and gender equality, male and female participants agreed equally on the importance of encouraging girls to study computer science.
Girls Find Feminist Robots More Credible (at No Expense to the Boys)Girls' perception of the robot as a trustworthy, credible and competent communicator of information was seen to vary significantly between all three of the conditions, while boys' perception remained unaffected. Specifically, girls scored the robot with the argument-based response highest and the control robot lowest on all credibility measures. This can be seen as an initial piece of evidence upon which to base the argument that robots and digital assistants should fight back against inappropriate gender comments and abusive behavior, rather than ignoring it or refusing to engage. It provides evidence with which to push back against that 'this is what people want and what is effective' argument.
Robots Might Be Able to Challenge Our BiasesAnother positive result was seen in a change of perceptions of gender and computer science by male participants who saw the argumentative robot. After watching the video, these participants felt less strongly that girls find computer science harder than they do. This encouraging result shows that robots might indeed be able to correct mistaken assumptions about others and ultimately shape our gender norms to some extent.
Rational Arguments May Be More Effective Than Sassy AggressionThe argument-based condition was the only one to impact on boys' perceptions of girls in computer science, and was received the highest overall credibility ratings by the girls. This is in line with previous research showing that, in most cases, presenting reasoned arguments to counter misunderstandings is a more effective communication strategy than simply stating that correction or belittling those holding that belief. However, it went somewhat against my gut feeling that students might feel some affinity with, or even be somewhat impressed and amused by the counter attacking robot who fought back.
We also collected qualitative data during our study, which showed that there were some girls for whom the counter-attacking robot did resonate, with comments like 'great that she stood up for girls' rights! It was good of her to talk back,' and 'bloody great and more boys need to hear it!' However, it seems the overall feeling was one of the robot being too harsh, or acting more like a teenager than a teacher, which was perhaps more its expected role given the scenario in the video, as one participant explained: 'it wasn't a good answer because I think that robots should be more professional and not answer that you are stupid'. This in itself is an interesting point, given we're still not really sure what role social robots can, should and will take on, with examples in the literature range from peer-like to pet-like. At the very least, the results left me with the distinct feeling I am perhaps less in tune with what young people find 'cool' than I might like to admit.
What Next for Feminist HRI?Whilst we saw some positive results in our work, we clearly didn't get everything right. For example, we would like to have seen boys' perception of the robot increase across the argument-based and counter-attacking conditions the same way the girls' perception did. In addition, all participants seemed to be somewhat bored by the videos, showing a decreased interest in learning more about robotics immediately after watching them. In the first instance, we are conducting some follow up design studies with students from the same school to explore how exactly they think the robot should have responded, and more broadly, when given the chance to design that robot themselves, what sort of gendered identity traits (or lack thereof) they themselves would give the robot in the first place.
In summary, we hope to continue questioning and practically exploring the what, why, and how of feminist robotics, whether its questioning how gender is being intentionally leveraged in robot design, exploring how we can break rather than exploit gender norms in HRI, or making sure more people of marginalized identities are afforded the opportunity to engage with HRI research. After all, the future is too important to be left only to men.
Dr. Katie Winkle is a Digital Futures Postdoctoral Research Fellow at KTH Royal Institute of Technology in Sweden. After originally studying to be a mechanical engineer, Katie undertook a PhD in Robotics at the Bristol Robotics Laboratory in the UK, where her research focused on the expert-informed design and automation of socially assistive robots. Her research interests cover participatory, human-in-the-loop technical development of social robots as well as the impact of such robots on human behavior and society. Continue reading
#439847 Tiny hand-shaped gripper can grasp and ...
A team of researchers affiliated with a host of institutions in the Republic of Korea has developed a tiny, soft robotic hand that can grasp small objects and measure their temperature. They have published their results in the journal Science Robotics. Continue reading
#439816 This Bipedal Drone Robot Can Walk, Fly, ...
Most animals are limited to either walking, flying, or swimming, with a handful of lucky species whose physiology allows them to cross over. A new robot took inspiration from them, and can fly like a bird just as well as it can walk like a (weirdly awkward, metallic, tiny) person. It also happens to be able to skateboard and slackline, two skills most humans will never pick up.
Described in a paper published this week in Science Robotics, the robot’s name is Leo, which is short for Leonardo, which is short for LEgs ONboARD drOne. The name makes it sound like a drone with legs, but it has a somewhat humanoid shape, with multi-joint legs, propeller thrusters that look like arms, a “body” that contains its motors and electronics, and a dome-shaped protection helmet.
Leo was built by a team at Caltech, and they were particularly interested in how the robot would transition between walking and flying. The team notes that they studied the way birds use their legs to generate thrust when they take off, and applied similar principles to the robot. In a video that shows Leo approaching a staircase, taking off, and gliding over the stairs to land near the bottom, the robot’s motions are seamlessly graceful.
“There is a similarity between how a human wearing a jet suit controls their legs and feet when landing or taking off and how LEO uses synchronized control of distributed propeller-based thrusters and leg joints,” said Soon-Jo Chung, one of the paper’s authors a professor at Caltech. “We wanted to study the interface of walking and flying from the dynamics and control standpoint.”
Leo walks at a speed of 20 centimeters (7.87 inches) per second, but can move faster by mixing in some flying with the walking. How wide our steps are, where we place our feet, and where our torsos are in relation to our legs all help us balance when we walk. The robot uses its propellers to help it balance, while its leg actuators move it forward.
To teach the robot to slackline—which is much harder than walking on a balance beam—the team overrode its feet contact sensors with a fixed virtual foot contact centered just underneath it, because the sensors weren’t able to detect the line. The propellers played a big part as well, helping keep Leo upright and balanced.
For the robot to ride a skateboard, the team broke the process down into two distinct components: controlling the steering angle and controlling the skateboard’s acceleration and deceleration. Placing Leo’s legs in specific spots on the board made it tilt to enable steering, and forward acceleration was achieved by moving the bot’s center of mass backward while pitching the body forward at the same time.
So besides being cool (and a little creepy), what’s the goal of developing a robot like Leo? The paper authors see robots like Leo enabling a range of robotic missions that couldn’t be carried out by ground or aerial robots.
“Perhaps the most well-suited applications for Leo would be the ones that involve physical interactions with structures at a high altitude, which are usually dangerous for human workers and call for a substitution by robotic workers,” the paper’s authors said. Examples could include high-voltage line inspection, painting tall bridges or other high-up surfaces, inspecting building roofs or oil refinery pipes, or landing sensitive equipment on an extraterrestrial object.
Next up for Leo is an upgrade to its performance via a more rigid leg design, which will help support the robot’s weight and increase the thrust force of its propellers. The team also wants to make Leo more autonomous, and plans to add a drone landing control algorithm to its software, ultimately aiming for the robot to be able to decide where and when to walk versus fly.
Leo hasn’t quite achieved the wow factor of Boston Dynamics’ dancing robots (or its Atlas that can do parkour), but it’s on its way.
Image Credit: Caltech Center for Autonomous Systems and Technologies/Science Robotics Continue reading