Tag Archives: different
#438774 The World’s First 3D Printed School ...
3D printed houses have been popping up all over the map. Some are hive-shaped, some can float, some are up for sale. Now this practical, cost-cutting technology is being employed for another type of building: a school.
Located on the island of Madagascar, the project is a collaboration between San Francisco-based architecture firm Studio Mortazavi and Thinking Huts, a nonprofit whose mission is to increase global access to education through 3D printing. The school will be built on the campus of a university in Fianarantsoa, a city in the south central area of the island nation.
According to the World Economic Forum, lack of physical infrastructure is one of the biggest barriers to education. Building schools requires not only funds, human capital, and building materials, but also community collaboration and ongoing upkeep and maintenance. For people to feel good about sending their kids to school each day, the buildings should be conveniently located, appealing, comfortable to spend several hours in, and of course safe. All of this is harder to accomplish than you might think, especially in low-income areas.
Because of its comparatively low cost and quick turnaround time, 3D printing has been lauded as a possible solution to housing shortages and a tool to aid in disaster relief. Cost details of the Madagascar school haven’t been released, but if 3D printed houses can go up in a day for under $10,000 or list at a much lower price than their non-3D-printed neighbors, it’s safe to say that 3D printing a school is likely substantially cheaper than building it through traditional construction methods.
The school’s modular design resembles a honeycomb, where as few or as many nodes as needed can be linked together. Each node consists of a room with two bathrooms, a closet, and a front and rear entrance. The Fianarantsoa school with just have one node to start with, but as local technologists will participate in the building process, they’ll learn the 3D printing ins and outs and subsequently be able to add new nodes or build similar schools in other areas.
Artist rendering of the completed school. Image Credit: Studio Mortazavi/Thinking Huts
The printer for the project is coming from Hyperion Robotics, a Finnish company that specializes in 3D printing solutions for reinforced concrete. The building’s walls will be made of layers of a special cement mixture that Thinking Huts says emits less carbon dioxide than traditional concrete. The roof, doors, and windows will be sourced locally, and the whole process can be completed in less than a week, another major advantage over traditional building methods.
“We can build these schools in less than a week, including the foundation and all the electrical and plumbing work that’s involved,” said Amir Mortazavi, lead architect on the project. “Something like this would typically take months, if not even longer.”
The roof of the building will be equipped with solar panels to provide the school with power, and in a true melding of modern technology and traditional design, the pattern of its walls is based on Malagasy textiles.
Thinking Huts considered seven different countries for its first school, and ended up choosing Madagascar for the pilot based on its need for education infrastructure, stable political outlook, opportunity for growth, and renewable energy potential. However, the team is hoping the pilot will be the first of many similar projects across multiple countries. “We can use this as a case study,” Mortazavi said. “Then we can go to other countries around the world and train the local technologists to use the 3D printer and start a nonprofit there to be able to build schools.”
Construction of the school will take place in the latter half of this year, with hopes of getting students into the classroom as soon as the pandemic is no longer a major threat to the local community’s health.
Image Credit: Studio Mortazavi/Thinking Huts Continue reading
#438762 When Robots Enter the World, Who Is ...
Over the last half decade or so, the commercialization of autonomous robots that can operate outside of structured environments has dramatically increased. But this relatively new transition of robotic technologies from research projects to commercial products comes with its share of challenges, many of which relate to the rapidly increasing visibility that these robots have in society.
Whether it's because of their appearance of agency, or because of their history in popular culture, robots frequently inspire people’s imagination. Sometimes this is a good thing, like when it leads to innovative new use cases. And sometimes this is a bad thing, like when it leads to use cases that could be classified as irresponsible or unethical. Can the people selling robots do anything about the latter? And even if they can, should they?
Roboticists understand that robots, fundamentally, are tools. We build them, we program them, and even the autonomous ones are just following the instructions that we’ve coded into them. However, that same appearance of agency that makes robots so compelling means that it may not be clear to people without much experience with or exposure to real robots that a robot itself isn’t inherently good or bad—rather, as a tool, a robot is a reflection of its designers and users.
This can put robotics companies into a difficult position. When they sell a robot to someone, that person can, hypothetically, use the robot in any way they want. Of course, this is the case with every tool, but it’s the autonomous aspect that makes robots unique. I would argue that autonomy brings with it an implied association between a robot and its maker, or in this case, the company that develops and sells it. I’m not saying that this association is necessarily a reasonable one, but I think that it exists, even if that robot has been sold to someone else who has assumed full control over everything it does.
“All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon”
—Robert Playter, Boston Dynamics
Robotics companies are certainly aware of this, because many of them are very careful about who they sell their robots to, and very explicit about what they want their robots to be doing. But once a robot is out in the wild, as it were, how far should that responsibility extend? And realistically, how far can it extend? Should robotics companies be held accountable for what their robots do in the world, or should we accept that once a robot is sold to someone else, responsibility is transferred as well? And what can be done if a robot is being used in an irresponsible or unethical way that could have a negative impact on the robotics community?
For perspective on this, we contacted folks from three different robotics companies, each of which has experience selling distinctive mobile robots to commercial end users. We asked them the same five questions about the responsibility that robotics companies have regarding the robots that they sell, and here’s what they had to say:
Do you have any restrictions on what people can do with your robots? If so, what are they, and if not, why not?
Péter Fankhauser, CEO, ANYbotics:
We closely work together with our customers to make sure that our solution provides the right approach for their problem. Thereby, the target use case is clear from the beginning and we do not work with customers interested in using our robot ANYmal outside the intended target applications. Specifically, we strictly exclude any military or weaponized uses and since the foundation of ANYbotics it is close to our heart to make human work easier, safer, and more enjoyable.
Robert Playter, CEO, Boston Dynamics:
Yes, we have restrictions on what people can do with our robots, which are outlined in our Terms and Conditions of Sale. All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon. Spot, just like any product, must be used in compliance with the law.
Ryan Gariepy, CTO, Clearpath Robotics:
We do have strict restrictions and KYC processes which are based primarily on Canadian export control regulations. They depend on the type of equipment sold as well as where it is going. More generally, we also will not sell or support a robot if we know that it will create an uncontrolled safety hazard or if we have reason to believe that the buyer is unqualified to use the product. And, as always, we do not support using our products for the development of fully autonomous weapons systems.
More broadly, if you sell someone a robot, why should they be restricted in what they can do with it?
Péter Fankhauser, ANYbotics: We see the robot less as a simple object but more as an artificial workforce. This implies to us that the usage is closely coupled with the transfer of the robot and both the customer and the provider agree what the robot is expected to do. This approach is supported by what we hear from our customers with an increasing interest to pay for the robots as a service or per use.
Robert Playter, Boston Dynamics: We’re offering a product for sale. We’re going to do the best we can to stop bad actors from using our technology for harm, but we don’t have the control to regulate every use. That said, we believe that our business will be best served if our technology is used for peaceful purposes—to work alongside people as trusted assistants and remove them from harm’s way. We do not want to see our technology used to cause harm or promote violence. Our restrictions are similar to those of other manufacturers or technology companies that take steps to reduce or eliminate the violent or unlawful use of their products.
Ryan Gariepy, Clearpath Robotics: Assuming the organization doing the restricting is a private organization and the robot and its software is sold vs. leased or “managed,” there aren't strong legal reasons to restrict use. That being said, the manufacturer likewise has no obligation to continue supporting that specific robot or customer going forward. However, given that we are only at the very edge of how robots will reshape a great deal of society, it is in the best interest for the manufacturer and user to be honest with each other about their respective goals. Right now, you're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.
“If a robot is being used in a way that is irresponsible due to safety: intervene! If it’s unethical: speak up!”
—Péter Fankhauser, ANYbotics
What can you realistically do to make sure that people who buy your robots use them in the ways that you intend?
Péter Fankhauser, ANYbotics: We maintain a close collaboration with our customers to ensure their success with our solution. So for us, we have refrained from technical solutions to block unintended use.
Robert Playter, Boston Dynamics: We vet our customers to make sure that their desired applications are things that Spot can support, and are in alignment with our Terms and Conditions of Sale. We’ve turned away customers whose applications aren’t a good match with our technology. If customers misuse our technology, we’re clear in our Terms of Sale that their violations may void our warranty and prevent their robots from being updated, serviced, repaired, or replaced. We may also repossess robots that are not purchased, but leased. Finally, we will refuse future sales to customers that violate our Terms of Sale.
Ryan Gariepy, Clearpath Robotics: We typically work with our clients ahead of the purchase to make sure their expectations match reality, in particular on aspects like safety, supervisory requirements, and usability. It's far worse to sell a robot that'll sit on a shelf or worse, cause harm, then to not sell a robot at all, so we prefer to reduce the risk of this situation in advance of receiving an order or shipping a robot.
How do you evaluate the merit of edge cases, for example if someone wants to use your robot in research or art that may push the boundaries of what you personally think is responsible or ethical?
Péter Fankhauser, ANYbotics: It’s about the dialog, understanding, and figuring out alternatives that work for all involved parties and the earlier you can have this dialog the better.
Robert Playter, Boston Dynamics: There’s a clear line between exploring robots in research and art, and using the robot for violent or illegal purposes.
Ryan Gariepy, Clearpath Robotics: We have sold thousands of robots to hundreds of clients, and I do not recall the last situation that was not covered by a combination of export control and a general evaluation of the client's goals and expectations. I'm sure this will change as robots continue to drop in price and increase in flexibility and usability.
“You're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.”
—Ryan Gariepy, Clearpath Robotics
What should roboticists do if we see a robot being used in a way that we feel is unethical or irresponsible?
Péter Fankhauser, ANYbotics: If it’s irresponsible due to safety: intervene! If it’s unethical: speak up!
Robert Playter, Boston Dynamics: We want robots to be beneficial for humanity, which includes the notion of not causing harm. As an industry, we think robots will achieve long-term commercial viability only if people see robots as helpful, beneficial tools without worrying if they’re going to cause harm.
Ryan Gariepy, Clearpath Robotics: On a one off basis, they should speak to a combination of the user, the supplier or suppliers, the media, and, if safety is an immediate concern, regulatory or government agencies. If the situation in question risks becoming commonplace and is not being taken seriously, they should speak up more generally in appropriate forums—conferences, industry groups, standards bodies, and the like.
As more and more robots representing different capabilities become commercially available, these issues are likely to come up more frequently. The three companies we talked to certainly don’t represent every viewpoint, and we did reach out to other companies who declined to comment. But I would think (I would hope?) that everyone in the robotics community can agree that robots should be used in a way that makes people’s lives better. What “better” means in the context of art and research and even robots in the military may not always be easy to define, and inevitably there’ll be disagreement as to what is ethical and responsible, and what isn’t.
We’ll keep on talking about it, though, and do our best to help the robotics community to continue growing and evolving in a positive way. Let us know what you think in the comments. Continue reading