Category Archives: Human Robots
Everything about Humanoid Robots and Androids
#437416 Robotics firm expands autonomous data ...
Back in 2013, local Brooklyn papers were excitedly reporting on a new initiative aimed at getting residents involved in cleaning up the highly polluted Gowanus Canal. Brooklyn Atlantis, as the project was known, was the brainchild of NYU Tandon Professor of Mechanical and Aerospace Engineering Maurizio Porfiri, who envisioned building and launching robotic boats to collect water-quality data and capture images of the infamous canal, which citizen scientists would then view and help classify. Those robotic boats ultimately led to the formation of the company Manifold Robotics, which aimed to further develop the unmanned surface vehicles (USVs) with sensor technology. (The fledgling company received support from PowerBridgeNY, a collaborative initiative to bring university research to market.) More recently, the startup has now branched out to develop a mobile data collection platform that allows unmanned aerial vehicles (UAVs) to operate safely in the sky near power lines. Continue reading →
#437414 Curling robot able to beat some ...
A combined team of researchers from Korea and Germany has built an AI-based curling robot that is able to compete at a professional level. In their paper published in the journal Science Robotics, the group describes how their robot was built, how it was trained and how well it performed when matched against professional human players. Johannes Stork with Örebro University has published a Focus piece discussing the work by the team in the same journal issue. Continue reading →
#437407 Nvidia’s Arm Acquisition Brings the ...
Artificial intelligence and mobile computing have been two of the most disruptive technologies of this century. The unification of the two companies that made them possible could have wide-ranging consequences for the future of computing.
California-based Nvidia’s graphics processing units (GPUs) have powered the deep learning revolution ever since Google researchers discovered in 2011 that they could run neural networks far more efficiently than conventional CPUs. UK company Arm’s energy-efficient chip designs have dominated the mobile and embedded computing markets for even longer.
Now the two will join forces after the American company announced a $40 billion deal to buy Arm from its Japanese owner, Softbank. In a press release announcing the deal, Nvidia touted its potential to rapidly expand the reach of AI into all areas of our lives.
“In the years ahead, trillions of computers running AI will create a new internet-of-things that is thousands of times larger than today’s internet-of-people,” said Nvidia founder and CEO Jensen Huang. “Uniting NVIDIA’s AI computing capabilities with the vast ecosystem of Arm’s CPU, we can advance computing from the cloud, smartphones, PCs, self-driving cars and robotics, to edge IoT, and expand AI computing to every corner of the globe.”
There are good reasons to believe the hype. The two companies are absolutely dominant in their respective fields—Nvidia’s GPUs support more than 97 percent of AI computing infrastructure offered by big cloud service providers, and Arm’s chips power more than 90 percent of smartphones. And there’s little overlap in their competencies, which means the relationship could be a truly symbiotic one.
“I think the deal “fits like a glove” in that Arm plays in areas that Nvidia does not or isn’t that successful, while NVIDIA plays in many places Arm doesn’t or isn’t that successful,” analyst Patrick Moorhead wrote in Forbes.
One of the most obvious directions would be to expand Nvidia’s AI capabilities to the kind of low-power edge devices that Arm excels in. There’s growing demand for AI in devices like smartphones, wearables, cars, and drones, where transmitting data to the cloud for processing is undesirable either for reasons of privacy or speed.
But there might also be fruitful exchanges in the other direction. Huang told Moorhead a major focus would be bringing Arm’s expertise in energy efficiency to the data center. That’s a big concern for technology companies whose electricity bills and green credentials are taking a battering thanks to the huge amounts of energy required to run millions of computer chips around the clock.
The deal may not be plain sailing, though, most notably due to the two companies’ differing business models. While Nvidia sells ready-made processors, Arm simply creates chip designs and then licenses them to other companies who can then customize them to their particular hardware needs. It operates on an open-licence basis whereby any company with the necessary cash can access its designs.
As a result, its designs are found in products built by hundreds of companies that license its innovations, including Apple, Samsung, Huawei, Qualcomm, and even Nvidia. Some, including two of the company’s co-founders, have raised concerns that the purchase by Nvidia, which competes with many of these other companies, could harm the neutrality that has been central to its success.
It’s possible this could push more companies towards RISC-V, an open-source technology developed by researchers at the University of California at Berkeley that rivals Arm’s and is not owned by any one company. However, there are plenty of reasons why most companies still prefer arm over the less feature-rich open-source option, and it might take a considerable push to convince Arm’s customers to jump ship.
The deal will also have to navigate some thorny political issues. Unions, politicians, and business leaders in the UK have voiced concerns that it could lead to the loss of high-tech jobs, and government sources have suggested conditions could be placed on the deal.
Regulators in other countries could also put a spanner in the works. China is concerned that if Arm becomes US-owned, many of the Chinese companies that rely on its technology could become victims of export restrictions as the China-US trade war drags on. South Korea is also wary that the deal could create a new technology juggernaut that could dent Samsung’s growth in similar areas.
Nvidia has made commitments to keep Arm’s headquarters in the UK, which it says should lessen concerns around jobs and export restrictions. It’s also pledged to open a new world-class technology center in Cambridge and build a state-of-the-art AI supercomputer powered by Arm’s chips there. Whether the deal goes through still hangs in the balance, but of it does it could spur a whole new wave of AI innovation.
Image Credit: Nvidia Continue reading →
#437402 Helping robots avoid collisions
George Konidaris still remembers his disheartening introduction to robotics. Continue reading →
#437395 Microsoft Had a Crazy Idea to Put ...
A little over two years ago, a shipping container-sized cylinder bearing Microsoft’s name and logo was lowered onto the ocean floor off the northern coast of Scotland. Inside were 864 servers, and their submersion was part of the second phase of the software giant’s Project Natick. Launched in 2015, the project’s purpose is to determine the feasibility of underwater data centers powered by offshore renewable energy.
A couple months ago, the deep-sea servers were brought back up to the surface so engineers could inspect them and evaluate how they’d performed while under water.
But wait—why were they there in the first place?
As bizarre as it seems to sink hundreds of servers into the ocean, there are actually several very good reasons to do so. According to the UN, about 40 percent of the world’s population lives within 60 miles of an ocean. As internet connectivity expands to cover most of the globe in the next few years, millions more people will come online, and a lot more servers will be needed to manage the increased demand and data they’ll generate.
In densely-populated cities real estate is expensive and can be hard to find. But know where there’s lots of cheap, empty space? At the bottom of the ocean. This locale also carries the added benefit of being really cold (depending where we’re talking, that is; if you’re looking off the coast of, say, Mumbai or Abu Dhabi, the waters are warmer).
Servers generate a lot of heat, and datacenters use most of their electricity for cooling. Keeping not just the temperature but also the humidity level constant is important for optimal functioning of the servers; neither of these vary much 100 feet under water.
Finally, installing data centers on the ocean floor is, surprisingly, much faster than building them on land. Microsoft claims its server-holding cylinders will take less than 90 days to go from factory ship to operation, as compared to the average two years it takes to get a terrestrial data center up and running.
Microsoft’s Special Projects team operated the underwater data center for two years, and it took a full day to dredge it up and bring it to the surface. One of the first things researchers did was to insert test tubes into the container to take samples of the air inside; they’ll use it to try to determine how gases released from the equipment may have impacted the servers’ operating environment.
The container was filled with dry nitrogen upon deployment, which seems to have made for a much better environment than the oxygen that land-bound servers are normally surrounded by; the failure rate of the servers in the water was just one-eighth that of Microsoft’s typical rate for its servers on land. The team thinks the nitrogen atmosphere was helpful because it’s less corrosive than oxygen. The fact that no humans entered the container for the entirety of its operations helped, too (no moving around of components or having to turn on lights or adjust the temperature).
Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick, believes the results of this phase of the project are sufficient to show that underwater data centers are worth pursuing. “We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” he said.
Cutler envisions putting underwater datacenters near offshore wind farms to power them sustainably. The data centers of the future will require less human involvement, instead being managed and run primarily by technologies like robotics and AI. In this kind of “lights-out” datacenter, the servers would be swapped out about once every five years, with any that fail before then being taken offline.
The final step in this phase of Project Natick is to recycle all the components used for the underwater data center, including the steel pressure vessel, heat exchangers, and the servers themselves—and restoring the sea bed where the cylinder rested back to its original condition.
If Cutler’s optimism is a portent of things to come, it may not be long before the ocean floor is dotted with sustainable datacenters to feed our ever-increasing reliance on our phones and the internet.
Image Credit: Microsoft Continue reading →