Tag Archives: europe
Robotic research benefited in the last 10 years from a standardized open-source platform for research on embodied artificial intelligence (AI), the humanoid robot iCub. Created in Italy, today it is available in laboratories across Europe, the U.S., South Korea, Singapore and Japan, and more than 100 researchers worldwide contribute to develop its skills. Researchers at IIT-Istituto Italiano di Tecnologia focused on the importance of such a research platform in a paper published today in Science Robotics. Continue reading
I don’t have to open the doors of AImotive’s white 2015 Prius to see that it’s not your average car. This particular Prius has been christened El Capitan, the name written below the rear doors, and two small cameras are mounted on top of the car. Bundles of wire snake out from them, as well as from the two additional cameras on the car’s hood and trunk.
Inside is where things really get interesting, though. The trunk holds a computer the size of a microwave, and a large monitor covers the passenger glove compartment and dashboard. The center console has three switches labeled “Allowed,” “Error,” and “Active.”
Budapest-based AImotive is working to provide scalable self-driving technology alongside big players like Waymo and Uber in the autonomous vehicle world. On a highway test ride with CEO Laszlo Kishonti near the company’s office in Mountain View, California, I got a glimpse of just how complex that world is.
Camera-Based Feedback System
AImotive’s approach to autonomous driving is a little different from that of some of the best-known systems. For starters, they’re using cameras, not lidar, as primary sensors. “The traffic system is visual and the cost of cameras is low,” Kishonti said. “A lidar can recognize when there are people near the car, but a camera can differentiate between, say, an elderly person and a child. Lidar’s resolution isn’t high enough to recognize the subtle differences of urban driving.”
Image Credit: AImotive
The company’s aiDrive software uses data from the camera sensors to feed information to its algorithms for hierarchical decision-making, grouped under four concurrent activities: recognition, location, motion, and control.
Kishonti pointed out that lidar has already gotten more cost-efficient, and will only continue to do so.
“Ten years ago, lidar was best because there wasn’t enough processing power to do all the calculations by AI. But the cost of running AI is decreasing,” he said. “In our approach, computer vision and AI processing are key, and for safety, we’ll have fallback sensors like radar or lidar.”
aiDrive currently runs on Nvidia chips, which Kishonti noted were originally designed for graphics, and are not terribly efficient given how power-hungry they are. “We’re planning to substitute lower-cost, lower-energy chips in the next six months,” he said.
Testing in Virtual Reality
Waymo recently announced its fleet has now driven four million miles autonomously. That’s a lot of miles, and hard to compete with. But AImotive isn’t trying to compete, at least not by logging more real-life test miles. Instead, the company is doing 90 percent of its testing in virtual reality. “This is what truly differentiates us from competitors,” Kishonti said.
He outlined the three main benefits of VR testing: it can simulate scenarios too dangerous for the real world (such as hitting something), too costly (not every company has Waymo’s funds to run hundreds of cars on real roads), or too time-consuming (like waiting for rain, snow, or other weather conditions to occur naturally and repeatedly).
“Real-world traffic testing is very skewed towards the boring miles,” he said. “What we want to do is test all the cases that are hard to solve.”
On a screen that looked not unlike multiple games of Mario Kart, he showed me the simulator. Cartoon cars cruised down winding streets, outfitted with all the real-world surroundings: people, trees, signs, other cars. As I watched, a furry kangaroo suddenly hopped across one screen. “Volvo had an issue in Australia,” Kishonti explained. “A kangaroo’s movement is different than other animals since it hops instead of running.” Talk about cases that are hard to solve.
AImotive is currently testing around 1,000 simulated scenarios every night, with a steadily-rising curve of successful tests. These scenarios are broken down into features, and the car’s behavior around those features fed into a neural network. As the algorithms learn more features, the level of complexity the vehicles can handle goes up.
On the Road
After Kishonti and his colleagues filled me in on the details of their product, it was time to test it out. A safety driver sat in the driver’s seat, a computer operator in the passenger seat, and Kishonti and I in back. The driver maintained full control of the car until we merged onto the highway. Then he flicked the “Allowed” switch, his copilot pressed the “Active” switch, and he took his hands off the wheel.
What happened next, you ask?
A few things. El Capitan was going exactly the speed limit—65 miles per hour—which meant all the other cars were passing us. When a car merged in front of us or cut us off, El Cap braked accordingly (if a little abruptly). The monitor displayed the feed from each of the car’s cameras, plus multiple data fields and a simulation where a blue line marked the center of the lane, measured by the cameras tracking the lane markings on either side.
I noticed El Cap wobbling out of our lane a bit, but it wasn’t until two things happened in a row that I felt a little nervous: first we went under a bridge, then a truck pulled up next to us, both bridge and truck casting a complete shadow over our car. At that point El Cap lost it, and we swerved haphazardly to the right, narrowly missing the truck’s rear wheels. The safety driver grabbed the steering wheel and took back control of the car.
What happened, Kishonti explained, was that the shadows made it hard for the car’s cameras to see the lane markings. This was a new scenario the algorithm hadn’t previously encountered. If we’d only gone under a bridge or only been next to the truck for a second, El Cap may not have had so much trouble, but the two events happening in a row really threw the car for a loop—almost literally.
“This is a new scenario we’ll add to our testing,” Kishonti said. He added that another way for the algorithm to handle this type of scenario, rather than basing its speed and positioning on the lane markings, is to mimic nearby cars. “The human eye would see that other cars are still moving at the same speed, even if it can’t see details of the road,” he said.
After another brief—and thankfully uneventful—hands-off cruise down the highway, the safety driver took over, exited the highway, and drove us back to the office.
Driving into the Future
I climbed out of the car feeling amazed not only that self-driving cars are possible, but that driving is possible at all. I squint when driving into a tunnel, swerve to avoid hitting a stray squirrel, and brake gradually at stop signs—all without consciously thinking to do so. On top of learning to steer, brake, and accelerate, self-driving software has to incorporate our brains’ and bodies’ unconscious (but crucial) reactions, like our pupils dilating to let in more light so we can see in a tunnel.
Despite all the progress of machine learning, artificial intelligence, and computing power, I have a wholly renewed appreciation for the thing that’s been in charge of driving up till now: the human brain.
Kishonti seemed to feel similarly. “I don’t think autonomous vehicles in the near future will be better than the best drivers,” he said. “But they’ll be better than the average driver. What we want to achieve is safe, good-quality driving for everyone, with scalability.”
AImotive is currently working with American tech firms and with car and truck manufacturers in Europe, China, and Japan.
Image Credit: Alex Oakenman / Shutterstock.com Continue reading
Cimcorp Selected to Supply Turnkey Automated Handling System to Large Turkish Tire Manufacturer, Petlas
The leading tire handling specialist’s system will handle tires in the tire-finishing and palletizing areas in Turkish manufacturer’s expanded facility
Ulvila, Finland – November 9, 2016 – Cimcorp, leading global supplier of turnkey automation for intralogistics and tire-handling solutions, announces it has been selected to implement a fully automated handling system in Petlas Tire Corporation’s (Petlas) factory in Kirsehir, Turkey. Based on Cimcorp’s Dream Factory solution, the automation will take care of the handling of passenger car radial (PCR) finished tires in the tire-finishing and palletizing areas. Work on the order is already underway and the’ turnkey material handling system will become fully operational in fall 2017.
The order, Cimcorp’s first project for Petlas, is part of a huge investment program to expand the Kirsehir plant in order to increase Petlas’ PCR production capacity and meet growing demand.
Turkey achieved record car production and export levels in 2015, with production up by 16 percent and exports up 12 percent over the preceding year. This growth rate is higher than in any other European country and, with its automotive plants rolling out 1.36 million vehicles in 2015, Turkey is now the seventh largest automotive producer in Europe.
With the production equipment – the tire-building machines, presses and testing machines – already installed, Petlas is commencing the automation of the plant’s material handling. This comprises Cimcorp’s robotic buffer stores, tire conveyors and control software – Cimcorp WCS (Warehouse Control Software) – to take care of all material flows. Using linear robots operating on overhead gantries, the system will automate the handling and transfer of finished tires from the trimming stations, through visual inspection and uniformity testing, to palletizing.
Yahya Ertem, general manager, Petlas Tire Corporation, said, “We think highly of Cimcorp’s software, which integrates the machines into one entity and keeps the flow of material and data under complete control. Cimcorp’s Dream Factory solution fits with our vision to achieve ‘excellence in business’ and will help us to achieve our strategic goals.”
Tero Peltomäki, vice president of sales and projects, Cimcorp, said, “It has been fantastic to work with the Petlas team, honing our design into the best possible solution for the Kirsehir plant. The automation will help Petlas to enhance its market position as a leading tire manufacturer and distributor and we look forward to working on future automation projects with the company.”
To receive high-resolution images, please send requests to Heidi Scott via email at: firstname.lastname@example.org
Cimcorp Group – part of Murata Machinery, Ltd. (Muratec) – is a leading global supplier of turnkey automation for intralogistics, using advanced robotics and software technologies. As well as being a manufacturer and integrator of pioneering material handling systems for the tire industry, Cimcorp has developed unique robotic solutions for order fulfillment and storage that are being used in the food & beverage, retail, e-commerce, FMCG and postal services sectors. With locations in Finland, Canada and the United States, the group has around 300 employees and has delivered over 2,000 logistics automation solutions. Designed to reduce operating costs, ensure traceability and improve efficiency, these systems are used within manufacturing and distribution centers in 40 countries across five continents. For more information, visit www.cimcorp.com.
About Petlas Tire Corporation (Petlas)
Founded in 1976, Petlas Tire Corporation has operations in 98 countries worldwide and employs 2,150 people. The company’s plant in Kirsehir currently has the capacity to produce 8 million PCR (passenger car radial) tires, 2 million agricultural tires, 500,000 TBR (truck & bus radial) tires and 300,000 OTR (off-the-road) tires per year. For more information, visit www.petlas.com.
The post Cimcorp to fully automate Turkish Tire Manufacturer Petlas appeared first on Roboticmagazine. Continue reading