Tag Archives: DIY
#435619 Video Friday: Watch This Robot Dog ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
RoboBusiness 2019 – October 1-3, 2019 – Santa Clara, CA, USA
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.
Team PLUTO (University of Pennsylvania, Ghost Robotics, and Exyn Technologies) put together this video giving us a robot’s-eye-view (or whatever they happen to be using for eyes) of the DARPA Subterranean Challenge tunnel circuits.
[ PLUTO ]
Zhifeng Huang has been improving his jet-stepping humanoid robot, which features new hardware and the ability to take larger and more complex steps.
This video reported the last progress of an ongoing project utilizing ducted-fan propulsion system to improve humanoid robot’s ability in stepping over large ditches. The landing point of the robot’s swing foot can be not only forward but also side direction. With keeping quasi-static balance, the robot was able to step over a ditch with 450mm in width (up to 97% of the robot’s leg’s length) in 3D stepping.
[ Paper ]
Thanks Zhifeng!
These underacuated hands from Matei Ciocarlie’s lab at Columbia are magically able to reconfigure themselves to grasp different object types with just one or two motors.
[ Paper ] via [ ROAM Lab ]
This is one reason we should pursue not “autonomous cars” but “fully autonomous cars” that never require humans to take over. We can’t be trusted.
During our early days as the Google self-driving car project, we invited some employees to test our vehicles on their commutes and weekend trips. What we were testing at the time was similar to the highway driver assist features that are now available on cars today, where the car takes over the boring parts of the driving, but if something outside its ability occurs, the driver has to take over immediately.
What we saw was that our testers put too much trust in that technology. They were doing things like texting, applying makeup, and even falling asleep that made it clear they would not be ready to take over driving if the vehicle asked them to. This is why we believe that nothing short of full autonomy will do.
[ Waymo ]
Buddy is a DIY and fetchingly minimalist social robot (of sorts) that will be coming to Kickstarter this month.
We have created a new arduino kit. His name is Buddy. He is a DIY social robot to serve as a replacement for Jibo, Cozmo, or any of the other bots that are no longer available. Fully 3D printed and supported he adds much more to our series of Arduino STEM robotics kits.
Buddy is able to look around and map his surroundings and react to changes within them. He can be surprised and he will always have a unique reaction to changes. The kit can be built very easily in less than an hour. It is even robust enough to take the abuse that kids can give it in a classroom.
[ Littlebots ]
The android Mindar, based on the Buddhist deity of mercy, preaches sermons at Kodaiji temple in Kyoto, and its human colleagues predict that with artificial intelligence it could one day acquire unlimited wisdom. Developed at a cost of almost $1 million (¥106 million) in a joint project between the Zen temple and robotics professor Hiroshi Ishiguro, the robot teaches about compassion and the dangers of desire, anger and ego.
[ Japan Times ]
I’m not sure whether it’s the sound or what, but this thing scares me for some reason.
[ BIRL ]
This gripper uses magnets as a sort of adjustable spring for dynamic stiffness control, which seems pretty clever.
[ Buffalo ]
What a package of medicine sees while being flown by drone from a hospital to a remote clinic in the Dominican Republic. The drone flew 11 km horizontally and 800 meters vertically, and I can’t even imagine what it would take to make that drive.
[ WeRobotics ]
My first ride in a fully autonomous car was at Stanford in 2009. I vividly remember getting in the back seat of a descendant of Junior, and watching the steering wheel turn by itself as the car executed a perfect parking maneuver. Ten years later, it’s still fun to watch other people have that experience.
[ Waymo ]
Flirtey, the pioneer of the commercial drone delivery industry, has unveiled the much-anticipated first video of its next-generation delivery drone, the Flirtey Eagle. The aircraft designer and manufacturer also unveiled the Flirtey Portal, a sophisticated take off and landing platform that enables scalable store-to-door operations; and an autonomous software platform that enables drones to deliver safely to homes.
[ Flirtey ]
EPFL scientists are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects.
[ EPFL ]
This video is a few years old, but we’ll take any excuse to watch the majestic sage-grouse be majestic in all their majesticness.
[ UC Davis ]
I like the idea of a game of soccer (or, football to you weirdos in the rest of the world) where the ball has a mind of its own.
[ Sphero ]
Looks like the whole delivery glider idea is really taking off! Or, you know, not taking off.
Weird that they didn’t show the landing, because it sure looked like it was going to plow into the side of the hill at full speed.
[ Yates ] via [ sUAS News ]
This video is from a 2018 paper, but it’s not like we ever get tired of seeing quadrupeds do stuff, right?
[ MIT ]
Founder and Head of Product, Ian Bernstein, and Head of Engineering, Morgan Bell, have been involved in the Misty project for years and they have learned a thing or two about building robots. Hear how and why Misty evolved into a robot development platform, learn what some of the earliest prototypes did (and why they didn’t work for what we envision), and take a deep dive into the technology decisions that form the Misty II platform.
[ Misty Robotics ]
Lex Fridman interviews Vijay Kumar on the Artifiical Intelligence Podcast.
[ AI Podcast ]
This week’s CMU RI Seminar is from Ross Knepper at Cornell, on Formalizing Teamwork in Human-Robot Interaction.
Robots out in the world today work for people but not with people. Before robots can work closely with ordinary people as part of a human-robot team in a home or office setting, robots need the ability to acquire a new mix of functional and social skills. Working with people requires a shared understanding of the task, capabilities, intentions, and background knowledge. For robots to act jointly as part of a team with people, they must engage in collaborative planning, which involves forming a consensus through an exchange of information about goals, capabilities, and partial plans. Often, much of this information is conveyed through implicit communication. In this talk, I formalize components of teamwork involving collaboration, communication, and representation. I illustrate how these concepts interact in the application of social navigation, which I argue is a first-class example of teamwork. In this setting, participants must avoid collision by legibly conveying intended passing sides via nonverbal cues like path shape. A topological representation using the braid groups enables the robot to reason about a small enumerable set of passing outcomes. I show how implicit communication of topological group plans achieves rapid covergence to a group consensus, and how a robot in the group can deliberately influence the ultimate outcome to maximize joint performance, yielding pedestrian comfort with the robot.
[ CMU RI ]
In this week’s episode of Robots in Depth, Per speaks with Julien Bourgeois about Claytronics, a project from Carnegie Mellon and Intel to develop “programmable matter.”
Julien started out as a computer scientist. He was always interested in robotics privately but then had the opportunity to get into micro robots when his lab was merged into the FEMTO-ST Institute. He later worked with Seth Copen Goldstein at Carnegie Mellon on the Claytronics project.
Julien shows an enlarged mock-up of the small robots that make up programmable matter, catoms, and speaks about how they are designed. Currently he is working on a unit that is one centimeter in diameter and he shows us the very small CPU that goes into that model.
[ Robots in Depth ] Continue reading
#432271 Your Shopping Experience Is on the Verge ...
Exponential technologies (AI, VR, 3D printing, and networks) are radically reshaping traditional retail.
E-commerce giants (Amazon, Walmart, Alibaba) are digitizing the retail industry, riding the exponential growth of computation.
Many brick-and-mortar stores have already gone bankrupt, or migrated their operations online.
Massive change is occurring in this arena.
For those “real-life stores” that survive, an evolution is taking place from a product-centric mentality to an experience-based business model by leveraging AI, VR/AR, and 3D printing.
Let’s dive in.
E-Commerce Trends
Last year, 3.8 billion people were connected online. By 2024, thanks to 5G, stratospheric and space-based satellites, we will grow to 8 billion people online, each with megabit to gigabit connection speeds.
These 4.2 billion new digital consumers will begin buying things online, a potential bonanza for the e-commerce world.
At the same time, entrepreneurs seeking to service these four-billion-plus new consumers can now skip the costly steps of procuring retail space and hiring sales clerks.
Today, thanks to global connectivity, contract production, and turnkey pack-and-ship logistics, an entrepreneur can go from an idea to building and scaling a multimillion-dollar business from anywhere in the world in record time.
And while e-commerce sales have been exploding (growing from $34 billion in Q1 2009 to $115 billion in Q3 2017), e-commerce only accounted for about 10 percent of total retail sales in 2017.
In 2016, global online sales totaled $1.8 trillion. Remarkably, this $1.8 trillion was spent by only 1.5 billion people — a mere 20 percent of Earth’s global population that year.
There’s plenty more room for digital disruption.
AI and the Retail Experience
For the business owner, AI will demonetize e-commerce operations with automated customer service, ultra-accurate supply chain modeling, marketing content generation, and advertising.
In the case of customer service, imagine an AI that is trained by every customer interaction, learns how to answer any consumer question perfectly, and offers feedback to product designers and company owners as a result.
Facebook’s handover protocol allows live customer service representatives and language-learning bots to work within the same Facebook Messenger conversation.
Taking it one step further, imagine an AI that is empathic to a consumer’s frustration, that can take any amount of abuse and come back with a smile every time. As one example, meet Ava. “Ava is a virtual customer service agent, to bring a whole new level of personalization and brand experience to that customer experience on a day-to-day basis,” says Greg Cross, CEO of Ava’s creator, an Austrian company called Soul Machines.
Predictive modeling and machine learning are also optimizing product ordering and the supply chain process. For example, Skubana, a platform for online sellers, leverages data analytics to provide entrepreneurs constant product performance feedback and maintain optimal warehouse stock levels.
Blockchain is set to follow suit in the retail space. ShipChain and Ambrosus plan to introduce transparency and trust into shipping and production, further reducing costs for entrepreneurs and consumers.
Meanwhile, for consumers, personal shopping assistants are shifting the psychology of the standard shopping experience.
Amazon’s Alexa marks an important user interface moment in this regard.
Alexa is in her infancy with voice search and vocal controls for smart homes. Already, Amazon’s Alexa users, on average, spent more on Amazon.com when purchasing than standard Amazon Prime customers — $1,700 versus $1,400.
As I’ve discussed in previous posts, the future combination of virtual reality shopping, coupled with a personalized, AI-enabled fashion advisor will make finding, selecting, and ordering products fast and painless for consumers.
But let’s take it one step further.
Imagine a future in which your personal AI shopper knows your desires better than you do. Possible? I think so. After all, our future AIs will follow us, watch us, and observe our interactions — including how long we glance at objects, our facial expressions, and much more.
In this future, shopping might be as easy as saying, “Buy me a new outfit for Saturday night’s dinner party,” followed by a surprise-and-delight moment in which the outfit that arrives is perfect.
In this future world of AI-enabled shopping, one of the most disruptive implications is that advertising is now dead.
In a world where an AI is buying my stuff, and I’m no longer in the decision loop, why would a big brand ever waste money on a Super Bowl advertisement?
The dematerialization, demonetization, and democratization of personalized shopping has only just begun.
The In-Store Experience: Experiential Retailing
In 2017, over 6,700 brick-and-mortar retail stores closed their doors, surpassing the former record year for store closures set in 2008 during the financial crisis. Regardless, business is still booming.
As shoppers seek the convenience of online shopping, brick-and-mortar stores are tapping into the power of the experience economy.
Rather than focusing on the practicality of the products they buy, consumers are instead seeking out the experience of going shopping.
The Internet of Things, artificial intelligence, and computation are exponentially improving the in-person consumer experience.
As AI dominates curated online shopping, AI and data analytics tools are also empowering real-life store owners to optimize staffing, marketing strategies, customer relationship management, and inventory logistics.
In the short term,retail store locations will serve as the next big user interface for production 3D printing (custom 3D printed clothes at the Ministry of Supply), virtual and augmented reality (DIY skills clinics), and the Internet of Things (checkout-less shopping).
In the long term,we’ll see how our desire for enhanced productivity and seamless consumption balances with our preference for enjoyable real-life consumer experiences — all of which will be driven by exponential technologies.
One thing is certain: the nominal shopping experience is on the verge of a major transformation.
Implications
The convergence of exponential technologies has already revamped how and where we shop, how we use our time, and how much we pay.
Twenty years ago, Amazon showed us how the web could offer each of us the long tail of available reading material, and since then, the world of e-commerce has exploded.
And yet we still haven’t experienced the cost savings coming our way from drone delivery, the Internet of Things, tokenized ecosystems, the impact of truly powerful AI, or even the other major applications for 3D printing and AR/VR.
Perhaps nothing will be more transformed than today’s $20 trillion retail sector.
Hold on, stay tuned, and get your AI-enabled cryptocurrency ready.
Join Me
Abundance Digital Online Community: I’ve created a digital/online community of bold, abundance-minded entrepreneurs called Abundance Digital.
Abundance Digital is my ‘onramp’ for exponential entrepreneurs — those who want to get involved and play at a higher level. Click here to learn more.
Image Credit: Zapp2Photo / Shutterstock.com Continue reading