Tag Archives: jumping

#433282 The 4 Waves of AI: Who Will Own the ...

Recently, I picked up Kai-Fu Lee’s newest book, AI Superpowers.

Kai-Fu Lee is one of the most plugged-in AI investors on the planet, managing over $2 billion between six funds and over 300 portfolio companies in the US and China.

Drawing from his pioneering work in AI, executive leadership at Microsoft, Apple, and Google (where he served as founding president of Google China), and his founding of VC fund Sinovation Ventures, Lee shares invaluable insights about:

The four factors driving today’s AI ecosystems;
China’s extraordinary inroads in AI implementation;
Where autonomous systems are headed;
How we’ll need to adapt.

With a foothold in both Beijing and Silicon Valley, Lee looks at the power balance between Chinese and US tech behemoths—each turbocharging new applications of deep learning and sweeping up global markets in the process.

In this post, I’ll be discussing Lee’s “Four Waves of AI,” an excellent framework for discussing where AI is today and where it’s going. I’ll also be featuring some of the hottest Chinese tech companies leading the charge, worth watching right now.

I’m super excited that this Tuesday, I’ve scored the opportunity to sit down with Kai-Fu Lee to discuss his book in detail via a webinar.

With Sino-US competition heating up, who will own the future of technology?

Let’s dive in.

The First Wave: Internet AI
In this first stage of AI deployment, we’re dealing primarily with recommendation engines—algorithmic systems that learn from masses of user data to curate online content personalized to each one of us.

Think Amazon’s spot-on product recommendations, or that “Up Next” YouTube video you just have to watch before getting back to work, or Facebook ads that seem to know what you’ll buy before you do.

Powered by the data flowing through our networks, internet AI leverages the fact that users automatically label data as we browse. Clicking versus not clicking; lingering on a web page longer than we did on another; hovering over a Facebook video to see what happens at the end.

These cascades of labeled data build a detailed picture of our personalities, habits, demands, and desires: the perfect recipe for more tailored content to keep us on a given platform.

Currently, Lee estimates that Chinese and American companies stand head-to-head when it comes to deployment of internet AI. But given China’s data advantage, he predicts that Chinese tech giants will have a slight lead (60-40) over their US counterparts in the next five years.

While you’ve most definitely heard of Alibaba and Baidu, you’ve probably never stumbled upon Toutiao.

Starting out as a copycat of America’s wildly popular Buzzfeed, Toutiao reached a valuation of $20 billion by 2017, dwarfing Buzzfeed’s valuation by more than a factor of 10. But with almost 120 million daily active users, Toutiao doesn’t just stop at creating viral content.

Equipped with natural-language processing and computer vision, Toutiao’s AI engines survey a vast network of different sites and contributors, rewriting headlines to optimize for user engagement, and processing each user’s online behavior—clicks, comments, engagement time—to curate individualized news feeds for millions of consumers.

And as users grow more engaged with Toutiao’s content, the company’s algorithms get better and better at recommending content, optimizing headlines, and delivering a truly personalized feed.

It’s this kind of positive feedback loop that fuels today’s AI giants surfing the wave of internet AI.

The Second Wave: Business AI
While internet AI takes advantage of the fact that netizens are constantly labeling data via clicks and other engagement metrics, business AI jumps on the data that traditional companies have already labeled in the past.

Think banks issuing loans and recording repayment rates; hospitals archiving diagnoses, imaging data, and subsequent health outcomes; or courts noting conviction history, recidivism, and flight.

While we humans make predictions based on obvious root causes (strong features), AI algorithms can process thousands of weakly correlated variables (weak features) that may have much more to do with a given outcome than the usual suspects.

By scouting out hidden correlations that escape our linear cause-and-effect logic, business AI leverages labeled data to train algorithms that outperform even the most veteran of experts.

Apply these data-trained AI engines to banking, insurance, and legal sentencing, and you get minimized default rates, optimized premiums, and plummeting recidivism rates.

While Lee confidently places America in the lead (90-10) for business AI, China’s substantial lag in structured industry data could actually work in its favor going forward.

In industries where Chinese startups can leapfrog over legacy systems, China has a major advantage.

Take Chinese app Smart Finance, for instance.

While Americans embraced credit and debit cards in the 1970s, China was still in the throes of its Cultural Revolution, largely missing the bus on this technology.

Fast forward to 2017, and China’s mobile payment spending outnumbered that of Americans’ by a ratio of 50 to 1. Without the competition of deeply entrenched credit cards, mobile payments were an obvious upgrade to China’s cash-heavy economy, embraced by 70 percent of China’s 753 million smartphone users by the end of 2017.

But by leapfrogging over credit cards and into mobile payments, China largely left behind the notion of credit.

And here’s where Smart Finance comes in.

An AI-powered app for microfinance, Smart Finance depends almost exclusively on its algorithms to make millions of microloans. For each potential borrower, the app simply requests access to a portion of the user’s phone data.

On the basis of variables as subtle as your typing speed and battery percentage, Smart Finance can predict with astounding accuracy your likelihood of repaying a $300 loan.

Such deployments of business AI and internet AI are already revolutionizing our industries and individual lifestyles. But still on the horizon lie two even more monumental waves— perception AI and autonomous AI.

The Third Wave: Perception AI
In this wave, AI gets an upgrade with eyes, ears, and myriad other senses, merging the digital world with our physical environments.

As sensors and smart devices proliferate through our homes and cities, we are on the verge of entering a trillion-sensor economy.

Companies like China’s Xiaomi are putting out millions of IoT-connected devices, and teams of researchers have already begun prototyping smart dust—solar cell- and sensor-geared particulates that can store and communicate troves of data anywhere, anytime.

As Kai-Fu explains, perception AI “will bring the convenience and abundance of the online world into our offline reality.” Sensor-enabled hardware devices will turn everything from hospitals to cars to schools into online-merge-offline (OMO) environments.

Imagine walking into a grocery store, scanning your face to pull up your most common purchases, and then picking up a virtual assistant (VA) shopping cart. Having pre-loaded your data, the cart adjusts your usual grocery list with voice input, reminds you to get your spouse’s favorite wine for an upcoming anniversary, and guides you through a personalized store route.

While we haven’t yet leveraged the full potential of perception AI, China and the US are already making incredible strides. Given China’s hardware advantage, Lee predicts China currently has a 60-40 edge over its American tech counterparts.

Now the go-to city for startups building robots, drones, wearable technology, and IoT infrastructure, Shenzhen has turned into a powerhouse for intelligent hardware, as I discussed last week. Turbocharging output of sensors and electronic parts via thousands of factories, Shenzhen’s skilled engineers can prototype and iterate new products at unprecedented scale and speed.

With the added fuel of Chinese government support and a relaxed Chinese attitude toward data privacy, China’s lead may even reach 80-20 in the next five years.

Jumping on this wave are companies like Xiaomi, which aims to turn bathrooms, kitchens, and living rooms into smart OMO environments. Having invested in 220 companies and incubated 29 startups that produce its products, Xiaomi surpassed 85 million intelligent home devices by the end of 2017, making it the world’s largest network of these connected products.

One KFC restaurant in China has even teamed up with Alipay (Alibaba’s mobile payments platform) to pioneer a ‘pay-with-your-face’ feature. Forget cash, cards, and cell phones, and let OMO do the work.

The Fourth Wave: Autonomous AI
But the most monumental—and unpredictable—wave is the fourth and final: autonomous AI.

Integrating all previous waves, autonomous AI gives machines the ability to sense and respond to the world around them, enabling AI to move and act productively.

While today’s machines can outperform us on repetitive tasks in structured and even unstructured environments (think Boston Dynamics’ humanoid Atlas or oncoming autonomous vehicles), machines with the power to see, hear, touch and optimize data will be a whole new ballgame.

Think: swarms of drones that can selectively spray and harvest entire farms with computer vision and remarkable dexterity, heat-resistant drones that can put out forest fires 100X more efficiently, or Level 5 autonomous vehicles that navigate smart roads and traffic systems all on their own.

While autonomous AI will first involve robots that create direct economic value—automating tasks on a one-to-one replacement basis—these intelligent machines will ultimately revamp entire industries from the ground up.

Kai-Fu Lee currently puts America in a commanding lead of 90-10 in autonomous AI, especially when it comes to self-driving vehicles. But Chinese government efforts are quickly ramping up the competition.

Already in China’s Zhejiang province, highway regulators and government officials have plans to build China’s first intelligent superhighway, outfitted with sensors, road-embedded solar panels and wireless communication between cars, roads and drivers.

Aimed at increasing transit efficiency by up to 30 percent while minimizing fatalities, the project may one day allow autonomous electric vehicles to continuously charge as they drive.

A similar government-fueled project involves Beijing’s new neighbor Xiong’an. Projected to take in over $580 billion in infrastructure spending over the next 20 years, Xiong’an New Area could one day become the world’s first city built around autonomous vehicles.

Baidu is already working with Xiong’an’s local government to build out this AI city with an environmental focus. Possibilities include sensor-geared cement, computer vision-enabled traffic lights, intersections with facial recognition, and parking lots-turned parks.

Lastly, Lee predicts China will almost certainly lead the charge in autonomous drones. Already, Shenzhen is home to premier drone maker DJI—a company I’ll be visiting with 24 top executives later this month as part of my annual China Platinum Trip.

Named “the best company I have ever encountered” by Chris Anderson, DJI owns an estimated 50 percent of the North American drone market, supercharged by Shenzhen’s extraordinary maker movement.

While the long-term Sino-US competitive balance in fourth wave AI remains to be seen, one thing is certain: in a matter of decades, we will witness the rise of AI-embedded cityscapes and autonomous machines that can interact with the real world and help solve today’s most pressing grand challenges.

Join Me
Webinar with Dr. Kai-Fu Lee: Dr. Kai-Fu Lee — one of the world’s most respected experts on AI — and I will discuss his latest book AI Superpowers: China, Silicon Valley, and the New World Order. Artificial Intelligence is reshaping the world as we know it. With U.S.-Sino competition heating up, who will own the future of technology? Register here for the free webinar on September 4th, 2018 from 11:00am–12:30pm PST.

Image Credit: Elena11 / Shutterstock.com Continue reading

Posted in Human Robots

#432421 Cheetah III robot preps for a role as a ...

If you were to ask someone to name a new technology that emerged from MIT in the 21st century, there's a good chance they would name the robotic cheetah. Developed by the MIT Department of Mechanical Engineering's Biomimetic Robotics Lab under the direction of Associate Professor Sangbae Kim, the quadruped MIT Cheetah has made headlines for its dynamic legged gait, speed, jumping ability, and biomimetic design. Continue reading

Posted in Human Robots

#431671 The Doctor in the Machine: How AI Is ...

Artificial intelligence has received its fair share of hype recently. However, it’s hype that’s well-founded: IDC predicts worldwide spend on AI and cognitive computing will culminate to a whopping $46 billion (with a “b”) by 2020, and all the tech giants are jumping on board faster than you can say “ROI.” But what is AI, exactly?
According to Hilary Mason, AI today is being misused as a sort of catch-all term to basically describe “any system that uses data to do anything.” But it’s so much more than that. A truly artificially intelligent system is one that learns on its own, one that’s capable of crunching copious amounts of data in order to create associations and intelligently mimic actual human behavior.
It’s what powers the technology anticipating our next online purchase (Amazon), or the virtual assistant that deciphers our voice commands with incredible accuracy (Siri), or even the hipster-friendly recommendation engine that helps you discover new music before your friends do (Pandora). But AI is moving past these consumer-pleasing “nice-to-haves” and getting down to serious business: saving our butts.
Much in the same way robotics entered manufacturing, AI is making its mark in healthcare by automating mundane, repetitive tasks. This is especially true in the case of detecting cancer. By leveraging the power of deep learning, algorithms can now be trained to distinguish between sets of pixels in an image that represents cancer versus sets that don’t—not unlike how Facebook’s image recognition software tags pictures of our friends without us having to type in their names first. This software can then go ahead and scour millions of medical images (MRIs, CT scans, etc.) in a single day to detect anomalies on a scope that humans just aren’t capable of. That’s huge.
As if that wasn’t enough, these algorithms are constantly learning and evolving, getting better at making these associations with each new data set that gets fed to them. Radiology, dermatology, and pathology will experience a giant upheaval as tech giants and startups alike jump in to bring these deep learning algorithms to a hospital near you.
In fact, some already are: the FDA recently gave their seal of approval for an AI-powered medical imaging platform that helps doctors analyze and diagnose heart anomalies. This is the first time the FDA has approved a machine learning application for use in a clinical setting.
But how efficient is AI compared to humans, really? Well, aside from the obvious fact that software programs don’t get bored or distracted or have to check Facebook every twenty minutes, AI is exponentially better than us at analyzing data.
Take, for example, IBM’s Watson. Watson analyzed genomic data from both tumor cells and healthy cells and was ultimately able to glean actionable insights in a mere 10 minutes. Compare that to the 160 hours it would have taken a human to analyze that same data. Diagnoses aside, AI is also being leveraged in pharmaceuticals to aid in the very time-consuming grunt work of discovering new drugs, and all the big players are getting involved.
But AI is far from being just a behind-the-scenes player. Gartner recently predicted that by 2025, 50 percent of the population will rely on AI-powered “virtual personal health assistants” for their routine primary care needs. What this means is that consumer-facing voice and chat-operated “assistants” (think Siri or Cortana) would, in effect, serve as a central hub of interaction for all our connected health devices and the algorithms crunching all our real-time biometric data. These assistants would keep us apprised of our current state of well-being, acting as a sort of digital facilitator for our personal health objectives and an always-on health alert system that would notify us when we actually need to see a physician.
Slowly, and thanks to the tsunami of data and advancements in self-learning algorithms, healthcare is transitioning from a reactive model to more of a preventative model—and it’s completely upending the way care is delivered. Whether Elon Musk’s dystopian outlook on AI holds any weight or not is yet to be determined. But one thing’s certain: for the time being, artificial intelligence is saving our lives.
Image Credit: Jolygon / Shutterstock.com Continue reading

Posted in Human Robots

#431653 9 Robot Animals Built From Nature’s ...

Millions of years of evolution have allowed animals to develop some elegant and highly efficient solutions to problems like locomotion, flight, and dexterity. As Boston Dynamics unveils its latest mechanical animals, here’s a rundown of nine recent robots that borrow from nature and why.
SpotMini – Boston Dynamics

Starting with BigDog in 2005, the US company has built a whole stable of four-legged robots in recent years. Their first product was designed to be a robotic packhorse for soldiers that borrowed the quadrupedal locomotion of animals to travel over terrain too rough for conventional vehicles.
The US Army ultimately rejected the robot for being too noisy, according to the Guardian, but since then the company has scaled down its design, first to the Spot, then a first edition of the SpotMini that came out last year.
The latter came with a robotic arm where its head should be and was touted as a domestic helper, but a sleeker second edition without the arm was released earlier this month. There’s little detail on what the new robot is designed for, but the more polished design suggests a more consumer-focused purpose.
OctopusGripper – Festo

Festo has released a long line of animal-inspired machines over the years, from a mechanical kangaroo to robotic butterflies. Its latest creation isn’t a full animal—instead it’s a gripper based on an octopus tentacle that can be attached to the end of a robotic arm.
The pneumatically-powered device is made of soft silicone and features two rows of suction cups on its inner edge. By applying compressed air the tentacle can wrap around a wide variety of differently shaped objects, just like its natural counterpart, and a vacuum can be applied to the larger suction cups to grip the object securely. Because it’s soft, it holds promise for robots required to operate safely in collaboration with humans.
CRAM – University of California, Berkeley

Cockroaches are renowned for their hardiness and ability to disappear down cracks that seem far too small for them. Researchers at UC Berkeley decided these capabilities could be useful for search and rescue missions and so set about experimenting on the insects to find out their secrets.
They found the bugs can squeeze into gaps a fifth of their normal standing height by splaying their legs out to the side without significantly slowing themselves down. So they built a palm-sized robot with a jointed plastic shell that could do the same to squeeze into crevices half its normal height.
Snake Robot – Carnegie Mellon University

Search and rescue missions are a common theme for animal-inspired robots, but the snake robot built by CMU researchers is one of the first to be tested in a real disaster.
A team of roboticists from the university helped Mexican Red Cross workers search collapsed buildings for survivors after the 7.1-magnitude earthquake that struck Mexico City in September. The snake design provides a small diameter and the ability to move in almost any direction, which makes the robot ideal for accessing tight spaces, though the team was unable to locate any survivors.
The snake currently features a camera on the front, but researchers told IEEE Spectrum that the experience helped them realize they should also add a microphone to listen for people trapped under the rubble.
Bio-Hybrid Stingray – Harvard University

Taking more than just inspiration from the animal kingdom, a group from Harvard built a robotic stingray out of silicone and rat heart muscle cells.
The robot uses the same synchronized undulations along the edge of its fins to propel itself as a ray does. But while a ray has two sets of muscles to pull the fins up and down, the new device has only one that pulls them down, with a springy gold skeleton that pulls them back up again. The cells are also genetically modified to be activated by flashes of light.
The project’s leader eventually hopes to engineer a human heart, and both his stingray and an earlier jellyfish bio-robot are primarily aimed at better understanding how that organ works.
Bat Bot – Caltech

Most recent advances in drone technology have come from quadcopters, but Caltech engineers think rigid devices with rapidly spinning propellers are probably not ideal for use in close quarters with humans.
That’s why they turned to soft-winged bats for inspiration. That’s no easy feat, though, considering bats use more than 40 joints with each flap of their wings, so the team had to optimize down to nine joints to avoid it becoming too bulky. The simplified bat can’t ascend yet, but its onboard computer and sensors let it autonomously carry out glides, turns, and dives.
Salto – UC Berkeley

While even the most advanced robots tend to plod around, tree-dwelling animals have the ability to spring from branch to branch to clear obstacles and climb quickly. This could prove invaluable for search and rescue robots by allowing them to quickly traverse disordered rubble.
UC Berkeley engineers turned to the Senegal bush baby for inspiration after determining it scored highest in “vertical jumping agility”—a combination of how high and how frequently an animal can jump. They recreated its ability to get into a super-low crouch that stores energy in its tendons to create a robot that could carry out parkour-style double jumps off walls to quickly gain height.
Pleurobot – École Polytechnique Fédérale de Lausanne

Normally robots are masters of air, land, or sea, but the robotic salamander built by researchers at EPFL can both walk and swim.
Its designers used X-ray videos to carefully study how the amphibians move before using this to build a true-to-life robotic version using 3D printed bones, motorized joints, and a synthetic nervous system made up of electronic circuitry.
The robot’s low center of mass and segmented legs make it great at navigating rough terrain without losing balance, and the ability to swim gives added versatility. They also hope it will help paleontologists gain a better understanding of the movements of the first tetrapods to transition from water to land, which salamanders are the best living analog of.
Eelume – Eelume

A snakelike body isn’t only useful on land—eels are living proof it’s an efficient way to travel underwater, too. Norwegian robotics company Eelume has borrowed these principles to build a robot capable of sub-sea inspection, maintenance, and repair.
The modular design allows operators to put together their own favored configuration of joints and payloads such as sensors and tools. And while an early version of the robot used the same method of locomotion as an eel, the latest version undergoing sea trials has added a variety of thrusters for greater speeds and more maneuverability.
Image Credit: Boston Dynamics / YouTube Continue reading

Posted in Human Robots

#431587 Atlas Jumped (not Shrugged)!

Jumping Jack Atlas! Is it just me, or does this jumping, backflipping, humanoid robot scare the living daylights out of me?

Posted in Human Robots