Tag Archives: workers
#436944 Is Digital Learning Still Second Best?
As Covid-19 continues to spread, the world has gone digital on an unprecedented scale. Tens of thousands of employees are working from home, and huge conferences, like the Google I/O and Apple WWDC software extravaganzas, plan to experiment with digital events.
Universities too are sending students home. This might have meant an extended break from school not too long ago. But no more. As lecture halls go empty, an experiment into digital learning at scale is ramping up. In the US alone, over 100 universities, from Harvard to Duke, are offering online classes to students to keep the semester going.
While digital learning has been improving for some time, Covid-19 may not only tip us further into a more digitally connected reality, but also help us better appreciate its benefits. This is important because historically, digital learning has been viewed as inferior to traditional learning. But that may be changing.
The Inversion
We often think about digital technologies as ways to reach people without access to traditional services—online learning for children who don’t have schools nearby or telemedicine for patients with no access to doctors. And while these solutions have helped millions of people, they’re often viewed as “second best” and “better than nothing.” Even in more resource-rich environments, there’s an assumption one should pay more to attend an event in person—a concert, a football game, an exercise class—while digital equivalents are extremely cheap or free. Why is this? And is the situation about to change?
Take the case of Dr. Sanjeev Arora, a professor of medicine at the University of New Mexico. Arora started Project Echo because he was frustrated by how many late-stage cases of hepatitis C he encountered in rural New Mexico. He realized that if he had reached patients sooner, he could have prevented needless deaths. The solution? Digital learning for local health workers.
Project Echo connects rural healthcare practitioners to specialists at top health centers by video. The approach is collaborative: Specialists share best practices and work through cases with participants to apply them in the real world and learn from edge cases. Added to expert presentations, there are lots of opportunities to ask questions and interact with specialists.
The method forms a digital loop of learning, practice, assessment, and adjustment.
Since 2003, Project Echo has scaled to 800 locations in 39 countries and trained over 90,000 healthcare providers. Most notably, a study in The New England Journal of Medicine found that the outcomes of hepatitis C treatment given by Project Echo trained healthcare workers in rural and underserved areas were similar to outcomes at university medical centers. That is, digital learning in this context was equivalent to high quality in-person learning.
If that is possible today, with simple tools, will they surpass traditional medical centers and schools in the future? Can digital learning more generally follow suit and have the same success? Perhaps. Going digital brings its own special toolset to the table too.
The Benefits of Digital
If you’re training people online, you can record the session to better understand their engagement levels—or even add artificial intelligence to analyze it in real time. Ahura AI, for example, founded by Bryan Talebi, aims to upskill workers through online training. Early study of their method suggests they can significantly speed up learning by analyzing users’ real-time emotions—like frustration or distraction—and adjusting the lesson plan or difficulty on the fly.
Other benefits of digital learning include the near-instantaneous download of course materials—rather than printing and shipping books—and being able to more easily report grades and other results, a requirement for many schools and social services organizations. And of course, as other digitized industries show, digital learning can grow and scale further at much lower costs.
To that last point, 360ed, a digital learning startup founded in 2016 by Hla Hla Win, now serves millions of children in Myanmar with augmented reality lesson plans. And Global Startup Ecosystem, founded by Christine Souffrant Ntim and Einstein Kofi Ntim in 2015, is the world’s first and largest digital accelerator program. Their entirely online programs support over 1,000 companies in 90 countries. It’s astonishing how fast both of these organizations have grown.
Notably, both examples include offline experiences too. Many of the 360ed lesson plans come with paper flashcards children use with their smartphones because the online-offline interaction improves learning. The Global Startup Ecosystem also hosts about 10 additional in-person tech summits around the world on various topics through a related initiative.
Looking further ahead, probably the most important benefit of online learning will be its potential to integrate with other digital systems in the workplace.
Imagine a medical center that has perfect information about every patient and treatment in real time and that this information is (anonymously and privately) centralized, analyzed, and shared with medical centers, research labs, pharmaceutical companies, clinical trials, policy makers, and medical students around the world. Just as self-driving cars can learn to drive better by having access to the experiences of other self-driving cars, so too can any group working to solve complex, time-sensitive challenges learn from and build on each other’s experiences.
Why This Matters
While in the long term the world will likely end up combining the best aspects of traditional and digital learning, it’s important in the near term to be more aware of the assumptions we make about digital technologies. Some of the most pioneering work in education, healthcare, and other industries may not be highly visible right now because it is in a virtual setting. Most people are unaware, for example, that the busiest emergency room in rural America is already virtual.
Once they start converging with other digital technologies, these innovations will likely become the mainstream system for all of us. Which raises more questions: What is the best business model for these virtual services? If they start delivering better healthcare and educational outcomes than traditional institutions, should they charge more? Hopefully, we will see an even bigger shift occurring, in which technology allows us to provide high quality education, healthcare, and other services to everyone at more affordable prices than today.
These are some of the topics we can consider as Covid-19 forces us into uncharted territory.
Image Credit: Andras Vas / Unsplash Continue reading
#436470 Retail Robots Are on the Rise—at Every ...
The robots are coming! The robots are coming! On our sidewalks, in our skies, in our every store… Over the next decade, robots will enter the mainstream of retail.
As countless robots work behind the scenes to stock shelves, serve customers, and deliver products to our doorstep, the speed of retail will accelerate.
These changes are already underway. In this blog, we’ll elaborate on how robots are entering the retail ecosystem.
Let’s dive in.
Robot Delivery
On August 3rd, 2016, Domino’s Pizza introduced the Domino’s Robotic Unit, or “DRU” for short. The first home delivery pizza robot, the DRU looks like a cross between R2-D2 and an oversized microwave.
LIDAR and GPS sensors help it navigate, while temperature sensors keep hot food hot and cold food cold. Already, it’s been rolled out in ten countries, including New Zealand, France, and Germany, but its August 2016 debut was critical—as it was the first time we’d seen robotic home delivery.
And it won’t be the last.
A dozen or so different delivery bots are fast entering the market. Starship Technologies, for instance, a startup created by Skype founders Janus Friis and Ahti Heinla, has a general-purpose home delivery robot. Right now, the system is an array of cameras and GPS sensors, but upcoming models will include microphones, speakers, and even the ability—via AI-driven natural language processing—to communicate with customers. Since 2016, Starship has already carried out 50,000 deliveries in over 100 cities across 20 countries.
Along similar lines, Nuro—co-founded by Jiajun Zhu, one of the engineers who helped develop Google’s self-driving car—has a miniature self-driving car of its own. Half the size of a sedan, the Nuro looks like a toaster on wheels, except with a mission. This toaster has been designed to carry cargo—about 12 bags of groceries (version 2.0 will carry 20)—which it’s been doing for select Kroger stores since 2018. Domino’s also partnered with Nuro in 2019.
As these delivery bots take to our streets, others are streaking across the sky.
Back in 2016, Amazon came first, announcing Prime Air—the e-commerce giant’s promise of drone delivery in 30 minutes or less. Almost immediately, companies ranging from 7-Eleven and Walmart to Google and Alibaba jumped on the bandwagon.
While critics remain doubtful, the head of the FAA’s drone integration department recently said that drone deliveries may be “a lot closer than […] the skeptics think. [Companies are] getting ready for full-blown operations. We’re processing their applications. I would like to move as quickly as I can.”
In-Store Robots
While delivery bots start to spare us trips to the store, those who prefer shopping the old-fashioned way—i.e., in person—also have plenty of human-robot interaction in store. In fact, these robotics solutions have been around for a while.
In 2010, SoftBank introduced Pepper, a humanoid robot capable of understanding human emotion. Pepper is cute: 4 feet tall, with a white plastic body, two black eyes, a dark slash of a mouth, and a base shaped like a mermaid’s tail. Across her chest is a touch screen to aid in communication. And there’s been a lot of communication. Pepper’s cuteness is intentional, as it matches its mission: help humans enjoy life as much as possible.
Over 12,000 Peppers have been sold. She serves ice cream in Japan, greets diners at a Pizza Hut in Singapore, and dances with customers at a Palo Alto electronics store. More importantly, Pepper’s got company.
Walmart uses shelf-stocking robots for inventory control. Best Buy uses a robo-cashier, allowing select locations to operate 24-7. And Lowe’s Home Improvement employs the LoweBot—a giant iPad on wheels—to help customers find the items they need while tracking inventory along the way.
Warehouse Bots
Yet the biggest benefit robots provide might be in-warehouse logistics.
In 2012, when Amazon dished out $775 million for Kiva Systems, few could predict that just 6 years later, 45,000 Kiva robots would be deployed at all of their fulfillment centers, helping process a whopping 306 items per second during the Christmas season.
And many other retailers are following suit.
Order jeans from the Gap, and soon they’ll be sorted, packed, and shipped with the help of a Kindred robot. Remember the old arcade game where you picked up teddy bears with a giant claw? That’s Kindred, only her claw picks up T-shirts, pants, and the like, placing them in designated drop-off zones that resemble tiny mailboxes (for further sorting or shipping).
The big deal here is democratization. Kindred’s robot is cheap and easy to deploy, allowing smaller companies to compete with giants like Amazon.
Final Thoughts
For retailers interested in staying in business, there doesn’t appear to be much choice in the way of robotics.
By 2024, the US minimum wage is projected to be $15 an hour (the House of Representatives has already passed the bill, but the wage hike is meant to unfold gradually between now and 2025), and many consider that number far too low.
Yet, as human labor costs continue to climb, robots won’t just be coming, they’ll be here, there, and everywhere. It’s going to become increasingly difficult for store owners to justify human workers who call in sick, show up late, and can easily get injured. Robots work 24-7. They never take a day off, never need a bathroom break, health insurance, or parental leave.
Going forward, this spells a growing challenge of technological unemployment (a blog topic I will cover in the coming month). But in retail, robotics usher in tremendous benefits for companies and customers alike.
And while professional re-tooling initiatives and the transition of human capital from retail logistics to a booming experience economy take hold, robotic retail interaction and last-mile delivery will fundamentally transform our relationship with commerce.
This blog comes from The Future is Faster Than You Think—my upcoming book, to be released Jan 28th, 2020. To get an early copy and access up to $800 worth of pre-launch giveaways, sign up here!
Join Me
(1) A360 Executive Mastermind: If you’re an exponentially and abundance-minded entrepreneur who would like coaching directly from me, consider joining my Abundance 360 Mastermind, a highly selective community of 360 CEOs and entrepreneurs who I coach for 3 days every January in Beverly Hills, Ca. Through A360, I provide my members with context and clarity about how converging exponential technologies will transform every industry. I’m committed to running A360 for the course of an ongoing 25-year journey as a “countdown to the Singularity.”
If you’d like to learn more and consider joining our 2020 membership, apply here.
(2) Abundance-Digital Online Community: I’ve also created a Digital/Online community of bold, abundance-minded entrepreneurs called Abundance-Digital. Abundance-Digital is Singularity University’s ‘onramp’ for exponential entrepreneurs — those who want to get involved and play at a higher level. Click here to learn more.
(Both A360 and Abundance-Digital are part of Singularity University — your participation opens you to a global community.)
Image Credit: Image by imjanuary from Pixabay Continue reading
#436466 How Two Robots Learned to Grill and ...
The list of things robots can do seems to be growing by the week. They can play sports, help us explore outer space and the deep sea, take over some of our boring everyday tasks, and even assemble Ikea furniture.
Now they can add one more accomplishment to the list: grilling and serving a hot dog.
It seems like a pretty straightforward task, and as far as grilling goes, hot dogs are about as easy as it gets (along with, maybe, burgers? Hot dogs require more rotation, but it’s easier to tell when they’re done since they’re lighter in color).
Let’s paint a picture: you’re manning the grill at your family’s annual Fourth of July celebration. You’ve got a 10-pack of plump, juicy beef franks and a hungry crowd of relatives whose food-to-alcohol ratio is getting pretty skewed—they need some solid calories, pronto. What are the steps you need to take to get those franks from package to plate?
Each one needs to be placed on the grill, rotated every couple minutes for even cooking, removed from the grill when you deem it’s done, then—if you’re the kind of guy or gal who goes the extra mile—placed in a bun and dressed with ketchup, mustard, pickles, and the like before being handed over to salivating, too-loud Uncle Hector or sweet, bored Cousin Margaret.
While carrying out your grillmaster duties, you know better than to drop the hot dogs on the ground, leave them cooking on one side for too long, squeeze them to the point of breaking or bursting, and any other hot-dog-ruining amateur moves.
But for a robot, that’s a lot to figure out, especially if they have no prior knowledge of grilling hot dogs (which, well, most robots don’t).
As described in a paper published in this week’s Science Robotics, a team from Boston University programmed two robotic arms to use reinforcement learning—a branch of machine learning in which software gathers information about its environment then learns from it by replaying its experiences and incorporating rewards—to cook and serve hot dogs.
The team used a set of formulas to specify and combine tasks (“pick up hot dog and place on the grill”), meet safety requirements (“always avoid collisions”), and incorporate general prior knowledge (“you cannot pick up another hot dog if you are already holding one”).
Baxter and Jaco—as the two robots were dubbed—were trained through computer simulations. The paper’s authors emphasized their use of what they call a “formal specification language” for training the software, with the aim of generating easily-interpretable task descriptions. In reinforcement learning, they explain, being able to understand how a reward function influences an AI’s learning process is a key component in understanding the system’s behavior—but most systems lack this quality, and are thus likely to be lumped into the ‘black box’ of AI.
The robots’ decisions throughout the hot dog prep process—when to turn a hot dog, when to take it off the grill, and so on—are, the authors write, “easily interpretable from the beginning because the language is very similar to plain English.”
Besides being a step towards more explainable AI systems, Baxter and Jaco are another example of fast-food robots—following in the footsteps of their burger and pizza counterparts—that may take over some repetitive manual tasks currently performed by human workers. As robots’ capabilities improve through incremental progress like this, they’ll be able to take on additional tasks.
In a not-so-distant future, then, you just may find yourself throwing back drinks with Uncle Hector and Cousin Margaret while your robotic replacement mans the grill, churning out hot dogs that are perfectly cooked every time.
Image Credit: Image by Muhammad Ribkhan from Pixabay Continue reading
#436462 Robotic Exoskeletons, Like This One, Are ...
When you imagine an exoskeleton, chances are it might look a bit like the Guardian XO from Sarcos Robotics. The XO is literally a robot you wear (or maybe, it wears you). The suit’s powered limbs sense your movements and match their position to yours with little latency to give you effortless superstrength and endurance—lifting 200 pounds will feel like 10.
A vision of robots and humankind working together in harmony. Now, isn’t that nice?
Of course, there isn’t anything terribly novel about an exoskeleton. We’ve seen plenty of concepts and demonstrations in the last decade. These include light exoskeletons tailored to industrial settings—some of which are being tested out by the likes of Honda—and healthcare exoskeletons that support the elderly or folks with disabilities.
Full-body powered robotic exoskeletons are a bit rarer, which makes the Sarcos suit pretty cool to look at. But like all things in robotics, practicality matters as much as vision. It’s worth asking: Will anyone buy and use the thing? Is it more than a concept video?
Sarcos thinks so, and they’re excited about it. “If you were to ask the question, what does 30 years and $300 million look like,” Sarcos CEO, Ben Wolff, told IEEE Spectrum, “you’re going to see it downstairs.”
The XO appears to check a few key boxes. For one, it’s user friendly. According to Sarcos, it only takes a few minutes for the uninitiated to strap in and get up to speed. Feeling comfortable doing work with the suit takes a few hours. This is thanks to a high degree of sensor-based automation that allows the robot to seamlessly match its user’s movements.
The XO can also operate for more than a few minutes. It has two hours of battery life, and with spares on hand, it can go all day. The batteries are hot-swappable, meaning you can replace a drained battery with a new one without shutting the system down.
The suit is aimed at manufacturing, where workers are regularly moving heavy stuff around. Additionally, Wolff told CNET, the suit could see military use. But that doesn’t mean Avatar-style combat. The XO, Wolff said, is primarily about logistics (lifting and moving heavy loads) and isn’t designed to be armored, so it won’t likely see the front lines.
The system will set customers back $100,000 a year to rent, which sounds like a lot, but for industrial or military purposes, the six-figure rental may not deter would-be customers if the suit proves itself a useful bit of equipment. (And it’s reasonable to imagine the price coming down as the technology becomes more commonplace and competitors arrive.)
Sarcos got into exoskeletons a couple decades ago and was originally funded by the military (like many robotics endeavors). Videos hit YouTube as long ago as 2008, but after announcing the company was taking orders for the XO earlier this year, Sarcos says they’ll deliver the first alpha units in January, which is a notable milestone.
Broadly, robotics has advanced a lot in recent years. YouTube sensations like Boston Dynamics have regularly earned millions of views (and inevitably, headlines stoking robot fear). They went from tethered treadmill sessions to untethered backflips off boxes. While today’s robots really are vastly superior to their ancestors, they’ve struggled to prove themselves useful. A counterpoint to flashy YouTube videos, the DARPA Robotics Challenge gave birth to another meme altogether. Robots falling over. Often and awkwardly.
This year marks some of the first commercial fruits of a few decades’ research. Boston Dynamics recently started offering its robot dog, Spot, to select customers in 2019. Whether this proves to be a headline-worthy flash in the pan or something sustainable remains to be seen. But between robots with more autonomy and exoskeletons like the XO, the exoskeleton variety will likely be easier to make more practical for various uses.
Whereas autonomous robots require highly advanced automation to navigate uncertain and ever-changing conditions—automation which, at the moment, remains largely elusive (though the likes of Google are pairing the latest AI with robots to tackle the problem)—an exoskeleton mainly requires physical automation. The really hard bits, like navigating and recognizing and interacting with objects, are outsourced to its human operator.
As it turns out, for today’s robots the best AI is still us. We may yet get chipper automatons like Rosy the Robot, but until then, for complicated applications, we’ll strap into our mechs for their strength and endurance, and they’ll wear us for our brains.
Image Credit: Sarcos Robotics Continue reading
#436261 AI and the future of work: The prospects ...
AI experts gathered at MIT last week, with the aim of predicting the role artificial intelligence will play in the future of work. Will it be the enemy of the human worker? Will it prove to be a savior? Or will it be just another innovation—like electricity or the internet?
As IEEE Spectrum previously reported, this conference (“AI and the Future of Work Congress”), held at MIT’s Kresge Auditorium, offered sometimes pessimistic outlooks on the job- and industry-destroying path that AI and automation seems to be taking: Self-driving technology will put truck drivers out of work; smart law clerk algorithms will put paralegals out of work; robots will (continue to) put factory and warehouse workers out of work.
Andrew McAfee, co-director of MIT’s Initiative on the Digital Economy, said even just in the past couple years, he’s noticed a shift in the public’s perception of AI. “I remember from previous versions of this conference, it felt like we had to make the case that we’re living in a period of accelerating change and that AI’s going to have a big impact,” he said. “Nobody had to make that case today.”
Elisabeth Reynolds, executive director of MIT’s Task Force on the Work of the Future, noted that following the path of least resistance is not a viable way forward. “If we do nothing, we’re in trouble,” she said. “The future will not take care of itself. We have to do something about it.”
Panelists and speakers spoke about championing productive uses of AI in the workplace, which ultimately benefit both employees and customers.
As one example, Zeynep Ton, professor at MIT Sloan School of Management, highlighted retailer Sam’s Club’s recent rollout of a program called Sam’s Garage. Previously customers shopping for tires for their car spent somewhere between 30 and 45 minutes with a Sam’s Club associate paging through manuals and looking up specs on websites.
But with an AI algorithm, they were able to cut that spec hunting time down to 2.2 minutes. “Now instead of wasting their time trying to figure out the different tires, they can field the different options and talk about which one would work best [for the customer],” she said. “This is a great example of solving a real problem, including [enhancing] the experience of the associate as well as the customer.”
“We think of it as an AI-first world that’s coming,” said Scott Prevost, VP of engineering at Adobe. Prevost said AI agents in Adobe’s software will behave something like a creative assistant or intern who will take care of more mundane tasks for you.
“We need a mindset change. That it is not just about minimizing costs or maximizing tax benefits, but really worrying about what kind of society we’re creating and what kind of environment we’re creating if we keep on just automating and [eliminating] good jobs.”
—Daron Acemoglu, MIT Institute Professor of Economics
Prevost cited an internal survey of Adobe customers that found 74 percent of respondents’ time was spent doing repetitive work—the kind that might be automated by an AI script or smart agent.
“It used to be you’d have the resources to work on three ideas [for a creative pitch or presentation],” Prevost said. “But if the AI can do a lot of the production work, then you can have 10 or 100. Which means you can actually explore some of the further out ideas. It’s also lowering the bar for everyday people to create really compelling output.”
In addition to changing the nature of work, noted a number of speakers at the event, AI is also directly transforming the workforce.
Jacob Hsu, CEO of the recruitment company Catalyte spoke about using AI as a job placement tool. The company seeks to fill myriad positions including auto mechanics, baristas, and office workers—with its sights on candidates including young people and mid-career job changers. To find them, it advertises on Craigslist, social media, and traditional media.
The prospects who sign up with Catalyte take a battery of tests. The company’s AI algorithms then match each prospect’s skills with the field best suited for their talents.
“We want to be like the Harry Potter Sorting Hat,” Hsu said.
Guillermo Miranda, IBM’s global head of corporate social responsibility, said IBM has increasingly been hiring based not on credentials but on skills. For instance, he said, as much as 50 per cent of the company’s new hires in some divisions do not have a traditional four-year college degree. “As a company, we need to be much more clear about hiring by skills,” he said. “It takes discipline. It takes conviction. It takes a little bit of enforcing with H.R. by the business leaders. But if you hire by skills, it works.”
Ardine Williams, Amazon’s VP of workforce development, said the e-commerce giant has been experimenting with developing skills of the employees at its warehouses (a.k.a. fulfillment centers) with an eye toward putting them in a position to get higher-paying work with other companies.
She described an agreement Amazon had made in its Dallas fulfillment center with aircraft maker Sikorsky, which had been experiencing a shortage of skilled workers for its nearby factory. So Amazon offered to its employees a free certification training to seek higher-paying work at Sikorsky.
“I do that because now I have an attraction mechanism—like a G.I. Bill,” Williams said. The program is also only available for employees who have worked at least a year with Amazon. So their program offers medium-term job retention, while ultimately moving workers up the wage ladder.
Radha Basu, CEO of AI data company iMerit, said her firm aggressively hires from the pool of women and under-resourced minority communities in the U.S. and India. The company specializes in turning unstructured data (e.g. video or audio feeds) into tagged and annotated data for machine learning, natural language processing, or computer vision applications.
“There is a motivation with these young people to learn these things,” she said. “It comes with no baggage.”
Alastair Fitzpayne, executive director of The Aspen Institute’s Future of Work Initiative, said the future of work ultimately means, in bottom-line terms, the future of human capital. “We have an R&D tax credit,” he said. “We’ve had it for decades. It provides credit for companies that make new investment in research and development. But we have nothing on the human capital side that’s analogous.”
So a company that’s making a big investment in worker training does it on their own dime, without any of the tax benefits that they might accrue if they, say, spent it on new equipment or new technology. Fitzpayne said a simple tweak to the R&D tax credit could make a big difference by incentivizing new investment programs in worker training. Which still means Amazon’s pre-existing worker training programs—for a company that already famously pays no taxes—would not count.
“We need a different way of developing new technologies,” said Daron Acemoglu, MIT Institute Professor of Economics. He pointed to the clean energy sector as an example. First a consensus around the problem needs to emerge. Then a broadly agreed-upon set of goals and measurements needs to be developed (e.g., that AI and automation would, for instance, create at least X new jobs for every Y jobs that it eliminates).
Then it just needs to be implemented.
“We need to build a consensus that, along the path we’re following at the moment, there are going to be increasing problems for labor,” Acemoglu said. “We need a mindset change. That it is not just about minimizing costs or maximizing tax benefits, but really worrying about what kind of society we’re creating and what kind of environment we’re creating if we keep on just automating and [eliminating] good jobs.” Continue reading