Tag Archives: x

#433400 A Model for the Future of Education, and ...

As kids worldwide head back to school, I’d like to share my thoughts on the future of education.

Bottom line, how we educate our kids needs to radically change given the massive potential of exponential tech (e.g. artificial intelligence and virtual reality).

Without question, the number one driver for education is inspiration. As such, if you have a kid age 8–18, you’ll want to get your hands on an incredibly inspirational novel written by my dear friend Ray Kurzweil called Danielle: Chronicles of a Superheroine.

Danielle offers boys and girls a role model of a young woman who uses smart technologies and super-intelligence to partner with her friends to solve some of the world’s greatest challenges. It’s perfect to inspire anyone to pursue their moonshot.

Without further ado, let’s dive into the future of educating kids, and a summary of my white paper thoughts….

Just last year, edtech (education technology) investments surpassed a record high of 9.5 billion USD—up 30 percent from the year before.

Already valued at over half a billion USD, the AI in education market is set to surpass 6 billion USD by 2024.

And we’re now seeing countless new players enter the classroom, from a Soul Machines AI teacher specializing in energy use and sustainability to smart “lab schools” with personalized curricula.

As my two boys enter 1st grade, I continue asking myself, given the fact that most elementary schools haven’t changed in many decades (perhaps a century), what do I want my kids to learn? How do I think about elementary school during an exponential era?

This post covers five subjects related to elementary school education:

Five Issues with Today’s Elementary Schools
Five Guiding Principles for Future Education
An Elementary School Curriculum for the Future
Exponential Technologies in our Classroom
Mindsets for the 21st Century

Excuse the length of this post, but if you have kids, the details might be meaningful. If you don’t, then next week’s post will return to normal length and another fun subject.

Also, if you’d like to see my detailed education “white paper,” you can view or download it here.

Let’s dive in…

Five Issues With Today’s Elementary Schools
There are probably lots of issues with today’s traditional elementary schools, but I’ll just choose a few that bother me most.

Grading: In the traditional education system, you start at an “A,” and every time you get something wrong, your score gets lower and lower. At best it’s demotivating, and at worst it has nothing to do with the world you occupy as an adult. In the gaming world (e.g. Angry Birds), it’s just the opposite. You start with zero and every time you come up with something right, your score gets higher and higher.
Sage on the Stage: Most classrooms have a teacher up in front of class lecturing to a classroom of students, half of whom are bored and half of whom are lost. The one-teacher-fits-all model comes from an era of scarcity where great teachers and schools were rare.
Relevance: When I think back to elementary and secondary school, I realize how much of what I learned was never actually useful later in life, and how many of my critical lessons for success I had to pick up on my own (I don’t know about you, but I haven’t ever actually had to factor a polynomial in my adult life).
Imagination, Coloring inside the Lines: Probably of greatest concern to me is the factory-worker, industrial-era origin of today’s schools. Programs are so structured with rote memorization that it squashes the originality from most children. I’m reminded that “the day before something is truly a breakthrough, it’s a crazy idea.” Where do we pursue crazy ideas in our schools? Where do we foster imagination?
Boring: If learning in school is a chore, boring, or emotionless, then the most important driver of human learning, passion, is disengaged. Having our children memorize facts and figures, sit passively in class, and take mundane standardized tests completely defeats the purpose.

An average of 7,200 students drop out of high school each day, totaling 1.3 million each year. This means only 69 percent of students who start high school finish four years later. And over 50 percent of these high school dropouts name boredom as the number one reason they left.

Five Guiding Principles for Future Education
I imagine a relatively near-term future in which robotics and artificial intelligence will allow any of us, from ages 8 to 108, to easily and quickly find answers, create products, or accomplish tasks, all simply by expressing our desires.

From ‘mind to manufactured in moments.’ In short, we’ll be able to do and create almost whatever we want.

In this future, what attributes will be most critical for our children to learn to become successful in their adult lives? What’s most important for educating our children today?

For me it’s about passion, curiosity, imagination, critical thinking, and grit.

Passion: You’d be amazed at how many people don’t have a mission in life… A calling… something to jolt them out of bed every morning. The most valuable resource for humanity is the persistent and passionate human mind, so creating a future of passionate kids is so very important. For my 7-year-old boys, I want to support them in finding their passion or purpose… something that is uniquely theirs. In the same way that the Apollo program and Star Trek drove my early love for all things space, and that passion drove me to learn and do.
Curiosity: Curiosity is something innate in kids, yet something lost by most adults during the course of their life. Why? In a world of Google, robots, and AI, raising a kid that is constantly asking questions and running “what if” experiments can be extremely valuable. In an age of machine learning, massive data, and a trillion sensors, it will be the quality of your questions that will be most important.
Imagination: Entrepreneurs and visionaries imagine the world (and the future) they want to live in, and then they create it. Kids happen to be some of the most imaginative humans around… it’s critical that they know how important and liberating imagination can be.
Critical Thinking: In a world flooded with often-conflicting ideas, baseless claims, misleading headlines, negative news, and misinformation, learning the skill of critical thinking helps find the signal in the noise. This principle is perhaps the most difficult to teach kids.
Grit/Persistence: Grit is defined as “passion and perseverance in pursuit of long-term goals,” and it has recently been widely acknowledged as one of the most important predictors of and contributors to success.

Teaching your kids not to give up, to keep trying, and to keep trying new ideas for something that they are truly passionate about achieving is extremely critical. Much of my personal success has come from such stubbornness. I joke that both XPRIZE and the Zero Gravity Corporation were “overnight successes after 10 years of hard work.”

So given those five basic principles, what would an elementary school curriculum look like? Let’s take a look…

An Elementary School Curriculum for the Future
Over the last 30 years, I’ve had the pleasure of starting two universities, International Space University (1987) and Singularity University (2007). My favorite part of co-founding both institutions was designing and implementing the curriculum. Along those lines, the following is my first shot at the type of curriculum I’d love my own boys to be learning.

I’d love your thoughts, I’ll be looking for them here: https://www.surveymonkey.com/r/DDRWZ8R

For the purpose of illustration, I’ll speak about ‘courses’ or ‘modules,’ but in reality these are just elements that would ultimately be woven together throughout the course of K-6 education.

Module 1: Storytelling/Communications

When I think about the skill that has served me best in life, it’s been my ability to present my ideas in the most compelling fashion possible, to get others onboard, and support birth and growth in an innovative direction. In my adult life, as an entrepreneur and a CEO, it’s been my ability to communicate clearly and tell compelling stories that has allowed me to create the future. I don’t think this lesson can start too early in life. So imagine a module, year after year, where our kids learn the art and practice of formulating and pitching their ideas. The best of oration and storytelling. Perhaps children in this class would watch TED presentations, or maybe they’d put together their own TEDx for kids. Ultimately, it’s about practice and getting comfortable with putting yourself and your ideas out there and overcoming any fears of public speaking.

Module 2: Passions

A modern school should help our children find and explore their passion(s). Passion is the greatest gift of self-discovery. It is a source of interest and excitement, and is unique to each child.

The key to finding passion is exposure. Allowing kids to experience as many adventures, careers, and passionate adults as possible. Historically, this was limited by the reality of geography and cost, implemented by having local moms and dads presenting in class about their careers. “Hi, I’m Alan, Billy’s dad, and I’m an accountant. Accountants are people who…”

But in a world of YouTube and virtual reality, the ability for our children to explore 500 different possible careers or passions during their K-6 education becomes not only possible but compelling. I imagine a module where children share their newest passion each month, sharing videos (or VR experiences) and explaining what they love and what they’ve learned.

Module 3: Curiosity & Experimentation

Einstein famously said, “I have no special talent. I am only passionately curious.” Curiosity is innate in children, and many times lost later in life. Arguably, it can be said that curiosity is responsible for all major scientific and technological advances; it’s the desire of an individual to know the truth.

Coupled with curiosity is the process of experimentation and discovery. The process of asking questions, creating and testing a hypothesis, and repeated experimentation until the truth is found. As I’ve studied the most successful entrepreneurs and entrepreneurial companies, from Google and Amazon to Uber, their success is significantly due to their relentless use of experimentation to define their products and services.

Here I imagine a module which instills in children the importance of curiosity and gives them permission to say, “I don’t know, let’s find out.”

Further, a monthly module that teaches children how to design and execute valid and meaningful experiments. Imagine children who learn the skill of asking a question, proposing a hypothesis, designing an experiment, gathering the data, and then reaching a conclusion.

Module 4: Persistence/Grit

Doing anything big, bold, and significant in life is hard work. You can’t just give up when the going gets rough. The mindset of persistence, of grit, is a learned behavior I believe can be taught at an early age, especially when it’s tied to pursuing a child’s passion.

I imagine a curriculum that, each week, studies the career of a great entrepreneur and highlights their story of persistence. It would highlight the individuals and companies that stuck with it, iterated, and ultimately succeeded.

Further, I imagine a module that combines persistence and experimentation in gameplay, such as that found in Dean Kamen’s FIRST LEGO league, where 4th graders (and up) research a real-world problem such as food safety, recycling, energy, and so on, and are challenged to develop a solution. They also must design, build, and program a robot using LEGO MINDSTORMS®, then compete on a tabletop playing field.

Module 5: Technology Exposure

In a world of rapidly accelerating technology, understanding how technologies work, what they do, and their potential for benefiting society is, in my humble opinion, critical to a child’s future. Technology and coding (more on this below) are the new “lingua franca” of tomorrow.

In this module, I imagine teaching (age appropriate) kids through play and demonstration. Giving them an overview of exponential technologies such as computation, sensors, networks, artificial intelligence, digital manufacturing, genetic engineering, augmented/virtual reality, and robotics, to name a few. This module is not about making a child an expert in any technology, it’s more about giving them the language of these new tools, and conceptually an overview of how they might use such a technology in the future. The goal here is to get them excited, give them demonstrations that make the concepts stick, and then to let their imaginations run.

Module 6: Empathy

Empathy, defined as “the ability to understand and share the feelings of another,” has been recognized as one of the most critical skills for our children today. And while there has been much written, and great practices for instilling this at home and in school, today’s new tools accelerate this.

Virtual reality isn’t just about video games anymore. Artists, activists, and journalists now see the technology’s potential to be an empathy engine, one that can shine spotlights on everything from the Ebola epidemic to what it’s like to live in Gaza. And Jeremy Bailenson has been at the vanguard of investigating VR’s power for good.

For more than a decade, Bailenson’s lab at Stanford has been studying how VR can make us better people. Through the power of VR, volunteers at the lab have felt what it is like to be Superman (to see if it makes them more helpful), a cow (to reduce meat consumption), and even a coral (to learn about ocean acidification).

Silly as they might seem, these sorts of VR scenarios could be more effective than the traditional public service ad at making people behave. Afterwards, they waste less paper. They save more money for retirement. They’re nicer to the people around them. And this could have consequences in terms of how we teach and train everyone from cliquey teenagers to high court judges.

Module 7: Ethics/Moral Dilemmas

Related to empathy, and equally important, is the goal of infusing kids with a moral compass. Over a year ago, I toured a special school created by Elon Musk (the Ad Astra school) for his five boys (age 9 to 14). One element that is persistent in that small school of under 40 kids is the conversation about ethics and morals, a conversation manifested by debating real-world scenarios that our kids may one day face.

Here’s an example of the sort of gameplay/roleplay that I heard about at Ad Astra, that might be implemented in a module on morals and ethics. Imagine a small town on a lake, in which the majority of the town is employed by a single factory. But that factory has been polluting the lake and killing all the life. What do you do? It’s posed that shutting down the factory would mean that everyone loses their jobs. On the other hand, keeping the factory open means the lake is destroyed and the lake dies. This kind of regular and routine conversation/gameplay allows the children to see the world in a critically important fashion.

Module 8: The 3R Basics (Reading, wRiting & aRithmetic)

There’s no question that young children entering kindergarten need the basics of reading, writing, and math. The only question is what’s the best way for them to get it? We all grew up in the classic mode of a teacher at the chalkboard, books, and homework at night. But I would argue that such teaching approaches are long outdated, now replaced with apps, gameplay, and the concept of the flip classroom.

Pioneered by high school teachers Jonathan Bergman and Aaron Sams in 2007, the flipped classroom reverses the sequence of events from that of the traditional classroom.

Students view lecture materials, usually in the form of video lectures, as homework prior to coming to class. In-class time is reserved for activities such as interactive discussions or collaborative work, all performed under the guidance of the teacher.

The benefits are clear:

Students can consume lectures at their own pace, viewing the video again and again until they get the concept, or fast-forwarding if the information is obvious.
The teacher is present while students apply new knowledge. Doing the homework into class time gives teachers insight into which concepts, if any, that their students are struggling with and helps them adjust the class accordingly.
The flipped classroom produces tangible results: 71 percent of teachers who flipped their classes noticed improved grades, and 80 percent reported improved student attitudes as a result.

Module 9: Creative Expression & Improvisation

Every single one of us is creative. It’s human nature to be creative… the thing is that we each might have different ways of expressing our creativity.

We must encourage kids to discover and to develop their creative outlets early. In this module, imagine showing kids the many different ways creativity is expressed, from art to engineering to music to math, and then guiding them as they choose the area (or areas) they are most interested in. Critically, teachers (or parents) can then develop unique lessons for each child based on their interests, thanks to open education resources like YouTube and the Khan Academy. If my child is interested in painting and robots, a teacher or AI could scour the web and put together a custom lesson set from videos/articles where the best painters and roboticists in the world share their skills.

Adapting to change is critical for success, especially in our constantly changing world today. Improvisation is a skill that can be learned, and we need to be teaching it early.

In most collegiate “improv” classes, the core of great improvisation is the “Yes, and…” mindset. When acting out a scene, one actor might introduce a new character or idea, completely changing the context of the scene. It’s critical that the other actors in the scene say “Yes, and…” accept the new reality, then add something new of their own.

Imagine playing similar role-play games in elementary schools, where a teacher gives the students a scene/context and constantly changes variables, forcing them to adapt and play.

Module 10: Coding

Computer science opens more doors for students than any other discipline in today’s world. Learning even the basics will help students in virtually any career, from architecture to zoology.

Coding is an important tool for computer science, in the way that arithmetic is a tool for doing mathematics and words are a tool for English. Coding creates software, but computer science is a broad field encompassing deep concepts that go well beyond coding.

Every 21st century student should also have a chance to learn about algorithms, how to make an app, or how the internet works. Computational thinking allows preschoolers to grasp concepts like algorithms, recursion and heuristics. Even if they don’t understand the terms, they’ll learn the basic concepts.

There are more than 500,000 open jobs in computing right now, representing the number one source of new wages in the US, and these jobs are projected to grow at twice the rate of all other jobs.

Coding is fun! Beyond the practical reasons for learning how to code, there’s the fact that creating a game or animation can be really fun for kids.

Module 11: Entrepreneurship & Sales

At its core, entrepreneurship is about identifying a problem (an opportunity), developing a vision on how to solve it, and working with a team to turn that vision into reality. I mentioned Elon’s school, Ad Astra: here, again, entrepreneurship is a core discipline where students create and actually sell products and services to each other and the school community.

You could recreate this basic exercise with a group of kids in lots of fun ways to teach them the basic lessons of entrepreneurship.

Related to entrepreneurship is sales. In my opinion, we need to be teaching sales to every child at an early age. Being able to “sell” an idea (again related to storytelling) has been a critical skill in my career, and it is a competency that many people simply never learned.

The lemonade stand has been a classic, though somewhat meager, lesson in sales from past generations, where a child sits on a street corner and tries to sell homemade lemonade for $0.50 to people passing by. I’d suggest we step the game up and take a more active approach in gamifying sales, and maybe having the classroom create a Kickstarter, Indiegogo or GoFundMe campaign. The experience of creating a product or service and successfully selling it will create an indelible memory and give students the tools to change the world.

Module 12: Language

A little over a year ago, I spent a week in China meeting with parents whose focus on kids’ education is extraordinary. One of the areas I found fascinating is how some of the most advanced parents are teaching their kids new languages: through games. On the tablet, the kids are allowed to play games, but only in French. A child’s desire to win fully engages them and drives their learning rapidly.

Beyond games, there’s virtual reality. We know that full immersion is what it takes to become fluent (at least later in life). A semester abroad in France or Italy, and you’ve got a great handle on the language and the culture. But what about for an eight-year-old?

Imagine a module where for an hour each day, the children spend their time walking around Italy in a VR world, hanging out with AI-driven game characters who teach them, engage them, and share the culture and the language in the most personalized and compelling fashion possible.

Exponential Technologies for Our Classrooms
If you’ve attended Abundance 360 or Singularity University, or followed my blogs, you’ll probably agree with me that the way our children will learn is going to fundamentally transform over the next decade.

Here’s an overview of the top five technologies that will reshape the future of education:

Tech 1: Virtual Reality (VR) can make learning truly immersive. Research has shown that we remember 20 percent of what we hear, 30 percent of what we see, and up to 90 percent of what we do or simulate. Virtual reality yields the latter scenario impeccably. VR enables students to simulate flying through the bloodstream while learning about different cells they encounter, or travel to Mars to inspect the surface for life.

To make this a reality, Google Cardboard just launched its Pioneer Expeditions product. Under this program, thousands of schools around the world have gotten a kit containing everything a teacher needs to take his or her class on a virtual trip. While data on VR use in K-12 schools and colleges have yet to be gathered, the steady growth of the market is reflected in the surge of companies (including zSpace, Alchemy VR and Immersive VR Education) solely dedicated to providing schools with packaged education curriculum and content.

Add to VR a related technology called augmented reality (AR), and experiential education really comes alive. Imagine wearing an AR headset that is able to superimpose educational lessons on top of real-world experiences. Interested in botany? As you walk through a garden, the AR headset superimposes the name and details of every plant you see.

Tech 2: 3D Printing is allowing students to bring their ideas to life. Never mind the computer on every desktop (or a tablet for every student), that’s a given. In the near future, teachers and students will want or have a 3D printer on the desk to help them learn core science, technology, engineering and mathematics (STEM) principles. Bre Pettis, of MakerBot Industries, in a grand but practical vision, sees a 3D printer on every school desk in America. “Imagine if you had a 3D printer instead of a LEGO set when you were a kid; what would life be like now?” asks Mr. Pettis. You could print your own mini-figures, your own blocks, and you could iterate on new designs as quickly as your imagination would allow. MakerBots are now in over 5,000 K-12 schools across the US.

Taking this one step further, you could imagine having a 3D file for most entries in Wikipedia, allowing you to print out and study an object you can only read about or visualize in VR.

Tech 3: Sensors & Networks. An explosion of sensors and networks are going to connect everyone at gigabit speeds, making access to rich video available at all times. At the same time, sensors continue to miniaturize and reduce in power, becoming embedded in everything. One benefit will be the connection of sensor data with machine learning and AI (below), such that knowledge of a child’s attention drifting, or confusion, can be easily measured and communicated. The result would be a representation of the information through an alternate modality or at a different speed.

Tech 4: Machine Learning is making learning adaptive and personalized. No two students are identical—they have different modes of learning (by reading, seeing, hearing, doing), come from different educational backgrounds, and have different intellectual capabilities and attention spans. Advances in machine learning and the surging adaptive learning movement are seeking to solve this problem. Companies like Knewton and Dreambox have over 15 million students on their respective adaptive learning platforms. Soon, every education application will be adaptive, learning how to personalize the lesson for a specific student. There will be adaptive quizzing apps, flashcard apps, textbook apps, simulation apps and many more.

Tech 5: Artificial Intelligence or “An AI Teaching Companion.” Neil Stephenson’s book The Diamond Age presents a fascinating piece of educational technology called “A Young Lady’s Illustrated Primer.”

As described by Beat Schwendimann, “The primer is an interactive book that can answer a learner’s questions (spoken in natural language), teach through allegories that incorporate elements of the learner’s environment, and presents contextual just-in-time information.

“The primer includes sensors that monitor the learner’s actions and provide feedback. The learner is in a cognitive apprenticeship with the book: The primer models a certain skill (through allegorical fairy tale characters), which the learner then imitates in real life.

“The primer follows a learning progression with increasingly more complex tasks. The educational goals of the primer are humanist: To support the learner to become a strong and independently thinking person.”

The primer, an individualized AI teaching companion is the result of technological convergence and is beautifully described by YouTuber CGP Grey in his video: Digital Aristotle: Thoughts on the Future of Education.

Your AI companion will have unlimited access to information on the cloud and will deliver it at the optimal speed to each student in an engaging, fun way. This AI will demonetize and democratize education, be available to everyone for free (just like Google), and offering the best education to the wealthiest and poorest children on the planet equally.

This AI companion is not a tutor who spouts facts, figures and answers, but a player on the side of the student, there to help him or her learn, and in so doing, learn how to learn better. The AI is always alert, watching for signs of frustration and boredom that may precede quitting, for signs of curiosity or interest that tend to indicate active exploration, and for signs of enjoyment and mastery, which might indicate a successful learning experience.

Ultimately, we’re heading towards a vastly more educated world. We are truly living during the most exciting time to be alive.

Mindsets for the 21st Century
Finally, it’s important for me to discuss mindsets. How we think about the future colors how we learn and what we do. I’ve written extensively about the importance of an abundance and exponential mindset for entrepreneurs and CEOs. I also think that attention to mindset in our elementary schools, when a child is shaping the mental “operating system” for the rest of their life, is even more important.

As such, I would recommend that a school adopt a set of principles that teach and promote a number of mindsets in the fabric of their programs.

Many “mindsets” are important to promote. Here are a couple to consider:

Nurturing Optimism & An Abundance Mindset:
We live in a competitive world, and kids experience a significant amount of pressure to perform. When they fall short, they feel deflated. We all fail at times; that’s part of life. If we want to raise “can-do” kids who can work through failure and come out stronger for it, it’s wise to nurture optimism. Optimistic kids are more willing to take healthy risks, are better problem-solvers, and experience positive relationships. You can nurture optimism in your school by starting each day by focusing on gratitude (what each child is grateful for), or a “positive focus” in which each student takes 30 seconds to talk about what they are most excited about, or what recent event was positively impactful to them. (NOTE: I start every meeting inside my Strike Force team with a positive focus.)

Finally, helping students understand (through data and graphs) that the world is in fact getting better (see my first book: Abundance: The Future is Better Than You Think) will help them counter the continuous flow of negative news flowing through our news media.

When kids feel confident in their abilities and excited about the world, they are willing to work harder and be more creative.

Tolerance for Failure:
Tolerating failure is a difficult lesson to learn and a difficult lesson to teach. But it is critically important to succeeding in life.

Astro Teller, who runs Google’s innovation branch “X,” talks a lot about encouraging failure. At X, they regularly try to “kill” their ideas. If they are successful in killing an idea, and thus “failing,” they save lots of time, money and resources. The ideas they can’t kill survive and develop into billion-dollar businesses. The key is that each time an idea is killed, Astro rewards the team, literally, with cash bonuses. Their failure is celebrated and they become a hero.

This should be reproduced in the classroom: kids should try to be critical of their best ideas (learn critical thinking), then they should be celebrated for ‘successfully failing,’ perhaps with cake, balloons, confetti, and lots of Silly String.

Join Me & Get Involved!
Abundance Digital Online Community: I have created a Digital/Online community of bold, abundance-minded entrepreneurs called Abundance Digital. This is my ‘onramp’ for exponential entrepreneurs – those who want to get involved and play at a higher level. Click here to learn more.

Image Credit: sakkarin sapu / Shutterstock.com Continue reading

Posted in Human Robots

#432880 Google’s Duplex Raises the Question: ...

By now, you’ve probably seen Google’s new Duplex software, which promises to call people on your behalf to book appointments for haircuts and the like. As yet, it only exists in demo form, but already it seems like Google has made a big stride towards capturing a market that plenty of companies have had their eye on for quite some time. This software is impressive, but it raises questions.

Many of you will be familiar with the stilted, robotic conversations you can have with early chatbots that are, essentially, glorified menus. Instead of pressing 1 to confirm or 2 to re-enter, some of these bots would allow for simple commands like “Yes” or “No,” replacing the buttons with limited ability to recognize a few words. Using them was often a far more frustrating experience than attempting to use a menu—there are few things more irritating than a robot saying, “Sorry, your response was not recognized.”

Google Duplex scheduling a hair salon appointment:

Google Duplex calling a restaurant:

Even getting the response recognized is hard enough. After all, there are countless different nuances and accents to baffle voice recognition software, and endless turns of phrase that amount to saying the same thing that can confound natural language processing (NLP), especially if you like your phrasing quirky.

You may think that standard customer-service type conversations all travel the same route, using similar words and phrasing. But when there are over 80,000 ways to order coffee, and making a mistake is frowned upon, even simple tasks require high accuracy over a huge dataset.

Advances in audio processing, neural networks, and NLP, as well as raw computing power, have meant that basic recognition of what someone is trying to say is less of an issue. Soundhound’s virtual assistant prides itself on being able to process complicated requests (perhaps needlessly complicated).

The deeper issue, as with all attempts to develop conversational machines, is one of understanding context. There are so many ways a conversation can go that attempting to construct a conversation two or three layers deep quickly runs into problems. Multiply the thousands of things people might say by the thousands they might say next, and the combinatorics of the challenge runs away from most chatbots, leaving them as either glorified menus, gimmicks, or rather bizarre to talk to.

Yet Google, who surely remembers from Glass the risk of premature debuts for technology, especially the kind that ask you to rethink how you interact with or trust in software, must have faith in Duplex to show it on the world stage. We know that startups like Semantic Machines and x.ai have received serious funding to perform very similar functions, using natural-language conversations to perform computing tasks, schedule meetings, book hotels, or purchase items.

It’s no great leap to imagine Google will soon do the same, bringing us closer to a world of onboard computing, where Lens labels the world around us and their assistant arranges it for us (all the while gathering more and more data it can convert into personalized ads). The early demos showed some clever tricks for keeping the conversation within a fairly narrow realm where the AI should be comfortable and competent, and the blog post that accompanied the release shows just how much effort has gone into the technology.

Yet given the privacy and ethics funk the tech industry finds itself in, and people’s general unease about AI, the main reaction to Duplex’s impressive demo was concern. The voice sounded too natural, bringing to mind Lyrebird and their warnings of deepfakes. You might trust “Do the Right Thing” Google with this technology, but it could usher in an era when automated robo-callers are far more convincing.

A more human-like voice may sound like a perfectly innocuous improvement, but the fact that the assistant interjects naturalistic “umm” and “mm-hm” responses to more perfectly mimic a human rubbed a lot of people the wrong way. This wasn’t just a voice assistant trying to sound less grinding and robotic; it was actively trying to deceive people into thinking they were talking to a human.

Google is running the risk of trying to get to conversational AI by going straight through the uncanny valley.

“Google’s experiments do appear to have been designed to deceive,” said Dr. Thomas King of the Oxford Internet Institute’s Digital Ethics Lab, according to Techcrunch. “Their main hypothesis was ‘can you distinguish this from a real person?’ In this case it’s unclear why their hypothesis was about deception and not the user experience… there should be some kind of mechanism there to let people know what it is they are speaking to.”

From Google’s perspective, being able to say “90 percent of callers can’t tell the difference between this and a human personal assistant” is an excellent marketing ploy, even though statistics about how many interactions are successful might be more relevant.

In fact, Duplex runs contrary to pretty much every major recommendation about ethics for the use of robotics or artificial intelligence, not to mention certain eavesdropping laws. Transparency is key to holding machines (and the people who design them) accountable, especially when it comes to decision-making.

Then there are the more subtle social issues. One prominent effect social media has had is to allow people to silo themselves; in echo chambers of like-minded individuals, it’s hard to see how other opinions exist. Technology exacerbates this by removing the evolutionary cues that go along with face-to-face interaction. Confronted with a pair of human eyes, people are more generous. Confronted with a Twitter avatar or a Facebook interface, people hurl abuse and criticism they’d never dream of using in a public setting.

Now that we can use technology to interact with ever fewer people, will it change us? Is it fair to offload the burden of dealing with a robot onto the poor human at the other end of the line, who might have to deal with dozens of such calls a day? Google has said that if the AI is in trouble, it will put you through to a human, which might help save receptionists from the hell of trying to explain a concept to dozens of dumbfounded AI assistants all day. But there’s always the risk that failures will be blamed on the person and not the machine.

As AI advances, could we end up treating the dwindling number of people in these “customer-facing” roles as the buggiest part of a fully automatic service? Will people start accusing each other of being robots on the phone, as well as on Twitter?

Google has provided plenty of reassurances about how the system will be used. They have said they will ensure that the system is identified, and it’s hardly difficult to resolve this problem; a slight change in the script from their demo would do it. For now, consumers will likely appreciate moves that make it clear whether the “intelligent agents” that make major decisions for us, that we interact with daily, and that hide behind social media avatars or phone numbers are real or artificial.

Image Credit: Besjunior / Shutterstock.com Continue reading

Posted in Human Robots

#432193 Are ‘You’ Just Inside Your Skin or ...

In November 2017, a gunman entered a church in Sutherland Springs in Texas, where he killed 26 people and wounded 20 others. He escaped in his car, with police and residents in hot pursuit, before losing control of the vehicle and flipping it into a ditch. When the police got to the car, he was dead. The episode is horrifying enough without its unsettling epilogue. In the course of their investigations, the FBI reportedly pressed the gunman’s finger to the fingerprint-recognition feature on his iPhone in an attempt to unlock it. Regardless of who’s affected, it’s disquieting to think of the police using a corpse to break into someone’s digital afterlife.

Most democratic constitutions shield us from unwanted intrusions into our brains and bodies. They also enshrine our entitlement to freedom of thought and mental privacy. That’s why neurochemical drugs that interfere with cognitive functioning can’t be administered against a person’s will unless there’s a clear medical justification. Similarly, according to scholarly opinion, law-enforcement officials can’t compel someone to take a lie-detector test, because that would be an invasion of privacy and a violation of the right to remain silent.

But in the present era of ubiquitous technology, philosophers are beginning to ask whether biological anatomy really captures the entirety of who we are. Given the role they play in our lives, do our devices deserve the same protections as our brains and bodies?

After all, your smartphone is much more than just a phone. It can tell a more intimate story about you than your best friend. No other piece of hardware in history, not even your brain, contains the quality or quantity of information held on your phone: it ‘knows’ whom you speak to, when you speak to them, what you said, where you have been, your purchases, photos, biometric data, even your notes to yourself—and all this dating back years.

In 2014, the United States Supreme Court used this observation to justify the decision that police must obtain a warrant before rummaging through our smartphones. These devices “are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy,” as Chief Justice John Roberts observed in his written opinion.

The Chief Justice probably wasn’t making a metaphysical point—but the philosophers Andy Clark and David Chalmers were when they argued in “The Extended Mind” (1998) that technology is actually part of us. According to traditional cognitive science, “thinking” is a process of symbol manipulation or neural computation, which gets carried out by the brain. Clark and Chalmers broadly accept this computational theory of mind, but claim that tools can become seamlessly integrated into how we think. Objects such as smartphones or notepads are often just as functionally essential to our cognition as the synapses firing in our heads. They augment and extend our minds by increasing our cognitive power and freeing up internal resources.

If accepted, the extended mind thesis threatens widespread cultural assumptions about the inviolate nature of thought, which sits at the heart of most legal and social norms. As the US Supreme Court declared in 1942: “freedom to think is absolute of its own nature; the most tyrannical government is powerless to control the inward workings of the mind.” This view has its origins in thinkers such as John Locke and René Descartes, who argued that the human soul is locked in a physical body, but that our thoughts exist in an immaterial world, inaccessible to other people. One’s inner life thus needs protecting only when it is externalized, such as through speech. Many researchers in cognitive science still cling to this Cartesian conception—only, now, the private realm of thought coincides with activity in the brain.

But today’s legal institutions are straining against this narrow concept of the mind. They are trying to come to grips with how technology is changing what it means to be human, and to devise new normative boundaries to cope with this reality. Justice Roberts might not have known about the idea of the extended mind, but it supports his wry observation that smartphones have become part of our body. If our minds now encompass our phones, we are essentially cyborgs: part-biology, part-technology. Given how our smartphones have taken over what were once functions of our brains—remembering dates, phone numbers, addresses—perhaps the data they contain should be treated on a par with the information we hold in our heads. So if the law aims to protect mental privacy, its boundaries would need to be pushed outwards to give our cyborg anatomy the same protections as our brains.

This line of reasoning leads to some potentially radical conclusions. Some philosophers have argued that when we die, our digital devices should be handled as remains: if your smartphone is a part of who you are, then perhaps it should be treated more like your corpse than your couch. Similarly, one might argue that trashing someone’s smartphone should be seen as a form of “extended” assault, equivalent to a blow to the head, rather than just destruction of property. If your memories are erased because someone attacks you with a club, a court would have no trouble characterizing the episode as a violent incident. So if someone breaks your smartphone and wipes its contents, perhaps the perpetrator should be punished as they would be if they had caused a head trauma.

The extended mind thesis also challenges the law’s role in protecting both the content and the means of thought—that is, shielding what and how we think from undue influence. Regulation bars non-consensual interference in our neurochemistry (for example, through drugs), because that meddles with the contents of our mind. But if cognition encompasses devices, then arguably they should be subject to the same prohibitions. Perhaps some of the techniques that advertisers use to hijack our attention online, to nudge our decision-making or manipulate search results, should count as intrusions on our cognitive process. Similarly, in areas where the law protects the means of thought, it might need to guarantee access to tools such as smartphones—in the same way that freedom of expression protects people’s right not only to write or speak, but also to use computers and disseminate speech over the internet.

The courts are still some way from arriving at such decisions. Besides the headline-making cases of mass shooters, there are thousands of instances each year in which police authorities try to get access to encrypted devices. Although the Fifth Amendment to the US Constitution protects individuals’ right to remain silent (and therefore not give up a passcode), judges in several states have ruled that police can forcibly use fingerprints to unlock a user’s phone. (With the new facial-recognition feature on the iPhone X, police might only need to get an unwitting user to look at her phone.) These decisions reflect the traditional concept that the rights and freedoms of an individual end at the skin.

But the concept of personal rights and freedoms that guides our legal institutions is outdated. It is built on a model of a free individual who enjoys an untouchable inner life. Now, though, our thoughts can be invaded before they have even been developed—and in a way, perhaps this is nothing new. The Nobel Prize-winning physicist Richard Feynman used to say that he thought with his notebook. Without a pen and pencil, a great deal of complex reflection and analysis would never have been possible. If the extended mind view is right, then even simple technologies such as these would merit recognition and protection as a part of the essential toolkit of the mind.This article was originally published at Aeon and has been republished under Creative Commons.

Image Credit: Sergii Tverdokhlibov / Shutterstock.com Continue reading

Posted in Human Robots

#432181 Putting AI in Your Pocket: MIT Chip Cuts ...

Neural networks are powerful things, but they need a lot of juice. Engineers at MIT have now developed a new chip that cuts neural nets’ power consumption by up to 95 percent, potentially allowing them to run on battery-powered mobile devices.

Smartphones these days are getting truly smart, with ever more AI-powered services like digital assistants and real-time translation. But typically the neural nets crunching the data for these services are in the cloud, with data from smartphones ferried back and forth.

That’s not ideal, as it requires a lot of communication bandwidth and means potentially sensitive data is being transmitted and stored on servers outside the user’s control. But the huge amounts of energy needed to power the GPUs neural networks run on make it impractical to implement them in devices that run on limited battery power.

Engineers at MIT have now designed a chip that cuts that power consumption by up to 95 percent by dramatically reducing the need to shuttle data back and forth between a chip’s memory and processors.

Neural nets consist of thousands of interconnected artificial neurons arranged in layers. Each neuron receives input from multiple neurons in the layer below it, and if the combined input passes a certain threshold it then transmits an output to multiple neurons above it. The strength of the connection between neurons is governed by a weight, which is set during training.

This means that for every neuron, the chip has to retrieve the input data for a particular connection and the connection weight from memory, multiply them, store the result, and then repeat the process for every input. That requires a lot of data to be moved around, expending a lot of energy.

The new MIT chip does away with that, instead computing all the inputs in parallel within the memory using analog circuits. That significantly reduces the amount of data that needs to be shoved around and results in major energy savings.

The approach requires the weights of the connections to be binary rather than a range of values, but previous theoretical work had suggested this wouldn’t dramatically impact accuracy, and the researchers found the chip’s results were generally within two to three percent of the conventional non-binary neural net running on a standard computer.

This isn’t the first time researchers have created chips that carry out processing in memory to reduce the power consumption of neural nets, but it’s the first time the approach has been used to run powerful convolutional neural networks popular for image-based AI applications.

“The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays,” Dario Gil, vice president of artificial intelligence at IBM, said in a statement.

“It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT [the internet of things] in the future.”

It’s not just research groups working on this, though. The desire to get AI smarts into devices like smartphones, household appliances, and all kinds of IoT devices is driving the who’s who of Silicon Valley to pile into low-power AI chips.

Apple has already integrated its Neural Engine into the iPhone X to power things like its facial recognition technology, and Amazon is rumored to be developing its own custom AI chips for the next generation of its Echo digital assistant.

The big chip companies are also increasingly pivoting towards supporting advanced capabilities like machine learning, which has forced them to make their devices ever more energy-efficient. Earlier this year ARM unveiled two new chips: the Arm Machine Learning processor, aimed at general AI tasks from translation to facial recognition, and the Arm Object Detection processor for detecting things like faces in images.

Qualcomm’s latest mobile chip, the Snapdragon 845, features a GPU and is heavily focused on AI. The company has also released the Snapdragon 820E, which is aimed at drones, robots, and industrial devices.

Going a step further, IBM and Intel are developing neuromorphic chips whose architectures are inspired by the human brain and its incredible energy efficiency. That could theoretically allow IBM’s TrueNorth and Intel’s Loihi to run powerful machine learning on a fraction of the power of conventional chips, though they are both still highly experimental at this stage.

Getting these chips to run neural nets as powerful as those found in cloud services without burning through batteries too quickly will be a big challenge. But at the current pace of innovation, it doesn’t look like it will be too long before you’ll be packing some serious AI power in your pocket.

Image Credit: Blue Planet Studio / Shutterstock.com Continue reading

Posted in Human Robots

#431925 How the Science of Decision-Making Will ...

Neuroscientist Brie Linkenhoker believes that leaders must be better prepared for future strategic challenges by continually broadening their worldviews.
As the director of Worldview Stanford, Brie and her team produce multimedia content and immersive learning experiences to make academic research and insights accessible and useable by curious leaders. These future-focused topics are designed to help curious leaders understand the forces shaping the future.
Worldview Stanford has tackled such interdisciplinary topics as the power of minds, the science of decision-making, environmental risk and resilience, and trust and power in the age of big data.
We spoke with Brie about why understanding our biases is critical to making better decisions, particularly in a time of increasing change and complexity.

Lisa Kay Solomon: What is Worldview Stanford?
Brie Linkenhoker: Leaders and decision makers are trying to navigate this complex hairball of a planet that we live on and that requires keeping up on a lot of diverse topics across multiple fields of study and research. Universities like Stanford are where that new knowledge is being created, but it’s not getting out and used as readily as we would like, so that’s what we’re working on.
Worldview is designed to expand our individual and collective worldviews about important topics impacting our future. Your worldview is not a static thing, it’s constantly changing. We believe it should be informed by lots of different perspectives, different cultures, by knowledge from different domains and disciplines. This is more important now than ever.
At Worldview, we create learning experiences that are an amalgamation of all of those things.
LKS: One of your marquee programs is the Science of Decision Making. Can you tell us about that course and why it’s important?
BL: We tend to think about decision makers as being people in leadership positions, but every person who works in your organization, every member of your family, every member of the community is a decision maker. You have to decide what to buy, who to partner with, what government regulations to anticipate.
You have to think not just about your own decisions, but you have to anticipate how other people make decisions too. So, when we set out to create the Science of Decision Making, we wanted to help people improve their own decisions and be better able to predict, understand, anticipate the decisions of others.

“I think in another 10 or 15 years, we’re probably going to have really rich models of how we actually make decisions and what’s going on in the brain to support them.”

We realized that the only way to do that was to combine a lot of different perspectives, so we recruited experts from economics, psychology, neuroscience, philosophy, biology, and religion. We also brought in cutting-edge research on artificial intelligence and virtual reality and explored conversations about how technology is changing how we make decisions today and how it might support our decision-making in the future.
There’s no single set of answers. There are as many unanswered questions as there are answered questions.
LKS: One of the other things you explore in this course is the role of biases and heuristics. Can you explain the importance of both in decision-making?
BL: When I was a strategy consultant, executives would ask me, “How do I get rid of the biases in my decision-making or my organization’s decision-making?” And my response would be, “Good luck with that. It isn’t going to happen.”
As human beings we make, probably, thousands of decisions every single day. If we had to be actively thinking about each one of those decisions, we wouldn’t get out of our house in the morning, right?
We have to be able to do a lot of our decision-making essentially on autopilot to free up cognitive resources for more difficult decisions. So, we’ve evolved in the human brain a set of what we understand to be heuristics or rules of thumb.
And heuristics are great in, say, 95 percent of situations. It’s that five percent, or maybe even one percent, that they’re really not so great. That’s when we have to become aware of them because in some situations they can become biases.
For example, it doesn’t matter so much that we’re not aware of our rules of thumb when we’re driving to work or deciding what to make for dinner. But they can become absolutely critical in situations where a member of law enforcement is making an arrest or where you’re making a decision about a strategic investment or even when you’re deciding who to hire.
Let’s take hiring for a moment.
How many years is a hire going to impact your organization? You’re potentially looking at 5, 10, 15, 20 years. Having the right person in a role could change the future of your business entirely. That’s one of those areas where you really need to be aware of your own heuristics and biases—and we all have them. There’s no getting rid of them.
LKS: We seem to be at a time when the boundaries between different disciplines are starting to blend together. How has the advancement of neuroscience help us become better leaders? What do you see happening next?
BL: Heuristics and biases are very topical these days, thanks in part to Michael Lewis’s fantastic book, The Undoing Project, which is the story of the groundbreaking work that Nobel Prize winner Danny Kahneman and Amos Tversky did in the psychology and biases of human decision-making. Their work gave rise to the whole new field of behavioral economics.
In the last 10 to 15 years, neuroeconomics has really taken off. Neuroeconomics is the combination of behavioral economics with neuroscience. In behavioral economics, they use economic games and economic choices that have numbers associated with them and have real-world application.
For example, they ask, “How much would you spend to buy A versus B?” Or, “If I offered you X dollars for this thing that you have, would you take it or would you say no?” So, it’s trying to look at human decision-making in a format that’s easy to understand and quantify within a laboratory setting.
Now you bring neuroscience into that. You can have people doing those same kinds of tasks—making those kinds of semi-real-world decisions—in a brain scanner, and we can now start to understand what’s going on in the brain while people are making decisions. You can ask questions like, “Can I look at the signals in someone’s brain and predict what decision they’re going to make?” That can help us build a model of decision-making.
I think in another 10 or 15 years, we’re probably going to have really rich models of how we actually make decisions and what’s going on in the brain to support them. That’s very exciting for a neuroscientist.
Image Credit: Black Salmon / Shutterstock.com Continue reading

Posted in Human Robots