Tag Archives: beat

#433400 A Model for the Future of Education, and ...

As kids worldwide head back to school, I’d like to share my thoughts on the future of education.

Bottom line, how we educate our kids needs to radically change given the massive potential of exponential tech (e.g. artificial intelligence and virtual reality).

Without question, the number one driver for education is inspiration. As such, if you have a kid age 8–18, you’ll want to get your hands on an incredibly inspirational novel written by my dear friend Ray Kurzweil called Danielle: Chronicles of a Superheroine.

Danielle offers boys and girls a role model of a young woman who uses smart technologies and super-intelligence to partner with her friends to solve some of the world’s greatest challenges. It’s perfect to inspire anyone to pursue their moonshot.

Without further ado, let’s dive into the future of educating kids, and a summary of my white paper thoughts….

Just last year, edtech (education technology) investments surpassed a record high of 9.5 billion USD—up 30 percent from the year before.

Already valued at over half a billion USD, the AI in education market is set to surpass 6 billion USD by 2024.

And we’re now seeing countless new players enter the classroom, from a Soul Machines AI teacher specializing in energy use and sustainability to smart “lab schools” with personalized curricula.

As my two boys enter 1st grade, I continue asking myself, given the fact that most elementary schools haven’t changed in many decades (perhaps a century), what do I want my kids to learn? How do I think about elementary school during an exponential era?

This post covers five subjects related to elementary school education:

Five Issues with Today’s Elementary Schools
Five Guiding Principles for Future Education
An Elementary School Curriculum for the Future
Exponential Technologies in our Classroom
Mindsets for the 21st Century

Excuse the length of this post, but if you have kids, the details might be meaningful. If you don’t, then next week’s post will return to normal length and another fun subject.

Also, if you’d like to see my detailed education “white paper,” you can view or download it here.

Let’s dive in…

Five Issues With Today’s Elementary Schools
There are probably lots of issues with today’s traditional elementary schools, but I’ll just choose a few that bother me most.

Grading: In the traditional education system, you start at an “A,” and every time you get something wrong, your score gets lower and lower. At best it’s demotivating, and at worst it has nothing to do with the world you occupy as an adult. In the gaming world (e.g. Angry Birds), it’s just the opposite. You start with zero and every time you come up with something right, your score gets higher and higher.
Sage on the Stage: Most classrooms have a teacher up in front of class lecturing to a classroom of students, half of whom are bored and half of whom are lost. The one-teacher-fits-all model comes from an era of scarcity where great teachers and schools were rare.
Relevance: When I think back to elementary and secondary school, I realize how much of what I learned was never actually useful later in life, and how many of my critical lessons for success I had to pick up on my own (I don’t know about you, but I haven’t ever actually had to factor a polynomial in my adult life).
Imagination, Coloring inside the Lines: Probably of greatest concern to me is the factory-worker, industrial-era origin of today’s schools. Programs are so structured with rote memorization that it squashes the originality from most children. I’m reminded that “the day before something is truly a breakthrough, it’s a crazy idea.” Where do we pursue crazy ideas in our schools? Where do we foster imagination?
Boring: If learning in school is a chore, boring, or emotionless, then the most important driver of human learning, passion, is disengaged. Having our children memorize facts and figures, sit passively in class, and take mundane standardized tests completely defeats the purpose.

An average of 7,200 students drop out of high school each day, totaling 1.3 million each year. This means only 69 percent of students who start high school finish four years later. And over 50 percent of these high school dropouts name boredom as the number one reason they left.

Five Guiding Principles for Future Education
I imagine a relatively near-term future in which robotics and artificial intelligence will allow any of us, from ages 8 to 108, to easily and quickly find answers, create products, or accomplish tasks, all simply by expressing our desires.

From ‘mind to manufactured in moments.’ In short, we’ll be able to do and create almost whatever we want.

In this future, what attributes will be most critical for our children to learn to become successful in their adult lives? What’s most important for educating our children today?

For me it’s about passion, curiosity, imagination, critical thinking, and grit.

Passion: You’d be amazed at how many people don’t have a mission in life… A calling… something to jolt them out of bed every morning. The most valuable resource for humanity is the persistent and passionate human mind, so creating a future of passionate kids is so very important. For my 7-year-old boys, I want to support them in finding their passion or purpose… something that is uniquely theirs. In the same way that the Apollo program and Star Trek drove my early love for all things space, and that passion drove me to learn and do.
Curiosity: Curiosity is something innate in kids, yet something lost by most adults during the course of their life. Why? In a world of Google, robots, and AI, raising a kid that is constantly asking questions and running “what if” experiments can be extremely valuable. In an age of machine learning, massive data, and a trillion sensors, it will be the quality of your questions that will be most important.
Imagination: Entrepreneurs and visionaries imagine the world (and the future) they want to live in, and then they create it. Kids happen to be some of the most imaginative humans around… it’s critical that they know how important and liberating imagination can be.
Critical Thinking: In a world flooded with often-conflicting ideas, baseless claims, misleading headlines, negative news, and misinformation, learning the skill of critical thinking helps find the signal in the noise. This principle is perhaps the most difficult to teach kids.
Grit/Persistence: Grit is defined as “passion and perseverance in pursuit of long-term goals,” and it has recently been widely acknowledged as one of the most important predictors of and contributors to success.

Teaching your kids not to give up, to keep trying, and to keep trying new ideas for something that they are truly passionate about achieving is extremely critical. Much of my personal success has come from such stubbornness. I joke that both XPRIZE and the Zero Gravity Corporation were “overnight successes after 10 years of hard work.”

So given those five basic principles, what would an elementary school curriculum look like? Let’s take a look…

An Elementary School Curriculum for the Future
Over the last 30 years, I’ve had the pleasure of starting two universities, International Space University (1987) and Singularity University (2007). My favorite part of co-founding both institutions was designing and implementing the curriculum. Along those lines, the following is my first shot at the type of curriculum I’d love my own boys to be learning.

I’d love your thoughts, I’ll be looking for them here: https://www.surveymonkey.com/r/DDRWZ8R

For the purpose of illustration, I’ll speak about ‘courses’ or ‘modules,’ but in reality these are just elements that would ultimately be woven together throughout the course of K-6 education.

Module 1: Storytelling/Communications

When I think about the skill that has served me best in life, it’s been my ability to present my ideas in the most compelling fashion possible, to get others onboard, and support birth and growth in an innovative direction. In my adult life, as an entrepreneur and a CEO, it’s been my ability to communicate clearly and tell compelling stories that has allowed me to create the future. I don’t think this lesson can start too early in life. So imagine a module, year after year, where our kids learn the art and practice of formulating and pitching their ideas. The best of oration and storytelling. Perhaps children in this class would watch TED presentations, or maybe they’d put together their own TEDx for kids. Ultimately, it’s about practice and getting comfortable with putting yourself and your ideas out there and overcoming any fears of public speaking.

Module 2: Passions

A modern school should help our children find and explore their passion(s). Passion is the greatest gift of self-discovery. It is a source of interest and excitement, and is unique to each child.

The key to finding passion is exposure. Allowing kids to experience as many adventures, careers, and passionate adults as possible. Historically, this was limited by the reality of geography and cost, implemented by having local moms and dads presenting in class about their careers. “Hi, I’m Alan, Billy’s dad, and I’m an accountant. Accountants are people who…”

But in a world of YouTube and virtual reality, the ability for our children to explore 500 different possible careers or passions during their K-6 education becomes not only possible but compelling. I imagine a module where children share their newest passion each month, sharing videos (or VR experiences) and explaining what they love and what they’ve learned.

Module 3: Curiosity & Experimentation

Einstein famously said, “I have no special talent. I am only passionately curious.” Curiosity is innate in children, and many times lost later in life. Arguably, it can be said that curiosity is responsible for all major scientific and technological advances; it’s the desire of an individual to know the truth.

Coupled with curiosity is the process of experimentation and discovery. The process of asking questions, creating and testing a hypothesis, and repeated experimentation until the truth is found. As I’ve studied the most successful entrepreneurs and entrepreneurial companies, from Google and Amazon to Uber, their success is significantly due to their relentless use of experimentation to define their products and services.

Here I imagine a module which instills in children the importance of curiosity and gives them permission to say, “I don’t know, let’s find out.”

Further, a monthly module that teaches children how to design and execute valid and meaningful experiments. Imagine children who learn the skill of asking a question, proposing a hypothesis, designing an experiment, gathering the data, and then reaching a conclusion.

Module 4: Persistence/Grit

Doing anything big, bold, and significant in life is hard work. You can’t just give up when the going gets rough. The mindset of persistence, of grit, is a learned behavior I believe can be taught at an early age, especially when it’s tied to pursuing a child’s passion.

I imagine a curriculum that, each week, studies the career of a great entrepreneur and highlights their story of persistence. It would highlight the individuals and companies that stuck with it, iterated, and ultimately succeeded.

Further, I imagine a module that combines persistence and experimentation in gameplay, such as that found in Dean Kamen’s FIRST LEGO league, where 4th graders (and up) research a real-world problem such as food safety, recycling, energy, and so on, and are challenged to develop a solution. They also must design, build, and program a robot using LEGO MINDSTORMS®, then compete on a tabletop playing field.

Module 5: Technology Exposure

In a world of rapidly accelerating technology, understanding how technologies work, what they do, and their potential for benefiting society is, in my humble opinion, critical to a child’s future. Technology and coding (more on this below) are the new “lingua franca” of tomorrow.

In this module, I imagine teaching (age appropriate) kids through play and demonstration. Giving them an overview of exponential technologies such as computation, sensors, networks, artificial intelligence, digital manufacturing, genetic engineering, augmented/virtual reality, and robotics, to name a few. This module is not about making a child an expert in any technology, it’s more about giving them the language of these new tools, and conceptually an overview of how they might use such a technology in the future. The goal here is to get them excited, give them demonstrations that make the concepts stick, and then to let their imaginations run.

Module 6: Empathy

Empathy, defined as “the ability to understand and share the feelings of another,” has been recognized as one of the most critical skills for our children today. And while there has been much written, and great practices for instilling this at home and in school, today’s new tools accelerate this.

Virtual reality isn’t just about video games anymore. Artists, activists, and journalists now see the technology’s potential to be an empathy engine, one that can shine spotlights on everything from the Ebola epidemic to what it’s like to live in Gaza. And Jeremy Bailenson has been at the vanguard of investigating VR’s power for good.

For more than a decade, Bailenson’s lab at Stanford has been studying how VR can make us better people. Through the power of VR, volunteers at the lab have felt what it is like to be Superman (to see if it makes them more helpful), a cow (to reduce meat consumption), and even a coral (to learn about ocean acidification).

Silly as they might seem, these sorts of VR scenarios could be more effective than the traditional public service ad at making people behave. Afterwards, they waste less paper. They save more money for retirement. They’re nicer to the people around them. And this could have consequences in terms of how we teach and train everyone from cliquey teenagers to high court judges.

Module 7: Ethics/Moral Dilemmas

Related to empathy, and equally important, is the goal of infusing kids with a moral compass. Over a year ago, I toured a special school created by Elon Musk (the Ad Astra school) for his five boys (age 9 to 14). One element that is persistent in that small school of under 40 kids is the conversation about ethics and morals, a conversation manifested by debating real-world scenarios that our kids may one day face.

Here’s an example of the sort of gameplay/roleplay that I heard about at Ad Astra, that might be implemented in a module on morals and ethics. Imagine a small town on a lake, in which the majority of the town is employed by a single factory. But that factory has been polluting the lake and killing all the life. What do you do? It’s posed that shutting down the factory would mean that everyone loses their jobs. On the other hand, keeping the factory open means the lake is destroyed and the lake dies. This kind of regular and routine conversation/gameplay allows the children to see the world in a critically important fashion.

Module 8: The 3R Basics (Reading, wRiting & aRithmetic)

There’s no question that young children entering kindergarten need the basics of reading, writing, and math. The only question is what’s the best way for them to get it? We all grew up in the classic mode of a teacher at the chalkboard, books, and homework at night. But I would argue that such teaching approaches are long outdated, now replaced with apps, gameplay, and the concept of the flip classroom.

Pioneered by high school teachers Jonathan Bergman and Aaron Sams in 2007, the flipped classroom reverses the sequence of events from that of the traditional classroom.

Students view lecture materials, usually in the form of video lectures, as homework prior to coming to class. In-class time is reserved for activities such as interactive discussions or collaborative work, all performed under the guidance of the teacher.

The benefits are clear:

Students can consume lectures at their own pace, viewing the video again and again until they get the concept, or fast-forwarding if the information is obvious.
The teacher is present while students apply new knowledge. Doing the homework into class time gives teachers insight into which concepts, if any, that their students are struggling with and helps them adjust the class accordingly.
The flipped classroom produces tangible results: 71 percent of teachers who flipped their classes noticed improved grades, and 80 percent reported improved student attitudes as a result.

Module 9: Creative Expression & Improvisation

Every single one of us is creative. It’s human nature to be creative… the thing is that we each might have different ways of expressing our creativity.

We must encourage kids to discover and to develop their creative outlets early. In this module, imagine showing kids the many different ways creativity is expressed, from art to engineering to music to math, and then guiding them as they choose the area (or areas) they are most interested in. Critically, teachers (or parents) can then develop unique lessons for each child based on their interests, thanks to open education resources like YouTube and the Khan Academy. If my child is interested in painting and robots, a teacher or AI could scour the web and put together a custom lesson set from videos/articles where the best painters and roboticists in the world share their skills.

Adapting to change is critical for success, especially in our constantly changing world today. Improvisation is a skill that can be learned, and we need to be teaching it early.

In most collegiate “improv” classes, the core of great improvisation is the “Yes, and…” mindset. When acting out a scene, one actor might introduce a new character or idea, completely changing the context of the scene. It’s critical that the other actors in the scene say “Yes, and…” accept the new reality, then add something new of their own.

Imagine playing similar role-play games in elementary schools, where a teacher gives the students a scene/context and constantly changes variables, forcing them to adapt and play.

Module 10: Coding

Computer science opens more doors for students than any other discipline in today’s world. Learning even the basics will help students in virtually any career, from architecture to zoology.

Coding is an important tool for computer science, in the way that arithmetic is a tool for doing mathematics and words are a tool for English. Coding creates software, but computer science is a broad field encompassing deep concepts that go well beyond coding.

Every 21st century student should also have a chance to learn about algorithms, how to make an app, or how the internet works. Computational thinking allows preschoolers to grasp concepts like algorithms, recursion and heuristics. Even if they don’t understand the terms, they’ll learn the basic concepts.

There are more than 500,000 open jobs in computing right now, representing the number one source of new wages in the US, and these jobs are projected to grow at twice the rate of all other jobs.

Coding is fun! Beyond the practical reasons for learning how to code, there’s the fact that creating a game or animation can be really fun for kids.

Module 11: Entrepreneurship & Sales

At its core, entrepreneurship is about identifying a problem (an opportunity), developing a vision on how to solve it, and working with a team to turn that vision into reality. I mentioned Elon’s school, Ad Astra: here, again, entrepreneurship is a core discipline where students create and actually sell products and services to each other and the school community.

You could recreate this basic exercise with a group of kids in lots of fun ways to teach them the basic lessons of entrepreneurship.

Related to entrepreneurship is sales. In my opinion, we need to be teaching sales to every child at an early age. Being able to “sell” an idea (again related to storytelling) has been a critical skill in my career, and it is a competency that many people simply never learned.

The lemonade stand has been a classic, though somewhat meager, lesson in sales from past generations, where a child sits on a street corner and tries to sell homemade lemonade for $0.50 to people passing by. I’d suggest we step the game up and take a more active approach in gamifying sales, and maybe having the classroom create a Kickstarter, Indiegogo or GoFundMe campaign. The experience of creating a product or service and successfully selling it will create an indelible memory and give students the tools to change the world.

Module 12: Language

A little over a year ago, I spent a week in China meeting with parents whose focus on kids’ education is extraordinary. One of the areas I found fascinating is how some of the most advanced parents are teaching their kids new languages: through games. On the tablet, the kids are allowed to play games, but only in French. A child’s desire to win fully engages them and drives their learning rapidly.

Beyond games, there’s virtual reality. We know that full immersion is what it takes to become fluent (at least later in life). A semester abroad in France or Italy, and you’ve got a great handle on the language and the culture. But what about for an eight-year-old?

Imagine a module where for an hour each day, the children spend their time walking around Italy in a VR world, hanging out with AI-driven game characters who teach them, engage them, and share the culture and the language in the most personalized and compelling fashion possible.

Exponential Technologies for Our Classrooms
If you’ve attended Abundance 360 or Singularity University, or followed my blogs, you’ll probably agree with me that the way our children will learn is going to fundamentally transform over the next decade.

Here’s an overview of the top five technologies that will reshape the future of education:

Tech 1: Virtual Reality (VR) can make learning truly immersive. Research has shown that we remember 20 percent of what we hear, 30 percent of what we see, and up to 90 percent of what we do or simulate. Virtual reality yields the latter scenario impeccably. VR enables students to simulate flying through the bloodstream while learning about different cells they encounter, or travel to Mars to inspect the surface for life.

To make this a reality, Google Cardboard just launched its Pioneer Expeditions product. Under this program, thousands of schools around the world have gotten a kit containing everything a teacher needs to take his or her class on a virtual trip. While data on VR use in K-12 schools and colleges have yet to be gathered, the steady growth of the market is reflected in the surge of companies (including zSpace, Alchemy VR and Immersive VR Education) solely dedicated to providing schools with packaged education curriculum and content.

Add to VR a related technology called augmented reality (AR), and experiential education really comes alive. Imagine wearing an AR headset that is able to superimpose educational lessons on top of real-world experiences. Interested in botany? As you walk through a garden, the AR headset superimposes the name and details of every plant you see.

Tech 2: 3D Printing is allowing students to bring their ideas to life. Never mind the computer on every desktop (or a tablet for every student), that’s a given. In the near future, teachers and students will want or have a 3D printer on the desk to help them learn core science, technology, engineering and mathematics (STEM) principles. Bre Pettis, of MakerBot Industries, in a grand but practical vision, sees a 3D printer on every school desk in America. “Imagine if you had a 3D printer instead of a LEGO set when you were a kid; what would life be like now?” asks Mr. Pettis. You could print your own mini-figures, your own blocks, and you could iterate on new designs as quickly as your imagination would allow. MakerBots are now in over 5,000 K-12 schools across the US.

Taking this one step further, you could imagine having a 3D file for most entries in Wikipedia, allowing you to print out and study an object you can only read about or visualize in VR.

Tech 3: Sensors & Networks. An explosion of sensors and networks are going to connect everyone at gigabit speeds, making access to rich video available at all times. At the same time, sensors continue to miniaturize and reduce in power, becoming embedded in everything. One benefit will be the connection of sensor data with machine learning and AI (below), such that knowledge of a child’s attention drifting, or confusion, can be easily measured and communicated. The result would be a representation of the information through an alternate modality or at a different speed.

Tech 4: Machine Learning is making learning adaptive and personalized. No two students are identical—they have different modes of learning (by reading, seeing, hearing, doing), come from different educational backgrounds, and have different intellectual capabilities and attention spans. Advances in machine learning and the surging adaptive learning movement are seeking to solve this problem. Companies like Knewton and Dreambox have over 15 million students on their respective adaptive learning platforms. Soon, every education application will be adaptive, learning how to personalize the lesson for a specific student. There will be adaptive quizzing apps, flashcard apps, textbook apps, simulation apps and many more.

Tech 5: Artificial Intelligence or “An AI Teaching Companion.” Neil Stephenson’s book The Diamond Age presents a fascinating piece of educational technology called “A Young Lady’s Illustrated Primer.”

As described by Beat Schwendimann, “The primer is an interactive book that can answer a learner’s questions (spoken in natural language), teach through allegories that incorporate elements of the learner’s environment, and presents contextual just-in-time information.

“The primer includes sensors that monitor the learner’s actions and provide feedback. The learner is in a cognitive apprenticeship with the book: The primer models a certain skill (through allegorical fairy tale characters), which the learner then imitates in real life.

“The primer follows a learning progression with increasingly more complex tasks. The educational goals of the primer are humanist: To support the learner to become a strong and independently thinking person.”

The primer, an individualized AI teaching companion is the result of technological convergence and is beautifully described by YouTuber CGP Grey in his video: Digital Aristotle: Thoughts on the Future of Education.

Your AI companion will have unlimited access to information on the cloud and will deliver it at the optimal speed to each student in an engaging, fun way. This AI will demonetize and democratize education, be available to everyone for free (just like Google), and offering the best education to the wealthiest and poorest children on the planet equally.

This AI companion is not a tutor who spouts facts, figures and answers, but a player on the side of the student, there to help him or her learn, and in so doing, learn how to learn better. The AI is always alert, watching for signs of frustration and boredom that may precede quitting, for signs of curiosity or interest that tend to indicate active exploration, and for signs of enjoyment and mastery, which might indicate a successful learning experience.

Ultimately, we’re heading towards a vastly more educated world. We are truly living during the most exciting time to be alive.

Mindsets for the 21st Century
Finally, it’s important for me to discuss mindsets. How we think about the future colors how we learn and what we do. I’ve written extensively about the importance of an abundance and exponential mindset for entrepreneurs and CEOs. I also think that attention to mindset in our elementary schools, when a child is shaping the mental “operating system” for the rest of their life, is even more important.

As such, I would recommend that a school adopt a set of principles that teach and promote a number of mindsets in the fabric of their programs.

Many “mindsets” are important to promote. Here are a couple to consider:

Nurturing Optimism & An Abundance Mindset:
We live in a competitive world, and kids experience a significant amount of pressure to perform. When they fall short, they feel deflated. We all fail at times; that’s part of life. If we want to raise “can-do” kids who can work through failure and come out stronger for it, it’s wise to nurture optimism. Optimistic kids are more willing to take healthy risks, are better problem-solvers, and experience positive relationships. You can nurture optimism in your school by starting each day by focusing on gratitude (what each child is grateful for), or a “positive focus” in which each student takes 30 seconds to talk about what they are most excited about, or what recent event was positively impactful to them. (NOTE: I start every meeting inside my Strike Force team with a positive focus.)

Finally, helping students understand (through data and graphs) that the world is in fact getting better (see my first book: Abundance: The Future is Better Than You Think) will help them counter the continuous flow of negative news flowing through our news media.

When kids feel confident in their abilities and excited about the world, they are willing to work harder and be more creative.

Tolerance for Failure:
Tolerating failure is a difficult lesson to learn and a difficult lesson to teach. But it is critically important to succeeding in life.

Astro Teller, who runs Google’s innovation branch “X,” talks a lot about encouraging failure. At X, they regularly try to “kill” their ideas. If they are successful in killing an idea, and thus “failing,” they save lots of time, money and resources. The ideas they can’t kill survive and develop into billion-dollar businesses. The key is that each time an idea is killed, Astro rewards the team, literally, with cash bonuses. Their failure is celebrated and they become a hero.

This should be reproduced in the classroom: kids should try to be critical of their best ideas (learn critical thinking), then they should be celebrated for ‘successfully failing,’ perhaps with cake, balloons, confetti, and lots of Silly String.

Join Me & Get Involved!
Abundance Digital Online Community: I have created a Digital/Online community of bold, abundance-minded entrepreneurs called Abundance Digital. This is my ‘onramp’ for exponential entrepreneurs – those who want to get involved and play at a higher level. Click here to learn more.

Image Credit: sakkarin sapu / Shutterstock.com Continue reading

Posted in Human Robots

#432891 This Week’s Awesome Stories From ...

TRANSPORTATION
Elon Musk Presents His Tunnel Vision to the People of LA
Jack Stewart and Aarian Marshall | Wired
“Now, Musk wants to build this new, 2.1-mile tunnel, near LA’s Sepulveda pass. It’s all part of his broader vision of a sprawling network that could take riders from Sherman Oaks in the north to Long Beach Airport in the south, Santa Monica in the west to Dodger Stadium in the east—without all that troublesome traffic.”

ROBOTICS
Feel What This Robot Feels Through Tactile Expressions
Evan Ackerman | IEEE Spectrum
“Guy Hoffman’s Human-Robot Collaboration & Companionship (HRC2) Lab at Cornell University is working on a new robot that’s designed to investigate this concept of textural communication, which really hasn’t been explored in robotics all that much. The robot uses a pneumatically powered elastomer skin that can be dynamically textured with either goosebumps or spikes, which should help it communicate more effectively, especially if what it’s trying to communicate is, ‘Don’t touch me!’”

VIRTUAL REALITY
In Virtual Reality, How Much Body Do You Need?
Steph Yin | The New York Times
“In a paper published Tuesday in Scientific Reports, they showed that animating virtual hands and feet alone is enough to make people feel their sense of body drift toward an invisible avatar. Their work fits into a corpus of research on illusory body ownership, which has challenged understandings of perception and contributed to therapies like treating pain for amputees who experience phantom limb.”

MEDICINE
How Graphene and Gold Could Help Us Test Drugs and Monitor Cancer
Angela Chen | The Verge
“In today’s study, scientists learned to precisely control the amount of electricity graphene generates by changing how much light they shine on the material. When they grew heart cells on the graphene, they could manipulate the cells too, says study co-author Alex Savtchenko, a physicist at the University of California, San Diego. They could make it beat 1.5 times faster, three times faster, 10 times faster, or whatever they needed.”

DISASTER RELIEF
Robotic Noses Could Be the Future of Disaster Rescue—If They Can Outsniff Search Dogs
Eleanor Cummins | Popular Science
“While canine units are a tried and fairly true method for identifying people trapped in the wreckage of a disaster, analytical chemists have for years been working in the lab to create a robotic alternative. A synthetic sniffer, they argue, could potentially prove to be just as or even more reliable than a dog, more resilient in the face of external pressures like heat and humidity, and infinitely more portable.”

Image Credit: Sergey Nivens / Shutterstock.com Continue reading

Posted in Human Robots

#432236 Why Hasn’t AI Mastered Language ...

In the myth about the Tower of Babel, people conspired to build a city and tower that would reach heaven. Their creator observed, “And now nothing will be restrained from them, which they have imagined to do.” According to the myth, God thwarted this effort by creating diverse languages so that they could no longer collaborate.

In our modern times, we’re experiencing a state of unprecedented connectivity thanks to technology. However, we’re still living under the shadow of the Tower of Babel. Language remains a barrier in business and marketing. Even though technological devices can quickly and easily connect, humans from different parts of the world often can’t.

Translation agencies step in, making presentations, contracts, outsourcing instructions, and advertisements comprehensible to all intended recipients. Some agencies also offer “localization” expertise. For instance, if a company is marketing in Quebec, the advertisements need to be in Québécois French, not European French. Risk-averse companies may be reluctant to invest in these translations. Consequently, these ventures haven’t achieved full market penetration.

Global markets are waiting, but AI-powered language translation isn’t ready yet, despite recent advancements in natural language processing and sentiment analysis. AI still has difficulties processing requests in one language, without the additional complications of translation. In November 2016, Google added a neural network to its translation tool. However, some of its translations are still socially and grammatically odd. I spoke to technologists and a language professor to find out why.

“To Google’s credit, they made a pretty massive improvement that appeared almost overnight. You know, I don’t use it as much. I will say this. Language is hard,” said Michael Housman, chief data science officer at RapportBoost.AI and faculty member of Singularity University.

He explained that the ideal scenario for machine learning and artificial intelligence is something with fixed rules and a clear-cut measure of success or failure. He named chess as an obvious example, and noted machines were able to beat the best human Go player. This happened faster than anyone anticipated because of the game’s very clear rules and limited set of moves.

Housman elaborated, “Language is almost the opposite of that. There aren’t as clearly-cut and defined rules. The conversation can go in an infinite number of different directions. And then of course, you need labeled data. You need to tell the machine to do it right or wrong.”

Housman noted that it’s inherently difficult to assign these informative labels. “Two translators won’t even agree on whether it was translated properly or not,” he said. “Language is kind of the wild west, in terms of data.”

Google’s technology is now able to consider the entirety of a sentence, as opposed to merely translating individual words. Still, the glitches linger. I asked Dr. Jorge Majfud, Associate Professor of Spanish, Latin American Literature, and International Studies at Jacksonville University, to explain why consistently accurate language translation has thus far eluded AI.

He replied, “The problem is that considering the ‘entire’ sentence is still not enough. The same way the meaning of a word depends on the rest of the sentence (more in English than in Spanish), the meaning of a sentence depends on the rest of the paragraph and the rest of the text, as the meaning of a text depends on a larger context called culture, speaker intentions, etc.”

He noted that sarcasm and irony only make sense within this widened context. Similarly, idioms can be problematic for automated translations.

“Google translation is a good tool if you use it as a tool, that is, not to substitute human learning or understanding,” he said, before offering examples of mistranslations that could occur.

“Months ago, I went to buy a drill at Home Depot and I read a sign under a machine: ‘Saw machine.’ Right below it, the Spanish translation: ‘La máquina vió,’ which means, ‘The machine did see it.’ Saw, not as a noun but as a verb in the preterit form,” he explained.

Dr. Majfud warned, “We should be aware of the fragility of their ‘interpretation.’ Because to translate is basically to interpret, not just an idea but a feeling. Human feelings and ideas that only humans can understand—and sometimes not even we, humans, understand other humans.”

He noted that cultures, gender, and even age can pose barriers to this understanding and also contended that an over-reliance on technology is leading to our cultural and political decline. Dr. Majfud mentioned that Argentinean writer Julio Cortázar used to refer to dictionaries as “cemeteries.” He suggested that automatic translators could be called “zombies.”

Erik Cambria is an academic AI researcher and assistant professor at Nanyang Technological University in Singapore. He mostly focuses on natural language processing, which is at the core of AI-powered language translation. Like Dr. Majfud, he sees the complexity and associated risks. “There are so many things that we unconsciously do when we read a piece of text,” he told me. Reading comprehension requires multiple interrelated tasks, which haven’t been accounted for in past attempts to automate translation.

Cambria continued, “The biggest issue with machine translation today is that we tend to go from the syntactic form of a sentence in the input language to the syntactic form of that sentence in the target language. That’s not what we humans do. We first decode the meaning of the sentence in the input language and then we encode that meaning into the target language.”

Additionally, there are cultural risks involved with these translations. Dr. Ramesh Srinivasan, Director of UCLA’s Digital Cultures Lab, said that new technological tools sometimes reflect underlying biases.

“There tend to be two parameters that shape how we design ‘intelligent systems.’ One is the values and you might say biases of those that create the systems. And the second is the world if you will that they learn from,” he told me. “If you build AI systems that reflect the biases of their creators and of the world more largely, you get some, occasionally, spectacular failures.”

Dr. Srinivasan said translation tools should be transparent about their capabilities and limitations. He said, “You know, the idea that a single system can take languages that I believe are very diverse semantically and syntactically from one another and claim to unite them or universalize them, or essentially make them sort of a singular entity, it’s a misnomer, right?”

Mary Cochran, co-founder of Launching Labs Marketing, sees the commercial upside. She mentioned that listings in online marketplaces such as Amazon could potentially be auto-translated and optimized for buyers in other countries.

She said, “I believe that we’re just at the tip of the iceberg, so to speak, with what AI can do with marketing. And with better translation, and more globalization around the world, AI can’t help but lead to exploding markets.”

Image Credit: igor kisselev / Shutterstock.com Continue reading

Posted in Human Robots

#432190 In the Future, There Will Be No Limit to ...

New planets found in distant corners of the galaxy. Climate models that may improve our understanding of sea level rise. The emergence of new antimalarial drugs. These scientific advances and discoveries have been in the news in recent months.

While representing wildly divergent disciplines, from astronomy to biotechnology, they all have one thing in common: Artificial intelligence played a key role in their scientific discovery.

One of the more recent and famous examples came out of NASA at the end of 2017. The US space agency had announced an eighth planet discovered in the Kepler-90 system. Scientists had trained a neural network—a computer with a “brain” modeled on the human mind—to re-examine data from Kepler, a space-borne telescope with a four-year mission to seek out new life and new civilizations. Or, more precisely, to find habitable planets where life might just exist.

The researchers trained the artificial neural network on a set of 15,000 previously vetted signals until it could identify true planets and false positives 96 percent of the time. It then went to work on weaker signals from nearly 700 star systems with known planets.

The machine detected Kepler 90i—a hot, rocky planet that orbits its sun about every two Earth weeks—through a nearly imperceptible change in brightness captured when a planet passes a star. It also found a sixth Earth-sized planet in the Kepler-80 system.

AI Handles Big Data
The application of AI to science is being driven by three great advances in technology, according to Ross King from the Manchester Institute of Biotechnology at the University of Manchester, leader of a team that developed an artificially intelligent “scientist” called Eve.

Those three advances include much faster computers, big datasets, and improved AI methods, King said. “These advances increasingly give AI superhuman reasoning abilities,” he told Singularity Hub by email.

AI systems can flawlessly remember vast numbers of facts and extract information effortlessly from millions of scientific papers, not to mention exhibit flawless logical reasoning and near-optimal probabilistic reasoning, King says.

AI systems also beat humans when it comes to dealing with huge, diverse amounts of data.

That’s partly what attracted a team of glaciologists to turn to machine learning to untangle the factors involved in how heat from Earth’s interior might influence the ice sheet that blankets Greenland.

Algorithms juggled 22 geologic variables—such as bedrock topography, crustal thickness, magnetic anomalies, rock types, and proximity to features like trenches, ridges, young rifts, and volcanoes—to predict geothermal heat flux under the ice sheet throughout Greenland.

The machine learning model, for example, predicts elevated heat flux upstream of Jakobshavn Glacier, the fastest-moving glacier in the world.

“The major advantage is that we can incorporate so many different types of data,” explains Leigh Stearns, associate professor of geology at Kansas University, whose research takes her to the polar regions to understand how and why Earth’s great ice sheets are changing, questions directly related to future sea level rise.

“All of the other models just rely on one parameter to determine heat flux, but the [machine learning] approach incorporates all of them,” Stearns told Singularity Hub in an email. “Interestingly, we found that there is not just one parameter…that determines the heat flux, but a combination of many factors.”

The research was published last month in Geophysical Research Letters.

Stearns says her team hopes to apply high-powered machine learning to characterize glacier behavior over both short and long-term timescales, thanks to the large amounts of data that she and others have collected over the last 20 years.

Emergence of Robot Scientists
While Stearns sees machine learning as another tool to augment her research, King believes artificial intelligence can play a much bigger role in scientific discoveries in the future.

“I am interested in developing AI systems that autonomously do science—robot scientists,” he said. Such systems, King explained, would automatically originate hypotheses to explain observations, devise experiments to test those hypotheses, physically run the experiments using laboratory robotics, and even interpret the results. The conclusions would then influence the next cycle of hypotheses and experiments.

His AI scientist Eve recently helped researchers discover that triclosan, an ingredient commonly found in toothpaste, could be used as an antimalarial drug against certain strains that have developed a resistance to other common drug therapies. The research was published in the journal Scientific Reports.

Automation using artificial intelligence for drug discovery has become a growing area of research, as the machines can work orders of magnitude faster than any human. AI is also being applied in related areas, such as synthetic biology for the rapid design and manufacture of microorganisms for industrial uses.

King argues that machines are better suited to unravel the complexities of biological systems, with even the most “simple” organisms are host to thousands of genes, proteins, and small molecules that interact in complicated ways.

“Robot scientists and semi-automated AI tools are essential for the future of biology, as there are simply not enough human biologists to do the necessary work,” he said.

Creating Shockwaves in Science
The use of machine learning, neural networks, and other AI methods can often get better results in a fraction of the time it would normally take to crunch data.

For instance, scientists at the National Center for Supercomputing Applications, located at the University of Illinois at Urbana-Champaign, have a deep learning system for the rapid detection and characterization of gravitational waves. Gravitational waves are disturbances in spacetime, emanating from big, high-energy cosmic events, such as the massive explosion of a star known as a supernova. The “Holy Grail” of this type of research is to detect gravitational waves from the Big Bang.

Dubbed Deep Filtering, the method allows real-time processing of data from LIGO, a gravitational wave observatory comprised of two enormous laser interferometers located thousands of miles apart in California and Louisiana. The research was published in Physics Letters B. You can watch a trippy visualization of the results below.

In a more down-to-earth example, scientists published a paper last month in Science Advances on the development of a neural network called ConvNetQuake to detect and locate minor earthquakes from ground motion measurements called seismograms.

ConvNetQuake uncovered 17 times more earthquakes than traditional methods. Scientists say the new method is particularly useful in monitoring small-scale seismic activity, which has become more frequent, possibly due to fracking activities that involve injecting wastewater deep underground. You can learn more about ConvNetQuake in this video:

King says he believes that in the long term there will be no limit to what AI can accomplish in science. He and his team, including Eve, are currently working on developing cancer therapies under a grant from DARPA.

“Robot scientists are getting smarter and smarter; human scientists are not,” he says. “Indeed, there is arguably a case that human scientists are less good. I don’t see any scientist alive today of the stature of a Newton or Einstein—despite the vast number of living scientists. The Physics Nobel [laureate] Frank Wilczek is on record as saying (10 years ago) that in 100 years’ time the best physicist will be a machine. I agree.”

Image Credit: Romaset / Shutterstock.com Continue reading

Posted in Human Robots

#431999 Brain-Like Chips Now Beat the Human ...

Move over, deep learning. Neuromorphic computing—the next big thing in artificial intelligence—is on fire.

Just last week, two studies individually unveiled computer chips modeled after information processing in the human brain.

The first, published in Nature Materials, found a perfect solution to deal with unpredictability at synapses—the gap between two neurons that transmit and store information. The second, published in Science Advances, further amped up the system’s computational power, filling synapses with nanoclusters of supermagnetic material to bolster information encoding.

The result? Brain-like hardware systems that compute faster—and more efficiently—than the human brain.

“Ultimately we want a chip as big as a fingernail to replace one big supercomputer,” said Dr. Jeehwan Kim, who led the first study at MIT in Cambridge, Massachusetts.

Experts are hopeful.

“The field’s full of hype, and it’s nice to see quality work presented in an objective way,” said Dr. Carver Mead, an engineer at the California Institute of Technology in Pasadena not involved in the work.

Software to Hardware
The human brain is the ultimate computational wizard. With roughly 100 billion neurons densely packed into the size of a small football, the brain can deftly handle complex computation at lightning speed using very little energy.

AI experts have taken note. The past few years saw brain-inspired algorithms that can identify faces, falsify voices, and play a variety of games at—and often above—human capability.

But software is only part of the equation. Our current computers, with their transistors and binary digital systems, aren’t equipped to run these powerful algorithms.

That’s where neuromorphic computing comes in. The idea is simple: fabricate a computer chip that mimics the brain at the hardware level. Here, data is both processed and stored within the chip in an analog manner. Each artificial synapse can accumulate and integrate small bits of information from multiple sources and fire only when it reaches a threshold—much like its biological counterpart.

Experts believe the speed and efficiency gains will be enormous.

For one, the chips will no longer have to transfer data between the central processing unit (CPU) and storage blocks, which wastes both time and energy. For another, like biological neural networks, neuromorphic devices can support neurons that run millions of streams of parallel computation.

A “Brain-on-a-chip”
Optimism aside, reproducing the biological synapse in hardware form hasn’t been as easy as anticipated.

Neuromorphic chips exist in many forms, but often look like a nanoscale metal sandwich. The “bread” pieces are generally made of conductive plates surrounding a switching medium—a conductive material of sorts that acts like the gap in a biological synapse.

When a voltage is applied, as in the case of data input, ions move within the switching medium, which then creates conductive streams to stimulate the downstream plate. This change in conductivity mimics the way biological neurons change their “weight,” or the strength of connectivity between two adjacent neurons.

But so far, neuromorphic synapses have been rather unpredictable. According to Kim, that’s because the switching medium is often comprised of material that can’t channel ions to exact locations on the downstream plate.

“Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way,” explains Kim. “But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects.”

In his new study, Kim and colleagues swapped the jelly-like switching medium for silicon, a material with only a single line of defects that acts like a channel to guide ions.

The chip starts with a thin wafer of silicon etched with a honeycomb-like pattern. On top is a layer of silicon germanium—something often present in transistors—in the same pattern. This creates a funnel-like dislocation, a kind of Grand Canal that perfectly shuttles ions across the artificial synapse.

The researchers then made a neuromorphic chip containing these synapses and shot an electrical zap through them. Incredibly, the synapses’ response varied by only four percent—much higher than any neuromorphic device made with an amorphous switching medium.

In a computer simulation, the team built a multi-layer artificial neural network using parameters measured from their device. After tens of thousands of training examples, their neural network correctly recognized samples 95 percent of the time, just 2 percent lower than state-of-the-art software algorithms.

The upside? The neuromorphic chip requires much less space than the hardware that runs deep learning algorithms. Forget supercomputers—these chips could one day run complex computations right on our handheld devices.

A Magnetic Boost
Meanwhile, in Boulder, Colorado, Dr. Michael Schneider at the National Institute of Standards and Technology also realized that the standard switching medium had to go.

“There must be a better way to do this, because nature has figured out a better way to do this,” he says.

His solution? Nanoclusters of magnetic manganese.

Schneider’s chip contained two slices of superconducting electrodes made out of niobium, which channel electricity with no resistance. When researchers applied different magnetic fields to the synapse, they could control the alignment of the manganese “filling.”

The switch gave the chip a double boost. For one, by aligning the switching medium, the team could predict the ion flow and boost uniformity. For another, the magnetic manganese itself adds computational power. The chip can now encode data in both the level of electrical input and the direction of the magnetisms without bulking up the synapse.

It seriously worked. At one billion times per second, the chips fired several orders of magnitude faster than human neurons. Plus, the chips required just one ten-thousandth of the energy used by their biological counterparts, all the while synthesizing input from nine different sources in an analog manner.

The Road Ahead
These studies show that we may be nearing a benchmark where artificial synapses match—or even outperform—their human inspiration.

But to Dr. Steven Furber, an expert in neuromorphic computing, we still have a ways before the chips go mainstream.

Many of the special materials used in these chips require specific temperatures, he says. Magnetic manganese chips, for example, require temperatures around absolute zero to operate, meaning they come with the need for giant cooling tanks filled with liquid helium—obviously not practical for everyday use.

Another is scalability. Millions of synapses are necessary before a neuromorphic device can be used to tackle everyday problems such as facial recognition. So far, no deal.

But these problems may in fact be a driving force for the entire field. Intense competition could push teams into exploring different ideas and solutions to similar problems, much like these two studies.

If so, future chips may come in diverse flavors. Similar to our vast array of deep learning algorithms and operating systems, the computer chips of the future may also vary depending on specific requirements and needs.

It is worth developing as many different technological approaches as possible, says Furber, especially as neuroscientists increasingly understand what makes our biological synapses—the ultimate inspiration—so amazingly efficient.

Image Credit: arakio / Shutterstock.com Continue reading

Posted in Human Robots