Tag Archives: data science

#436488 Tech’s Biggest Leaps From the Last 10 ...

As we enter our third decade in the 21st century, it seems appropriate to reflect on the ways technology developed and note the breakthroughs that were achieved in the last 10 years.

The 2010s saw IBM’s Watson win a game of Jeopardy, ushering in mainstream awareness of machine learning, along with DeepMind’s AlphaGO becoming the world’s Go champion. It was the decade that industrial tools like drones, 3D printers, genetic sequencing, and virtual reality (VR) all became consumer products. And it was a decade in which some alarming trends related to surveillance, targeted misinformation, and deepfakes came online.

For better or worse, the past decade was a breathtaking era in human history in which the idea of exponential growth in information technologies powered by computation became a mainstream concept.

As I did last year for 2018 only, I’ve asked a collection of experts across the Singularity University faculty to help frame the biggest breakthroughs and moments that gave shape to the past 10 years. I asked them what, in their opinion, was the most important breakthrough in their respective fields over the past decade.

My own answer to this question, focused in the space of augmented and virtual reality, would be the stunning announcement in March of 2014 that Facebook acquired Oculus VR for $2 billion. Although VR technology had been around for a while, it was at this precise moment that VR arrived as a consumer technology platform. Facebook, largely fueled by the singular interest of CEO Mark Zuckerberg, has funded the development of this industry, keeping alive the hope that consumer VR can become a sustainable business. In the meantime, VR has continued to grow in sophistication and usefulness, though it has yet to truly take off as a mainstream concept. That will hopefully be a development for the 2020s.

Below is a decade in review across the technology areas that are giving shape to our modern world, as described by the SU community of experts.

Digital Biology
Dr. Tiffany Vora | Faculty Director and Vice Chair, Digital Biology and Medicine, Singularity University

In my mind, this decade of astounding breakthroughs in the life sciences and medicine rests on the achievement of the $1,000 human genome in 2016. More-than-exponentially falling costs of DNA sequencing have driven advances in medicine, agriculture, ecology, genome editing, synthetic biology, the battle against climate change, and our fundamental understanding of life and its breathtaking connections. The “digital” revolution in DNA constituted an important model for harnessing other types of biological information, from personalized bio data to massive datasets spanning populations and species.

Crucially, by aggressively driving down the cost of such analyses, researchers and entrepreneurs democratized access to the source code of life—with attendant financial, cultural, and ethical consequences. Exciting, but take heed: Veritas Genetics spearheaded a $600 genome in 2019, only to have to shutter USA operations due to a money trail tangled with the trade war with China. Stay tuned through the early 2020s to see the pricing of DNA sequencing fall even further … and to experience the many ways that cheaper, faster harvesting of biological data will enrich your daily life.

Cryptocurrency
Alex Gladstein | Chief Strategy Officer, Human Rights Foundation

The past decade has seen Bitcoin go from just an idea on an obscure online message board to a global financial network carrying more than 100 billion dollars in value. And we’re just getting started. One recent defining moment in the cryptocurrency space has been a stunning trend underway in Venezuela, where today, the daily dollar-denominated value of Bitcoin traded now far exceeds the daily dollar-denominated value traded on the Caracas Stock Exchange. It’s just one country, but it’s a significant country, and a paradigm shift.

Governments and corporations are following Bitcoin’s success too, and are looking to launch their own digital currencies. China will launch its “DC/EP” project in the coming months, and Facebook is trying to kickstart its Libra project. There are technical and regulatory uncertainties for both, but one thing is for certain: the era of digital currency has arrived.

Business Strategy and Entrepreneurship
Pascal Finnette | Chair, Entrepreneurship and Open Innovation, Singularity University

For me, without a doubt, the most interesting and quite possibly ground-shifting development in the fields of entrepreneurship and corporate innovation in the last ten years is the rapid maturing of customer-driven product development frameworks such as Lean Startup, and its subsequent adoption by corporates for their own innovation purposes.

Tools and frameworks like the Business Model Canvas, agile (software) development and the aforementioned Lean Startup methodology fundamentally shifted the way we think and go about building products, services, and companies, with many of these tools bursting onto the startup scene in the late 2000s and early 2010s.

As these tools matured they found mass adoption not only in startups around the world, but incumbent companies who eagerly adopted them to increase their own innovation velocity and success.

Energy
Ramez Naam | Co-Chair, Energy and Environment, Singularity University

The 2010s were the decade that saw clean electricity, energy storage, and electric vehicles break through price and performance barriers around the world. Solar, wind, batteries, and EVs started this decade as technologies that had to be subsidized. That was the first phase of their existence. Now they’re entering their third, most disruptive phase, where shifting to clean energy and mobility is cheaper than continuing to use existing coal, gas, or oil infrastructure.

Consider that at the start of 2010, there was no place on earth where building new solar or wind was cheaper than building new coal or gas power generation. By 2015, in some of the sunniest and windiest places on earth, solar and wind had entered their second phase, where they were cost-competitive for new power. And then, in 2018 and 2019, we started to see the edge of the third phase, as building new solar and wind, in some parts of the world, was cheaper than operating existing coal or gas power plants.

Food Technology
Liz Specht, Ph. D | Associate Director of Science & Technology, The Good Food Institute

The arrival of mainstream plant-based meat is easily the food tech advance of the decade. Meat analogs have, of course, been around forever. But only in the last decade have companies like Beyond Meat and Impossible Foods decided to cut animals out of the process and build no-compromise meat directly from plants.

Plant-based meat is already transforming the fast-food industry. For example, the introduction of the Impossible Whopper led Burger King to their most profitable quarter in many years. But the global food industry as a whole is shifting as well. Tyson, JBS, Nestle, Cargill, and many others are all embracing plant-based meat.

Augmented and Virtual Reality
Jody Medich | CEO, Superhuman-x

The breakthrough moment for augmented and virtual reality came in 2013 when Palmer Lucky took apart an Android smartphone and added optic lenses to make the first version of the Oculus Rift. Prior to that moment, we struggled with miniaturizing the components needed to develop low-latency head-worn devices. But thanks to the smartphone race started in 2006 with the iPhone, we finally had a suite of sensors, chips, displays, and computing power small enough to put on the head.

What will the next 10 years bring? Look for AR/VR to explode in a big way. We are right on the cusp of that tipping point when the tech is finally “good enough” for our linear expectations. Given all it can do today, we can’t even picture what’s possible. Just as today we can’t function without our phones, by 2029 we’ll feel lost without some AR/VR product. It will be the way we interact with computing, smart objects, and AI. Tim Cook, Apple CEO, predicts it will replace all of today’s computing devices. I can’t wait.

Philosophy of Technology
Alix Rübsaam | Faculty Fellow, Singularity University, Philosophy of Technology/Ethics of AI

The last decade has seen a significant shift in our general attitude towards the algorithms that we now know dictate much of our surroundings. Looking back at the beginning of the decade, it seems we were blissfully unaware of how the data we freely and willingly surrendered would feed the algorithms that would come to shape every aspect of our daily lives: the news we consume, the products we purchase, the opinions we hold, etc.

If I were to isolate a single publication that contributed greatly to the shift in public discourse on algorithms, it would have to be Cathy O’Neil’s Weapons of Math Destruction from 2016. It remains a comprehensive, readable, and highly informative insight into how algorithms dictate our finances, our jobs, where we go to school, or if we can get health insurance. Its publication represents a pivotal moment when the general public started to question whether we should be OK with outsourcing decision making to these opaque systems.

The ubiquity of ethical guidelines for AI and algorithms published just in the last year (perhaps most comprehensively by the AI Now Institute) fully demonstrates the shift in public opinion of this decade.

Data Science
Ola Kowalewski | Faculty Fellow, Singularity University, Data Innovation

In the last decade we entered the era of internet and smartphone ubiquity. The number of internet users doubled, with nearly 60 percent of the global population connected online and now over 35 percent of the globe owns a smartphone. With billions of people in a state of constant connectedness and therefore in a state of constant surveillance, the companies that have built the tech infrastructure and information pipelines have dominated the global economy. This shift from tech companies being the underdogs to arguably the world’s major powers sets the landscape we enter for the next decade.

Global Grand Challenges
Darlene Damm | Vice Chair, Faculty, Global Grand Challenges, Singularity University

The biggest breakthrough over the last decade in social impact and technology is that the social impact sector switched from seeing technology as something problematic to avoid, to one of the most effective ways to create social change. We now see people using exponential technologies to solve all sorts of social challenges in areas ranging from disaster response to hunger to shelter.

The world’s leading social organizations, such as UNICEF and the World Food Programme, have launched their own venture funds and accelerators, and the United Nations recently declared that digitization is revolutionizing global development.

Digital Biology
Raymond McCauley | Chair, Digital Biology, Singularity University, Co-Founder & Chief Architect, BioCurious; Principal, Exponential Biosciences

CRISPR is bringing about a revolution in genetic engineering. It’s obvious, and it’s huge. What may not be so obvious is the widespread adoption of genetic testing. And this may have an even longer-lasting effect. It’s used to test new babies, to solve medical mysteries, and to catch serial killers. Thanks to holiday ads from 23andMe and Ancestry.com, it’s everywhere. Testing your DNA is now a common over-the-counter product. People are using it to set their diet, to pick drugs, and even for dating (or at least picking healthy mates).

And we’re just in the early stages. Further down the line, doing large-scale studies on more people, with more data, will lead to the use of polygenic risk scores to help us rank our genetic potential for everything from getting cancer to being a genius. Can you imagine what it would be like for parents to pick new babies, GATTACA-style, to get the smartest kids? You don’t have to; it’s already happening.

Artificial Intelligence
Neil Jacobstein | Chair, Artificial Intelligence and Robotics, Singularity University

The convergence of exponentially improved computing power, the deep learning algorithm, and access to massive data resulted in a series of AI breakthroughs over the past decade. These included: vastly improved accuracy in identifying images, making self driving cars practical, beating several world champions in Go, and identifying gender, smoking status, and age from retinal fundus photographs.

Combined, these breakthroughs convinced researchers and investors that after 50+ years of research and development, AI was ready for prime-time applications. Now, virtually every field of human endeavor is being revolutionized by machine learning. We still have a long way to go to achieve human-level intelligence and beyond, but the pace of worldwide improvement is blistering.

Hod Lipson | Professor of Engineering and Data Science, Columbia University

The biggest moment in AI in the past decade (and in its entire history, in my humble opinion) was midnight, Pacific time, September 30, 2012: the moment when machines finally opened their eyes. It was the moment when deep learning took off, breaking stagnant decades of machine blindness, when AI couldn’t reliably tell apart even a cat from a dog. That seemingly trivial accomplishment—a task any one-year-old child can do—has had a ripple effect on AI applications from driverless cars to health diagnostics. And this is just the beginning of what is sure to be a Cambrian explosion of AI.

Neuroscience
Divya Chander | Chair, Neuroscience, Singularity University

If the 2000s were the decade of brain mapping, then the 2010s were the decade of brain writing. Optogenetics, a technique for precisely mapping and controlling neurons and neural circuits using genetically-directed light, saw incredible growth in the 2010s.

Also in the last 10 years, neuromodulation, or the ability to rewire the brain using both invasive and non-invasive interfaces and energy, has exploded in use and form. For instance, the Braingate consortium showed us how electrode arrays implanted into the motor cortex could be used by paralyzed people to use their thoughts to direct a robotic arm. These technologies, alone or in combination with robotics, exoskeletons, and flexible, implantable, electronics also make possible a future of human augmentation.

Image Credit: Image by Jorge Guillen from Pixabay Continue reading

Posted in Human Robots

#436437 Why AI Will Be the Best Tool for ...

Dmitry Kaminskiy speaks as though he were trying to unload everything he knows about the science and economics of longevity—from senolytics research that seeks to stop aging cells from spewing inflammatory proteins and other molecules to the trillion-dollar life extension industry that he and his colleagues are trying to foster—in one sitting.

At the heart of the discussion with Singularity Hub is the idea that artificial intelligence will be the engine that drives breakthroughs in how we approach healthcare and healthy aging—a concept with little traction even just five years ago.

“At that time, it was considered too futuristic that artificial intelligence and data science … might be more accurate compared to any hypothesis of human doctors,” said Kaminskiy, co-founder and managing partner at Deep Knowledge Ventures, an investment firm that is betting big on AI and longevity.

How times have changed. Artificial intelligence in healthcare is attracting more investments and deals than just about any sector of the economy, according to data research firm CB Insights. In the most recent third quarter, AI healthcare startups raised nearly $1.6 billion, buoyed by a $550 million mega-round from London-based Babylon Health, which uses AI to collect data from patients, analyze the information, find comparable matches, then make recommendations.

Even without the big bump from Babylon Health, AI healthcare startups raised more than $1 billion last quarter, including two companies focused on longevity therapeutics: Juvenescence and Insilico Medicine.

The latter has risen to prominence for its novel use of reinforcement learning and general adversarial networks (GANs) to accelerate the drug discovery process. Insilico Medicine recently published a seminal paper that demonstrated how such an AI system could generate a drug candidate in just 46 days. Co-founder and CEO Alex Zhavoronkov said he believes there is no greater goal in healthcare today—or, really, any venture—than extending the healthy years of the human lifespan.

“I don’t think that there is anything more important than that,” he told Singularity Hub, explaining that an unhealthy society is detrimental to a healthy economy. “I think that it’s very, very important to extend healthy, productive lifespan just to fix the economy.”

An Aging Crisis
The surge of interest in longevity is coming at a time when life expectancy in the US is actually dropping, despite the fact that we spend more money on healthcare than any other nation.

A new paper in the Journal of the American Medical Association found that after six decades of gains, life expectancy for Americans has decreased since 2014, particularly among young and middle-aged adults. While some of the causes are societal, such as drug overdoses and suicide, others are health-related.

While average life expectancy in the US is 78, Kaminskiy noted that healthy life expectancy is about ten years less.

To Zhavoronkov’s point about the economy (a topic of great interest to Kaminskiy as well), the US spent $1.1 trillion on chronic diseases in 2016, according to a report from the Milken Institute, with diabetes, cardiovascular conditions, and Alzheimer’s among the most costly expenses to the healthcare system. When the indirect costs of lost economic productivity are included, the total price tag of chronic diseases in the US is $3.7 trillion, nearly 20 percent of GDP.

“So this is the major negative feedback on the national economy and creating a lot of negative social [and] financial issues,” Kaminskiy said.

Investing in Longevity
That has convinced Kaminskiy that an economy focused on extending healthy human lifespans—including the financial instruments and institutions required to support a long-lived population—is the best way forward.

He has co-authored a book on the topic with Margaretta Colangelo, another managing partner at Deep Knowledge Ventures, which has launched a specialized investment fund, Longevity.Capital, focused on the longevity industry. Kaminskiy estimates that there are now about 20 such investment funds dedicated to funding life extension companies.

In November at the inaugural AI for Longevity Summit in London, he and his collaborators also introduced the Longevity AI Consortium, an academic-industry initiative at King’s College London. Eventually, the research center will include an AI Longevity Accelerator program to serve as a bridge between startups and UK investors.

Deep Knowledge Ventures has committed about £7 million ($9 million) over the next three years to the accelerator program, as well as establishing similar consortiums in other regions of the world, according to Franco Cortese, a partner at Longevity.Capital and director of the Aging Analytics Agency, which has produced a series of reports on longevity.

A Cure for What Ages You
One of the most recent is an overview of Biomarkers for Longevity. A biomarker, in the case of longevity, is a measurable component of health that can indicate a disease state or a more general decline in health associated with aging. Examples range from something as simple as BMI as an indicator of obesity, which is associated with a number of chronic diseases, to sophisticated measurements of telomeres, the protective ends of chromosomes that shorten as we age.

While some researchers are working on moonshot therapies to reverse or slow aging—with a few even arguing we could expand human life on the order of centuries—Kaminskiy said he believes understanding biomarkers of aging could make more radical interventions unnecessary.

In this vision of healthcare, people would be able to monitor their health 24-7, with sensors attuned to various biomarkers that could indicate the onset of everything from the flu to diabetes. AI would be instrumental in not just ingesting the billions of data points required to develop such a system, but also what therapies, treatments, or micro-doses of a drug or supplement would be required to maintain homeostasis.

“Consider it like Tesla with many, many detectors, analyzing the behavior of the car in real time, and a cloud computing system monitoring those signals in real time with high frequency,” Kaminskiy explained. “So the same shall be applied for humans.”

And only sophisticated algorithms, Kaminskiy argued, can make longevity healthcare work on a mass scale but at the individual level. Precision medicine becomes preventive medicine. Healthcare truly becomes a system to support health rather than a way to fight disease.

Image Credit: Photo by h heyerlein on Unsplash Continue reading

Posted in Human Robots

#436021 AI Faces Speed Bumps and Potholes on Its ...

Implementing machine learning in the real world isn’t easy. The tools are available and the road is well-marked—but the speed bumps are many.

That was the conclusion of panelists wrapping up a day of discussions at the IEEE AI Symposium 2019, held at Cisco’s San Jose, Calif., campus last week.

The toughest problem, says Ben Irving, senior manager of Cisco’s strategy innovations group, is people.

It’s tough to find data scientist expertise, he indicated, so companies are looking into non-traditional sources of personnel, like political science. “There are some untapped areas with a lot of untapped data science expertise,” Irving says.

Lazard’s artificial intelligence manager Trevor Mottl agreed that would-be data scientists don’t need formal training or experience to break into the field. “This field is changing really rapidly,” he says. “There are new language models coming out every month, and new tools, so [anyone should] expect to not know everything. Experiment, try out new tools and techniques, read, study, spend time; there aren’t any true experts at this point because the foundational elements are shifting so rapidly.”

“It is a wonderful time to get into a field,” he reasons, noting that it doesn’t take long to catch up because there aren’t 20 years of history.”

Confusion about what different kinds of machine learning specialists do doesn’t help the personnel situation. An audience member asked panelists to explain the difference between data scientist, data analyst, and data engineer. Darrin Johnson, Nvidia global director of technical marketing for enterprise, admitted it’s hard to sort out, and any two companies could define the positions differently. “Sometimes,” he says, particularly at smaller companies, “a data scientist plays all three roles. But as companies grow, there are different groups that ingest data, clean data, and use data. At some companies, training and inference are separate. It really depends, which is a challenge when you are trying to hire someone.”

Mitigating the risks of a hot job market

The competition to hire data scientists, analysts, engineers, or whatever companies call them requires that managers make sure any work being done is structured and comprehensible at all times, the panelists cautioned.

“We need to remember that our data scientists go home every day and sometimes they don’t come back because they go home and then go to a different company,” says Lazard’s Mottl. “That’s a fact of life. If you give people choice on [how they do development], and have a successful person who gets poached by competitor, you have to either hire a team to unwrap what that person built or jettison their work and rebuild it.”

By contrast, he says, “places that have structured coding and structured commits and organized constructions of software have done very well.”

But keeping all of a company’s engineers working with the same languages and on the same development paths is not easy to do in a field that moves as fast as machine learning. Zongjie Diao, Cisco director of product management for machine learning, quipped: “I have a data scientist friend who says the speed at which he changes girlfriends is less than speed at which he changes languages.”

The data scientist/IT manager clash

Once a company finds the data engineers and scientists they need and get them started on the task of applying machine learning to that company’s operations, one of the first obstacles they face just might be the company’s IT department, the panelists suggested.

“IT is process oriented,” Mottl says. The IT team “knows how to keep data secure, to set up servers. But when you bring in a data science team, they want sandboxes, they want freedom, they want to explore and play.”

Also, Nvidia’s Johnson pointed out, “There is a language barrier.” The AI world, he says, is very different from networking or storage, and data scientists find it hard to articulate their requirements to IT.

On the ground or in the cloud?

And then there is the decision of where exactly machine learning should happen—on site, or in the cloud? At Lazard, Mottl says, the deep learning engineers do their experimentation on premises; that’s their sandbox. “But when we deploy, we deploy in the cloud,” he says.

Nvidia, Johnson says, thinks the opposite approach is better. We see the cloud as “the sandbox,” he says. “So you can run as many experiments as possible, fail fast, and learn faster.”

For Cisco’s Irving, the “where” of machine learning depends on the confidentiality of the data.

Mottl, who says rolling machine learning technology into operation can hit resistance from all across the company, had one last word of caution for those aiming to implement AI:

Data scientists are building things that might change the ways other people in the organization work, like sales and even knowledge workers. [You need to] think about the internal stakeholders and prepare them, because the last thing you want to do is to create a valuable new thing that nobody likes and people take potshots against.

The AI Symposium was organized by the Silicon Valley chapters of the IEEE Young Professionals, the IEEE Consultants’ Network, and IEEE Women in Engineering and supported by Cisco. Continue reading

Posted in Human Robots

#434772 Traditional Higher Education Is Losing ...

Should you go to graduate school? If so, why? If not, what are your alternatives? Millions of young adults across the globe—and their parents and mentors—find themselves asking these questions every year.

Earlier this month, I explored how exponential technologies are rising to meet the needs of the rapidly changing workforce.

In this blog, I’ll dive into a highly effective way to build the business acumen and skills needed to make the most significant impact in these exponential times.

To start, let’s dive into the value of graduate school versus apprenticeship—especially during this time of extraordinarily rapid growth, and the micro-diversification of careers.

The True Value of an MBA
All graduate schools are not created equal.

For complex technical trades like medicine, engineering, and law, formal graduate-level training provides a critical foundation for safe, ethical practice (until these trades are fully augmented by artificial intelligence and automation…).

For the purposes of today’s blog, let’s focus on the value of a Master in Business Administration (MBA) degree, compared to acquiring your business acumen through various forms of apprenticeship.

The Waning of Business Degrees
Ironically, business schools are facing a tough business problem. The rapid rate of technological change, a booming job market, and the digitization of education are chipping away at the traditional graduate-level business program.

The data speaks for itself.

The Decline of Graduate School Admissions
Enrollment in two-year, full-time MBA programs in the US fell by more than one-third from 2010 to 2016.

While in previous years, top business schools (e.g. Stanford, Harvard, and Wharton) were safe from the decrease in applications, this year, they also felt the waning interest in MBA programs.

Harvard Business School: 4.5 percent decrease in applications, the school’s biggest drop since 2005.
Wharton: 6.7 percent decrease in applications.
Stanford Graduate School: 4.6 percent decrease in applications.

Another signal of change began unfolding over the past week. You may have read news headlines about an emerging college admissions scam, which implicates highly selective US universities, sports coaches, parents, and students in a conspiracy to game the undergraduate admissions process.

Already, students are filing multibillion-dollar civil lawsuits arguing that the scheme has devalued their degrees or denied them a fair admissions opportunity.

MBA Graduates in the Workforce
To meet today’s business needs, startups and massive companies alike are increasingly hiring technologists, developers, and engineers in place of the MBA graduates they may have preferentially hired in the past.

While 85 percent of US employers expect to hire MBA graduates this year (a decrease from 91 percent in 2017), 52 percent of employers worldwide expect to hire graduates with a master’s in data analytics (an increase from 35 percent last year).

We’re also seeing the waning of MBA degree holders at the CEO level.

For decades, an MBA was the hallmark of upward mobility towards the C-suite of top companies.

But as exponential technologies permeate not only products but every part of the supply chain—from manufacturing and shipping to sales, marketing and customer service—that trend is changing by necessity.

Looking at the Harvard Business Review’s Top 100 CEOs in 2018 list, more CEOs on the list held engineering degrees than MBAs (34 held engineering degrees, while 32 held MBAs).

There’s much more to leading innovative companies than an advanced business degree.

How Are Schools Responding?
With disruption to the advanced business education system already here, some business schools are applying notes from their own innovation classes to brace for change.

Over the past half-decade, we’ve seen schools with smaller MBA programs shut their doors in favor of advanced degrees with more specialization. This directly responds to market demand for skills in data science, supply chain, and manufacturing.

Some degrees resemble the precise skills training of technical trades. Others are very much in line with the apprenticeship models we’ll explore next.

Regardless, this new specialization strategy is working and attracting more new students. Over the past decade (2006 to 2016), enrollment in specialized graduate business programs doubled.

Higher education is also seeing a preference shift toward for-profit trade schools, like coding boot camps. This shift is one of several forces pushing universities to adopt skill-specific advanced degrees.

But some schools are slow to adapt, raising the question: how and when will these legacy programs be disrupted? A survey of over 170 business school deans around the world showed that many programs are operating at a loss.

But if these schools are world-class business institutions, as advertised, why do they keep the doors open even while they lose money? The surveyed deans revealed an important insight: they keep the degree program open because of the program’s prestige.

Why Go to Business School?
Shorthand Credibility, Cognitive Biases, and Prestige
Regardless of what knowledge a person takes away from graduate school, attending one of the world’s most rigorous and elite programs gives grads external validation.

With over 55 percent of MBA applicants applying to just 6 percent of graduate business schools, we have a clear cognitive bias toward the perceived elite status of certain universities.

To the outside world, thanks to the power of cognitive biases, an advanced degree is credibility shorthand for your capabilities.

Simply passing through a top school’s filtration system means that you had some level of abilities and merits.

And startup success statistics tend to back up that perceived enhanced capability. Let’s take, for example, universities with the most startup unicorn founders (see the figure below).

When you consider the 320+ unicorn startups around the world today, these numbers become even more impressive. Stanford’s 18 unicorn companies account for over 5 percent of global unicorns, and Harvard is responsible for producing just under 5 percent.

Combined, just these two universities (out of over 5,000 in the US, and thousands more around the world) account for 1 in 10 of the billion-dollar private companies in the world.

By the numbers, the prestigious reputation of these elite business programs has a firm basis in current innovation success.

While prestige may be inherent to the degree earned by graduates from these business programs, the credibility boost from holding one of these degrees is not a guaranteed path to success in the business world.

For example, you might expect that the Harvard School of Business or Stanford Graduate School of Business would come out on top when tallying up the alma maters of Fortune 500 CEOs.

It turns out that the University of Wisconsin-Madison leads the business school pack with 14 CEOs to Harvard’s 12. Beyond prestige, the success these elite business programs see translates directly into cultivating unmatched networks and relationships.

Relationships
Graduate schools—particularly at the upper echelon—are excellent at attracting sharp students.

At an elite business school, if you meet just five to ten people with extraordinary skill sets, personalities, ideas, or networks, then you have returned your $200,000 education investment.

It’s no coincidence that some 40 percent of Silicon Valley venture capitalists are alumni of either Harvard or Stanford.

From future investors to advisors, friends, and potential business partners, relationships are critical to an entrepreneur’s success.

Apprenticeships
As we saw above, graduate business degree programs are melting away in the current wave of exponential change.

With an increasing $1.5 trillion in student debt, there must be a more impactful alternative to attending graduate school for those starting their careers.

When I think about the most important skills I use today as an entrepreneur, writer, and strategic thinker, they didn’t come from my decade of graduate school at Harvard or MIT… they came from my experiences building real technologies and companies, and working with mentors.

Apprenticeship comes in a variety of forms; here, I’ll cover three top-of-mind approaches:

Real-world business acumen via startup accelerators
A direct apprenticeship model
The 6 D’s of mentorship

Startup Accelerators and Business Practicum
Let’s contrast the shrinking interest in MBA programs with applications to a relatively new model of business education: startup accelerators.

Startup accelerators are short-term (typically three to six months), cohort-based programs focusing on providing startup founders with the resources (capital, mentorship, relationships, and education) needed to refine their entrepreneurial acumen.

While graduate business programs have been condensing, startup accelerators are alive, well, and expanding rapidly.

In the 10 years from 2005 (when Paul Graham founded Y Combinator) through 2015, the number of startup accelerators in the US increased by more than tenfold.

The increase in startup accelerator activity hints at a larger trend: our best and brightest business minds are opting to invest their time and efforts in obtaining hands-on experience, creating tangible value for themselves and others, rather than diving into the theory often taught in business school classrooms.

The “Strike Force” Model
The Strike Force is my elite team of young entrepreneurs who work directly with me across all of my companies, travel by my side, sit in on every meeting with me, and help build businesses that change the world.

Previous Strike Force members have gone on to launch successful companies, including Bold Capital Partners, my $250 million venture capital firm.

Strike Force is an apprenticeship for the next generation of exponential entrepreneurs.

To paraphrase my good friend Tony Robbins: If you want to short-circuit the video game, find someone who’s been there and done that and is now doing something you want to one day do.

Every year, over 500,000 apprentices in the US follow this precise template. These apprentices are learning a craft they wish to master, under the mentorship of experts (skilled metal workers, bricklayers, medical technicians, electricians, and more) who have already achieved the desired result.

What if we more readily applied this model to young adults with aspirations of creating massive value through the vehicles of entrepreneurship and innovation?

For the established entrepreneur: How can you bring young entrepreneurs into your organization to create more value for your company, while also passing on your ethos and lessons learned to the next generation?

For the young, driven millennial: How can you find your mentor and convince him or her to take you on as an apprentice? What value can you create for this person in exchange for their guidance and investment in your professional development?

The 6 D’s of Mentorship
In my last blog on education, I shared how mobile device and internet penetration will transform adult literacy and basic education. Mobile phones and connectivity already create extraordinary value for entrepreneurs and young professionals looking to take their business acumen and skill set to the next level.

For all of human history up until the last decade or so, if you wanted to learn from the best and brightest in business, leadership, or strategy, you either needed to search for a dated book that they wrote at the local library or bookstore, or you had to be lucky enough to meet that person for a live conversation.

Now you can access the mentorship of just about any thought leader on the planet, at any time, for free.

Thanks to the power of the internet, mentorship has digitized, demonetized, dematerialized, and democratized.

What do you want to learn about?

Investing? Leadership? Technology? Marketing? Project management?

You can access a near-infinite stream of cutting-edge tools, tactics, and lessons from thousands of top performers from nearly every field—instantaneously, and for free.

For example, every one of Warren Buffett’s letters to his Berkshire Hathaway investors over the past 40 years is available for free on a device that fits in your pocket.

The rise of audio—particularly podcasts and audiobooks—is another underestimated driving force away from traditional graduate business programs and toward apprenticeships.

Over 28 million podcast episodes are available for free. Once you identify the strong signals in the noise, you’re still left with thousands of hours of long-form podcast conversation from which to learn valuable lessons.

Whenever and wherever you want, you can learn from the world’s best. In the future, mentorship and apprenticeship will only become more personalized. Imagine accessing a high-fidelity, AI-powered avatar of Bill Gates, Richard Branson, or Arthur C. Clarke (one of my early mentors) to help guide you through your career.

Virtual mentorship and coaching are powerful education forces that are here to stay.

Bringing It All Together
The education system is rapidly changing. Traditional master’s programs for business are ebbing away in the tides of exponential technologies. Apprenticeship models are reemerging as an effective way to train tomorrow’s leaders.

In a future blog, I’ll revisit the concept of apprenticeships and other effective business school alternatives.

If you are a young, ambitious entrepreneur (or the parent of one), remember that you live in the most abundant time ever in human history to refine your craft.

Right now, you have access to world-class mentorship and cutting-edge best-practices—literally in the palm of your hand. What will you do with this extraordinary power?

Join Me
Abundance-Digital Online Community: I’ve created a Digital/Online community of bold, abundance-minded entrepreneurs called Abundance-Digital. Abundance-Digital is my ‘onramp’ for exponential entrepreneurs – those who want to get involved and play at a higher level. Click here to learn more.

Image Credit: fongbeerredhot / Shutterstock.com Continue reading

Posted in Human Robots