Does The Online Education Revolution Mean The Death Of The Diploma?

As the options for self-education explode, what does a college education mean? And how can we measure what a good education is?

Education is changing, and it’s changing fast. Anyone can put together a personalized educational experience via digital textbooks accessible by iPad, video learning from top university faculty, or peer-led discussion. People of all demographics are gathering their own seeds of education and cultivating lush sets of hybrid tools to deal with the rapid knowledge replenishment that’s essential in an economy where massive career specialization and constant innovation reign.

This is the first piece in a Collaborative Fund-curated series on creativity and values, written by thought leaders in the for-profit, for-good business space.

What we’re witnessing is a bottom-up revolution in education: Learners, not institutions, are leading innovation. This is an era of plenty. I like to call it the Education Harvest.

But there is a huge issue that’s preventing lifelong learners from blossoming into our next generation of highly skilled—and employed—workers: There’s no accreditation process for self-taught learners.

Where is education headed? How are learners driving the movement? And how can we fix this lack of accreditation? Here are the five parts of the Education Harvest:

1. Gadgets And Blended Learning

Classrooms can be anywhere at anytime. When 75% of teens own a cell phone—40% of those phone are smartphones—more learning is on mobile devices than ever before. Karen Cator, United States Department of Education director of technology, said, "I think 2012 will see an expansion of a variety of ways of getting access to the materials that students need for learning." The confluence of gadgets and learning is yielding a rise in the blended learning movement. The DOE plans to spend $30 million over the next three years to bring blended learning to 400 schools around the country.

We’re seeing evidence of this in digital textbooks by Apple (handheld devices coupled with social media) and (on-demand textbook rental online) to flipped classrooms by Khan Academy, TED Talks, and YouTube to mobile learning worldwide via SchoolSMS (schools sending bulk text messages to parents, teachers, and students) in Kenya and Akash (touch-screen learning tablets) in India.

2. Social Learning and Collaboration

Students are taking responsibility for their own learning, and the lines between student and teacher are blurring. Learners can determine their strengths and weaknesses and connect with one another to help and teach each other based on their areas of expertise—all they need is Facebook and Twitter. Teachers are using platforms like Kickboard, TeacherTube, Edmodo, and Edutopia to share content and lessons with each other online so they don’t have to reinvent the wheel or keep content to themselves anymore. This kind of social and collaborative behavior results in teachers and students working together as peers (gasp!) on individual learning goals, thinking through solutions together.

3. Open Resources and Classrooms

Textbook companies have to change now that everything’s going digital, and top learning institutions are offering their courses for free on the Internet. Educational resources, data, and technology are becoming more accessible than ever through programs like MIT’s OpenCourseWare, CK12, California’s Free Digital Textbook Initiative, and The Faculty Project. In fact, the entire Bering Straight School District in Alaska has implemented a wiki-style format for its curriculum. Which city is next?

4. Adaptive, Personalized Learning

Formal learning environments are crumbling, and learners are finding their own paths to knowledge through independent thinking and experience in the real world. Teachers are figuring out how to teach each student the way he or she learns best, and assessment is viewed as an ongoing process, since learning is not a constant. Knewton and Grockit are leaders in this kind of personalized learning, with instruction and quizzes aimed at a student’s specific needs and skills, from K-12 all the way up to higher ed. Lady Gaga’s Empowerment Foundation, supported by Harvard University, is helping students discover their interests and develop strategies best suited to their own, individual future plans.

5. Creative Certification

Codecademy, General Assembly, Udemy, and Udacity all offer their students a certificate of completion, some signed by the instructors of their classes. As Scott Gray writes in the O’Reilly School of Technology blog, "No one, and I mean no one has figured out how to get end-users to pay for individual courses online unless they are attached to some kind of degree or certification." Using the legitimacy and/or reputation of the course-providing entity or the teachers themselves is a baby-step solution, though. There are serious concerns with badges when it comes to quality assurance and the superficial incentives they create. Those outside of companies with skill-building curriculae can’t obtain legitimacy in those skillsets without being an employee. The more people are culling unassociated resources and experiences to learn specific skills, the more urgent it is for there to be a place for them to record their efforts and success, to study with peers, and to present their learning portfolios to future employers or partners in a meaningful way.

What This Means For The Future

The world of "free radicals"—a term Behance’s Scott Belsky coined to describe those who take their careers into their own hands—is already hard at work on the Education Harvest, and they’re being followed fast by plenty of others. But, while independent learners are cultivating social, high-quality, personalized learning experiences, there’s a problem: There’s no universally accepted proof of purchase to verify the skill level of those who are self-taught.

The education paradigm of the future is all about the doers, not the academics or theorists. A paper degree won’t stand a chance against action. Start your own company, build a website, organize an event, get a side project, and you’ll make it. The accreditation of today is a powerful hybrid of tangible evidence of hands-on learning and social proof. Those who "course correct," so to speak, and let their passion and personal interests drive their self-powered knowledge acquisition, will succeed because of the portfolios of evidence they’ll naturally build as they learn by doing. Those who mentor and partner with them will endorse their credibility and provide the final link of trust.

Six of the top 10 Fortune 500 executives do not have an Ivy League degree, and 19 of the top 100 worked their way up to success without a college degree. Steve Jobs, Bill Gates, and Mark Zuckerberg all dropped out—or, opted out, we might say—from the traditional educational institutions that stifled their creativity and influence. Today, you advance in the world based on your performance, not a piece of paper declaring your expertise in "knowing a little about a lot of things." Employers are hiring for specific, narrow skills that aren’t fully learned in college, and they’re thinking, "What else have you done other than go to college?" when they field monotonously similar resumes.

There’s proof of a new anti-resume: my company, Skillshare, as well as Kickstarter, Mzinga, and The Unreasonable Institute refuse resumes from job applicants. Links are the new CVs, portfolios aren’t just for artists anymore, and experience reigns. The most important skill we’ll have in a world where 50% of people see self-employment as more secure than having a full-time job is the ability to go out and get the right knowledge for the right purpose at the right time. But it won’t be worth peas if there’s no way to measure the yield of our efforts. Welcome to Education Harvest.

Add New Comment


  • cookie crisp

    I think anybody arguing against this is a professor. or someone who did well in college so think its fine the way it is. I heavily digress. saying i don't want any self taught brain surgeons near me is funny. because the first of the first was self taught. I don't need to pay 20000 dollars a year to have people tell me what i can learn on my own. not everyone is going to college to be a surgeon. before college existed people were self taught. To me college is a bit of an insult. Its saying unless i come to this institution and get educated by the people here then there is no way i am capable of gaining that knowledge on my own. All my english professor did was babel about how he and his wife was on food stamps. That s what i need to go to school for? One professor quoted everything straight out a book i can read if your just going to tell me whats in the book why the hell am i paying you? The fact that people think college is necasary means the government already won by making you think that college teaches you things you cant learn on your own. Its also saying that to be a lawyer or a doctor everyone must amass the knowledge the same way or they are not worthy of being such. I do not need law school to learn and know law i simply don't. College is a scheme to regulate the amount of people entering a particular field at a time as well as slip money into the governments hands. Nursing was jumping off so they made it take 4 years instead of 2 to slow the number of new nurses. The fact that nobody sees this is funny. I think the social and economic systems are a joke and people have been brainwashed into believing that you need college to be smart. You need that piece of paper to get a good job in this society. However that paper is not a true reflection of your knowledge or capabilities and that debt should not exist. I know various politicians and lawyers were exposed for having fake degrees. The problem was they were exposed "YEARS" after. so if they where doing there jobs well while having fake degrees and having never attended college what does that mean? Oh wait it means you don't need to sit at a desk and listen to somebody talk for years to be able to work at a high paying job. Why do people let the government pull this BS. I just want to live on a remote piece of land and learn how to farm as the world goes to shit because people seem to willingly enjoy being enslaved by the government.

  • Jambrose

    The premise that self-study is radically new about to disintermediate traditional certifying bodies is an incorrect starting point.  The web surely makes the availability of information more accessible to enterprising self-learners, but this has been the case for the last several hundred years since the invention of movable type when books become the scalable delivery channel for the world's best knowledge. Everyone had access to the thinking of the greatest minds of the last several centuries, but this did not replace the process of critical thinking and certification that is our global education system. Sure, there is room for improvement and new technology opens up opportunities at the margin for the most enterprising. Sure, more information can be shared more quickly and efficiently than ever.  Sure, social technologies enable approximation of cohort learning.  But beyond self-learning passion, there is intellectual rigor and demonstrated aptitude that can not come without measurement ... and institutions will still serve a vital role in providing a high degree of validation.   

  • Jill Rooney

    As a professor, I am always worried when I read phrases like "self-taught learners" when they are applied in a generalized form.  I don't want a "self-taught" brain surgeon anywhere near me--or near anyone else, for that matter.  The assumption is always that everyone is capable of being a "self-taught learner"--but from what I've seen in my 20+ years in the college classroom, most students arrive in my classes without even knowing how to assess the value of the information they find.  The arguments in favor of self-teaching are predicated on a number of assumptions about the prior education and abilities of the learner.  And this is a dangerous path to go down. There's a reason not everyone is an expert.

    I also think there's a lot to be said for those who commit themselves to the pursuit of a formal education. It speaks to their ability to sustain learning over long periods and to be open to ideas from other people (and not the myopia of self-taught learning, which after all only has one response, while in the classroom there are often many to consider).

    I often wonder if the current antagonism toward completing a degree comes more from the sour grapes of those who cannot commit to the process (which is often truly legitimate) than from some overarching criticism of the educational system. When did getting a college degree become "elitist"? 

  • Anthony Arroyo

    I think that what we have here is a disparity between the Use Value and the Exchange Value of education. Teaching yourself how to do something (how I prefer to learn) has an extremely high use value, but not much exchange value. Going to Harvard (regardless of how much you actually learned) has a high exchange value, but dubious use value. Ideally, the use and the exchange would be somewhat similar. And while the "Just Do It" method is amazing, its scalability into other, non-creative, non-internetty professions may not be feasible. 

    I think that the best way to get around this is to encourage employers to develop their own entrance exams, something that is actually done in many other countries. This would make things (ideally) more meritocratic. It would actually benefit those who are self-taught, since they (we) are usually more task-oriented. Who cares that you weren't a computer science major, if you can code like one? And the advancements in testing technology will allow organizations to make these tests better and less cookie cutter. 

    That is my 75 cents. 

  • Jens

    I was thinking the same thing about the "non-internetty professions". When you're thinking about the (natural) sciences and mechanical & electrical engineering: these kind of fields need equipment, you have to work in labs or do research.

    That might be a bit difficult without institutions. Mind you, they won't necessarily have to be the institutions of today. And the theoretical parts can definitely learned in other ways than in a classroom.

  • pdxsays

    Assumes a lot - connectivity ubiquity, access to mentors, assured validation of content, and more. Web content has little validation to be accurate - the majority of the 40% of smart phone students are not hanging out on Harvard, MIT, or Howard Rhiengold sites. Many student think they learn more by multi-tasking with devices, when they are actually not retaining as much or comprehending at higher levels. There is much to be desired here - beware the dumbing down effect.  Once most have dumbed down, the majority will not be able to distinguish what is knowledgeable and the true intellectual will be an outlier to the learning community.

  • Ali

    Stanford stays on the forefront of innovation while still providing a brand-name diploma. Difficult, but not impossible.

  • superdo

    As to your comments on section 5:  I think it is up to the general public to police sites like CodeAcademy and the upcoming to keep them honest.  Free services are an excellent resource to learn by, but it should not take the place of a certification process. In any case, the work that people produce after taking such courses will speak to their skill level, not some badge of completion they earned.

  • Ian Lynch

    I run a new awarding organisation accredited by the UK government and with 3 transfer of innovation awards from the EU Lifelong learning project. We are essentially quality assurers. We provide technologies and services based on Open Source software to enable this to take place anywhere in the world. As long as competence can be demonstrated against the assessment criteria there is no prescribed method to get there. What matters is competence, competence and competence. The main issue with claimed competence is to get it externally corroborated by a reliable external source. This can be peer review as in the case of high level research but other methods will also be used.

  • Jade

    you taught me so much in this really informative little article - will now be investigating many of the platforms you mentioned. thank you!

  • andrewdeandrade

    You missed one of the best forms of accreditation. It already exists (for some professions) and will always be valued more highly than a certificate...

    Social "portfolios" of work completed, such as Github, Dribbble and Forrst are just three examples of sites that allow people to prove their worth to anyone wanting to hire them. Reputation systems will replace accreditation. Coursework will move away from testing and towards projects and activities that demonstrate tangible skills. 

  • Sheena Rajan

    The future is self-study for sure! Unfortunately, there is no accreditation for self-study - but in the end that doesn't matter because your portfolio and knowledge about a subject will speak for itself. 

    To take this further, I've noticed that most college students become conditioned to learn by lecture. When you're self-taught, you become accustomed to learning on the fly and take the initiative to do so. A very vital attribute to have if you're in the digital & tech industries which is constantly changing. 

  • David E

    You are so true.  After four years of college I finally have the urgency to stop "learning for the test" and start learning things that I will be able to translate into my career.  It's not what you say you can do, it's what you can prove you have done and can continue to do.

  • Brace Carlos

    Although, I think any form of higher education can never be viewed as a bad thing, I do think that the actual classroom model will always be preferred and appreciated.  I think we can not forget the importance of "actual" human interaction and networking that you can only get from "actual" interaction.  I think online degrees are good for those who are already in the field or in a company they like.

  • SimplyDefined

    The traditional college classroom has radically changed and it's going to force those in higher education to take a long hard look at what is unique to the college classroom - especially since I anticipate that many employers will catch up to this idea that a degree is not the only way to "certify" one's learning.

  • deniseinark

    As a former home-schooling parent, I applaud all of this.  One thing I didn't understand, however, is how one makes the necessary contacts to begin the web of networking.  Many times openings for interns and other workers are filled because the company has a relationship with the educational organization.  Since experience is the thing that innovative employers seek over credentials, how are these young people going to get the proverbial foot in the door?