2012-04-26

Co.Exist

Reinventing College To Prepare Us For The Future, Not The Past

College was designed to prepare students for a 20th-century economy, and it’s not catching up fast enough to the realities of the modern world. How can we overhaul the entire system?

The witty name of the indie Scottish band We Were Promised Jetpacks reminds us that, not so long ago, “futurism” looked like the Jetsons TV cartoon. In the future, we were told, we’d be zooming around like George in a techno-topia where time and space could be mastered but where basic social relationships were pretty much unchanged. If that version of the future had panned out, think about all the rules and regulations we would have invented by now for training kids for responsible flying.

We would have ways for testing them to determine their maturity for solo flying. White-knuckled parents would be taking young teens around the 'hood for an airborne spin and there would be professional driving schools to make sure they learned the rules of the airways. High school would teach the safety and ethics of jetpack-powered life. Kids would be cramming for the written test, practicing for the airstrip test, and would anxiously wait out the year-long learner’s permit. They would be taking summer jobs to help pay for the expensive jetpack insurance, inflated for the risky under-25 crowd.

Needless to say, the Jetsons version of the future didn’t pan out. Instead, the whole social, economic, media, and communication fabric of life has changed dramatically since April 1993 when the Mosaic 1.0 browser went public. Historian Robert Darnton says our publish-yourself, YouTube, Facebook, Twitter era of customized, globally connected interaction is up there as one of the most important developments in all human history, alongside the invention of writing in 4000 B.C. Mesopotamia, movable type in the Renaissance, and Industrial Age mass printing as the fourth great Information Age. Yet, so far as I know, there wasn’t a single Hannah-Barbera cartoon that imagined George and Jane patiently teaching little Judy and Elroy Jetson HTML in order to zoom the World Wide Web. A 16-year-old can upload just about anything to the Web and have it available worldwide to anyone else with an Internet connection. No flying license required.

And that’s the problem. My college students are the first generation to come of age in humanity’s fourth great Information Age and their entire education has been designed to train them for the third. They have been learning with all the forms, formulas, strategies, and assessment methods that the late 19th- and early-20th-century educators engineered to get them off the farm and into the factory, out of the shop and into the firm.

In 2012, to be middle class meant going to college. That is the end, implicitly, of K-12 education in the U.S. and pretty much everywhere in the developed world. And the be-all and end-all of all that learning is designed to prepare us for the 20th century, not the one we live in.

Grades, IQ tests, timed individual achievement tests, machine-gradable multiple choice tests, teaching to the test, the credit hour, the Master’s in Business Administration, Scholastic Aptitude Tests, GREs, Law Boards: All of these are part of what I call “scientific learning management.” That’s the educational arm of what the great theorist of industrialism, Frederick Winslow Taylor, called “scientific labor management,” the emphasis on production quotas that kept the assembly line running smoothly and led to the creation of MBA programs (he taught in the first independent one, at Dartmouth) to oversee mass production. Putting kids in rows, all starting at the same age, beginning at the same time, dividing knowledge up into subjects taught in 50-minute chunks was all part of retraining a humanity for the regulated time, regulated work, and predetermined, regulated productivity that were key to the division of labor, home from work, labor from leisure.

How do we train youth today for a smartphone world that brings the workplace to your ear 24-7 or where your email at work also brings every imaginable distraction? What do we do to re-jigger scientific learning management for a world where self-organizing networks can bring down governments or be surveilled and exploited by them? What about our educational system takes advantage of the Wikipedia world, where humans volunteer not only their knowledge but their editorial skills to create the largest, most multi-lingual encyclopedia the world has ever known? Networked forms of communication change human interaction, human attention, and human labor. How do you reform education for this world?

That’s the question I’ll be addressing in a series of posts for Co.Exist on changing higher education to change the world. I will connect the institutions and practices of higher education not to the imagined future but to the very real present we live in now.

We’ll look at re-imagining general education and even Great Books core thinking as an interdisciplinary, problem-based curriculum, with service learning and entrepreneurship built right into the readings from Plato and Levinas. We’ll focus on method, and what happens when students are allowed to take charge. And we’ll discuss the challenge of taming the cost of higher ed.

Nearly 50% of students entering university do not finish at that university. Where they go has not been fully studied, but survey data indicates that they drop out because of a combination of cost, boredom, and lack of real-world relevance. We’ll look at the current situation that contributes to those three reasons and pose some new possibilities, with examples of universities and even individual professors who are addressing the issue.

Add New Comment

16 Comments

  • Tom

    What a magnificent description of the dominant thinking about education, not only at the college level but also K-12.  I imagine the author would think that what she proposes could indeed be embodied in K-12.  It's all about preparation, isn't it?  What if I suggest that preparation means the end and what happens in the schools is not in itself the end, but the means?  And how most of the comments share that view.  Paul St. Amant, Michael Magee  and John Mack do hint at another view of education that embodies learning--not as a means to some other end, but an end in itself.  But no, schools were constructed precisely to tell the students what they must learn; not to find out what the students want to learn.  Employers like students with diplomas or degrees because--let's face it--they tell the employers those students have been house-broken; they have taken orders, whether they liked them or not; have been told what to do ("pay attention, you'll be quizzed on this"), and they've done it, like it or not.  Pass the tests, do the assignments, show up for class--all will get you that diploma or degree so you can get out there and get a job.  Look, we all know students learn far, far more outside of school than inside.  We're born learners.  Look at what infants learn without a school.  Youngsters what to pursue what interests them, what they find curious, and to become more self-aware and self-controlling.  Some do it, or some of it; but a whole lot don't.  The schools have their own ideas of what to learn.  

  • Vic Williams

    It's really a cultural pattern. Shared by slow-to-change medicine, eating (google on "eat your peas"), drug war/laws, etc.Nibbling at the symptoms won't nudge the pattern much.

  • David Grebow

    Great and timely post thanks! As an educator I see a large part of the problem being the pre-21st century "event-based model" of learning. The model has a specified place and time scheduled for learning. Think classroom (actual or virtual). Used as the basis for education it supports the 'farm-to-firm" go as you learn approach. You have the time to learn, then get tested, pass (or not), and go into the field in which you received your degree-certifications-alphabet soup. Go-as-you-learn.

    In the 21st century and onward, thanks to technology, knowledge and know-how change more rapidly in one year than in all of recorded history. It's a learn-as-you-go world in which learning, thanks again to technology, happens anytime and anywhere. The old model indicates what we learned when we received our paper that proved we had passed the test. The new model needs to include a way to track our continuous learning so we can let others see what we know and know-how to do, instead of what we knew and once knew-how to do.

  • John Mack

    Overstated. Students go to college and are asked to learn how to think, how to solve problems, in some cases how to be creative, in others how to perform within a specialized body of knowledge. Technology does not change any of that. Technology is just another method - bio now uses 3D illustrations on a  computer, but it still studies the processes of cells and anatomy. Math uses 3D slopes, etc. but it still studies traditional logic systems using mathematical symbols and procedures. Literature and the arts still deal with the contradictions and conflicts of the human mind and heart. Accountants still learn accounting methods, although using technology not used 20 years ago. Environmental science is now far netter taught, using a combination of field studies and technology. New fields of study - genetic engineering - have been introduced. Et cetera.

    People also go to college to make friends, develop allies, tap into networks. technology can get in the way of that.

    I would imagine that at some point there will be majors or minors on how to build a local economy, using many ofv the tech innovations presented here in co-exist. Of course developmental economics could benefit from knowing about these innovations as well and how they are being put to use.

    Technology is best taught as a basic model to be modified and rapidly advanced in a tech company. That is happening even in college through internships. Stusenst grounded ina  basic model quickly adapt to systems beyond the model.

    The most rigid field in universities is Economics. It often uses technology to advance a theological, superstitious view of how the world "must" work.

    I don't get it. What exactly is this is big nebulous paradigm shift? Are you planning to be rid of the sciences, the humanities, business studies that extensively use technology as tools, not as ends in themselves, etc.?

  • Michael Magee

     Great post, I read it through a few different sets of lenses, I've just finished a PhD in how personal epistemological belief structures interact with players learning how to pay video games, I develop curriculum for a post-secondary and I've run and exited a tech start-up. I'd have to say that the PhD I completed and the curriculum I have developed are firmly fixed in a different age and the post-secondaries I've been involved with almost have a romantic vision of their place in the world. I still remember one conversation with a committee member about the relevance of universities in the modern world. He looked at me and said "But they need us!".
    Based on my contrasting experiences in the start-up world I'd say that was true. I think where universities fail is in communicating their relevance in a rapidly shifting world. Their research and critical thinking skills are more relevant than ever but they insist on putting them into an academic publishing framework that is inaccessible and unintelligible to most people in the real world. I spent time with some extremely successful entrepreneurs who had dropped out of university to make millions. They could have really benefited from going back to school and getting a deeper understanding of the philosophical and historical perspectives when it came to thinking critically about the decisions they were making. They operated almost totally by intuition and lacked the self-awareness that comes with the kind of education that would have made them grow intellectually. They could have all benefited from a greater level of cognitive flexibility in their approach to problems. Even if they did they were aware of this deficit they were at a loss about where to go to develop it. Their initial experiences with university had given them a perception that they wouldn't improve those abilities by going back to school.
    So yes, schools need to change but they are also going to have to work against some very negative stereotypes both held and communicated by some of the more successful people in the new information age.  

  • gheadd

    Thanks for introducing this discussion.  As an adjunct at a junior college, I truly understand the issues here at our college.  We also have an opportunity to bring about this paradigm shift as that older generation retires, if administration can foresee the problem!  Which creates another issue all together, those who design new cirruculum and create academic policy are not abreast of the issues you've articulated in this discussion.  Or if they are, they do not have the courage to challenge the system due to the position of stakeholders that control our funding!

  • Guest

    What if the problem is not university and university classes, but everything else? What if we're doing a disservice to students by preparing them for what is ultimately a hollow world of buzzword-ideas like "networking" and "entrepreneur"? I'm not old (35) and I've only been teaching in universities for a few years, but I'm concerned that the "real world" you're trying to be relevant to is the problem, not traditional arts and humanities. Like, should we really be training students for a job that will have them on their phones 24-7, or should we train them for something better? Why do we need to prepare students for an ultimately shitty world at the expense of things like close-reading and critical thinking (activities students often become bored with)?

  • Kenny

    This is excellent, can't wait to read the rest of the posts. I have had many lengthy discussions with my piers at the Claremont Colleges about this subject and am glad to see that someone is finally addressing it. 

  • Aditi Ravi

    Very well said. I'm glad somebody is talking about this, particularly somebody from the academy. The paradigm shift in learning and technology needs to be addressed rather than asking students to flock to universities and accumulate redundant skills. Great idea!!! 

  • Darla B.

    The reconstruction of higher education has never been more relevant and I think it is high time that the issue is addressed and struggled over. I look forward to seeing where this goes. Best of luck!

  • Generational Justice

    If you think the university can keep up with technology, you're insane. To get a PhD in Computer Science is to be a laughingstock. By the time you finish "in-depth academic study" on anything, it's outdated. So, through the university system's perverse credentialism you have the obsolete teaching the incompetent.

    These are people who still think Marx is relevant. You also seem neglectful in your reading of Taylor, his concept of scientific management had to do with managing process, not the barriers to entry.

  • Trevor Linton

    Computer Science does not teach people technology or programming.  Those are left for technical schools. a PhD in Computer Science may study algorithms for machine learning and building artificial intelligent beings.  Computer Science is best put, applied mathematics, not technology. Hope that is clear.

  • Tom Tresser

    Fantastic subject. We're on it at The IIT Stuart School of Business where I just finished teaching the second offering of "Got Creativity? Strategies & Tools for the Next Economy" and the first offering of "Strategies & Tools for the Social Change Agent." Details are at http://tomsclasses.wordpress.c.... Would love to hear from other B-School instructors who are creating classes that teach design thinking, new economic models, cooperative economics, sustainable practices and creativity. tom@tresser.com

  • Paul St. Amant

    Thanks for referencing Taylor.  He was so bad to work for, but so genius.  I loved college, but I didn't learn half as much as I did after I graduated.  So much empahsis is put on those 4 years.  It is an important coming of age time that shouldn't have such a hefty price tag.  On thing the Internet offers is a chance for high quality life-long learning opportunities.  Udedmy, Coursera, Udacity, Khan, etc, must be embraced and developed so the internet and all of it's social elements can be used for a positive learning experience and not necessarily a distraction.  Flattening the higher education sector might be just what students desire to engage real-world relevance.  I also look forward to your future articles.

  • Spryorjones

    I'm looking forward to the results of this work.  I believe one of our biggest tasks will be to organize and "certify" Out of School Time and other forms of independent learning that technology has made all the more plentiful and accessible.

  • David

     Totally agree. It's all tied together since the very idea of formal diplomas, degrees, certifications and resumes is part of the 20th century model about which Cathy writes. An early trend is emerging in Silicon Valley where several companies no longer accept resumes for a job applicant. And the whole idea of highlighting you past accomplishments is ludicrous in the 21st century when the shelf-life of knowledge and know-how is shorter than ever and the availability of what we need to learn is now anywhere and anytime.

    Schools, colleges and universities track and record your courses. Companies use Learning Management Systems to do the same for the formal education and certifications they provide. Yet all of the learning done independently, the learning that keeps a persons skills, knowledge and abilities up-to-date is not tracked.

    We need to bring the model into the 21st century.