Like reading, writing, and arithmetic, web literacy is both content and activity. You don’t just learn "about" reading: you learn to read. You don’t just learn "about" arithmetic: you learn to count and calculate. You don’t just learn "about" the web: you learn to make your own website. As with these other three literacies, web literacy begins simply, with basics you can build upon. For some it can lead to a profession (i.e. becoming a computer programmer) while for most it becomes part of the conceptual DNA that helps you to understand and negotiate the world you live in.
Our Information Age began, for all intents and purposes, in April of 1993 when the Mosaic 1.0 browser made the World Wide Web available—for free—not just for use but for contribution and participation by anyone with access to the Internet. Its decentralization, its open architecture, and its lack of a "director" or "owner" or even central switching point made the potential for worldwide co-creation of knowledge, art, science, literature, animation, and all the rest possible.
No one would have believed that peers could contribute knowledge and advice, helping one another to learn through YouTube videos, Wikipedia, or other sites. In fact, if you go back to 2000, before any of those things existed, you cannot find accepted theories of human nature, economics, or earning that could predict that those things could and would exist in less than a decade. No one guessed Wikipedia’s success, not even its founders. We simply didn’t know that, without a work plan, a lesson plan, or a taxonomy of what "counts" as knowledge, without leadership or payments or designated roles, people—non-experts—would build the largest encyclopedia the world has ever known, because we love to share what we know with others, and we’re even willing to spend endless hours creating our own community standards, editing, and making it right.
Why haven’t we had an educational revolution that takes advantage of this human quality that we now have proof exists? Making web literacy the fourth literacy begins with the premise that not only are humans capable of learning together—we’re doing it, contributing to peer learning online, every day of our lives. That is a major educational paradigm shift, the great gift we’ve been given by those who built the web on open architecture.
Web literacy explains the world we live in and gives us the tools to contribute to that world. You can learn enough basic HTML and CSS in a few months to be able to make your own simple website. You learn by doing it. As you learn more, your website gets better, and vice versa. But a website requires content. What content do you want to put up there? If you want to just take a Disney cartoon, you run into copyright problems. Learning the basics of intellectual property is part of web literacy. Or say you are 17 and want to put up some profanity or some scathing comments about a school pal of the kind you’d make in the halls. Do you really want that content, able to be data-mined and searchable by anyone, available for the world to see, now and in your future, by, for example, college admissions officers? Privacy, security, and web etiquette are other basics of web literacy.
Right now kids can go online outside of school all they want. Some schools drop iPads into the schools as if that makes kids literate. But if web literacy, including web programming, was adopted by every school as a fourth basic literacy, kids would not only learn how to code, they would learn about interactivity, collaboration, the melding of the artistic and the scientific, creativity, and precision. We’d also benefit from a far more diverse technology world if every boy and girl, from every economic, cultural, and national background, were learning about programming from the time they started school.
That’s why we need an alliance of technology and educators. If we’re going to truly change higher education to change the world, we have to begin by emphasizing web literacy as a required, basic, indispensable competency in the 21st century. To do that, we need a leadership alliance between education and technology developers—and higher education is a good place to begin since it has far more flexibility than K-12 and exerts a tremendous pull on the shape of all education.
Economics may not trickle down of its own accord. But educational requirements do. Witness all the AP classes high schools offer these days. You make something a college requirement, and pretty soon high schools change their curricula. Pretty soon, pre-schoolers are learning code—and they can too. Wonderful programs like Scratch, produced by the MIT Media Lab, are designed to introduce kids to basic programming concepts. They can then graduate to the web programming tools for grade and middle-school age (several of which Mozilla makes available for free) like Thimble, X-Ray Goggles, and Popcorn. They can make web pages, movies, animations, and more. It’s fun. It’s code. And, best of all, these resources are already available, for free, with lots of helpful guides for parents and teachers as well as peer learning opportunities and young webmaker communities.
The ethic and ethos of webmaking is, as Mozilla says as part of its 2012 Summer Code: "Meet. Make. Learn." That’s a great educational philosophy for the 21st century.