By J. M. Anderson
We have entered a new digital era that appears to have made the traditional trappings of higher education--e.g., fixed curricula, going to lectures, even physically attending a college or university--about as necessary to getting a college degree as the telegraph is for sending messages. Out with hierarchy, structure, and the top-down approach to higher education. In with collaboration, more student input, and above all else, greater interactivity.
Let's call this disruption University 2.0, which promises to be every bit as revolutionary to higher education as Web 2.0 has been to the Internet.
In the old days (Web 1.0), the Internet was largely a passive medium through which users viewed web sites created by others and had little or no input on content or design. In the new era of Web 2.0, users interact, share information, add or modify content, and collaborate in communities, such as social networking sites, blogs, Twitter, and wikis.
As Lee Siegal describes it, "Web 2.0 is what the Internet calls its philosophy of interactivity. It applies to any online experience that allows the user to help create, edit, or revise the content of a website, interact with other users, share pictures, music, and so on."
Continuous Partial Attention
Similarly, the philosophy of University 2.0 is interactivity. It touts student-centered learning and greater engagement through more online courses and Web-based activities, such as embedded quizzes that provide immediate feedback, chat rooms, threaded discussions, and course videos. More importantly, it allows students to personalize their education, giving them more freedom of choice and power to customize it to meet their individual needs.
Once University 2.0 is fully realized, people will take whole courses or customize "select parts of different courses and combine them in different ways," writes Mark Taylor in Crisis on Campus.
In fact, they will be able to buy "any portion of a course--a single session, a few weeks, the entire semester." Even the relationship between professor and student is "considerably more collaborative."
This is enticing--who doesn't want greater freedom and flexibility?--but it is questionable whether University 2.0 will produce better educated graduates. Based on what we already know about how so-called Digital Natives use modern technology, it's more likely to discourage them from obtaining the intellectual skills and habits of thought and mind that higher education is supposed to foster.
In Everything Bad Is Good for You, Steven Johnson describes the meaning of interactvity in the era of 2.0:
posting a response to an online article; maintaining three separate IM conversations at the same time; learning the tricks of a new e-mail client; configuring the video chat software properly; getting your bearings after installing a new operating system. In other words, interactivity means keeping the mind busy.
The problem with continually staying busy and trying to keep tabs on everything, writes cognitive scientist Gary Small in iBrain, is that we can never truly focus on anything and do not have time to reflect, contemplate, or make thoughtful decisions. "Our high-tech revolution has plunged us into a state of continuous partial attention." The brains of Digital Natives are being wired up for "rapid-fire cybersearches," but "the neural circuits that control the more traditional learning methods are neglected and gradually diminished."
For all the time Digital Natives spend on-line, only 55% of teenagers demonstrated proficiency navigating the Web, according to a study by Jakob Nielson. The reason for such a poor performance is "insufficient reading skills, less sophisticated research strategies, and a dramatically lower patience level."
While University 2.0 assumes (perhaps correctly) that students may be more motivated because of greater interactivity, it can't work if they lack the intellectual ability and academic skills to perform at the college-level.
Not only are college freshman unable to write, analyze sources, or even cite sources properly, according to a recent Citation Project study, but they also are plagiarizing in greater numbers than before.
This is hardly surprising when an entire generation has been raised to assume that information is knowledge and that knowledge can be freely obtained on the Web. It's not that most students are lazy or unwilling to do the work; it's that they see nothing wrong with copying and pasting and presenting that "knowledge" as their own.
The problem is compounded because they have been taught from grade school that learning is simply a process of decoding information--like finding the correct answer on a multiple-choice exam. There's always a right or a wrong answer to every question or problem, and all they have to do is look it up. Who needs to think his or her own thoughts when the wisdom of Wikipedia is just a few clicks away?
Students need direction and guidance to overcome the mindset that knowledge is a commodity that can be acquired. They must learn deep thinking, precision, and the ability to sustain thought on complex ideas. They must acquire good habits like punctuality and the ability to behave and work together in groups. In a word, a college education should have a civilizing effect by providing them with models of what a civilized human being should be.
But University 2.0 isn't interested in this function of a college education. It encourages distraction rather than attentiveness, concentration, and intellectual discipline. It panders to short attention spans and the desire to be stimulated constantly. It turns higher education into a free-for-all by suggesting that one subject is as good as any other, and that students who follow their fancies will be equally prepared for life.
After all, today's college students take the kind of freedom that the Internet offers as a natural right, even though it leaves them like Rousseau's child on the beach:
When I see a man enamored of the various kinds of knowledge, let himself be seduced by their charm and run from one to the other without knowing how to stop himself, I believe I am seeing a child on the shore gathering shells and beginning by loading himself up with them; then, tempted by those he sees next, he throws some away and picks up others, until, overwhelmed by their multitude and not knowing anymore which to choose, he ends by throwing them all away and returning empty-handed (Emile, Book III).
Like the failed smorgasbord approach to the curriculum over the past forty years, University 2.0 turns students into mere tasters of information. They might leave college or university with full tummies, but they will also leave with empty minds.
J. M. Anderson is dean of Humanities, Fine Arts, and Social Sciences at Illinois ValleyCommunity College, and author of The Skinny on Teaching: What You Don't Learn in Graduate School.