Popular Posts

Monday, December 30, 2013

The Triumph of Triviata

Almost as soon as user generated content became an acronym, two rival interpretations appeared among cultural critics and technologists and seemingly everyone else.  On the one hand, someone like Web guru turned NYU professor Clay Shirky (Here Comes Everybody, Cognitive Surplus) seized on the democratizing, collaborative possibilities of the social, Web 2.0 movement.  Whereas Big Media once told everyone what was important (epitomized in antediluvian declarations like Cronkite's "and that's the way it is"), the Web was making it possible now for us to tell each other what we cared about; what was important.  To someone like Shirky, or Stanford law professor Lawrence Lessig (Free Culture), or Harvard technology theorist Yochai Benkler (The Wealth of Networks), it seemed that the Web was a kind of information liberation movement, destined to make all those passive readers of yesterday tomorrow's writers and trend setters and innovators.  It wasn't simply that we had more options with UGC--more things to look at and to enjoy--it was that we had an entire, revolutionary, technological means for large-scale social change and improvement.  "What gives?" was missing the point, and borderline nonsensical.  "What's next?" was the only relevant question.  As the popular Microsoft ad of the time put it (ironically referring to someone sitting at a computer):  Where do you want to go today? The answer, to the Web 2.0 enthusiasts and visionaries, was a resounding anywhere.

On the other hand, folks began noticing before long that much of the content generated by all these newly liberated creators wasn't worth much, to put it bluntly.  The LA Times attempted to capitalize on the new Web culture by allowing anyone to comment and even contribute to its stories; this lasted a few days, until the sheer magnitude of silliness and irrelevance and tastelessness peppering its woebegone pages forced an about face, and they discontinued the feature in disgrace (albeit quietly).  Other media giants like the New York Times or the Wall Street Journal of course launched "Web 2.0" online versions with comments sections, but they were notably safeguarded from the "mob rules" type of scenario that embarrassed the LA Times.  In general, it became apparent that while anyone could say anything and publish it online, editorial standards in the traditional sense were more, not less, necessary in such an environment.

Blogging became ubiquitous, entering into our lexicon shortly after appearing first as "Web logs", and gave voice to the common person, to be sure.  But most blogs were silly missives written by uninformed amateurs who either borrowed from actual reporting to regurgitate or expound on ideas and stories, or simply neglected serious discussion altogether, journalistic or otherwise, in favor of mindless off-the-cuff chatter about their significant others, their sports cars, or other desiderata that few others found worthy of reading.  A few blogs became important in serious discussions; most of the millions of others were scarcely worth knowing about.  Still, they were, all of them, "published" on do-it-yourself blogging platforms like Live Journal or Google's Blogger, and it was all readable to anyone who cared, and all UGC.  Similar observations apply here to amateur videos on YouTube, to "mashing up" content like songs by combining existing artists' singles, and on and on.  In short, sans the social change rhetoric, "UGC" was largely what one might expect, by the end of the 2000s:  lots of amateurish, often inaccurate, often mendacious, and rarely publishable (in the traditional sense) written and multi-media content, everywhere.  Crap, in other words.

The sobering reality of Web 2.0 when judged by traditional media standards should not, in retrospect, have been much of a surprise.  Viewed statistically, any large sample of the population will generally not happen to be award-winning journalists, novelists, musicians, or movie makers.  That's life.  But what was, perhaps, a surprise were the success stories, like Wikipedia.  Here, anonymous users collaborated in an open "Wiki" environment to produce encyclopedia entries, and as the project exploded in the early 2000s, with some famous exceptions, the quality of the articles appearing on Wikipedia seemed to confirm, not challenge, the idea that there could be "wisdom in crowds", and that Shirky et al really were prescient in seeing the transformative social potential of Web 2.0.  Fair enough.  But notwithstanding the successes, there was a deeper problem emerging that would pose more fundamental challenges to the technological revolution of the Web.  To see it clearly and at its root, we'll need to return to the issue of search, and to Google search in particular.





No comments: