Sunday, August 26, 2007
War on Wikis: Critical Standards and Standards of Criticism
Also see Wikipedia's article on Why Wikipedia Is So Great.
And, of course, their article on Why Wikipedia Is Not So Great.
Information aside, the interesting thing here is the attempt at honest self-criticism. This is another important difference between the practices of the new media at its best and the way things are done in the old media world.
The standard for the new media is that criticism should be frankly acknowledged and frankly answered. It should not be ignored, dismissed or overborne by claims of expertise.
To say this standard is an ideal rather than universal practice is putting it mildly. A lot of new media practitioners don't do it this way and some of them are pretty scathing in the way they apply the older techniques.
The difference is they suffer for it much more.
The web is in essence a conversation. Communication is two way and neither of the participants has such an overwhelming advantage they can drown out the other party.
This was one of the first things I noticed when I started writing for the web over a decade ago. Articles on web sites drew a lot more comment, especially critical comment, than I was used to in magazines and newspapers.
Personally I thought this was great. While my fear of being wrong in public approaches an obsessive-compulsive disorder -- not uncommon in those schooled in the journalistic paradigm -- I appreciated the fact that I could learn from my readers. For one thing, it meant I could target future articles more accurately to what my readers were interested in. That's very hard to do with a magazine where there is a several-month lag between writing the article and getting the letter to the editor.
If you are writing for the web, or blogging, or social networking you have to accept the fact that your work is exposed to the opinions of others -- and those others have equally powerful channels to express their opinions.
In conventional journalism one party controls both ends of the conversation by deciding what gets into print. While letters to the editor are theoretically welcome, objections and comments are still filtered by the side with the publication.
As A.J. Leibling famously said: Freedom of the press belongs to he who owns one.
This produces some unfortunate behavior. Newspapers, for example, tend to alibi their mistakes rather than admitting them.
I remember one occasion when, as energy reporter for a major metro daily, I covered the explosion of a pole-mounted transformer belonging to the local power company. Such explosions aren't uncommon since the transformers can overheat in use. In themselves they're pretty harmless. What made this one newsworthy was that the transformer was one of the old ones whose cooling oil was laced with PCBs. That meant it was a hazardous material spill in a residential neighborhood.
Fine, but how much hazardous material? PR guy at the utility told me it was "several quarts" so that's what I used in the story. The next day I got a call from a reader who informed me the actual amount of oil in such a transformer is about five gallons. After confirming that independently, I told my assistant editor, implying that we needed to run a correction.
His response was that five gallons is the same as several quarts so we didn't need a correction.
It was his call, but to this day I think that story was misleading. However the assistant editor wanted to keep from printing a correction. The prevailing theory at newspapers and magazines in those days was that if you didn't admit a mistake you hadn't made a mistake.
While that attitude is still with us -- witness The New Republic's response to the wildly inaccurate stories from Scott Thomas Beauchamp in Iraq -- it works a lot less well even for the print media. Beauchamp's fables were quickly exposed by bloggers with military experience and The New Republic made itself a laughing stock with its lame and dishonest attempts to defend its position.
(Note: Because I'm rushed for time, and because the material in the blogosphere on this is so extensive, I am not going to provide specific links. For the military blogger's point of view I'd recommend 'Blackfive'. For an series of summaries of the controversy, see Michael Goldfarb's articles in the "Weekly Standard". Not an unbiased source, but a lot of links to blogs and other sources. And of course there's "The New Republic's" own statements. For an example of why TNR is a laughing stock, take a look at the investigation into TNR's "investigation" of whether a Bradley Fighting Vehicle could run over a dog in the manner Beauchamp described. For a summation of the culture at TNR that led to all this, as well as an example of the journalistic CYA mentality at work, see "How The New Republic Got Suckered"
The smart new media folks have learned this lesson already and public reaction is teaching it to the others. If you make a mistake, face up to it and deal with it honestly. Because if you don't it will be thrown back at you with hurricane force.
The War On The Wikis
Of all the manifestations of the new media, none has attracted the sheer level of bile aimed at Wikipedia, the online collaborative encyclopedia.
(In future posts we’ll examine some of that bile in detail. For now, let’s accept that it is so and move on.)
Part of that is a profound lack of understanding of how wikis work. Part of it is a knee jerk reaction against something a lot of people see as extremely threatening. Many of those people are right to feel threatened because the existence of Wikipedia and things like it is going to force them to change their ways or fade into irrelevance.
Let’s start with a simple fact. According to nearly every study, Wikipedia is about as accurate (in the journalistic sense of being factually correct) as the major print encyclopedias, such as Encyclopedia Britannica. (The best compendium of the studies, ironically, is on Wikipedia . However with the information there it’s easy to find either the original studies or reports on them.)
While the studies aren’t uniformly favorable (the Guardian’s panel of experts found material disorganized and not always helpful) they generally support Wikipedia against its real-world competitors.
Matching an encyclopedia’s accuracy is not an exalted standard, please note. Encyclopedias are notorious for being rife with errors, some of them deliberately introduced to prevent copyright violation. (The late Fred Saberhagen, who was an editor and writer for the Britannica as well as a science fiction author, confirmed that to me in a conversation at a science fiction convention several years ago.) Many of the errors are simple mistakes.
But no matter what their source, there those errors sit, like flies in amber until the next edition of the encyclopedia comes out, often decades later.
This leads to key point that Wikipedia’s critics consistently miss. Unlike printed sources, wikis have a powerful error correction method built into the process.
In fact Wikipedia is sometimes wrong. This is especially true since another front in the war on the wikis comes from those who want to co-opt them for their own purposes by feeding them false, misleading or slanted information. This has apparently become a growth industry among the rich and powerful, with everyone from the CIA to Microsoft editing Wikipedia articles in an effort to elide inconvenient facts.
If you look at this with the mindset of print scholars it is horrendous. If anyone can edit material, how can you possibly produce accurate, reliable information? Obviously you can’t and the end product must be utterly unreliable.
Except you can and it isn’t.
Not only have examinations shown Wikipedia’s articles are about as accurate as those in conventional sources, repeated tests have shown that incorrect information, even if very subtle, are almost always corrected quickly, often within minutes. There have been a very few highly publicized exceptions, such as the claim that John Seigenthaler, former editor of the Nashville Tenneseean , was involved in the murders of President Kennedy and his brother Robert, but those are both rare and very well reported when they do happen.
Part of the response to wikis is simply the normal human problem of appreciating the different. As one of my cultural anthropology professors was fond of saying, “different doesn’t mean better and it doesn’t mean worse. It means different.” Wikipedia, and wikis in general, use a profoundly different method of insuring reliable information. Because it is so different a lot of people have trouble believing it can work as well as the traditional reliance on experts. Except it does.
Some of the attacks are little peculiar. Lee Peoples, a law librarian at the Oklahoma City University Law Library asked his students to analyze a Wikipedia article on administrative law After they found out the piece had more than 50 authors, none of whom was named, the students displayed a ‘healthy skepticism’.
Skepticism about any source is good, authorship of a source is important, but the exercise seems oddly pointless as a test of accuracy. Administrative law is hardly terra incognita to a law librarian. Surely a better test would have been for Peoples or his students to compare the article to standard sources on administrative law to judge its accuracy. In fact it’s hard to see what Peoples’ example accomplished aside from scoring a rhetorical point.
So, differences aside, what’s going on here?
What is happening I think is that there are a lot of people, especially academics, who feel threatened by the easy access to information provided by the web. As a leading source of online information Wikipedia becomes the focus of that fear.
As we’ll see, one of the striking things about the vast majority of the attacks on Wikipedia is their fevered defense of experts and material “created by scholars” and “published by reputable publishers” as the only true source of knowledge. (The quotes are from Michael Gorman, who we will meet in more detail in later posts.) The defenders have a point, but it is vastly overstated.
Indeed it reminds me mightily of the defenses put forward by Catholic theologians during the Reformation against the chaotic, pernicious and dangerous notion that anyone but an expert could interpret the Bible.
Which is, as Karl Marx was wont to say, no accident. Neither is it an accident that the people who are making these defenses of the status quo are largely the academics, intellectuals, librarians and others who have the most to lose in this epistemological earthquake.
(Not all of the hostility is from academics however. For example the “Weekly Standard” makes a habit of ladling out healthy doses of British snark on Wikipedia and the Standard is by no means a haunt of academics.)
In other words, besides some very real and serious concerns about new information sources, there is a lot of protectionism by those who are proactively protecting their oxen from a severe goring by the new media and new information sources like Wikipedia.
Monday, August 20, 2007
Amateurphobia And Roiling The Clam Bed
One of the reasons our public debates go careening off at odd angles is that we, and especially Americans, have all the historical sense of a colony of cherrystone clams. (To steal a phrase from the greatest bathroom reading every written: Harvard Lampoon’s Bored of the Rings)
The result is that one of our culture’s most popular characters is the Viewer with Alarm. Indeed some authors, notably the late Vance Packard (The Hidden Persuaders, etc. ad nauseum) have made long and prolific careers out of Viewing With Alarm. This has the advantage of providing a dose of unintentional humor when one comes across their forlorn relics on dusty shelves 30 years later. However the amusement is tinged with regret when the reader realizes that people not only bought this stuff, they bought into the arguments as well.
The instant Viewer With Alarm is one Andrew Keen, a persistent critic of Web 2.0, whose work, The Cult of the Amateur: How Today’s Internet Is Killing Our Culture is providing a rich source of blog fodder all across the spectrum.
Because democratization, despite its lofty idealization, is undermining truth, souring civic discourse, and belittling expertise, experience and talent. As I noted earlier it is threatening the very future of our cultural institutions.
As a Viewer With Alarm, Keen is long on hyperventilation and rather short on substance. Lawrence Lessig, one of Keen’s targets, calls his book an exercise in unintentional parody. Before shredding Keen’s critique of his position, Lessig summarizes one of his central arguments thusly:
“(Keen) tells us that without institutions, and standards, to signal what we can trust (like the institution (Doubleday) that decided to print his book), we won’t know what’s true and what’s false.”
If Keen really believes that, or anything close to it, he is truly one of the great blithering idiots of the 21st Century.
However anyone who’s ever been a professional journalist knows, blithering idiocy has never been a bar to success as a Viewer with Alarm. Some people are bound to take you seriously.
One such person is Tony Long over at Wired
“But one of Keen’s central arguments — that the internet, by its all-inclusive nature and easy access, opens the door to amateurism-as-authority while at the same time devaluing professional currency — deserves a full airing. Basically, I think he’s right to criticize what he calls the “cut and paste” ethic that trivializes scholarship and professional ability, implying that anybody with a little pluck and the right technology can do just as well.”
To give Keen credit, he has noted one of the consequences of the internet as a medium. Just about anyone can use it and much of what is produced is narcissistic drivel. (The proof is about two mouse clicks away from you.) That conclusion is, perhaps, worthy of a bumper sticker (mine reads “The experiment has begun: A million monkeys at a million typewriters. We call it the Usenet.” Obviously it’s a very old bumper sticker.) Stretching it into a book is the intellectual equivalent of homeopathic medicine.
And that dumps us right in the middle of the clam bed.
What people like Keen utterly miss, is the reaction of the audience. The audience adapts to the medium, which historically has meant that they have learned to analyze what they’re reading, hearing and seeing.
If you approach the internet with the same skill set people used to apply to newspapers, you’re in a lot of trouble. Just as you would have been in the 19th Century if you had read the new penny press with the same degree of credulity that people applied to the old-line newspapers. Or if a 16th Century reader applied the same degree of belief and acceptance to the mass of newly printed religious tracts they he or she did to what the priest said at Mass.
But they didn’t and we don’t, at least not for long. We, or most of us, are learning to exercise critical thinking skills to evaluate what we’re finding. We are learning to ignore the junk and to sift some semblance of truth from the fiction.
There’s something else at work here as well. That is the growing ‘professionalisation of amateurism’. In other words, even amateurs can learn and they generally do.
This sort of phenomenon that has Keen’s knickers in a knot happens nearly every time a new medium opens up channels of discourse. There is a huge outpouring of mostly really dreadful stuff as people take advantage of the new opportunities. Some of the stuff remains resolutely dreadful. But some of it becomes excellent.
Early in my journalistic career, the availability of cheap offset presses and photolithography as an alternative to hot metal type meant anyone could publish a ‘newspaper’ with little capital investment and even less skill. A lot of people did and the ‘underground press’ was born. It was rife with all the evil effects of amateurism Keen Views With Alarm. However mixed in with the dreck was an increasing amount of good work — what eventually became known as ‘alternative journalism’.
Similarly, when personal computers and cheap (relatively!) laser printers made desktop publishing possible, much of the early work was stomach-churningly awful. But it also opened up new channels of communication and became the foundation for the way we produce and distribute printed information today.
So once more we will see the cycle played out. And once more, despite people like Keen, the result will be ultimately beneficial.
Sunday, August 19, 2007
Eisenstein's Changes: Printing and the Web
In “The Printing Press As an Agent of Change”, Elizabeth Eisenstein listed six consequences of the shift from handwritten manuscripts to printed material.
Eisenstein’s consequences of printing were:
1)Dissemination
Printing spread information far and wide in books, pamphlets, newspapers, broadsides and other material. In the space of a hundred years or so, Europe went from being an information-poor society to a (comparatively) information-rich one.
2)Standardization
Everything from spelling to language became more uniform. It’s no accident that English as a unified language really emerges during the period when printing became popular.
3)Reorganization
With printing came the ability to organize material more effectively. In fact as information proliferated, organization became more important. Arranging information alphabetically really got its start during this period.
4)Data collection
Printing didn’t exactly make data collection easier, but it made it possible to spread the results of the data collected far more widely. Now scholars and merchants hundreds of miles apart could be sure they were working with the same information. It also meant that mistakes and inconsistencies became more obvious.
5)Preservation
With hundreds, or thousands, of copies of texts produced at a time and spread far and wide, the chances that a work would survive became much greater. Even when the religious or secular authorities tried to suppress a work they had much less chance of success.
The Catholic Church may have attempted to suppress Galileo’s “Discourses Concerning Two New Sciences” after it was printed, but before they could do so copies had already reached the Protestant nations of Europe and Galileo’s discoveries were safe.
6)Amplification and reinforcement of existing trends
Everything from nationalism to Protestantism to science to the rise of vernacular languages for scholarly communication were well-established before printing arrived, but printing made all those trends more powerful and spread them more rapidly.
So now it’s 500 years later and we’re in the middle of a shift that’s at least as big as the introduction of printing and moving a lot faster. It’s instructive to compare how Eisenstein’s big six stack up on the internet and other new media.
1) Dissemination:
A big win. The new media spread everything from pornography to philosophy to physics around the world far faster and more efficiently than printed matter ever could.
2) Standardization:
A necessity. Where printing encouraged regional and national customs, languages and views of data, the new media encourage international standardization of everything from language to presentation.
In fact technical standards, from HTML to RSS are central to the new media.
HTML provides a particularly instructive example. Ten years ago, back in the Web Paleozoic, it was accepted practice to use all kinds of non-standard tricks and clever hacks to design ‘killer’ web pages.
Reading books on web design from that era can give you the creeps.
The problem was that these non-standard methods actually interfered with communication because they broke browsers and sometimes even crashed computers. However it was impossible to convince some people that non-adherence to standards was a bad thing. They honestly didn’t understand why they couldn’t use their little hacks to make their web pages look just so. (On their computer and their browser, of course. But hey…)
The other problem goes well beyond web page design. The new media provides us with a raft of tools to aid standardization and we come to rely on them, sometimes to our detriment. The new media are still very much a work in progress and some of our tools just aren’t smart enough to do the job.
Case in point: Spell checkers and the dreaded homonym/homophone problem. Spell checkers ‘know’ that ‘two’ is a word. But so are ‘to’ and ‘two’. If you rely on the spell checker to give you the right variation you’re going to produce anything from occasional illiteratisms to howlers that can make you look like an absolute idiot.
The problem, of course, is that we expect standardization to the point where we’re upset if we don’t get it. We used to deal with this with paid experts called copy editors, or constant reference to dictionaries. Now we just rely on fallible tools and suffer the consequences.
3) Reorganization:
This one is really interesting. In print the reader has virtually no control over organization and presentation of material and even the author frequently doesn’t have much. In the new media we have almost total control over the organization of the material.
In fact we’re so used to being able to reorganize data to meet our whims, never mind our needs, that we feel cheated when we can’t shuffle the data around to suit our fancy.
Case in point: Wordpress.com, the site that used to host this blog, has a number of limitations on what you can do in terms of data organization and presentation. The themes (page templates) in particular suffer from a number of irritating limitations, such as not being able to have a list of previous articles automatically appear on all the templates. That lack of flexibility in data presentation is the main reason I gave up on them.
4) Data collection:
The biggest win of all. The new media are data sponges, soaking in every conceivable kind of data and making it available everywhere in the world. In fact our biggest problem is not drowning in the information tsunami.
That is, of course, one of the big reasons that reorganization is so important. Selecting the kind of data we’re interested in and the view of it we want helps use swim with the data wave instead of being overwhelmed.
5) Preservation:
A major, major lose. Data on the web is not only ephemeral, perhaps worse it is subject to change without notice. What was there last week may not be there today – or it may have been changed to eliminate embarrassing, damaging or particularly useful information.
History, as defined on the web, even more mutable than it was for Winston Smith in 1984. The Ministry of Truth in Orwell’s dystopia had to physically recall and change books before shoving inconvenient material down the memory hole. With the internet, there’s no need to recall anything and the memory hole is as close as the nearest computer.
Factual information disappears in a heartbeat as well. There’s usually a huge loss of information at the end of every semester, as thousands of students leave school and their web accounts are closed. The fact that some of that material is extremely useful to a fair number of people doesn’t protect it. It vanishes silently away, leaving only 404’ed references on Google – and wailing and gnashing of teeth among those for whom the information was important.
6) Amplification and reinforcement of existing trends.
Macrocosmically, yes. Microcosmically, yes and no. The new media reinforces major historical trends like delocalization, disintermediation, specialization and anti-mediocrity.
“As Maine goes, so goes California – except twice as far and four times as fast.” The old joke isn’t quite so funny any more.
All these macrotrends can be traced back hundreds of years, some of them into the High Middle Ages, even before printing. The new media reinforce and accelerate those changes.
At a micro level, the Long Tail Power Law applies with a vengeance. That is to say that the new media offers an enormously wider range of choices in everything from friendships to industrial suppliers. The counter-intuitive result of that is that a few of those choices become enormously more popular than the rest but that the aggregate of the less popular choices is likely to be much greater than the volume of the few enormously more popular choices.
In other words, Britney Spears becomes enormously more popular, but if your tastes run to the pibroch (the classic music of the Highland Bagpipe), you have a lot more choices available to you as well.
The other thing that happens is that as the choice space expands exponentially, people are likely to discover things that they like more among the choices they didn’t know they had before. That tends to produce rapid, and rapidly fluctuating, shifts in relative popularity of the various choices. Today it may be Britney Spears, tomorrow she may be replaced by someone you haven’t heard of yet.
Fungal Journalism, Michael Yon and his ilk
If you want to understand Iraq, Michael Yon is indispensable.
“Clearly, context like this is not well-served by the adversarial frame of “mainstream-versus-alternative-news.” Reporting from this war is deadly serious business. Deadly for the reporters, but how they report can also be deadly for us all.”
However if you look at the view from 10,000 feet, Michael’s work exemplifies another kind of truth. His recent dispatches exemplify that there is a “mainstream-versus-alternative news” split. And the truth is that division is critical to the free flow of information a democracy depends on.
To understand why, let’s back up a couple of times around Robin Hood’s barn and start our voyage of journalistic exploration with a humble fungus.
The result of this monoculture is that if you know what you’re doing you can play the mainstream media like a pinball machine. By understanding the process and the mindset you can insert toxins into the flow of information as surely and as ruthlessly as Bipolaris maydis invades the stomata of corn plants. The fungus destroys the energy producing centers of cells. The journalistic attackers hijack the process to spread their own propaganda.
International terrorists may be the latest players of Fungal Journalism Pinball, but they still aren’t the only ones. Increasingly our media is held captive by the manipulators and sometimes the results are truly absurd.
The Answer
This has potentially disastrous consequences for the United States because a democracy utterly depends on a flow of accurate information to its citizens to function effectively. Increasingly fungal journalism has systematically distorted that flow, poisoned it as effectively as the Southern Corn Blight Fungus destroys the energy producing centers in corn plant cells.
Resources:
http://www.michaelyon-online.com/wp/tabula-rasa.htm
The article on the massacre at al-Hamari (warning! Graphic)
http://www.michaelyon-online.com/wp/bless-the-beasts-and-children.htm
The follow up on the media non-reaction to al-Hamari
http://www.michaelyon-online.com/wp/update-on-bless-the-beasts-and-children.htm
The report that motivated this blog
http://www.michaelyon-online.com/wp/second-chances.htm
You can read more the 1970 Corn Crisis here
http://www.cbwinfo.com/Biological/PlantPath/BM.html
and here:
http://www.sciencemag.org/cgi/content/abstract/171/3976/1113
and more about the perils of monoculture in agriculture here:
http://aboutbiodiversity.org/agbdx/cornblight.html
and here:
http://oregonstate.edu/~muirp/cropdiv.htm
For an account of Steorn and Orbo, see here:
http://www.itwire.com.au/content/view/13359/1103/
for a list of stories, see:
http://peswiki.com/index.php/Directory:Steorn_Free_Energy#Other_Press_Coverage
The Guardian reporter indicates skepticism, but the overall tone is favorable and the paper published it anyway. A classic example of a successful fungal journalism attack.
http://environment.guardian.co.uk/energy/story/0,,1858172,00.html