Wednesday, March 19, 2008

READING JOURNALISTIC TEA LEAVES -- AND GETTING IT NOT-QUITE-RIGHT

The Center For Excellence In Journalism has released its 2008 report on the state of journalism in America. The report is a gold mine of information for those with a serious interest in journalism. However almost no one is reading past the executive summary (the thing is huge) and the summary's conclusions and (mostly implicit) predictions for the future are questionable.

Basically the report, and a lot of the commentary, reflect the usual problem with making sense of a mind-bendingly revolutionary technology .
(Hereinafter MBRT for ease of typing.) That is that when we're faced with something mind-bending and imperfectly understood, the natural instinct is to fall back on our preconceptions of how the world works -- i.e., our prejudices.

Now preconceptions about the way the world works come in two general flavors: The conventional and the radical. The conventional view is that whatever it is isn't going to make much difference, thing will keep on working the way they have, and all the fuss is overblown. The radical preconceptions sees the MBRT as something that will remake the world into the image the particular radical preconceiver finds most attractive. Thus, socialism becomes a cure for body odor on the subways, conserving energy will lead us to a green utopia, radio will provide rich culture to the masses, computers will set us all free -- etc., etc., et dreary cetera.

(There's a third form of preconception, which is apocalyptic. Whatever it is will destroy us utterly. This tends to quickly shade over into forces driving us to the particular preconceiver's utopia. Witness Marxism and the reaction to the atomic bomb.)

The problem, of course, is that preconceptions are no better guide to the future than they are to the present. If you're looking at a truly MBRT (and they're rarer than proclaimed) it does change the world, but not totally and not in the ways or to the degrees the radicals or the apocalyptic want to believe.

The journalism report, coming as it does out of the journalistic mainstream (not to say the journalistic establishment) is pretty throughly conventional. It discovers much in the report's data to support the conventional view of the impact of the new media. As such it stands in sharp contrast to the radical view of the new media and journalism. IMHO both sets of preconceptions are equally wrong.

One of the report's points which is getting a lot of play is that the 10 most popular online news sites are mostly products of the world of conventional journalism. Whether it's a journalistic organization such as the New York Times or an aggregator such as Google or Yahoo.

That's true. However the conclusions the report's summary draws from that, and other data, is much more questionable.

The verdict on citizen media for now suggests limitations.

Now I'd like to maintain that no one ever claimed that "citizen media" didn't have limitations. Unfortunately it wouldn't be true. One of the phenomena associated with MBRT is that some people see it as doing away with everything that went before and utterly reshaping the landscape -- not surprisingly, in their own image.

Consider, for example, the"Computer Liberation" of the 60s and 70s, built around Ted Nelson's book of the same name. "Computer Liberation" is worth reading today to see how the predictions of MBRT do -- and don't -- come true.

The purpose of computers is human freedom.
-- Computer Lib, 1974
(Ted Nelson)
The parallels are instructive. And Ted is both brighter and more rational than a lot of the radicals sounding off on the impact of the new media on journalism.

The summary continues:

The prospects for user-created content, once thought possibly central to the next era of journalism, for now appear more limited, even among “citizen” sites and blogs.

Here's where preconceptions start to run the authors off the rails. This statement is only true if you bought into the massive hype about the new media destroying the old journalism. Of course it's done no such thing and it's not likely to.

What it is going to do, just as personal computers did, is to change the landscape fundamentally. Most user created comment is, and will continue to be, meta-comment. That is comment on other sources of information. Currently the most accessible of those sources are conventional media.

However conventional media are beginning to change in response to the new media. Some of the most important changes arise from the fact that the conventional media are no longer the only ones with a megaphone. Others come from the availability of information from much more diverse sources.

Google Headlines may aggregate conventional news sources, but it aggregates hundreds of news sources from all over the world. This gives a much broader picture of what's going on. And if you want more information DAGS (Do A Google Search) for the background, reports, documents and all sorts of other information. Don't want to do it yourself? There are usually a lot of bloggers and citizen journalists out there who will point you toward sources.

News people report the most promising parts of citizen input currently are new ideas, sources, comments and to some extent pictures and video.

First, note whose perception they are relying on: Journalists. That's an interestingly self-referential filter. Second, that's to be expected, of course. At the present time journalists are much more tightly connected with the tools to gather news. (Try calling up the governor's office as a private citizen and asking for comment on the latest state budget crisis.) However this is changing as more sources of information become available.

But citizens posting news content has proved less valuable, with too little that is new or verifiable.

If it's new and verifiable it's scooped up by the conventional media as soon as it hits the web. Which is as it should be. This phenomenon is starting nationally and working its way down. Its worthwhile to look at the list of major national news stories which are broken each year by bloggers and other citizen journalists. In the area where bloggers and citizen journalists are most highly concentrated -- science and technology -- the percentage of stories that start with citizen journalists approaches 100 percent in some cases. Even political stories are increasingly broken by people like Matt Drudge and of course the work of bloggers like Michael Yon in Iraq pretty nearly defined the war at a time when the conventional media were getting it spectacularly wrong.

But a study of citizen media contained in this report finds most of these sites do not let outsiders do more than comment on the site’s own material, the same as most traditional news sites.

In other words, we're not all Wikipedia. But the missed point here is that if you don't like what that particular citizen medium has to say, it's easy to set up your own. "Freedom of the Press belongs to he who owns one", a media critic famously observed in the last century. However today that's everybody.

In other words there's an important corrective here that's lacking in conventional media because there are so many alternate voices. Less generally, but more tellingly, the claim that the response opportunities are "the same as most traditional news sites" is either mind-numbingly self-serving or breathtakingly ignorant.

The fact is that virtually every new media site, and certainly the important ones, offers far, far more opportunity for comment without the kind of editorial filtering one encounters in the letters to the editor column of a newspaper. (As the one-time managing editor of a small daily I can say this with some authority.)

Few allow the posting of news, information, community events or even letters to the editors. And blog sites are even more restricted.

This is by-and-large not true. As scanning through the comments sections following articles on sites like Slashdot, TechCrunch, etc. will easily demonstrate. And if you're still not satisfied with your ability to post, start your own blog.

This is so wrong, I suspect strongly the authors' problem is ignorance rather than being disingenuous. They simply don't know, and can't understand, how the new media work. The next comment supports that notion.

In short, rather than rejecting the “gatekeeper” role of traditional journalism, for now citizen journalists and bloggers appear for now to be recreating it in other places.

Ah yes, the "gatekeeper" argument. The problem with this argument is that it's specious because it equates a single blog or news site with the local newspaper. The key difference is that the newspaper, plus perhaps a couple of television stations (although the local news gerbils do a horrendous job when it comes to original reporting) are the only source of news in the community. The blog or news site is one of dozens, perhaps hundreds commenting on major topics.

More broadly the argument is misconceived for the simple reason that you will always have, and need, gatekeepers for any given source. Over the years journalism has been lambasted for its role as a gatekeeper, not because gatekeepers are inherently bad, but because one or two organizations had a monopoly on gatekeeping. The monopoly was the problem, not gatekeeping per se.

Finally we come to the quote that sums up what the report's authors want to believe.
more and more it appears that the biggest problem facing traditional media has less to do with where people get information than how to pay for it — the emerging reality that advertising isn’t migrating online with the consumer. The crisis in journalism, in other words, may not strictly be loss of audience. It may, more fundamentally, be the decoupling of news and advertising.

Which comes perilously close to saying that conventional media will continue to be the main source of information and this whole new media thing is overblown.

New media is indeed overblown in some (ever narrowing) circles, but conventional media will most assuredly not continue to be the public's main source of information -- at least not in the traditional sense. Looking back over the last decade there's been a sea change in journalism and the change is only beginning.

The "crisis in journalism" -- traditional journalism, anyway, -- is indeed in large part that newspapers, television and other old-style media are losing advertising revenue. One area where that's particularly true is the most lucrative section of any newspaper, the classified ads. This was already starting to be a problem 30 years ago with the growth of free shoppers. The web has greatly speeded up the process. So advertising is migrating to the web. It's just not supporting conventional journalism as it does so.

The Tech Liberation site has an overview of the report here. The report itself, all 180,000 words, can be found here:

Google, ontology and magic phrases

Search engines are vital to the web. Search engines also suck. I was just forcibly reminded of those facts as I struggled to find a source through Google.

For my purposes Google is the best of a bad lot. It indexes a huge number of sites, adds material fairly quickly and tries to stay up to date. But the search mechanism is fundamentally broken, because they're all fundamentally broken.

The immediate problem is trying to figure out what to search for. The larger problem is search taxonomy -- how to organize the information so the user can quickly find what he or she is looking for. The method used today is a string search. That essentially means guessing the magic phrase that refers to whatever you're looking for.

This is not only annoying as hell, it is a supremely complicated problem because not everyone uses the same words or phrases for a thing when they search for it.

I got considerable exposure to this a couple of years ago when I was acting as "Chief Staff Ontologist" (hey, I got to pick my own title) for an online yellow pages company. We wanted a way to classify the hundreds of thousands of listings in our database so customers could find the business they were looking for.

Developing a working, and workable, taxonomy just for businesses is a dauntingly complex task. Part of the problem is regionalisms. The same business is called different things in different parts of the country. You can have an "undertaker", a "funeral home", a "funeral parlor", a "mortuary", and several other terms, depending on where you are. And in some areas those terms have specific, differentiable, meanings. For example in some places in the east, mostly in large urban areas, a "funeral parlor" is a place to hold funerals. It doesn't provide embalming or other related services.

Essentially when you develop the taxonomy you have to try to read your searcher's mind, just as a searcher using a search engine has to read the mind of the people who put up the web site.

And since the searches are basically string-based, you've got no way to intelligently cross-reference topics. In fact in a search engine there usually aren't any topics.

I just spent a couple of days trying to find an appropriate "IT compliance consultant" in California for a story I am working on. It turned out the magic phrase was "security consultant" with a sub-specialty in compliance issues. Arrgh!

If you get the impression from this that I have a better solution, I hate to disabuse you but I do not. The answer undoubtedly involves what is called the semantic web -- being able to search by meaning rather than a string of characters. However the semantic web is mostly a pious wish that's struggling to achieve buzzword status.

As far as I know, no one can do a useful, generalized semantic search. The only way I know to do it is to have humans cross reference terms. A whole lot of humans doing a whole lot of cross references.

I suspect we're eventually going to get more useful search through a massive wiki-like project where people enter terms and, after flailing around and finding the magic phrase, provide a cross reference between terms. That's not an elegant solution, but given the power of the web -- and the need -- it's one that can work.