Post-Industrial Journalism

Section 3: Ecosystem

The only reason to talk about something as abstract as a news ecosystem is as a way of understanding what’s changed. The most significant recent change, of course, is the spread of the internet, connecting our computers and phones in a grid that is global, social, ubiquitous and cheap. As new capabilities go, the ability for any connected citizen to make, copy, alter, share and discuss digital content is a lulu, upending many existing assumptions about news and about media in general.

The news business in the 20th century was a fairly linear process, where reporters and editors would gather facts and observations and turn them into stories, which were then committed to ink on paper or waves in the air, and finally consumed, at the far end of those various modes of transport, by the audience.

A pipeline is the simplest metaphor for that process, whether distribution of news was organized around the printing press or the broadcast tower. Part of the conceptual simplicity of traditional media came from the clarity provided by the near-total division of roles between professionals and amateurs. Reporters and editors (and producers and engineers) worked “upstream,” which is to say, as the source of the news. They created and refined the product, decided when it was ready for consumption, and sent it out when it was.

Meanwhile, the audience was “downstream.” We were the recipients of this product, seeing it only in its final, packaged form. We could consume it, of course (our principal job), and we could talk about it around the dinner table or the water cooler, but little more. News was something we got, not something we used. If we wanted to put our own observations out in public, we needed permission from the pros, who had to be persuaded to print our letters to the editor, or to give us a few moments of airtime on a call-in show.

That pipeline model is still central to the self-conception of many institutions in the news business, but the gap between that model and the real world has grown large and is growing larger, because the formerly separate worlds of the professionals and the amateurs are intersecting more dramatically, and more unpredictably, by the day.

The main effect of digital media is that there is no main effect. The changes wrought by the internet and mobile phones, and the applications built on top of them, are so various and pervasive as to defeat any attempt to understand the current transition as a single force or factor. To understand this as a change to the ecosystem, it helps to have a sense of where the changes are showing up, and how they interact.

Here are a few surprises in our little corner of the 21st century:

  • In 2002, after Senate Minority Leader Trent Lott praised Strom Thurmond’s segregationist 1948 campaign, one of the people who did Lott in was Ed Sebesta, a historian who had been tracking racist statements made by American politicians to segregationist groups. Shortly after Lott said his praise had been an uncharacteristic slip, Sebesta contacted Josh Marshall, who ran the blog Talking Points Memo, to share similar (and similarly racist) comments made by Lott dating back to the 1980s.

    These comments undermined Lott’s ability to characterize his comments as a slip and led to his losing his Republican leadership position. Sebesta had built the database of racist speech on his own, without institutional support; Marshall was an amateur blogger (not yet having incorporated); and the source contacted the news outlet, 1,500 miles away, rather than vice versa. Indeed, as mentioned in Section 2, Talking Points Memo became the institution it is today because of what Marshall was able to do as an amateur (another example of institutional stabilization).

  • In 2005, the London transit system was bombed. Sir Ian Blair, the head of London’s Metropolitan police, went on radio and TV to announce that the cause had been an electrical failure in the underground. Within minutes of Blair’s statements, citizens began posting and analyzing pictures of a bombed double-decker bus in Tavistock Square, and in less than two hours, hundreds of blog posts were analyzing this evidence. These posts reached hundreds of thousands of readers and explicitly contradicted Blair’s characterization.

    Seeing this, and overriding the advice of his own communications staff, Blair went on air again less than two hours later to say that it had indeed been a bombing, that the police didn’t have all the answers yet, and that he would continue reporting as they knew more. When he spoke to the public, Blair had the power of all the traditional media behind him, but it was clear that merely having a consistent message on every broadcast channel in existence was no longer the same as having control.

  • Starting in 2010, in a series of reports called Dollars for Docs, Pro- Publica covered the flow of payments between the pharmaceutical industry and prescribing physicians. It was a story that had been covered before in bits and pieces, but ProPublica brought several things to its investigation not previously seen, including a database it assembled from data the pharmaceuticals were required to make public, along with the ability and journalistic will to mine that database.

    Dollars for Docs was not just a new report. It was a new kind of reporting. Though much of the data used were publicly available, they had not been centralized or standardized in a form that could make them useful; armed with this database, ProPublica has been able to report on a national story, while also providing tools for other organizations to cover the same issue as a local story; as of this writing, it helped spark stories in 125 other publications. (As a nonprofit, ProPublica can be both a news retailer and wholesaler.) In addition, it has been able to make its database as local as any news story can ever get: individual users can type the name of their doctor into the database and get a customized report. The harvesting and organizing of publicly available data thus became a platform for national, local and personal reporting.

Better access to individuals, as with Ed Sebesta; crowds, as with the London bloggers; and machines, as in Dollars for Docs, are driving working models that would have been both unthinkable and unworkable even 10 years ago: Huffington Post’s Off the Bus project, covering every Iowa caucus in 2008 with citizen journalists, would have bankrupted the organization had it been done with

stringers. The Guardian decided to crowdsource the tracking of expenses by UK members of Parliament, because the job, done by employees, would not just have cost too much but taken too long.

Journalists have always used tip lines and man-in-the-street interviews, and members of the audience have always clipped and forwarded favorite articles. What’s new here isn’t the possibility of occasional citizen involvement. What’s new is the speed and scale and leverage of that involvement, the possibility of persistent, dramatic amounts of participation by people previously relegated to largely invisible consumption. What’s new is that making public statements no longer requires pre-existing outlets or professional publishers.

Tip lines worked well only in geographically local areas, but NY Velocity was able to reach halfway around the world to get its critical interview in the Lance Armstrong doping case. Man-in-the-street interviews are random, because the professionals controlled the mode and tempo of public utterances, but with Flickr and weblogs, British bloggers could discuss the London bombings in public, at will, and with no professionals anywhere in sight. Dollars for Docs took disparate data and turned it into a database, which gave ProPublica an ongoing resource that was reused by it, other organizations, and millions of users over the course of two years and counting.

This is a change in degree so large, in other words, that it amounts to a change in kind. As Steven Levy observed, writing about the iPod, when you make something 10 percent better, you’ve made an improvement, but when you make something 10 times better, you’ve created a new thing. New digital tools can accelerate existing patterns of news gathering, shaping and publishing so dramatically that they become new things.

We are living through a shock of inclusion, where the former audience is becoming increasingly intertwined with all aspects of news, as sources who can go public on their own, as groups that can both create and comb through data in ways the professionals can’t, as disseminators and syndicators and users of the news.

This shock of inclusion is coming from the outside in, driven not by the professionals formerly in charge, but by the former audience. It is also being driven by new news entrepreneurs, the men and women who want to build new kinds of sites and services that assume, rather than ignore, the free time and talents of the public.

The importance of news isn’t going away. The importance of dedicated professionals isn’t going away. What’s going away are the linearity of the process and the passivity of the audience. What’s going away is a world where the news was made only by professionals, and consumed only by amateurs who couldn’t do much to produce news on their own, or distribute it, or act on it en bloc.

This a change so varied and robust that we need to consider retiring the word “consumer” altogether and treat consumption as simply one behavior of many that citizens can now engage in. The kinds of changes that are coming will dwarf those we’ve already seen, as citizen involvement stops being a set of special cases and becomes a core to our conception of how the news ecosystem can and should function.

Ecosystems and Control

To talk about a “news ecosystem” is to recognize that no news organization is now, or has ever been, absolute master of its own destiny. Relationships elsewhere in the ecosystem set the context for any given organization; changes in the ecosystem alter that context.

This essay began with a focus on the individual journalist, and on the various ways she can gather, process and make sense of information and events vital to public life. Most journalists do their work inside institutions; those institutions are shaped by everything from the size and makeup of the staff they employ to their self-conception and source of revenue. These institutions in turn shape the work of the journalist: which stories she can and can’t pursue, what is considered good or bad work, who her collaborators can be, and what resources are at her disposal.

Those institutions are themselves in an analogous position, operating in the media environment that covers the news (and sometimes even the part that doesn’t). This news ecosystem (hereafter just “ecosystem”) is made up of other institutions— competitors, collaborators, vendors and suppliers—but it is also made up of the ways other actors affect those institutions. The audience’s preference for news about Hollywood over Washington, the presence of the competition just a click away, the Supreme Court’s current interpretation of the First Amendment, and the proliferation of high-quality cameras on mobile phones are all part of the news ecosystem of the early 21st century, the effects of the ancient and modern all mixed together.

The ecosystem also shapes institutional capability: the kinds of stories that do and don’t get pursued are affected by everything from audience and advertiser desires to narrative frames. Everyone knows how to tell the story of a cheating athlete or a business gone bankrupt, but there is no obvious narrative frame for the tension between monetary and fiscal union in the EU, even though the latter story is by far the more important. Similarly, the facts and assumptions around things like access to data, validity of sources, the nature and limits of acceptable partnerships, and so on affect what institutions believe they can and can’t do, and should and shouldn’t do.

In the pipeline model of news, the existing institutions could be thought of as a series of production bottlenecks, owned and operated by media firms, and from which they captured income from both advertisers and audience. These bottlenecks were a byproduct of the incredible cost and difficulty of reproducing and distributing information, whether via printing press or broadcast tower. As noted in the last section, this was an ecosystem in which the institutions themselves had a high degree of control over their own fates.

A large, competent staff was required to print and deliver a daily paper; an even larger one was required to make and broadcast a news program. These costs and difficulties limited competition, as did the geographic range of delivery trucks and broadcast signals. Within the small numbers of organizations that could create and distribute news, whole professional structures arose.

Newspapers and magazines saw this institutionalization first, of course; the printing press preceded not just radio and movies but also steam engines and telegraphs. The entire professional edifice of writers and editors and publishers and, later, illustrators and layout artists and fact checkers and all the rest of the apparatus that went into creating a newspaper were built around—and often quite literally on top of—the giant machines that put the ink on the paper. Radio and TV news departments followed the same pattern, inventing professional categories and practices to subdivide and systematize both the work and the categories of employment that went into making broadcast news.

Then came the internet, whose basic logic—digital replication, universally available, with no division of participants into producers and consumers—is at odds with the organizing principles of news production as it has existed since the 1600s. Abundance creates more disruption than scarcity; when everyone suddenly got a lot more freedom, every relationship in the old “charge for operating the bottleneck” model was up for grabs.

The arrival of the internet did not herald a new entrant in the news ecosystem. It heralded a new ecosystem, full stop. Advertisers could reach consumers directly, without paying a toll, and it turned out many consumers preferred it that way. Amateurs could be reporters, in the most literal sense of the word—stories from the Szechuan quake to Sullenberger’s Hudson River landing to Syrian massacres were broken by firsthand accounts. The doctrine of “fair use,” previously an escape valve for orderly reuse of small amounts of content among a small group of publishers, suddenly became the sort of opportunity that whole new businesses of aggregation and re-blogging could be built on top of. And so on.

When changes are small or localized and existing institutions are well adapted to those conditions, it doesn’t make much sense to think about things as an “ecosystem”—simply responding to competitive pressures and adapting to small and obvious changes is enough. For institutions that produce news, however, the changes of the past decade have not been small or localized.

A common theme in writing about the response to those changes by traditional news outlets is the failure of newspaper management to recognize the problems they would face. This, in our view, misdiagnoses the problem: The transition to digital production and distribution of information has so dramatically altered the relations among publishers and citizens that “stay the course” has never been an option, and, for the majority of the press that was ad-supported, there was never an option that didn’t involve painful restructuring.

A similar theme has been unpredictability and surprise, explaining the current crisis with the rationale that recent changes were so unforeseeable and have transpired so rapidly that traditional organizations were unable to adapt. This view is also wrong: There were coherent predictions of the trouble the internet would cause for the news industry going back to the late 1980s, and despite frequent invocations of “internet time,” the pace of this change has been glacial; dated from 1994 (the first year of the broadly commercial web), management has had 75 consecutive quarters to adapt.

Individual accounts of even successful adaptation to the current ecosystem make it clear how hard such adaptation is. To take one example, in August 2011, the New York Daily News launched innovative live coverage of Hurricane Irene, replacing the front page of its website with a live blog, Storm Tracker.

The News then dispatched reporters out into the city, armed with cameras and phones (often the same device) to document everything from the evacuation efforts, to residents’ struggles to shelter in place, to the effects of the wind and water itself. These live reports were interspersed with messages from weather services, emergency services and city government, all unfolding along with the storm.

The News’ effort in live disaster blogging was a triumph, for which the News rightly won considerable praise. Also, it almost didn’t happen. The precipitating event for Storm Tracker was not a new web strategy but the failure of an old one. The News building is on Water Street, in a Class A flood plain, so the police severely limited the number of workers who could go there on the weekend Irene blew in. This would seem to be no problem for filing digital copy, except that the News’ content management system had been engineered to be difficult to log into if you weren’t in the building.

As noted earlier by Anjali Mullany, who pioneered live blogging at the News and oversaw Storm Tracker, the need to establish a production process around a CMS creates a large but often hidden tax on attempts at innovation. In this particular case, the Daily News had taken a tool that could have been accessible to anyone working for the paper anywhere in the world, and added security constraints so that it instead behaved like a steam-driven printing press—workers had to be near the machine to operate it, even though the machine was a networked computer.

The defining need that drove the launch of Storm Tracker, in other words, wasn’t to find new ways to inform the residents of New York City during a big storm, but simply to find a way to keep the website up when terrible engineering decisions collided with terrible weather.

This was one essential factor in the launch of Storm Tracker. There was one other. In interviews with Mullany about Storm Tracker’s success, she noted that it was fortunate that Irene had hit in late August instead of early September, because in late August, most senior management were on vacation and thus could not override the decision of the News’ junior but more web-savvy staff to try something new.

As noted in Section 2, institutions are designed to resist change—that is their core competence, in the language of management consultants. The risk, of course, is that too much success in that department can preserve an institution’s internal logic right up to the moment it collapses. If what it takes to innovate in the manner of Storm Tracker is brain-dead technology management, the fear that your newsroom will be washed out to sea, and senior management gone fishing, then the prospects for orderly innovation among legacy organizations is grim. (As a dreadful coda, Hurricane Sandy flooded the Daily News building, and the users of the CMS suffered the same issue as during Irene. Even a year after the original crisis, no one had adapted the system to allow for a distributed workforce.)

Given this, the old news industry’s collective fabulation about restoring status quo ante has itself been harmful. News organizations should obviously do what they can to improve their income, but the reliable revenue, high profits and cultural norms of the news business in the 20th century are gone, and the ecosystem that reliably produced such effects is gone as well. For individual journalists and for the institutions that serve them, cost containment, plus restructuring in the direction of more impact per hour or dollar invested, is the new norm of effective news organizations, the pattern we’ve taken to calling post-industrial journalism.

Post-Industrial Ecosystem

What does post-industrial journalism look like? It starts with the assumption, introduced in Section 2, that news organizations are no longer in control of the news, as it has traditionally been understood, and that the heightened degree of public agency by citizens, governments, businesses and even loosely affiliated networks is a permanent change, to which news organizations must adapt.

As one example of this change, the ejection of the Occupy Wall Street movement from New York’s Zuccotti Park in November 2011 was broken not by the traditional press, but by the occupiers themselves, who sent word of the police action via SMS, Twitter and Facebook. More pictures and video of the event were generated by the participants than by the traditional media, in part because the overwhelming majority of available cameras were in the pockets of the occupiers and in part because the police closed the airspace above the park to news helicopters. Reporters on the scene hid their press badges because ordinary citizens had better access to the events in question than credentialed members of the press.

Similarly, the news organizations that ran leaked documents from WikiLeaks often described WikiLeaks as a source rather than as a publisher, on the rationale that WikiLeaks provided the material they were working from. This makes sense in a world where holders of important information can’t publish it on their own and where publishers don’t share source materials with one another. But there is no longer a right answer to the question, “Who is a publisher and who is a source?” WikiLeaks is a source that can publish globally; it is a publisher that collaborates on delivery of raw material with other publishers.

Coverage of events like #Occupy and Cablegate (as well as Tunisian uprisings, Syrian massacres, Indonesian tsunamis, Chinese train crashes and Chilean protests) simply cannot be described or explained using the old language of the pipeline. The best argument for thinking of news as an ecosystem is to help reexamine the roles institutions can play in that ecosystem.

Imagine dividing the new entities in the news ecosystem into three broad categories— individuals, crowds and machines (which is to say, both new sources of data and new ways of processing it). Individuals are newly powerful because each of them has access to a button that reads “Publish”; material can now appear and spread, borne on nothing but the wings of newly dense social networks. Crowds are powerful because media have become social, providing a substrate not just for individual consumption but also for group conversation. Kate Hanni was able to use newspaper comment sections to drive her “Airline Passengers Bill of Rights” because she had a better sense of those papers as watering holes than they had themselves. And machines are newly powerful because the explosion of data and analytic methods opens whole new vistas of analysis, as with lexical and social network analyses that followed the release of State Department cables.

As with the inability to make WikiLeaks stay firmly in the category of either source or publisher, there is no stable attitude that a news outlet can take toward the new agency of individuals, the spread of ridiculously easy group-forming, or the increase in the volume of raw data and the new power of analytic tools. As the Daily News’ unwitting experiment with disaster blogging demonstrates, these are not resources that can be added to the old system to improve it. These are resources that change any institution that adopts them.

Now imagine dividing up the core operation of a news organization into three overlapping phases—gathering information about a story, shaping it into something ready to publish, and then publishing it. This taxonomy of a news pipeline into getting, making, and telling is of course simplistic, but it captures the basic logic of news production—take material broadly from the outside world, shape it into whatever your organization considers a story or a segment or a post, and then send the newly fashioned material back out into the world.

Armed with these two triads, we can ask, “How do individuals, crowds and machines affect the work of getting, shaping and telling?”

  • As one example, the “getting” phase of the news story was the cycling blog, NY Velocity, founded in 2004 by cycling enthusiasts Andy Shen, Alex Ostroy and Dan Schmalz. Though the site existed mostly to cover bike racing in New York, the people running it grew increasingly alarmed at what they thought was a culture of willful blindness around the possibility that Lance Armstrong, seven-time winner of the Tour de France, had been doping with Erythropoietin, or EPO, a blood-boosting drug. NY Velocity interviewed Michael Ashenden, the Australian physician who had developed a test for EPO; in the interview, Dr. Ashenden went on the record as saying he believed, after testing a sample of Armstrong’s blood during his 1999 Tour de France win, that Armstrong had been doping. This was original, long-form reporting, and the resulting 13,000-word interview became a public rallying point for cyclists who believed not merely that Armstrong had cheated his way to those wins, but that the professional sports journalism world was far too willing to look the other way. NY Velocity’s founders were willing to pursue a lead tenaciously and publicly; not only were they completely vindicated in their suspicions, but they also demonstrated that professional journalists may simply not be covering a story well enough and that dedicated and knowledgeable insiders can sometimes fill this gap.

  • To take another intersection of traditional practice and new capability, consider the way the ability to assemble groups has changed creating a story. Huffington Post’s 2008 reporting project was able to cover every site of the Iowa caucuses because it could dispatch a volunteer to each site for an hour or two, something that would have been too expensive with freelancers and too travel-intensive for full-time staff. The volunteers for Off the Bus were not the people creating the eventual report on the caucuses—the project was instead a hybrid of distributed reporting and centralized writing of the story; it was, in a way, a return to the old separation of reporters in the field and rewrite men in offices close to the machine.

  • Still another cross-tab of existing jobs and new resources is the way a story can be told by letting machines do some of the telling. Several projects using Ushahidi, the “crisis mapping” tool, have crossed over from “resource for recovering from a crisis” to “resource for understanding the crisis as it happens.” Ushahidi has been used to create real-time maps of voter intimidation, street harassment, radiation levels and snow removal—every instance of Ushahidi for newsworthy events is an example of machines altering how data are collected, collated and displayed.

Every one of the core activities of getting, making and telling is being altered by new ways of involving individuals, groups and machines. As noted in Section 2, the significance and pervasiveness of these alterations is likely to defeat institutions’ ability to integrate change slowly. Many of the recommendations in this section are thus echoes of those from the section on institutions; when they are repeated here, it is with greater emphasis on the way that using these new resources and capabilities means adaptation to an altered ecosystem.

News as an Import-Export Business

One way to think about ecosystems is to ask what flows between its participants. As noted, flows in the 20th century were relatively linear and predictable; where there was significant complexity in flows of information, they tended to be embedded in highly specified business dealings, as with the use of syndicated or wire service copy.

The value of an Associated Press story to an individual newspaper was reflected in the interest of the locals; a subscription to the AP was justified when the value of that interest helped the paper generate more in ad revenue than the feed cost them.

This was a system where flows of business value were specified in bilateral agreements and priced in dollars—a newspaper signs an agreement with the AP in return for access to its feed. Compare that to the Huffington Post’s original model: the realization that some of HuffPo’s published material could excerpt existing stories, add commentary, and produce an economically viable new product. Fair use has existed in this form for decades; what changed was the conditions of the ecosystem. HuffPo management realized that fair use, as applied on the web, meant that, in essence, everything is a wire service and that excerpting and commenting on unique content from the Washington Post or the New York Times was actually more valuable to readers than contracting with the AP or Thomson Reuters.

The Huffington Post has often been criticized for this stance, but this is shooting the messenger—what it did was to understand how existing law and new technology intersected. The AP itself is experimenting with holding back key stories from its subscribers, in a bid to acquire more direct traffic. Similarly, the AP’s case against Shepard Fairey, an artist who created an iconic image of Barack Obama as a derivative work from an AP image hinged on the idea that AP had the right to photograph Obama without his permission but that Fairey couldn’t use that likeness to create a related one. In the Fairey case, there was no objective reality that the case could be adjudicated on—there was simply a set of legal doctrines.

The old ethic was described by Terry Heaton in a post entitled “Why don’t we trust the press?”:

Nobody ever mentions anybody else in the world of news gathering unless a copyright claim forces it. Before the Web, this was understandable, because as far as anybody knew, our reporters had all the angles on everything. The idea that the guy across town had it first was irrelevant, so why mention it? As far as our viewers or readers were concerned, we were the font of all knowledge. Besides, we had the time to gather everything we needed anyway. It was the world of the “finished” news product. But now, with news in real time, everybody can clearly see stories develop across all sources. We know who got it first. We know when something is exclusive. Our hype is just nonsense.

It has become obvious, in the new news ecosystem, that the notion of everyone producing a finished product from scratch is simply not the normal case. We are each other’s externalities. This has always been the case to some degree—newspapers famously helped set the agenda for broadcast media in the 20th century—but it was often hidden in the manner Heaton describes. The explosion of sources and the collapse in cost for access has made the networked aspect of news more salient. The tech site Slashdot was clearly a source of story ideas for the New York Times’ Science Times section; Boing Boing sends traffic to obscure but interesting websites, which often become fodder for stories, elsewhere, and so on.

In some ways, the ecosystemic aggregation, inspiration, excerpting and even wholesale ripping-off of journalistic content marks a return to earlier ages of newsgathering in which country newspapers often consisted of little more than week-old stories copied from metropolitan dailies. The ability to aggregate news, 18th-century style, was due in part of a lack of institutional norms (was reprinting news “illegal”? Few editors probably thought of it in those terms) and in part due to technology (few people in New York City would ever see a newspaper in rural Kentucky). The idea that news could be syndicated, for fee, is a relatively new concept in journalistic history.

The syndication model that existed under the 20th-century news production regime thus isn’t coming under pressure because of bad actors, but because the basic configuration of the media landscape has changed dramatically. In the old model, reuse of material was either contractual (freelancers, wire services) or hidden. In the new model (old models, really), there are many forms of reuse; some are contractual, but most are not. The AP is a particularly visible case, but every news institution is going to have to position or reposition itself relative to new externalities in the ecosystem.

The spectrum of the exchange of value between individuals and organizations is enormous and highly gradiated—there is now an institutional imperative to get good at developing partnerships, formal and informal, that have become possible in the new ecosystem. To take one recent example, important both in itself and for what it says about the changing world, the ability to translate written and spoken material has become dramatically easier and cheaper.

Automated translation tools are far better today than they were even five years ago, as with the use of Google Translate by English speakers to read Arabic tweets; crowdsourced translation, as with dotSub or the TedTalks translators, can convert astonishing amounts of material in short periods; and the rise of institutions given to consistently bridging linguistic and cultural gaps, as with Meedan or China‑ Smack. Every institution in the world now faces two strategic choices: when, and out of what languages, do we begin translating primary course material or existing reporting to present to our audience, and, second, when and into what languages do we translate our own material to attempt to reach new audiences. Imagining news as a linguistic import-export business, investing in importing from Arabic into English, at potentially all levels of the cost-quality trade-off, could be valuable for any U.S. newsroom that wants to cover geopolitics, while, given the demographic trends in the United States, investment in exporting from English to Spanish could add huge value in audience acquisition and retention.

Recommendation: Get Good at Working with Partners

There is a famous photo, from the 2008 Olympics, of a phalanx of sports photographers on a platform, all angling to get what is essentially the identical shot of Michael Phelps. The redundancy pictured is staggering. There is something like half a million dollars’ worth of gear committed to capturing a single point of view, and worse is the human cost of dozens of talented photojournalists competing for minimal incremental value.

This sort of competition, where every institution has to cover the same thing in only slightly different ways, was absurd even when those organizations were flush. Now, with many resources gone and more going, it is also damaging.

News institutions need to get better at partnering with individuals, organizations, even loose networks, both to expand their purview and reduce their costs. Successful examples range from the New York Times/WNYC SchoolBook partnership, designed to improve education coverage to both participants, to the aforementioned WikiLeaks and Dollars for Docs examples, to arm’s length use of online data hosted by the Sunlight Foundation or Data.gov. In particular, finding ways to use and acknowledge the work of such partners without needing to fit everything into a “source or vendor” category would expand the range of possible collaborations.

Recommendation: Figure Out How to Use Work Systematized by Others

This is a subset of the previous recommendation. We are seeing a huge increase in structured data (data that come in a highly ordered and welldescribed fashion, such as a database), and a related increase in APIs (application programming interfaces, a systematic form of machineto- machine conversation). Taken together, this means a potential rise in collaboration without cooperation, where a news outlet builds on data or interfaces made available elsewhere, without needing to ask the institution hosting the data for help or permission.

This is obviously valuable, as it provides low-cost, high-quality access to previously unavailable source material. As with so many new capabilities in the current environment, however, structured data and API access are not new tools for doing things the old way. These are tools whose adoption alters the organization that uses them.

The most obvious obstacles to taking advantage of work systematized by others are the technical skills and outlook required to use it. This problem, fortunately, is getting somewhat better, as tools like Many Eyes and Fusion Tables are making it easier for less tech-savvy people to explore large data sets looking for patterns. Even with this improvement, however, there is a need for basic numeracy among journalists, something we’ve taken to calling the “Final Cut vs. Excel” problem, where journalism schools are more likely to teach tools related to basic video production than to basic data exploration.

This emphasis on tools for presentation over investigation is most acutely a problem in the nation’s journalism schools, of course, but it is widespread in the industry. (As Bethany McLean of Vanity Fair said to us, “Anyone who’s good at understanding corporate balance sheets is likelier to work on Wall Street than cover it.”)

The subtler obstacles are cultural—using work systematized by others requires overcoming Not Invented Here syndrome and accepting that a higher degree of integration with outside organizations will be necessary to take advantage of new sources of data. Another obstacle is cultural— data and APIs are often freely available, but the hosting organizations want credit for helping to create something of value. This imperative pushes against the aforementioned tendency not to credit others publicly.

This logic is not just about using others’ work, of course. News organizations should do a better job of making their work systematically available to other organizations for reuse, whether by sharing data or by sharing tools and techniques. There will always be a tension between competitive and cooperative logic in the news ecosystem, but in the current environment, the cost of not undertaking shared effort has gone up, the cost of lightweight collaboration has gone down considerably, and the value of working alone has fallen.

As noted in Section 2, presence of process is often a greater obstacle to change than absence of resources. Taking advantage of work systematized by others and figuring out ways of making your work systematically useful to others are ways to do higher quality work at lower cost, but doing so requires an organization to start treating the newsroom like an import-export business, rather than an industrial shop floor.

Self-definition as Competitive Advantage

There is no solution to the present crisis. One corollary is there is no stable state coming to the practice of news any time soon. We are not living through a transition from A to B (Walter Cronkite to Baratunde Thurston, say) but a transition from one to many, from a world where Cronkite could represent some central focus to a world with a riot of competing voices—Thurston and Rachel Maddow and Juan Cole and Andy Carvin and Solana Larsen as a few members of a cast of millions. We’ve seen this in microcosm—the transition from broadcast to cable networks on TV, or, as a less popular example, from terrestrial to satellite radio led to a shift from networks that catered to a broad swath of people to highly specific niches (Comedy Central, Food, and, on satellite radio, not just blues music but Delta blues or Chicago blues).

Linking is the basic technological affordance of the web, the feature that sets it apart from other forms of publishing, because it says to the user: “If you want to see more on the topic being discussed, you can find related material here.” It is a way of respecting the users’ interests and ability to follow the story on their own.

In the practice of news, the most basic form of linking is to source materials. A discussion of a recent indictment should link to the text of that indictment. A discussion of a scientific article should link to that article. A piece about a funny video should link to that video (or, better, embed it).

This is not sophisticated digital strategy—it is core communicative ethics, yet it is disturbing that so many journalistic outlets fail this basic test. At fault are the usual cultural obstacles (as with Terry Heaton’s observations about not giving credit), ingrained habits (news desks used to be limited by space or time constraints to excerpting source materials), and commercial concern about sending readers elsewhere.

None of these obstacles, though, merits much sympathy. The habit of not giving credit, while widely practiced, is plainly unethical. The web no longer feels novel to the audience; it’s well past time for its core practice to be internalized by journalists. And refusing to link for commercial reasons may make sense to the ad sales department, but it should horrify anyone whose job involves public service.

The public value of linking to source materials is so obvious, and so easy, that organizations that refuse to do it are announcing little more than contempt for the audience and for the ethical norms of public communication.

The internet, of course, provides infinite potential variety, making the argument in favor of niche audiences (and niche loyalty) strong here as well. In addition, the old logic of geographic segmentation of local coverage allowed news outlets to buy wire service news or syndicated packages, secure in the knowledge that their audience wouldn’t see the same content published or aired in a neighboring town. With the rise of search as an essential form of finding content, however, the average user now has access to thousands of sources for the story of the Somali pirates, the vast majority of which are drawn from the same wire service copy.

This creates a new imperative for news organizations, for which the strategy of “We are all things to all people in a 30-mile radius” is no longer effective. There are useful services to be rendered by hyperlocal organizations (the St. Louis Beacon, the Broward Bulldog), others by hyperglobal ones (the New York Times, the BBC), others still by highly specialized niche sites of analysis (Naked Capitalism, ScienceBlogs), and so on.

This is a breadth vs. depth trade-off. The web creates a huge increase in diversity over a world dominated by broadcast and print media. More recently, an increasing amount of news is flowing through social media sites, and especially Twitter and Facebook; the growing dominance of the social spread of news and commentary further erodes the ability for any one site to produce an omnibus news package.

There is a place for rapidly produced, short pieces of breaking news. There is a place for moderately quickly produced analysis of moderate length (the first draft of history). There is a place for careful, detailed analysis by insiders, for insiders. There is a place for impressionistic, long-form looks at the world far away from the daily confusion of breaking news. And so on. Not many organizations, however, can pursue more than a few of these modes effectively, and none that can do all of them for all subjects its audience cares about.

This is partly because institutions always face breadth vs. depth trade-offs, but the internet has made them considerably worse—masses are bigger, as with the spread of the news of Michael Jackson’s death. Niches are nichier—coverage of mortgage issues at Lenderama, or Latino youth issues at Borderzine. The fastest news can be faster—the White House announcement of Osama bin Laden’s death was prefigured on Twitter more than once by independent sources.

Recommendation: Give Up on Trying to Keep Brand Imprimatur while Hollowing Out Product

This is principally a negative recommendation.

Two things that have changed dramatically in the past decade are the value of reputation (higher) and the cost of production (lower). So many sources of news are now available that any publication with a reputation for accuracy, probity or rigor has an advantage over the runof- the-mill competition. However, digital tools have also dramatically reduced the cost of finding and publishing information, leading to a profusion of outlets that publish by the ton.

It is tempting for the publications with the good reputations to combine these two changes, to find some way to extend their reputation for high quality over new low-cost, high-volume efforts. This was the rationale that led to the creation of the Washington Post’s blogPost aggregation and commentary feature, made famous by the resignation of Elizabeth Flock after being reprimanded for not having attributed some of the material she was aggregating.

It’s worth quoting from the column the Post’s ombudsman, Patrick B. Pexton, wrote after she resigned:

Flock resigned voluntarily. She said that the [two] mistakes were hers. She said it was only a matter of time before she made a third one; the pressures were just too great.

But The Post failed her as much as she failed The Post. I spoke with several young bloggers at The Post this week, and some who have left in recent months, and they had the same critique.

They said that they felt as if they were out there alone in digital land, under high pressure to get web hits, with no training, little guidance or mentoring and sparse editing. Guidelines for aggregating stories are almost nonexistent, they said.

Flock and her fellow aggregators were caught between the commodity news logic of an aggregation site and the Post’s brand, a tension that also showed up in the New Yorker providing a platform for Jonah Lehrer’s recycled content; as Julie Bosman noted in the New York Times, the magazine’s “famed fact-checking department is geared toward print, not the web.” It also appeared in the Journatic scandal, where fake bylines were added to stories written by overseas freelancers.

In all of these cases, the temptation is to place a low-cost process under a high-value brand. It’s clear that rapid commodification of ordinary news is not just inevitable but desirable, to free up resources for more complex work elsewhere. It’s also clear that the temptation to make commodity news look like its non-commodified counterpart is also significant, even for institutions as august as the Post and the New Yorker.

Basic respect for the journalistic effort demands that you give people doing commodity work clear guidelines about what is and isn’t permissible. Basic respect for your audience demands that it be given clear guidelines about the source and process of news.

“Breaking news from around the web” can be a valuable feature and asking people in the Philippines to write what is essentially standard copy, given a particular set of facts, are both useful strategies. But presenting them as no different from more aggressively researched, composed and checked stories creates both short- and long-term risks that are not worth the momentary arbitrage opportunity of marrying a good brand with cheap content.

The change in the ecosystem here is that functions previously executed among competitive news organizations, and especially scoops and breaking news, are now taken over by platforms. Any given news organization may set itself up to be faster at breaking sports news than Deadspin, say, or faster at breaking tech news than Scobleizer, but no organization today can consistently beat Facebook or Twitter on speed or spread.

One final observation: A core thesis of this essay is that the country’s news organizations are no longer adequate to ensuring coverage of the news on their own. This puts existing institutions in the awkward spot of needing to defend or even improve parts of the current ecosystem from which they may not profit, and which may benefit their competitors.

Were news organizations merely commercial entities, this would be impossible— Best Buy has little interest in improving the electronic ecosystem in ways that might benefit Amazon or Wal-Mart. News organizations, however, are not merely commercial entities. They are instead constituted to shield newsroom employees from most of the business questions a paper faces (however imperfect such “Chinese walls” turn out to be in practice). Indeed, if news organizations were not sources of such tremendous civic value, separate from the logic of the market, their commercial senescence would make no more difference than the closing of the local travel agent’s office.

Given this, and given the need for post-industrial journalism that makes considerably better use of an hour of a journalist’s time or a dollar of an institution’s money, news institutions large and small, commercial and for-profit, executional and educational, should commit themselves to two changes in the current ecosystem.

Recommendation: Demand that Businesses and Governments Release Their Data Cleanly

The most valuable dollar a news organization can make is the dollar it doesn’t have to spend, and in the 21st century, the easiest dollar not to spend is the dollar spent gathering data. In keeping with our recommendation that news organizations should shift some of their priorities from covering secrets to covering mysteries, anyone who deals with governments or businesses should demand that publicly relevant data be released in a timely, interpretable and accessible way.

Timely means that the data should be made available soon after being created. It is of far less value to know what committee recommendations were after a piece of legislation has gone up for a vote. Interpretable data come in a structured and usable format. Data should be made available in flexible formats, such as XML, and not inflexible ones, like PDF. (Indeed, using a format like PDF for publishing is often a clue that an organization has something to hide.) Accessible means that the data are made readily available over the public internet, instead of being kept on paper or made available by request only. The FCC’s ruling that broadcast outlets had to publish their political ad records online, rather than keeping them available “for inspection” at the station, was a big improvement in accessibility.

Every news outlets should commit some resources, however small, to taking an activist stance on this issue. Better access to better data is one of the few things that would be an obvious improvement for the news ecosystem, one where the principal obstacle is not cost but inertia, and one where the news organizations’ advantage in creating improvement is not expenditure of resources but moral suasion.

Recommendation: Recognize and Reward Collaboration

Organizations that offer grants and rewards provide a signaling mechanism for how practitioners of journalism should regard themselves and their peers.

These organizations should begin offering grants or create award criteria or categories that reward collaboration, either explicit, as in the case of SchoolBook, or implicit, as with organizations that provide access to their data for reuse by other organizations, as with Dollars for Docs. Similarly, awards for successful reuse of a reporting template—for example, other news organizations ferreting out Bell, Calif.-style corruption— would help alter the current valorization of handcrafted work that tends not to be repeatable, even when the reporting uncovers a potentially widespread problem. It was a huge loss for the nation that no organization undertook a systematic look at other states’ nursing boards after the California scandal or a harder look for off-balancesheet vehicles after Bethany McLean wrote about Enron.

McLean noted, in an interview for this report, that a key part of her ability to study Enron was cultivating skeptics as sources—her initial interest came after a short seller characterized Enron’s financial statements as incomprehensible. This might seem like an obvious strategy, but few in the business press followed it, either before the fall of Enron or, far more alarmingly, even afterward.

Organizations that shape assumed community norms among journalists and editors should highlight efforts that build on previous work. As with all grants and awards, these changes will reach only a few institutions directly but will reach many indirectly, by communicating the kinds of work that might reap either commercially unconstrained funds, the admiration of one’s peers, or both.