iLAB

If Truth Be Told: Part 3

May 30, 2014

On June 24, 2014, PublicAffairs, an imprint of the Perseus Books Group, will publish "935 Lies: The Future of Truth and the Decline of America’s Moral Integrity." The book is the culmination of a nine-year reporting and writing effort by Charles Lewis, founder and executive editor of the Investigative Reporting Workshop. In a three-part series written exclusively for the Workshop, Lewis expands on some of the book’s major themes, including how our national integrity has been eroded, and how a relative handful of reporters and other truth-seekers have tried to fight back in an increasingly hostile media environment. Part 1 of the series looks at the historical trends that helped create some of these problems. Part 2 looks at the current landscape, while Part 3 explores the outlook for change in the near and distant future. More information about "935 Lies" as well as Lewis’s other projects, is available at www.charles-lewis.com.

Last in a three-part series; to read earlier essays: Part 1; Part 2.

Earlier this year, media critics puzzled over the implications of Quakebot, a computer algorithm that Los Angeles Times journalist Ken Schwencke had written to extract certain earthquake data from the U.S. Geological Survey and paste it into a pre-written template ready for publication. What particularly earned the critics’ attention was that news of a magnitude 4.4. temblor was posted online — following Schwencke’s review of the text — a mere three minutes after the early-morning quake roused him from sleep. 

Quakebot, which was crafted some two years ago, isn’t the first media foray into “robo-journalism”; in fact, Schwencke and his cohorts on the Times’s data desk had previously built a tool to extract local homicide information from coroner reports. But the speed with which news of the March 17 quake was published left some wondering whether the dwindling ranks of journalists would be further imperiled by computers — an idea Schwencke, for one, dismissed as unfounded. “The way we use it, it’s supplemental,” he told Slate magazine. “It saves people a lot of time, and for certain types of stories, it gets the information out there in usually about as good a way as anybody else would. The way I see it is, it doesn’t eliminate anybody’s job as much as it makes everybody’s job more interesting.”

That conclusion may not quite jibe with the outlook at Forbes magazine and other periodicals, which now publish financial data-related stories with the curious byline “by Narrative Science.” According to the Chicago-based company, its trademarked technology, named Quill, integrates artificial intelligence and Big Data analytics in a novel way: “It can transform data into stories that are indistinguishable from those authored by people.” According to Emily Bell of the Guardian, stories generated by Narrative Science “will not be winning many Pulitzers, but [they] certainly pass the Turing Test of making one unsure whether they were written by a person or machine . . .One could be forgiven for thinking that the apocalypse has already dropped off its bags, if not actually arrived.”

But Kristian Hammond, the co-founder and chief technology officer of Narrative Science, heartily disputes the contentions that computer-generated journalism is fraught with doom and unworthy of journalistic recognition: In 2011, Wired magazine reported that Hammond told a small conference of journalists and tech folks he believed a computer would win a Pulitzer Prize within five years. And when asked what percentage of news will be written by computers in 15 years, he boasted, “More than 90 percent.”

Among those on hand for Hammond’s audacious comments was Sarah Cohen, who shared a Pulitzer Prize for Investigative Reporting at The Washington Post, and who in 2012 joined The New York Times and now as an editor oversees its computer-assisted reporting projects. “I’ve found folks in those worlds really take a dim view of journalism in general,” she told me. “They kind of are like, ‘what’s so hard about that?’ and think it’s for dopes. Then they use it.” She noted after hearing Hammond’s exchange that the ability of companies like Narrative Science to create a narrative was “based partly on scraping headlines so that they know what the tone should be. No one asked: What happens when they meet their 90 percent goal in 15 years to be the source of news and there are no more headlines to scrape?”

The new journalism ecosystem

And therein lies the rub: The proliferation of new technologies may compromise the integrity of the newsgathering business, as web-crawling machines analyze large numbers of vast datasets and human decision-making gives way to automated algorithms that spit out “investigative” reports; at the same time, however, such technological developments offer journalists the sort of possibilities that may dramatically enhance their storytelling capabilities. As a result, a veteran reporter like Cohen, who’s loathe to ever trade her colleagues for automated bots, is still very much aware of the potential benefits of the Big Data technologies. “I hope that we can use some of the methods they use for analytics in order to do our investigations — maybe we can borrow the tools of machine learning, linguistics, pattern recognition,” she told me.

Of course, Cohen and generations of journalists before her have toiled in newsrooms where investigative reporters have been turned loose on stories that no machines could ever emulate, no matter how sophisticated their algorithms, no matter how deeply they dive the web, no matter how many terabytes of storage they’re outfitted with. For example, it takes the boldness, cunning and bravery of a Nellie Bly, as described in Part 1 of this series, to fake her way into an asylum to expose the mistreatment of patients. It takes the outrage and finesse of a Washington Post investigative reporter like Morton Mintz to expose the unapproved domestic use of the German anti-nausea drug Thalidomide, which when taken in the first trimester of pregnancy resulted in babies born without one or more limbs. And it takes the sources, savvy and boundless horizons of a reporter like the Post’s Dana Priest to expose the secret CIA prisons in Eastern Europe, Thailand, and elsewhere — a story referenced in Part 2.

I’ve had the good fortune to conduct videotaped interviews of Mintz and Priest — as well as some two dozen other legendary journalists — for my website Investigating Power, and the how-I-did-it tales they told make clear that we will forever need such talent if our democracy is to flourish. By my estimation, no supercomputer is quite up to the task.

In our interview, Priest told me that the CIA “black sites” story took her more than two years to figure out, as she pulled on one strand after another in hopes of tying things together. “It started as a tiny, little piece of the puzzle, because you don’t even have a context; you don’t even know what you’re seeing. And then you get another one, and another one. And now you’re thinking, ‘Well, can we describe a system here? How big is it? How many people are in it?’ And every single story along the way was very painstakingly done and took forever, which was the luxury that I had.”

And she also had what no machine has yet been programmed with: instinct and luck. For example, the beginning chase was in part facilitated by a foreign reporter, who knew an employee of an obscure Pakistani airport able to provide a tidbit of information — a detail of no consequence to this reporter’s readers, but one that Priest picked up on as part of the puzzle she was trying to construct. That airplane tail number she gleaned from this foreign newspaper was put into a database, which led her to a couple of companies, which in turn led her to other revelations, which cumulatively helped bushwhack an editorial trail that ended with a front-page blockbuster worthy of a 2006 Pulitzer Prize for Beat Reporting. “It took a lot of time,” she says, “but pretty soon we started peeling back more of the onion….”

Sadly, few reporters these days are provided the time to poke around in endless alleys, no matter how blind or fruitful they may ultimately prove to be. Blame that on the crumbling economic foundation of our legacy newspapers, which I’ve detailed in previous installments of this series. What’s clear to me is that without the financial resources required to undertake no-stone-unturned investigative journalism, we risk being deprived of the game-changing stories that have lifted the veil on political corruption, industry malfeasance, social injustices and unbridled hypocrisy. Imagine the societal consequences of a media no longer able to reveal the likes of the Watergate scandal, secret domestic eavesdropping, the My Lai massacre during the Vietnam War, the mistreatment of wounded veterans at Walter Reed Hospital, sexual abuse by priests, backdated stock options, the perils of concussions in football, or the revelations that retired generals, working as radio and TV “analysts,” had in fact been recruited by the Pentagon to make the case for the war in Iraq.

“cohen.jpg”

Photo by Claudia Costa, datadrivenjournalism.net

Journalist Sarah Cohen works on data-driven stories at The New York Times.

But if the traditional media no longer have the wherewithal — or perhaps the gumption — to undertake such critical reporting, we’re thankfully witnessing the emergence of what I call the new journalism ecosystem — a development I describe in depth in Chapter 8 of "935 Lies." In this new framework, the traditional role of our legacy media is augmented by — or in some cases replaced by — nonprofit enterprises with the time, talent and financial resources to produce unimpeachable investigative journalism. We’ve seen the results of this paradigm shift from such pioneers of nonprofit reporting as the Center for Investigative Reporting and the Center for Public Integrity, which between them have amassed dozens of national journalism awards. We’re witnessing the emergence of scores of local, regional and special-interest watchdogs, which have either partnered with traditional media or filled the void left by these news organizations’ editorial retreat. And there are now 18 university-affiliated reporting projects, the largest of which is our Workshop at American University.

I have never believed, however, that the search for truth should be the exclusive preserve of reporters. After all, in this still-embryonic digital age, we live in a more connected, potentially more collaborative, world in which everyone with access to the Web can publish information and thus be a “citizen journalist.” And to extend that concept, imagine a world in which non-government organization researchers, public-interest activists, corporate investigators, forensic accountants, political scientists, investigative historians, public anthropologists and veteran journalists are occasionally looking in all the same places. Imagine that, to varying degrees, they are all utilizing the same new data technologies, analytics and other intellectual cross-pollination possibilities, exchanging ideas and sometimes working and writing together, whether side by side or across cyberspace borders. 

These are collaborative, 21st century fact-finders, fact-checkers, and, more broadly, truth-tellers, searching for information and its verification, each coming from different perspectives, interests, educational backgrounds and professional expertise, as well as culturally diverse geographic and economic circumstances. But despite their differences, they have much in common: They are all intrinsically curious and have an inordinate amount of patience, determination and mettle. They are willing, if necessary, to persevere in their quest for answers for months, years, and sometimes even decades, if necessary.

Our truth problem

I find this fledgling global community of interest in verifiable knowledge and understanding to be both exciting and auspicious, both for the future of truth and, more narrowly, for the future of journalism. For it is in our common interest, as citizens of a representational democracy, to be reasonably well-informed and able to distinguish fact from fiction. We therefore all necessarily have a shared value in needing to know the basic truth of the matter, whatever that matter may be.

Unfortunately, our individual and collective obligations to seek out the truth have been incrementally waylaid over recent history. As our politics have become more garish and circus-like, with the art of compromise having been torpedoed by unyielding, self-serving ideologues, anger toward and distrust of the government are at historically high levels. Concurrently, while ire over how the country is governed has risen, political participation remains anemic. And beyond this troubling public disengagement, surveys reveal that the level of ignorance about U.S. history, geography, science and current affairs is appalling. As Watergate chronicler Carl Bernstein lamented in a 2007 interview for my Investigating Power website, “the spectacle, and the triumph, of the idiot culture” that he described in a 1992 article for The New Republic magazine has only proliferated.

So the bottom line is that until we straighten out our truth problem, we’ll continue to blunder along, easily misled by politicians and the corporations that sponsor them. Until the public is more engaged, better informed and knows the real-time truth about the pressing issues of the day, brought to them substantially because of high-quality, in-depth reporting — whether via our legacy media or new incarnations of independent journalism — there can be no real democracy. After all, if we cannot clearly see and broadly acknowledge our unvarnished reality as a people today, then what chance do we have to act responsibly tomorrow? For as Proverbs in the Old Testament of the Bible put it more than two millennia ago, “Where there is no vision, the people perish.” 




New Economic Models