Originally published here, March/April, 2012
The choreography of a typical human rights investigation goes like this: Researchers interview victims and witnesses and write their report. The local media cover it — if they can. Then those accused dismiss it; you have nothing more than stories, it’s one word against another, the sources are biased, the evidence faked. And it goes away.
On March 13, 2002, in a courtroom in The Hague, something different happened. In the trial of Slobodan Milosevic, Patrick Ball, an American statistician, presented numbers to support the case that Milosevic had pursued a deliberate policy of ethnic cleansing. “We find evidence consistent with the hypothesis that Yugoslav forces forced people from their homes, forced Albanian Kosovars from their homes, and killed people,” Ball said.
Ball made this statement under cross-examination by Milosevic’s lawyer, who was, in fact, Milosevic himself. Over two days, the former president of Yugoslavia used his time to rage at Ball: The evidence was fabricated. The organizations that gathered the data were anti-Serb, trying to “galvanize public opinion and raise hostility against the Serbs and the desire to punish them,” Milosevic insisted. War is chaos, he said — how can you be so simplistic as to think that outcomes have a single cause? Why didn’t you examine Serb refugee flows? How can you, a self-described supporter of international law, be considered objective?
These were the usual arguments. They seldom persuade, but their mere existence creates a counterweight to the accusations made by human rights groups: Someone who wants to claim it’s their word against ours now has something to grasp. But Ball offered far more as evidence than interviews with Albanians who had fled their villages. He had obtained records from Kosovo’s borders of who left and when. He had exhumation data and a wealth of information about the displaced. In short, he had numbers.
Traditionally, human rights work has been more akin to investigative reporting, but Ball is the most influential of a handful of people around the world who see that world not in terms of words, but of figures. His specialty is applying quantitative analysis to mountains of anecdotes, finding the correlations that coax out a story that cannot easily be dismissed.
Could the movements of refugees have been random? No, Ball said. He had also plotted killings of Kosovars and found that both phenomena occurred at the same times and in the same places — flight and death, hand in hand. “I remember well the moment of astonishment that I felt when I saw the killing graph for the first time,” Ball replied to Milosevic. “I assumed I had made an error, because the correlation was so close.”
Something had caused both phenomena, and Ball examined three possibilities. First, the surges in killings and flight did not happen during or shortly after NATO bombings. Nor were they consistent with the pattern of attacks by Albanian guerrilla groups. They were consistent, however, with the third hypothesis, that Serb forces conducted a systematic campaign of killing and expulsions.
In testifying, Ball was doing something other human rights workers can only fantasize about: He confronted the accused, presented him with evidence, and watched him being held to account. At that point, Milosevic in his four wars had killed some 125,000 people, more than anyone in Europe since Stalin. But now the Butcher of the Balkans sat in a courtroom that looked rather like a community college classroom, with two Dutch police officers behind him and his cell waiting for him at the end of each day’s session, rhetorical bluster his only available weapon against Ball’s evidence.
Milosevic died before the trial ended. Ball returned to Washington and then went on to Lima to work for Peru’s Truth and Reconciliation Commission — one of dozens of truth commissions, tribunals, and investigatory bodies where his methods have changed our understanding of war.
BALL IS 46, STOCKY, SHORT, and bearded, with glasses and reddish-brown hair, which he used to wear in a ponytail. His manner is mostly endearing geek. But he is also an evangelist, a true believer in the need to get history right, to tell the truth about suffering and death. Like all evangelists, he can be impatient with people who do not share his priorities; his difficulty suffering fools (a large group, apparently) does not always help his cause.
He didn’t set out to be a human rights statistician. In the 1980s, before studying for his Ph.D. at the University of Michigan, Ball got involved in protests against the Reagan administration’s intervention in Central America. He did more than protest — he headed to Matagalpa, Nicaragua, to pick coffee during the Sandinista years. He hated the work and instead built his coffee cooperative a database to do inventory control.
He first applied statistics to human rights in 1991 in El Salvador. The U.N. Commission on the Truth for El Salvador arose at an auspicious moment — the new practice of collecting comprehensive information about human rights abuses coincided with advances in computing that allowed people with ordinary personal computers to organize and use the data. Statisticians had long done work on human rights — people like William Seltzer, the former head of statistics for the United Nations, and Herb Spirer, a professor and mentor to almost everyone in the field today, had helped organizations choose the right unit of analysis, developed ways to rank countries on various indices, and figured out how to measure compliance with international treaties. But the problem of counting and classifying mass testimony was new.
Ball, working for a Salvadoran human rights group, had started producing statistical summaries of the data the group had collected. The truth commission took notice and ended up using Ball’s model. One of its analyses plotted killings by time and military unit. Killings could then be compared with a list of commanders, making it possible to identify the military officers responsible for the most brutality.
“El Salvador was a transformational moment, from cases, even lots of them, to mass evidence,” Ball says. “The Salvadoran commission was the first one to understand that many, many voices could speak together. After that, every commission had to do something along these lines.”
Since the 1984 publication of Nunca Más, the report of Argentina’s National Commission on the Disappearance of Persons — the first modern truth commission — information about the toll of war and political repression has been routinely collected on a massive scale. Truth commissions from Chile to East Timor have gathered testimonies from tens of thousands of people, a process designed to provide dignity to the victims, identify crimes that require justice or restitution, and write the story of what happened — to try, as much as possible, to give closure to the past. Today, technology has speeded up and broadened the reporting process: Crowdsourced mapping tools collect and plot thousands of text messages from people who have witnessed violence. Ordinary people with cell phones take pictures and videos of atrocities that are uploaded and seen around the world.
But possessing an ocean of testimony is not the same as knowing the truth. No matter how many cases we learn of, they might not be representative of the whole. A truth commission might be scorned by a particular linguistic or ethnic group, which means its members don’t come forward to speak. Fewer media reports of killings might actually mean fewer killings — or it could mean that journalists were intimidated into silence. Human rights groups might record a decline in violence because budget cuts forced them to fire half their outreach team. Rape might never be disclosed. Video collected by cell phones tells us only what was witnessed by people with cell phones.
“I get a tingle the way everyone does when I see video from a cell phone from the Arab Spring,” Ball told me recently. “But let’s not forget that most human rights violations are in secret — no cell phone. It’s easy to think the world is totally different because we have cell phones now. It has changed our understanding of public violence — but most violence isn’t public.”
For Ball, it’s such unknowns that led to his most important early insight: “You can do precise statistics about what’s in your database,” he says, “and may be completely wrong about the world.”
Yet we need to know about the world. Those massive databases and advanced technologies mean we now expect answers, and precise ones at that. Journalists, politicians, activists, and citizens all demand numbers and labels. How many have been killed in Darfur or Libya? How many raped in Congo? Who was more deadly in Peru, the army or the guerrillas? Was it ethnic cleansing in Kosovo? Genocide in Guatemala? The answers can tip the balance between intervention and nonintervention, between justice and forgetting. They shape how those who emerge from catastrophe design new governments. They affirm or explode the stories every culture tells about itself.
There are two ways to get those answers. Most often we guess. But those guesses are based on data that can mislead us utterly. People still take the answer seriously, says Ball, “even though the science underneath it may not be any better than in the ’60s and ’70s, when we just made the stuff up. It’s not only easy to misinterpret these numbers; it’s substantially easier to misinterpret them than to interpret them correctly. You look at the graph and say, ‘Now I know what happened in Liberia.’ No, you know what happened to people who talked to the truth commission in Liberia. You can say there were at least 100 people killed. But you can’t say we had five in the north and five in the south and start drawing patterns.”
Ball’s accomplishment has been to provide an alternative to guessing: With statistical methods and the right kind of data, he can make what we know tell us what we don’t know. He has shown human rights groups, truth commissions, and international courts how to take a collection of thousands of testimonies and extract from them the magnitude and pattern of violence — to lift the fog of war.
IN THE LATE 1990s, Ball was commuting between Cape Town and Guatemala City, working for both the South African and Guatemalan truth commissions, immersed in the varying atrocities of apartheid and genocide committed by two very different regimes. Then in June 1998, when the staff of the South African Truth and Reconciliation Commission was writing its report, commissioner Mary Burton asked him what should have been a simple question: How many killings had there been in apartheid South Africa?
“According to the testimonies, 20,000,” said Ball.
“No,” said Burton. “How many were there in all?”
Uh-oh, thought Ball. “I don’t have an answer,” he told Burton.
The commission had been hobbled by a boycott from the Inkatha Freedom Party, the second-largest black political group, which had blocked its members from testifying because its leaders thought the commission was biased toward the African National Congress, with which it was involved in a long, simmering war. But just before the deadline for testifying expired, Inkatha lifted the ban and its members came forward. Those testimonies completely rewrote the commission’s conclusions, as some 8,000 Inkatha members gave evidence — about one-third of the commission’s final total. What, Ball wondered, would have happened if Inkatha hadn’t decided to participate? “Are people really coming forward in the same proportion as they suffered?” he asked himself.
As he was talking to Burton, it hit him: We don’t know what we don’t know. “It wasn’t enough to take good care of the evidence we have,” he says. “We have to transcend what people are willing to tell us. You’re thinking about: What can I fix about the data coming in? But the big problem is: What’s true in the world?”
At the University of Michigan, where Ball studied sociology in a program heavy on statistics, the solution to such problems was clear: Go out and take a random sample. You choose households at random and survey them about what happened. Since your sample is representative of the whole, you can easily extrapolate the results to the larger universe. But this was not something human rights groups knew how to do, and it would have been prohibitively expensive. It was not the answer.
Working on the Guatemalan data, Ball found the answer. He called Fritz Scheuren, a statistician with a long history of involvement in human rights projects. Scheuren reminded Ball that a solution to exactly this problem had been invented in the 19th century to count wildlife. “If you want to find out how many fish are in the pond, you can drain the pond and count them,” Scheuren explained, “but they’ll all be dead. Or you can fish, tag the fish you catch, and throw them back. Then you go another day and fish again. You count how many fish you caught the first day, and the second day, and the number of overlaps.”
The number of overlaps is key. It tells you how representative a sample is. From the overlap, you can calculate how many fish are in the whole pond. (The actual formula is this: Multiply the number of fish caught the first day by the number caught the second day. Divide the total by the overlap. That’s roughly how many fish are really in the pond.) It gets more accurate if you can fish not just twice, but many more times — then you can measure the overlap between every pair of days.
Guatemala had three different collections of human rights testimonies about what had happened during the country’s long, bloody civil war: from the U.N. truth commission, the Catholic Church’s truth commission, and the International Center for Research on Human Rights, an organization that worked with Guatemala’s human rights groups. Working for the official truth commission, Ball used the count-the-fish method, called multiple systems estimation (MSE), to compare all three databases. He found that over the time covered by the commission’s mandate, from 1978 to 1996, 132,000 people were killed (not counting those disappeared), and that government forces committed 95.4 percent of the killings. He was also able to calculate killings by the ethnicity of the victim. Between 1981 and 1983, 8 percent of the nonindigenous population of the Ixil region was assassinated; in the Rabinal region, the figure was around 2 percent. In both those regions, though, more than 40 percent of the Mayan population was assassinated.
It was the first time anyone had applied MSE to human rights work. “He produced numbers that provided a strong, crisp basis for drawing the conclusions the commission did about violence, in a way you can’t get from testimony,” says Kate Doyle, a senior analyst with the National Security Archive at George Washington University, who has worked extensively with archives in Guatemala.
The MSE numbers were not the only evidence for genocide, nor even the most important evidence. Genocide is determined by intent. The commission found that the Guatemalan military and the civil patrols it supported carried out a coordinated plan of massacres against the Mayan population to “kill the largest number of group members possible,” reasoning that Mayans were natural allies of the guerrillas. Ball’s data powerfully bolstered that conclusion. On Feb. 26, 1999, the commissioners presented their report, “Memory of Silence,” which concluded that the state had committed “acts of genocide” against the country’s indigenous population.
What Ball learned in South Africa and Guatemala — the disconnect between what happens in a database and what happens in the world, and how to link them together — has been the basis of his work ever since. Two weeks after he left Guatemala, he traveled to Kosovo, working for the American Association for the Advancement of Science (AAAS), collaborating with Human Rights Watch and other groups to quantify the destruction of Milosevic’s fourth war.
It was the site of one of Ball’s most absurd adventures. He and his interpreter, Ilir Gocaj, had gone to save data — the records Albanian border guards kept of Albanian Kosovar families who crossed into Albania. The records — handwritten lists of the number of people in each fleeing party, place of residence, and name of the family head — were evidence of when people fled specific villages. That was key to determining why they had fled.
The problem was that the border post had moved. A Serbian attack had driven the Albanian guards about 500 meters further into Albania. The records, however, were still in the old, abandoned border post. The Albanian guards told Ball and Gocaj that the 500-meter walk was a dangerous one: Shooting had seriously injured a journalist a few days before, and the area was still under sniper fire.
But Ball needed those records. He and Gocaj staged a rescue, walking — very slowly — to the old border post, a tiny white house. The records were scattered on the floor amid shards of wood and glass. Ball and Gocaj scooped up the papers and slowly walked back. When they returned, they found the guards laughing. The Albanian border guards, it turned out, had become the only source of cigarettes for Serbs on the Kosovo side of the border. A Serb attack was unthinkable; it could cut off their nicotine. Ball and Gocaj had been the butt of a practical joke.
Later, when Ball gave a talk on his findings in The Hague, someone from the prosecutor’s office took him aside. “Would you testify?” he asked Ball. Only if he could strengthen his research, Ball said; he’d need more data. The prosecutor’s office gave him information on Albanian guerrilla activities, exhumation reports, and surveys of the displaced. Using a four-way MSE — effectively counting the fish on four different days — Ball found that the migrations and killings of Albanian Kosovars matched up perfectly, suggesting the same thing had caused both phenomena. That thing was not NATO bombings or Albanian guerrilla activity; neither fit in date or place. What was left was attacks by Serbian forces.
There was one other devastating piece of evidence. Milosevic had called a halt to Serb military activity on the evening of April 6, 1999, in honor of Orthodox Easter. Instantly, the number of killings and people fleeing dropped to nearly zero. But NATO and the Albanian guerrillas had not stopped their hostilities. Two days later, when Serbian forces went back to work, the killings and flight of Kosovar Albanians also resumed. “The statistical evidence is consistent with the hypothesis that Yugoslav forces conducted a systematic campaign of killings and expulsions,” Ball concluded.
“He directly refuted Milosevic’s allegation that people were fleeing [NATO] bombs,” recalls Fred Abrahams, a Human Rights Watch investigator who worked with Ball. “It directly undermined one of the main arguments of Milosevic’s defense.”
AT THE AAAS, Ball had been a solo practitioner. But in 2002, with a grant from the MacArthur Foundation, he began to build the Human Rights Data Analysis Group (HRDAG), and in 2003 he moved the group across the country to Benetech, a technology-in-the-public-interest nonprofit in Palo Alto, California. Over the years, Ball had worked with Benetech to create Martus, encryption software designed to preserve human rights data and keep it from being stolen and used to retarget victims. Ball is now vice president and chief scientist for the HRDAG, which has a team of 14 active staff members and consultants. They have designed databases and conducted statistical analysis for dozens of truth commissions, U.N. missions, international tribunals, and human rights groups. Ball himself has worked in Chad, Colombia, the Democratic Republic of the Congo, East Timor, Ethiopia, Peru, Sierra Leone, and Sri Lanka, among other countries.
Since Ball put together that first database, sophisticated statistical analysis has become obligatory; a truth commission would no sooner do without it than record data with pencil and paper. But the way Ball sees it, threats to good numbers still lurk everywhere — among military officers displeased with what the numbers reveal, human rights groups making irresponsible claims, and even advocates for those same technologies that are revolutionizing the field of human rights. Often it’s budget cutters seeking to skimp on data collection who pose the biggest obstacle to getting it right. Resources are limited, and despite what has sometimes been an all-out assault by Ball on truth commission staff members who disagree with him, collecting data is just one of the things truth commissions must do.
In some places, the challenge comes from political groups trying to discredit truth commissions’ conclusions. In Peru, for example, Ball’s work completely upended what Peruvians thought they knew about their country’s war against the Shining Path guerrillas. Ball joined that country’s Truth and Reconciliation Commission quite late. “Our initial idea was to do simple counts,” recalls Daniel Manrique, who was in charge of the commission’s database. “But we read Patrick’s analysis in Guatemala and knew what he was doing in Kosovo, and we said, ‘This is what we need.’” Ball merged six collections of data, reaching conclusions that were highly controversial. One startling finding was that the Shining Path guerrillas had committed not about 3 percent of the atrocities, as had been thought, but about 50 percent. In addition, the report put the number of killings at 69,000 — nearly triple previous estimates. The enormous gap was proof of the disconnect between white Peru and the indigenous highlands — 75 percent of the victims did not speak Spanish as their first language. How could so many indigenous Peruvians have died without Peru’s elite taking notice? The implied racism made many of them uncomfortable.
While the proportion of killings committed by the state was lower than previously thought, the absolute number turned out to be higher. The state and right-wing forces accused the commission of inventing numbers to discredit the Peruvian military, a critique echoed endlessly in academia and the media — and made most eloquently through brandished guns. To this day, commission members still get death threats.
Truth commissions are the organizations that take data most seriously, but they are far from the only groups that make numerical claims about human rights. And outside truth commissions, anything goes.
The International Rescue Committee (IRC), for example, has periodically studied the number of deaths in the Congo conflict; its latest study, in January 2008, put the number at 5.4 million deaths between 1998 and 2007. That has become the number for Congo — you see it everywhere. The IRC found it by figuring out how many deaths would normally have occurred, then counting actual deaths and calculating the excess. Ball thinks the group’s baseline estimate of a “normal” number of deaths was far too low — correct that estimate, he says, and you get an excess death figure that is only one-third or one-fourth as high. “But we’re not ever going to figure out Congo,” he says. Other groups, like Human Rights Watch, have embraced numbers with more restraint, Ball believes; the group, traditionally given to the “who here’s been tortured and speaks English?” style of investigation, as its program director, Iain Levine, told me, has used quantitative methods when possible since Ball’s results in Kosovo. And Ball is complimentary. “When they don’t know numbers, they say so,” he says.
It’s easy to understand sloppy numbers. Nearly every journalistic story, political speech, or campaigner’s press release about a war needs the obligatory “a conflict that has claimed ____ lives so far.” Advocacy groups are prone to filling in the blank with the highest number they can credibly cite. Journalists sometimes throw around big numbers to inflate the importance of a story. Most often, they reach for whatever number someone vaguely authoritative has used.
The two most violent revolutions of the Arab Spring — in Libya and Syria — are cases in point. In Libya, Ball says, the tools exist to get a careful estimate, but he “wouldn’t take very seriously” the numbers thrown around today. At one point, the U.S. ambassador to Libya used a figure of 30,000 deaths — three times what Libyan rebel leaders were claiming. “Have we seen more than 1,000 bodies?” Ball asks. “When I see numbers like 10,000 to 30,000 without any evidence, my broad guess is that it’s just meant to signify ‘a lot.’” The victorious opposition eventually claimed that as many as 50,000 had died in the fighting.
In Syria, the numbers seem much more precise. The Syrian Revolution General Commission, which claims to speak for the opposition, says 6,275 people were killed between March 2011 and January 2012. But is that number accurate? “Right now,” Ball told me, “that number is probably too high — there is undetected duplication. And too low, as a bunch of people fall outside it.” Killings committed at demonstrations tend to get counted, while those outside might not. “The security forces may hunt people down late at night, and those people may not get documented,” he points out. “The follow-up violence may end up killing more people than violence at demonstrations.”
In 2007, the Save Darfur Coalition, a human rights advocacy organization, was called out by the London-based European-Sudanese Public Affairs Council, a group aligned with the government of Sudan. Save Darfur had been running full-page ads stating that 400,000 people had been killed in Darfur. The council brought a complaint to Britain’s Advertising Standards Authority claiming the figure was untrue. The regulators ruled that Save Darfur should have presented that figure not as fact, but as opinion.
It was a huge embarrassment for Save Darfur. Censured in an advertising court at the behest of the government of Sudan — “how much worse can it get?” says Ball. (He thinks 100,000 deaths is probably closer to the truth.) “A human rights group should never lose a factual challenge. Our moral legitimacy depends on speaking truth to power. People who want to dismiss us say we’re just making shit up. If they’re ever right when they say that, we’re in big trouble.”
IN THE LAST FEW YEARS, the most important development in human rights reporting — and Ball’s latest battleground over what the data can tell us — has been the rise of do-it-yourself. On Jan. 3, 2008, Ory Okolloh, a Kenyan lawyer, recounted on her blog a harrowing trip through Nairobi’s post-election violence, passing battles between police and civilians. “[A]ny techies out there willing to do a mashup of where the violence and destruction is occurring using Google Maps?” she asked.
Thus began Ushahidi, Swahili for “testimony,” a platform for collecting and plotting texts, tweets, and e-mails submitted by the masses. Ushahidi is now ubiquitous. Around the globe it is used to track not only human rights abuses, but also earthquake damage, downed trees in snowstorms, election violence, pharmacy stock-outs — even America’s best hamburgers. Just as ubiquitous are breathless stories in the media about it. The Atlantic named it one of the top ideas of the year in 2010, calling it the “Zelig of 2010 disasters.”
Ushahidi is obviously of tremendous use to first responders. But its advocates think it can be more than that: something much closer to a real-time database of catastrophe in action. Christina Corbane of the European Commission’s Joint Research Center plotted Ushahidi reports against satellite maps of building damage after Haiti’s 2010 earthquake and concluded that the crowdsourced data correlated with the damage. Ushahidi ran with it: Patrick Meier, Ushahidi’s director of crisis mapping, argued that although crisis information sent in by the masses is unrepresentative and has been considered useless for serious statistical analysis, Corbane’s work proves otherwise. “[T]his crowdsourced data can help predict the spatial distribution of structural damage in Port-au-Prince,”he wrote on his blog, iRevolution.
To Ball, it is the perfect illustration of why having some data is worse than none. When he ran the numbers for Haiti, he found that the text messages and building damages were correlated with something else — the location of the buildings themselves. A map of buildings would have been a truer guide to the damage than a map plus the Ushahidi data, Ball says. Ushahidi is not the equivalent of a random sample, he argues. “It confuses the picture.” (In a scathing blog post, Meier retorted that Benetech should stick to human rights. He defended his numbers but ultimately retreated to a narrower set of claims.)
Ushahidi’s SwiftRiver platform, now in beta, allows crowdsourced validation of crowdsourced data, the way Google searches validate information on the Internet. Ball, however, warns that Ushahidi is making the same mistake he did: worrying about whether the data are true, but not whether they accurately reflect events. Ushahidi is even less representative for mapping violence, he says; at least earthquakes don’t have bad guys actively trying to keep people from releasing information. With human rights abuses, it’s entirely plausible that what you’ll hear from the worst zones is silence. “Show me video of someone in a torture chamber,” he says. “Outside of Abu Ghraib, not so much.”
Technology tempts us with the promises of instant accountability and omniscient knowledge. While previous conflicts were opaque, surely now technology has given us the tools to know. But look at what we “know” of the violent ouster of Muammar al-Qaddafi in Libya or the brewing sectarian war in Syria. It is no different from what we knew in Argentina or South Africa.
Measurement matters. Information about the number and patterns of atrocities influences decisions about where to put money, political support, troops. It affects judgments about who is guilty, whom to help, how to rebuild. “In the past we’d just call a bunch of people up,” says Ball. “We’d survey experts: You’re on the ground — is it getting better or worse? It was just ‘informed observers say,’ but no one took it that seriously. Now people do take it seriously, and that’s a problem. People need numbers very, very badly. And they don’t give a shit where they come from.”
Correction: A previous version of this article misstated the number of staff members and consultants at the Human Rights Data Analysis Group. It is 14, not nearly two dozen