Fatal Encounters

A step toward creating an impartial, comprehensive and searchable national database of people killed during interactions with police

Methodology

Update, January 29, 2019. This is the basic description I send to people who ask: We’ve gone through several iterations for collecting the data. Initially, the plan was to do crowdsourced public records requests. We ended up doing about 2,500 requests, but this method turned out to be inefficient and expensive. We then moved to a system where I made a Google spreadsheet that was editable by anyone. It had partial records on about 5,000 incidents that I’d scraped from various sources on the internet. I then checked the information provided against public news sources and moved the verified information to another Google sheet that was only editable by me. When Ferguson hit in August 2014, trolls spent more time breaking the publicly editable sheet than I had time to fix it, so I created a form by which people could look at the sheet with partial information and research the missing information, submitting it through a form (http://www.fatalencounters.org/google-form/). Beginning in December 2014, we moved primarily to paid researchers while we completed state-by-year searches using internet searches and paid sites like Newsbank, Newspapers.com, and to a smaller extent, the student version of Lexis-Nexis. We completed that process in November 2017.
Currently, I have 15 Google alerts I get on a daily basis, and generally on Tuesdays, I go through these alerts, sites like GunViolenceArchive.org and a few other sources kept by volunteers and do the updates. There are more specialized searches I do on sort of an irregular basis. For example, I usually do research for chase deaths about once a month, and dispositions, I do about once a year.

D. Brian Burghart, executive director Fatal Encounters


Fatal Encounters is a complex and rigorous project that uses several processes of data collection to ensure a high level of validity. Media news sources have predominantly focused on the crowdsourcing aspects of our project. While some of our data is crowdsourced, we have three main methods of collecting information. They are listed below in order of numbers of records in the database:

1) Paid researchers; 2) Public records requests; 3) Crowdsourced data.

Out of the 6,900 documents we have on June 15, 2015, around 85 percent have been submitted by researchers we pay to log data.

Our paid researchers have several methods of getting information into the verification queue. First we aggregate data from other large sets like KilledByPolice or the Los Angeles Times’ The Homicide Report and individuals like Carla DeCeros who have contributed their data to FE. They then research the missing information and double-check the information that’s included. When the record is complete, it’s moved over to the verification queue, where it is again checked against published sources yet again by the Principal Investigator of FE.

When an incident is reported by a volunteer—the crowd—every fact presented is compared to published media reports or public records to verify its accuracy. This information from any source–a hometown newspaper, for example–and submitted it through our form. Once submitted, it goes to a separate spreadsheet, where we verify its information against media sources.

We have also been conducting research by state and by date. These methods are intended to be redundant so that we catch as many incidents as possible. However, we know from experience that incidents have been missed, sometimes because the death was not reported at the time it happened, through human error, or just because of the vagaries of the internet. To address this issue, FE and our sister project, EncuentrosMortales.org, have made more than 2,300 public records requests of state, federal and local law enforcement agencies. This part of the process is extremely expensive, but the documents are useful as yet another level of redundancy. Other researchers, such as Lance Farman have also been testing the completeness of the database against FOIA requests and have found that this method yields a 97% completeness rate for 11 of the states that have been logged so far.

D. Brian Burghart is the principal manager of FE. He is a newspaper editor with more than 25 years of experience.