Tag Archives: Police Surveillance

Amazon’s “Ring” On The Congressional Privacy Hot Seat



The House Oversight and Reform Subcommittee on Economic and Consumer Policy, asked for a range of information, including copies of all agreements the company has reached with local governments going back to 2013, details on integration of any facial recognition tools and instances where law enforcement has requested video footage from Ring.

Click to access 2020-02-19.RK%20to%20Huseman-Amazon%20re%20Ring%20%281%29.pdf


“The Subcommittee on Economic and Consumer Policy is writing to request documents and information about Ring’s partnerships with city governments and local police departments, along with the company’s policies governing the data it collects,” Krishnamoorthi wrote.  “The Subcommittee is examining traditional constitutional protections against surveilling Americans and the balancing of civil liberties and security interests.”

Ring reportedly works closely with local governments and police departments to promote its surveillance tools and has entered into agreements with cities to provide discounts on Ring products to their residents in exchange for city subsidies.  Reports also indicate that Ring has entered into agreements with police departments to provide free Ring products for giveaways to the public.

Ring reportedly tightly controls what cities and law enforcement agencies can say about Ring, requiring any public statement to be approved in advance.   In one instance, Ring is reported to have edited a police department’s press release to remove the word “surveillance.”

“The Subcommittee is seeking more information regarding why cities and law enforcement agencies enter into these agreements,” wrote Krishnamoorthi.  “The answer appears to be that Ring gives them access to a much wider system of surveillance than they could build themselves, and Ring allows law enforcement access to a network of surveillance cameras on private property without the expense to taxpayers of having to purchase, install, and monitor those cameras.”

The Subcommittee demands Amazon provide information about these partnerships dating back to January 1, 2013.”


Invasive Police Aerial Surveillance Is Widespread


Drones Watching

Image ACLU California


“While the greatest risks posed by drones and aerial surveillance lay ahead as tech continues to advance and becomes more powerful, easier to automate, and cheaper, there are already significant threats.

Drones, which already possess so much surveillance power, are widespread and broadly in use by police departments throughout the country.”

“In recent years, we’ve seen significant efforts to roll back the mass surveillance that technological advances have permitted on an unprecedented scale. In 2015, Congress passed the USA FREEDOM Act to ban bulk collection of sensitive information such as Americans’ communications metadata. And this year, the Supreme Court ruled that tracking an individual’s location from their cell phone required a warrant, creating a privacy protection even though it involved public activities. But amid these victories for privacy rights, another form of surveillance has been quite literally rising up all around us: aerial surveillance. And this snooping from the skies most often comes in the form of police departments across the country deploying powerful drones.

Aerial surveillance and the broad use of drones threaten to undermine the progress made in recent years to prevent unreasonable location tracking and government stockpiling of sensitive, personal information. With existing and emerging technologies, government may be able to use aerial surveillance to track our movements en masse and catalog participation in constitutionally protected activities such as protests, religious ceremonies, and political rallies.

Aerial Surveillance Can Be Incredibly Invasive

Existing technology that is affordable and in wide use allows law enforcement to spy on individuals over huge distances. The most prominent example is the DJI Zenmuse Z30 camera, which can be affixed to commonly used drone models such as the Inspire 2 and the Matrice. Chinese manufacturer DJI, the drone maker most favored by U.S. law enforcement, promotes the Zenmuse Z30 by describing it as “the most powerful integrated aerial zoom camera on the market with 30x optical and 6x digital zoom for a total magnification up to 180x.”

Demonstration of digital and optical zoom capacity with footage claims to be taken at a 3.7 mile distance. (Note: The video claims to use a Matrice 600 drone with a DJI Zenmuse Z30 drone camera)(Source: SkyLink Japan / YouTube)

The implications of this are profound, and frightening. With this technology, law enforcement can use small and inconspicuous drones to snoop on individuals from thousands of feet away, and even watch activities occurring several miles away with a good degree of precision. In an aerial space, these drones can easily move to adjust view and overcome obstacles that make this type of long distance surveillance impossible from ground level.

Invasive Aerial Surveillance is Cheap

In addition to the surveillance powers modern drones possess in terms of long-distance monitoring, automated identification, and automated tracking, technological advances are making aerial surveillance an exponentially cheaper option, and thus something that can be done more broadly and on a larger scale. The Inspire 2 costs around $3,000, and equipping it with the powerful Z30 zoom camera costs an additional $3,000. In comparison, police helicopters cost roughly $500,000 to $3,000,000. The helicopter’s operating costs of $200 to $400 per hour and the maintenance costs increase the expense of this traditional aerial surveillance tool even more.

With this cost differential, a department could potentially purchase a fleet of 500 drones in lieu of a single police chopper—a swarm of devices that can watch individuals without notice from thousands of feet away, use software to identify people in an automated manner, and follow them without human piloting. As technology improves, the potential power of this type of fleet will only increase, creating the possibility of a massive surveillance umbrella permanently buzzing over America’s cities and towns.

Invasive Aerial Surveillance Is Widespread

Map of Inspire 2 and Matrice drones (which can be equipped with the X30) in use by police departments throughout the country. (Source: Google Maps screenshot created based on data from the Center for the Study of the Drones at Bard College)

According to research by the Center for the Study of the Drone at Bard College, as of May 2018, at least 910 state and local public safety agencies have purchased drones (based on Federal Aviation Administration and other records). Of those, 599 are law enforcement agencies. The survey identified the make and model of drones owned by 627 of the 910 agencies. Of the 627, 523 have drones made by DJI. Of those, over 200 agencies fly either the Inspire or Matrice models, which can be equipped with the Z30 zoom camera.

Invasive Aerial Surveillance Can Identify You

With its capacity for precise zooming at short distances, aerial surveillance can, in combination with other automated identification technologies, allow for effortless cataloging of individuals and their activities. There are two prominent automated identification technologies that could allow for easy identification from immense distances: automated license plate readers and facial recognition technology. These technologies are already in wide use by government agencies. U.S. Immigration and Customs Enforcement maintains a nationwide net of automated license plate readers to track individuals, and the FBI already maintains a facial recognition database of fifty percent of American adults and permits law enforcement from dozens of states to use it.

Positive automated license plate reader identification from drone footage claimed to be taken at a 1200 ft distance. (Note: The video claims to use a Matrice 100 drone with a DJI Zenmuse Z30 drone camera) (Source: Sharon Arenhaim /YouTube)

This means that the government could surreptitiously watch sensitive activities and catalog individuals. Everyone entering or exiting a political meeting, union meeting, or lawyer’s office could be identified and catalogued. Or a drone could zoom in on and scan all the cars parked outside a medical facility or church, and create a list of attendees in seconds with no human effort. These fears are not hypothetical. American Civil Liberties Union research efforts exposed the fact that the FBI was deploying aerial surveillance to record the activities of protesters in Baltimore. Vendors marketing drones to police departments highlight their ability to pick individuals out of a public gathering such as a political rally as a feature, not a cause of potential abuse.

FBI aerial surveillance of protests in Baltimore after the death of Freddie Gray in 2015. (Photo: Still from FBI footage / ACLU)

Amplifying these risks is a recent partnershipbetween DJI and Axon, one of the leading producers of police body cameras. Axon also provides cloud computing services designed to allow law enforcement to sync data from a variety of sources, including cameras, and has spent years developing facial recognition technology for its products. With this partnership, which will allow DJI drone footage to sync with the Axon system, police drones with built-in facial recognition technology could soon become the norm.

Invasive Aerial Surveillance Can Track You

Identifying individuals from aerial surveillance footage appears to be on a path to automation and is occurring on a mass scale absent need for human involvement. But is the impact of drones on privacy limited by requiring a person to remotely pilot them and actively work to follow the target being tracked? Unfortunately, the answer is no.

DJI has developed a feature for many of its drones—including models like the Inspire 2 that are commonly used by police—to allow drones to lock onto and automatically follow individuals. This technique, called “Active Track,” enables the drone to automatically follow moving items, including people, absent any human control of the drone. DJI drones in Active Track operate in a mode that allows the drone to travel at roughly 20 miles per hour, more than enough to keep pace with an individual on foot. Some drones are even programmed to automatically avoid obstacles while continuously tracking their locked-on target.

Active Track allows drones to tag and track individuals without human piloting. (Note: The video claims to use a DJI Spark drone with attached camera) (Source: DC Rainmaker / YouTube)

As with automated identification, Active Track technology decreases reliance on human labor in another aspect of aerial surveillance which has traditionally served as an impediment to mass monitoring of individuals. And this technology will only become more powerful over time.

Drones with “swarm capabilities,” which further enhance automated flight power by allowing a single pilot to control multiple drones, are already in development, such as the military’s Low-Cost Unmanned aerial vehicle Swarming Technology (“LOCUST”). In the future, a single officer might be able to command a large swarm of drones, inconspicuously identifying and following many individuals over a long period of time.

Invasive Aerial Surveillance Can Be Limited

With these serious and growing risks to personal privacy, it’s important that lawmakers begin to take the threats of aerial surveillance more seriously. Luckily, drones can be fairly easily regulated. Several states have placed limits on drone-based surveillance. For example, Florida, Maine, North Dakota, and Virginia have all enacted some form of a warrant requirement for police use of drones, and Rhode Island has proposed legislation prohibiting the use of facial recognition on any images captured by drones. To be fully effective, drone regulations should take into account and allow important public safety uses that don’t threaten privacy rights, like natural disaster response and search and rescue.

Unfortunately, as we’ve previously written, the increasing use of powerful manned aerial surveillance programs remains a serious issue that drone regulations will not solve. Reasonable limits on law enforcement drone use is an excellent way to begin setting reasonable limits on all forms of aerial surveillance, but it is also just the first step in addressing larger civil liberties issues looming above.”


Risks In Police Use of Body Camera Real Time Facial Recognition


Real Time Facial Recognition


“Real-time facial recognition is especially concerning because it means that body cameras will continuously scan the face of everyone passing police officers on the street, and immediately log and relay data.

The Wall Street Journal reported that body camera vendors are preparing body cameras with real-time facial recognition capabilities, and law enforcement agencies could potentially deploy them as soon as this fall.”


“In recent years, we at The Constitution Project have warned that adding facial recognition scanning to police body cameras poses serious risks that could undermine basic privacy and due process rights. Unfortunately, the time to prepare for these risks is running out.

 Before adding real-time facial recognition to body cameras, it’s critical that departments and lawmakers implement necessary measures to avert the unprecedented mass collection of the identity and location of individuals in public:

Set Standards for Police Action that do not Depend upon Facial Recognition

A major issue law enforcement must confront before deploying facial recognition is its inaccuracy. It is well documented that despite its immense power, facial recognition technology is often wrong, especially when identifying racial minorities. Specifically, these systems are prone to generating false positives, in which the technology identifies a match (e.g., says a person on the street matches the face of a wanted criminal) when in reality the faces are of two entirely different people.

It’s not hard to see how this situation could spin out of control with real-time facial recognition on police body cameras. What if an officer’s camera misidentifies an innocent person as a dangerous fugitive at large, leading to a violent incident? Even a commonly accepted police use for facial recognition—searching for a missing child—could turn horribly wrong if a false positive leads an officer to confront a parent as an abductor. Body camera use has grown exponentially because many saw it as a means to improve community-police relations and reduce use-of-force incidents, but adding facial recognition could inflame these problems. Even if misidentifications do not result in use of force, a mere arrest has serious consequences for individuals. They can be detained, fingerprinted, and subject to strip searches—all merely because a computer program was wrong.

It’s critical that before police add real-time facial recognition to body cameras, they set proper limits on the degree to which officers can rely on the identifications provided by an imperfect system. At a minimum, facial recognition should not be allowed to serve as the sole basis for an arrest or any use of force. Officers should seek means of corroborating an identification, and in the event of conflict, base their decision on what action to take on the totality of circumstances rather than completely trusting the determinations of a facial recognition program. This principle is consistent with current practices; departments such as the NYPD already require human review to confirm results when facial recognition is applied to crime scene footage. These measures are necessary not just to prevent improper conflicts and arrests, but also to avoid a perverse incentive to build systems that generate more false positives, which would give police more pretexts to stop or arrest people, while limiting their liability for those actions because the identifications were based on the technology.

Limit Facial Recognition Scans and Identifications to Serious Crimes

Another serious risk that facial recognition poses is giving police “arrest-at-will authority,” and this potential is greatest when real-time facial recognition is incorporated into body cameras. Arrests may be a common police function, but they usually occur in response to specific assignments or situations, rather than in a random or opportunistic manner.

In some municipalities, a huge portion of the population has active bench warrants for minor violations, such as unpaid parking tickets (which people often don’t know can lead to an arrest warrant). For example a 2015 Department of Justice investigation revealed that 16,000 out of the 21,000 residents of Ferguson, Missouri, had outstanding warrants.

A patrol officer may be able to keep an eye out for the faces on a most-wanted listed, but they can’t memorize tens of thousands of people with outstanding warrants for petty offenses. Facial recognition changes that: This technology can take a face and scan it against millions of photos in a second. With this tool, every officer could be notified whenever they encounter an individual with any outstanding warrant, no matter how trivial the offense, and have free rein to arrest them.

This creates serious risk of abuse, as The Constitution Project’s comprehensive reporton police body cameras—whose signatories include both civil liberties advocates and former law enforcement officers—warned. This “arrest-at-will authority” could also be wielded to disrupt First Amendment-protected activities. Police could use real-time facial recognition to scan crowds at protests or political rallies, and then arrest anyone flagged for any potential offense–no matter how trivial. Fear of such abuse isn’t paranoid— we’ve already seen it attempted. In 2016, police scanned for and identified any individuals with outstanding warrants among those protesting police brutality in Baltimore, using a social media scraping software tool called Geofeedia. The platforms Geofeedia scraped its data from (Facebook, Twitter, Instagram) quickly cut off Geofeedia’s access to end the program, but with police using body cameras equipped with facial recognition, there is no such middleman to block misconduct. This will allow law enforcement to directly disrupt and chill participation in First Amendment protected activities.

The solution to these issues is simple: Facial recognition incorporated into body cameras should only be used in relation to an enumerated set of serious crimes. This would set an effective balance, preventing potential abuses stemming from overbroad use while still allowing a system to flag serious threats for officers. Limiting use of powerful technological tools to serious offenses has precedent. The Wiretap Act, which sets the foundation for law enforcement surveillance of phone calls and electronic communications, is only allowed to apply to a list of serious crimes. The government cannot wiretap everyone suspected of parking violations, and it shouldn’t be able to deploy mass surveillance across American cities for such minor offenses either.

Provide Oversight to Prevent Unfettered Location Tracking

A final risk is that real-time facial recognition in body cameras creates a new avenue for location tracking that is devoid of accountability or oversight. Currently, law enforcement location tracking is mostly conducted by tracking cellphones; the United States Supreme Court is currently reviewing whether this should require a warrant.  However, even if the Supreme Court does not impose a warrant standard, cellphone location tracking still requires some court approval. Facial recognition and body cameras, which currently do not require court approval to use, could cut out this independent oversight entirely, circumventing a basic due process protection.

Given the sheer scale of use of police body cameras in populated areas, facial recognition could allow law enforcement to rapidly scan and locate anyone they desire, and track their movements.  This would circumvent privacy rights and independent oversight. It could also chill sensitive activities: If someone sees an officer near a protest, house of worship, political rally, or medical facility, they might (and should) worry that their presence at that location (along with every other attendee) is being logged.

In order to prevent abuse and preserve due process, its vital that a court approve any use of body cameras with facial recognition for location tracking, just as court approval is currently a key component of oversight of other forms of electronic location tracking. The best specific rules and standards for this activity may become more clear in light of the Supreme Court’s ruling on cellphone tracking later this spring, but at a minimum police departments should begin preparing to incorporate independent oversight into any type of location tracking for body cameras with facial recognition. The technology is too powerful—and location information is too sensitive—for law enforcement to be unchecked in its use.

There are a variety of avenues towards setting effective policies for body cameras. Some police departments have directly stepped up and adopted effective internal guidelines. In other locations, cities have established rules to ensure body cameras provide accountability rather than overbroad surveillance. State legislatures have also set limits to stop body cameras from becoming too pervasive as a surveillance tool. Individuals should consider engaging at all of these different levels of government, but now is the time to act. If we do not, could soon be in a world where the government has an eye on every street corner, with little oversight or accountability about how it uses this immense power.”






Baltimore Police and Contractor Run 30 Mile Wide Aerial Surveillance


Philip Montgomery for Bloomberg Businessweek

Image:  Philip Montgomery for “Bloomberg Businessweek”


“Since January police have been testing an aerial surveillance system adapted from the surge in Iraq.

Funding came from a private donor. No public disclosure of the program had ever been made.

The plane’s wide-angle cameras captured an area of roughly 30 square miles and continuously transmitted real-time images to analysts on the ground. The footage from the plane was instantly archived and stored on massive hard drives, allowing analysts to review it weeks later if necessary.

Since the beginning of the year, the Baltimore Police Department had been using the plane to investigate all sorts of crimes, from property thefts to shootings. The Cessna sometimes flew above the city for as many as 10 hours a day, and the public had no idea it was there.

A company called Persistent Surveillance Systems, based in Dayton, Ohio, provided the service to the police, and the funding came from a private donor. No public disclosure of the program had ever been made.

A half block from the city’s central police station, in a spare office suite above a parking garage, Ross McNutt, the founder of Persistent Surveillance Systems, monitored the city’s reaction to the Goodson verdict by staring at a bank of computer monitors. “It’s pretty quiet out there,” he said. The riots that convulsed the city after Gray was killed wouldn’t be repeated. “A few protesters on the corner, and not much else. The police want us to keep flying, but the clouds are getting in the way.”

McNutt said something about not being able to control the weather, pretending to shrug it off, but he was frustrated. He wanted to please the cops. Since this discreet arrangement began in January, it had felt like a make-or-break opportunity for McNutt. His company had been trying for years to snag a long-term contract with an American metropolitan police department. Baltimore seemed like his best shot to date, one that could lead to more work. He’s told police departments that his system might help them reduce crime by as much as 20 percent in their cities, and he was hoping this Baltimore job would allow him to back up the claim. “I don’t have good statistical data yet, but that’s part of the reason we’re here,” he said. McNutt believes the technology would be most effective if used in a transparent, publicly acknowledged manner; part of the system’s effectiveness, he said, rests in its potential to deter criminal activity.

McNutt is an Air Force Academy graduate, physicist, and MIT-trained astronautical engineer who in 2004 founded the Air Force’s Center for Rapid Product Development. The Pentagon asked him if he could develop something to figure out who was planting the roadside bombs that were killing and maiming American soldiers in Iraq. In 2006 he gave the military Angel Fire, a wide-area, live-feed surveillance system that could cast an unblinking eye on an entire city.

The system was built around an assembly of four to six commercially available industrial imaging cameras, synchronized and positioned at different angles, then attached to the bottom of a plane. As the plane flew, computers stabilized the images from the cameras, stitched them together and transmitted them to the ground at a rate of one per second. This produced a searchable, constantly updating photographic map that was stored on hard drives. His elevator pitch was irresistible: “Imagine Google Earth with TiVo capability.”

The images weren’t perfect. Analysts on the ground could see individual cars moving through the streets, but they couldn’t tell what make or model they might be. Pedestrians were just pixelated dots; you couldn’t distinguish a man from a woman, or an Iraqi civilian from an American soldier. Individual recognition, however, wasn’t the point; any dot could be followed backward or forward in time, which opened up all sorts of investigative possibilities.

If a roadside bomb exploded while the camera was in the air, analysts could zoom in to the exact location of the explosion and rewind to the moment of detonation. Keeping their eyes on that spot, they could further rewind the footage to see a vehicle, for example, that had stopped at that location to plant the bomb. Then they could backtrack to see where the vehicle had come from, marking all of the addresses it had visited. They also could fast-forward to see where the driver went after planting the bomb—perhaps a residence, or a rebel hideout, or a stash house of explosives. More than merely identifying an enemy, the technology could identify an enemy network.

McNutt demonstrated the prototype to a group of Marines at a California base in 2006. “They called up their general,” McNutt recalls, “and when he saw it, he said, ‘I need this, and I need it right now—in Fallujah.’ ”

Eventually another military unit took control of the project and completed the development of Angel Fire at the Los Alamos National Laboratory in New Mexico. In 2007 the technology was deployed to Iraq. Angel Fire was eventually upgraded with all-weather and nighttime capabilities and then used as the basis for another system, called Blue Devil, which coupled wide-area cameras with narrow-focus zoom lenses in the same package.

McNutt retired from the military in 2007 and modified the technology for commercial development, increasing the number of cameras in the assembly to 12 and making the apparatus lighter and cheaper. He began attending security trade shows to fish for clients. His first real customer approached him at a security expo in Miami. His name was José Reyes Ferriz, and he was the mayor of Ciudad Juárez, in northern Mexico. In 2009 a war between the Sinaloa and Juárez drug cartels had turned his border town into the most deadly city on earth.

Reyes Ferriz offered enough money for a couple months’ worth of surveillance, and McNutt, who’s married with four children, left Ohio to temporarily set up shop at the border. Within the first hour of operations, his cameras witnessed two murders. “A 9-millimeter casing was all the evidence they’d had,” McNutt says. By tracking the assailants’ vehicles, McNutt’s small team of analysts helped police identify the headquarters of a cartel kill squad and pinpoint a separate cartel building where the murderers got paid for the hit.

The technology led to dozens of arrests and confessions, McNutt says, but within a few months the city ran out of money to continue paying for the service. Reyes Ferriz left office to mount an unsuccessful campaign for state governor.

For the next couple of years, Persistent Surveillance survived by providing services such as traffic-flow analysis for municipal planners, wildlife monitoring and border surveillance for federal agencies, and security monitoring for single events ranging from the Brickyard 400 Nascar race to Ohio State University football games. The company also did short-term projects in six countries, including in Central America and Africa, but the nature of that work is confidential, protected by nondisclosure agreements. The combination of those projects earned Persistent Surveillance about $3 million to $4 million a year in revenue, according to McNutt.

A single, long-term contract with an American police department would be worth about $2 million a year, he says. By 2012, McNutt was approaching the police departments of the 20 most crime-ridden jurisdictions in the country, marketing his services. He floated several of them an offer: Let us fly over your city to show you what we can do, and then you can decide if you want to hire us.

The Los Angeles County Sheriff’s Department quietly took him up on the offer, allowing him to conduct a nine-day trial run over Compton, a largely minority city south of L.A., in 2012. According to Patrick Bearse, operations lieutenant for the Aero Bureau of the sheriff’s department, the county recognized the potential of Persistent Surveillance’s service, but it didn’t sign a contract with the company because the technology, particularly the quality of the images, didn’t meet the department’s expectations. The city’s residents didn’t find out about the flights until a year later. Angry protesters demanded a new “citizen privacy protection policy” from local leaders, but even those leaders—from the mayor on down—hadn’t been told about the test program. “There is nothing worse than believing you are being observed by a third party unnecessarily,” Compton Mayor Aja Brown told the Los Angeles Times.
The next city to try McNutt’s technology was his home base of Dayton. After the L.A. County trial, he improved the system by more than doubling the resolution, to 192 megapixels, increased the archive’s storage capacity, and sped up the image processing to allow analysts to conduct multiple investigations simultaneously. The Dayton police department and the city council were sold on it, and they aired the idea for a contract at a series of public hearings. Joel Pruce, who teaches human rights studies at the University of Dayton, helped organize the opposition. To the objecting residents, it seemed as if it hadn’t occurred to city leaders that the surveillance program might be interpreted as a violation of some vital, unspoken trust. “At the hearings, nobody spoke in favor of it except for the people working for the city,” Pruce recalls. “The black community, in particular, said, ‘We’ve seen this type of thing before. This will target us, and you didn’t even come to us beforehand to see how we’d feel about it.’ ” Dayton’s city leaders dropped their attempts to hire the company after those hearings.

Last year the public radio program Radiolab featured Persistent Surveillance in a segment about the tricky balance between security and privacy. Shortly after that, McNutt got an e-mail on behalf of Texas-based philanthropists Laura and John Arnold. John is a former Enron trader whose hedge fund, Centaurus Advisors, made billions before he retired in 2012. Since then, the Arnolds have funded a variety of hot-button causes, including advocating for public pension rollbacks and charter schools. The Arnolds told McNutt that if he could find a city that would allow the company to fly for several months, they would donate the money to keep the plane in the air. McNutt had met the lieutenant in charge of Baltimore’s ground-based camera system on the trade-show circuit, and they’d become friendly. “We settled in on Baltimore because it was ready, it was willing, and it was just post-Freddie Gray,” McNutt says. The Arnolds donated the money to the Baltimore Community Foundation, a nonprofit that administers donations to a wide range of local civic causes.

In January, McNutt opened the office above the parking garage. The only sign greeting visitors is a piece of copy paper taped to the door that reads “Community Support Program.”

Almost everything about the surveillance program feels hush-hush; the city hasn’t yet acknowledged its existence, and the police department declined requests for interviews about the program. On Aug. 10 the U.S. Department of Justice released a 163-page report that detailed systemic abuses within the Baltimore Police Department, including unlawful stops and the use of excessive force, that disproportionately targeted poor and minority communities and led to “unnecessary, adversarial interactions with community members.” Within a week, civil rights groups filed a complaint with the Federal Communications Commission claiming that the department’s warrantless use of cell phone tower simulators known by the trade name StingRay—an activity the police acknowledged last year in court—violated federal law and targeted minorities. “The problem of radicalized surveillance is particularly pronounced in Baltimore,” the complaint stated. The city was already on the defensive, even as the aerial surveillance program was shielded from the public eye.

Around 11 o’clock each morning, a printout is delivered to the Persistent Surveillance office listing all the crimes logged the previous day by Baltimore’s computer-aided dispatch—or CAD—system. The company has hired a former Baltimore cop to act as a liaison between the company and the police force, and he scans the list for cases Persistent Surveillance’s analysts might help solve, highlighting them with an orange marker.

On a Friday in late June, not long after the Goodson decision, six analysts sat at separate workstations inside the office suite. The analysts ranged from their early 20s to their late 50s. McNutt brought four full-timers with him from Dayton, and he’s hired several more from a local temp agency, paying $10 to $15 per hour for entry-level trainees.

Terrence Rice, a 25-year-old from Baltimore County, was one of the local hires. It was his third day on the job, and he was still getting the hang of the software. For practice, he worked on a weeks-old case involving the illegal dumping of wood. He stared at an aerial image on the twin large-screen monitors on his desk. He struggled to track a pickup as it proceeded north, squinting to differentiate between the target vehicle and others it passed on a busy roadway. He kept his cursor over the truck as it advanced frame-by-frame. “It reminds me of playing a video game,” he said, his eyes rarely leaving the screen, his back bent as he leaned in close. “And that’s what they told me over the phone. They said that if I was into video games, I might like this work.”

The highlights of the previous day’s CAD list included 13 burglaries and 11 hit-and-runs, and all of the analysts were reviewing archived images instead of tracking the live feed. They were prepared to instantly drop their individual investigations and collaborate, however, if the police called with a report of a high-profile crime, like a homicide or violent assault.

One afternoon in February, every analyst in the office had pitched in when the police responded to the shooting of a 90-year-old woman and her 82-year-old brother, who’d been hit while walking in front of a bus stop on Clifton Avenue in the Western District. In a city where gun violence had lost much of its power to shock, the crime struck a local nerve. TV crews descended on the scene, sensing a big story.

McNutt’s analysts called up the aerial images and began tracking vehicles leaving a busy shopping center across the street from the bus stop, where witnesses had placed the shooter. For about two hours, they mapped the routes of several cars leaving the parking lot, until a detective informed McNutt that the shooter probably had left the area on foot. Rewinding to the moment of the shooting, they quickly pinpointed a person who appeared to scramble away from the scene just after the gunshots.

He was little more than a faint, grainy dot with no identifying characteristics. After he crossed the parking lot, he walked past a Subway sandwich shop and proceeded down a hill behind the shopping center. He cut a corner to cross a vacant lot and ducked between two houses on a quiet residential street. Then he approached what seemed to be a stationary object sitting in the backyard of one of the houses. The analysts toggled their screens to pull up Google Earth’s Street View, and the image—taken months earlier—revealed that the object in the backyard was a car, abandoned on the grass. The suspect stopped briefly at the car before walking a few doors down and into a house.

While he was inside, a vehicle pulled up to the front of the house; a person exited the house, got in the car, and traveled about three miles to Bons Secours Hospital. The analysts tracked him into the emergency room entrance.

Because the analysts had lost so much time while tracking the cars leaving the parking lot, all of the movements they were watching were a few hours old. When the police went to the emergency room, the hospital wouldn’t release any patient information. With no identifying information at hand, the trail seemed cold.

It wasn’t. The police later that day determined that the house the suspect may have entered before he went to the hospital belonged to the girlfriend of Carl Anthony Cooper, a man with a long criminal record. Additionally, they discovered that when the suspect walked away from the shopping center, he’d passed in front of a ground-based security camera. Accessing that footage and reviewing Cooper’s mug shots on file, they found a possible match. The police couldn’t immediately figure out why he went to the hospital; some speculated that his gun might have accidentally gone off when he tucked it into his pants and the bullet grazed his leg.

Two days after the shooting, the Baltimore Police Department posted an archived picture of Cooper on its Facebook page, labeling him the city’s “Public Enemy #1.” It also posted the footage captured by the ground-based security camera, which showed him calmly carrying what appeared to be a bag of food in one hand and his cell phone in the other.

The footage baffled Facebook users, who couldn’t figure out how it implicated Cooper. In the comments section, one wrote that if the man on camera really was the shooter, he surely would have dropped his food and run. Another commenter typed: “Not saying this isn’t the suspect but what is being seen that we, the public, isn’t seeing???” Finally someone posted, “Can a detective chime in and let us know what additional information leads you to believe that he is the suspect?”

No one from the department responded. But Cooper was eventually apprehended by federal marshals in North Carolina and sent to Baltimore, where he remains in custody. The police held a press conference to announce Cooper’s capture, saying he’d face charges for the shootings, including attempted murder and assault. Nothing was said about the surveillance plane.

Even six months after the flights began, some Baltimore police officers still didn’t know exactly how the surveillance program worked. But word was spreading.

One morning in June, three plainclothes officers showed up to see McNutt. They were members of a special unit charged with investigating dirt bike crews—groups of primarily young men who recklessly drive illegal off-road motorcycles through the city. In Baltimore the crews are infamous for aggressively disrupting traffic, ignoring stoplights, and occasionally injuring and killing bystanders. Should a car accidentally collide with group members, other riders have been known to assault the driver before speeding away. City policy prevents police from chasing the bikers, because high-speed pursuits are deemed too risky.

The officers wanted to learn more about the surveillance system, and McNutt led them to a conference room to give them a demonstration. Using two large projection screens, he delivered the sales pitch he’d honed for trade shows. He called up old images from a murder in Juárez and walked the detectives through the tracking process that had led him to a cartel safe house. The spiel lasted about 20 minutes. When it was over, the sergeant in charge of the unit sat in silence for a moment, his arms crossed on his chest.

“I’m sorry,” he said. “But oh my God—this is just overwhelming right here. This is amazing.”

One of the other officers slapped the tabletop. “Let’s go get some dirt bikes, Sarge!”

The sergeant said he expected the dirt bikes to be out in force that Sunday, and some might be entering the city from out of town on Saturday. When one of the officers asked if the plane might be flying that weekend over the west side of the city, where police suspected several of the bikes would be stored, McNutt said he would make sure of it.

That Saturday morning, the Cessna rolled out of a hangar at the Martin State Airport, about 10 miles east of downtown Baltimore. The plane was scheduled to make two flights of about five hours each, with a break to refuel. The pilot for the first flight was a man who declined to identify himself but said he was a local firefighter who’d flown for the U.S. Army. David Trexler, Persistent Surveillance’s director for operations, rode along in the back of the plane in case there were glitches with the cameras’ data link to the analysts. Trexler met McNutt when both were in the Air Force, and he’d worked on Angel Fire in Iraq.

The plane took off, and as it rose over the buildings of East Baltimore, the cockpit was noisy. The camera array was bolted onto the floor rails where seats normally would be, and it hung out of a broad opening in the fuselage, where the wind rushed through.

The Cessna leveled out at 8,500 feet, an altitudinal sweet spot between the planes approaching for landing at BWI Airport and those flying higher en route to the Washington airports. Occasionally, the Cessna has had to share airspace with an FBI airplane. Last year, two days after the Freddie Gray riots began, the FBI flew over Baltimore for five days—actions that were discovered when local aviation enthusiasts noticed a plane’s strange flight orbits on a public website that tracks radar data. According to information and footage released this summer by the FBI, its plane wasn’t doing the sort of wide-area motion imaging that Persistent Surveillance does but instead was zooming in on specific targets. McNutt says the FBI doesn’t coordinate its flights with him, and he doesn’t know what the agency is investigating; however, when his plane is in the air at the same time as the FBI’s, air traffic controllers insist that McNutt’s plane remain at a lower altitude than the federal craft.

From 8,500 feet, some of the landmarks below were easy to pick out. Pimlico Race Course to the north. The bold diagonal line of Pennsylvania Avenue. Paddleboats dotting the Inner Harbor close to the shore and sailboats scattered farther out. The just-detectable baseball players taking the field at Camden Yards. The Orioles were playing a doubleheader against the Tampa Bay Rays that afternoon; Trexler commented that the police were concerned Black Lives Matter demonstrators might try to disrupt the games. (Those concerns proved to be unfounded.)

Trexler was able to look at the cameras’ integrated aerial image on the computer on the plane, and he could chat with the analysts on the ground via instant message. About two hours into the flight, while he and the pilot were trading war stories about their respective tours in Iraq and Afghanistan, a message popped up. “Here we go,” Trexler said. The police had called in a shooting on the west side. “They’re probably following a bad guy through the city right now,” he guessed.

The analysts were, in fact, tracking a black SUV that had left the crime scene, and they saw that it had passed in front of three different ground-based police cameras. Those images gave them a clear picture of the suspect’s vehicle. Eventually, however, the vehicle drove beyond the range of the plane’s cameras, out of the city. They lost its trail.

Minutes later, Trexler announced, “Looks like we’ve got a new priority!” An off-duty Baltimore police detective had collided with a dirt bike rider in West Baltimore. When the detective got out of her unmarked car, other riders assaulted her. The crew probably had no idea that the officer was Dawnyell Taylor, the lead homicide detective in the case of Gray.

It was exactly the sort of crime McNutt and the analysts on the ground had been primed to follow. They tracked the motorcycle involved in the accident and followed it for an hour and a half. It passed several ground-based cameras, and the police got good images of the rider and the passenger sitting behind him. Police eventually found the motorcycle, confiscated it, and arrested the man they found sitting on it.

McNutt prides himself on being a student of efficiencies. In the airport residence hotel where he’s been living since January, he keeps a closet of cargo pants and identical black polos—a uniform that saves him the trouble of choosing what to wear each day. His goatee is a recent experiment to see if he can cut grooming time by limiting the surface area he shaves (results are pending; tending to the edge work, he’s discovered, takes time). And last year, when he was strategizing how he might best silence the sort of criticism he’d attracted in Compton and Dayton, McNutt attempted to save time and trouble by directly approaching the ACLU, the organization he figured would be most likely to challenge his system on privacy grounds.

He visited the ACLU’s headquarters in Washington, and in the office of Jay Stanley, a senior policy analyst and privacy expert, McNutt explained why his cameras weren’t a threat. The aerial images couldn’t identify specific people, because the target resolution would be limited to one pixel per person. The analysts zoomed in on specific areas only in response to specific crimes reported to the police. To further ensure that his employees weren’t spying on random people or addresses, everything they did was logged and saved—every keystroke and every address they zoomed in to for a closer look. Vehicles would be tracked only over public roads in areas where people have no expectation of privacy.

McNutt cited a couple of U.S. Supreme Court cases to show Persistent Surveillance wasn’t in the business of wanton intrusion. In 1986 a case from California hinged on whether police had the right to fly over a man’s property to see inside a fence in his backyard and then bust him for growing marijuana. The court backed the police, saying that “any member of the public flying in this airspace who glanced down could have seen everything that these officers observed.” Three years later, the court similarly upheld the arrest of a man busted for growing marijuana in a greenhouse after police in a helicopter spotted the plants through the roof, which was missing two panels.

Stanley heard McNutt out and thanked him for taking the initiative to seek the ACLU’s feedback. But McNutt’s presentation shocked him to the core. As he listened to his visitor describe the type of surveillance the company was capable of doing, Stanley felt as if he were witnessing America’s privacy-vs.-security debate move into uncharted territory.

“My reaction was ‘OK, this is it,’ ” Stanley recalls. “I said to myself, ‘This is where the rubber hits the road. The technology has finally arrived, and Big Brother, which everyone has always talked about, is finally here.’ ”

The meeting took place before McNutt’s work with Baltimore was arranged, and Stanley knew other companies were beginning to work in the same general field. For example, the creators of Constant Hawk, a system that had competed for military adoption with McNutt’s Angel Fire, started a company called Logos Technologies, which provides wide-area motion cameras to organizations that can mount them to aircraft and analyze the images. (“We sell the diamond, and someone else has to mount it in the ring,” company spokesman Erik Schechter says.) This year, Logos landed its first nonmilitary contract, partnering with a Brazilian company called Altave to provide aerial monitoring of the Olympic Games in Rio de Janeiro, via blimplike aerostats floating above the city. As the sector continues to mature, Stanley predicts that more companies will enter the marketplace, and each will try to one-up the other to please law enforcement agencies, creating more flexible—and more intrusive—camera and tracking systems. The Supreme Court decisions that McNutt cited, he says, might not apply. The previous court rulings didn’t take into consideration the constancy of these systems: It’s true that anyone might be able to see into someone’s fenced-in backyard from a passing plane, but was it reasonable to argue that anyone could follow a person’s movements across a city for hours at a time? To Stanley, these are open questions.

One afternoon in June, McNutt watched his analysts dig through archived images of traffic accidents. “I’m tired of these little hit-and-runs,” he said. “Let’s have some shootings!” If it sounded crass, it wasn’t intentional; he meant the statement as a declaration of confidence in his system’s ability to solve the worst crimes, the ones that most gravely endanger public safety. He’s convinced his system can be used to examine police behavior, too, in an objective, dispassionate, and nondiscriminatory way.

McNutt often says that when he stares into the computer monitors, the dots moving along the sidewalks and streets are mere pixels to him. Nothing more. If anyone else wants to project identifying features onto them—sex, race, whatever—that’s their doing, not his. Even as the technology advances and the camera lenses continue to get more powerful, he says, his company will choose to widen its viewing area beyond the current 30 square miles rather than sharpen the image resolution. He’s exasperated when his system is criticized not for what it does, but for its potential. Yet for critics like Stanley, the two can’t be separated. When told that Persistent Surveillance Systems had been operating over a major city for months, Stanley predicts, “I would expect fierce controversy over this.”

McNutt says he’s sure his system can withstand a public unveiling and that the more people know about what his cameras can—and can’t—do, the fewer worries they’ll have. But the police ultimately decide who and what should be tracked. In a city that’s struggled to convince residents that its police can be trusted, the arguments are now Baltimore’s to make.”







Why Police Spying On Americans Is Everyone’s Problem



“The American tradition of prohibiting military involvement in domestic policing is designed to ensure that we maintain democratic and civilian control over an extraordinarily powerful fighting force. An army designed and equipped to protect Americans should never be turned against Americans except to quell active rebellion.

But just as the drug war fueled increased military participation and militarization in domestic policing, the war on terrorism is driving the militarization of domestic intelligence operations. Unlike the purchases of armored vehicles, military weapons, and SWAT gear, domestic intelligence activities take place mostly in the dark and neither the public nor policymakers really know what is happening.

First, military agencies are conducting domestic intelligence collection against Americans, and providing that information to law enforcement officials. The National Security Agency scoops up domestic telephone calling data, as well as the content of U.S. international communications (“inadvertently” grabbing tens of thousands of purely domestic calls each year in the process). The FBI has direct access to this material, and can use it for general criminal purposes through so-called “back door searches.”

Military officials also collect domestic intelligence for “force protection.” A military unit that was caught spying on anti-war protesters under this authority was disbanded in 2008, but the Defense Intelligence Agency picked up its “offensive counterintelligence” duties and re-established an intelligence database in 2010. National Guard units and civilians working at military agencies have been caught illegally spying on domestic protesters, and more recently, engaging in undercover law enforcement activities in violation of the Posse Comitatus Act, a law prohibiting U.S. military personnel from enforcing criminal laws.

Second, military agencies and personnel participate in formal and informal information sharing programs on the federal and state level, including between FBI Joint Terrorism Task Forces, state and local law enforcement intelligence fusion centers, and information sharing networks like the Navy’s Law Enforcement Information Exchange (LInX), and the FBI’s eGuardian program. Though there are legal limits to the type of work military officials can do within these programs and the information they can share, there is little to no oversight conducted to ensure they follow the law.

Third, military intelligence tactics and attitudes rub off on law enforcement personnel assigned to intelligence matters. Most nations outlaw espionage, so foreign intelligence activities have to be carried out through stealth and deception. Avoidance of the law and contempt for the truth can become habitual among intelligence officials, but they simply have no place in a democratic government’s interactions with its own citizens. Yet, throughout the history of domestic intelligence operations in the U.S., law enforcement officials have gone to the military intelligence toolbox in selecting their methods.

In 1976, the Church Committee called the tactics J. Edgar Hoover brought to bear against civil rights and peace activists in the United States “techniques of wartime” better suited for use against agents of hostile foreign nations like the Soviet Union. The Attorney General issued guidelines to ensure future FBI intelligence activities would focus on criminal activity. The Justice Department imposed similar regulations restricting state and local law enforcement criminal intelligence systems.

Restricting domestic intelligence collection to suspected criminal activity is essential to the concepts of limited government and individual liberty, whose foundation lies in what Supreme Court Justice Louis Brandeis called “the right to be left alone.” It also reinforces the rule of law. Methods used in criminal intelligence gathering tend to get exposed in the prosecutions that follow their effective use. Defendants can then challenge their legality, while judges, juries and the public can weigh whether the government tactics are appropriate. Law enforcers can’t be law breakers.

Unfortunately, the federal government has loosened or ignored law enforcement guidelines restricting intelligence gathering in the years since 9/11, removing or weakening the criminal predicates necessary to ensure a proper focus on illegal activity. The results were predictable —increased police spying on minorities and political dissidents and increased efforts to escape judicial and public oversight. Federal law enforcement agencies have adopted policies of “parallel construction” to mask the surveillance methods they use to gather evidence, misleading courts and depriving defendants of their right to challenge their constitutionality. Where evidence of improper FBI surveillance has leaked to the public, the Justice Department invoked “state secrets” to shut down litigation. And at the request of the State Police and FBI, the Virginia legislature exempted its intelligence fusion center from open government laws.

In a recent interview, Erik Dahl, a former Navy intelligence officer and now a professor at the Naval Postgraduate School, explains why someone who trained to spy on the Soviet Navy shouldn’t be involved in domestic intelligence gathering.

Trained by the military to spy on hostile foreign nations, Dahl cautioned that “you wouldn’t want to hire me to conduct domestic surveillance.” His statement should serve as a warning to those in Congress who authorized the NSA to play a major role in seizing Americans’ electronic communications (and want to give it more authority over U.S. cyber security), and sat silent as the FBI has transitioned into a domestic intelligence agency.

It should also serve as a warning to federal, state and local law enforcement officials. As these agencies have increasingly claimed a role in intelligence collection, they’ve looked to the military and foreign intelligence agencies for tactics, expertise and personnel, without sufficiently recognizing the important distinctions between domestic and foreign intelligence.

The negative public reaction to recent militarized police tactics and equipment is an indication of the unease Americans will feel with militarized law enforcement intelligence efforts. Americans trust the NSA far less than their local police. But when the local police begin adopting intelligence methods used by the NSA and other foreign intelligence agencies, they will begin to lose that essential public support.

No one disputes that there are violent criminals, spies and terrorists within our country and that our law enforcement officials need adequate intelligence tools to catch them. Requiring that police focus on illegal activity doesn’t impair their mission, it puts these threats squarely in their cross-hairs. It is no surprise that Dahl’s research on successfully prevented terrorist attacks show that traditional law enforcement techniques are far more effective than NSA mass surveillance programs.

As Dahl suggests, we need to have a “much better public discussion about intelligence.”

To read an edited transcript of Erik Dahl’s interview, click here