Portland, Oregon. (BigStock Photo)

PORTLAND, ORE. — Lawmakers in Portland unanimously passed a privacy resolution on Wednesday they hope will lead to more ethical and equitable city data gathering and use.

Portland, like many cities around the U.S. and world, is grappling with how to craft policies governing technologies such as facial recognition and AI-enhanced video surveillance.

The council’s resolution comes on the heels of San Francisco’s historic ban on facial recognition tech last month. Portland, meanwhile, is planning a new data management system, recently implemented data-harvesting traffic signal sensors in dangerous intersections, and has partnered with controversial Alphabet-backed urban tech firm Sidewalk Labs, which provides data showing how people move around the city.

The new privacy and information protection principles, intended to guide Portland’s tech and data decisions, are among the few U.S. city policies that incorporate themes of equity, transparency, accountability and non-discrimination.

Notably, the resolution calls for Portland to devise approaches for evaluating the impacts of automated decision systems employing artificial intelligence or algorithmic models. Use of such tools, which can decide which public schools students are assigned to, where prison inmates are housed, or which worksites are visited by safety inspectors, has become a hot topic in city government and tech ethics circles.

But there are few forms of AI used by governments that have sparked as much contention as facial recognition. These AI-based biometric identification technologies are used by some cities for law enforcement and surveillance, and in public schools for security purposes.

“City use of facial recognition technologies is likely going to be an early policy conversation given recent media attention, community concerns, and related policies moving forward in other cities,” Hector Dominguez, open data coordinator at the City of Portland’s SmartCityPDX group, told GeekWire.

RealNetworks’ SAFR arms schools with facial recognition technology. (SAFR Photo)

The city will broaden the scope of the policy discussion to include other technologies that allow for real-time analysis of video, such as footage from surveillance cameras that could hypothetically be employed to build a video dossier of citizens.

Though no use of the technology has been made public, it’s unclear if facial recognition is currently used by any city agencies in Portland.

Facial recognition tech developed by companies including Amazon, Microsoft, IBM and others is highly scrutinized in part because some systems have been shown to be faulty, particularly when it comes to detecting dark-skinned faces accurately. Civil liberties advocates also fear that widespread use of the technology by government agencies or in commercial settings could turn cities into invasive surveillance states with little to no privacy protections, as has been happening in China.

Portland TV station KGW reported earlier this month that Jackson’s, a convenience store in Southeast Portland, uses facial recognition technology. Nearby in Washington County, the sheriff’s department uses Amazon’s Rekognition software to identify crime suspects.

Privacy concerns aside, Kris Henning, a criminology and criminal justice professor at Portland State University, suggested that facial recognition technologies used by law enforcement would have limited effect on reducing crime.

“Facial recognition is the shiny object that a lot of people are excited about in law enforcement,” he said. “People can argue left and right about the merits of this, when in reality it’s not going to influence crime prevention.”

A focus on equitable data use

Another indication that the city is following a trend among some of the world’s most forward-thinking municipalities, Portland’s principles call for data equity and non-discriminatory data use. The resolution requires Portland to ensure non-discriminatory data protections and undergo due diligence to understand the impacts of unintended data use consequences. It also demands that the city “prioritize the needs of marginalized communities regarding Data and Information management, which must be considered when designing or implementing programs, services, and policies,” according to the resolution.

Kelsey Finch, senior privacy counsel at the Future of Privacy Forum. (Future of Privacy Forum Photo)

“Portland is among the first cities in the U.S. to pass such principles, and its focus on human rights and the ethical use of data is especially progressive,” said Kelsey Finch, senior privacy counsel at the Future of Privacy Forum, which submitted a letter of support for the Portland resolution. “In formalizing its commitment to designing equitable and inclusive automated decision systems, the City of Portland is setting an important example for other cities and communities.”

Some say the goal of informing decision-making with data can be viewed through a broader mission of equity for communities that have historically been treated unfairly as a result of institutionalized racism, redlining, over-policing and neighborhood neglect.

“This Portland privacy resolution is saying it’s not just about protecting and securing data. It’s this recognition that in the management of data they’re taking this equitable lens through the use of the data,” said Natalie Evans Harris, head of strategic initiatives at Brighthive, a data management firm that helps clients such as Goodwill devise equitable approaches to data use and governance. “It’s also about making sure data is available to the people who need it when they need it to make decisions.”

For instance, she explained that cities might incorporate equity goals in data management policy by including data on how homelessness affects indigenous, rural or African-American populations, rather than simply looking at its effects on the general population.

Ultimately, insuring that automated decision systems are evaluated with equity, fairness, transparency, and accountability in mind could be a tough resource and expertise challenge, said Deirdre Mulligan, co-director of the Berkeley Center for Law and Technology. “I think the most important issue facing cities who want to do this work is that most of them lack professional staff with the expertise in privacy, bias, etc. in these algorithmic systems,” she said.

The Portland privacy resolution is part of the city’s overall data governance strategy and was a collaboration among the mayor’s office and several city agencies. A new privacy implementation workgroup led by Bureau of Planning and Sustainability and Office of Equity and Human Rights has formed to get the ball rolling and put the resolution into practice. That group, set to hold its first meeting June 25, also includes Smart City PDX, the City Attorney’s Office, Office for Community Technology, Portland Bureau of Transportation, the City Auditor’s Office, Office of Management and Finance, Portland Police Bureau and Office of Community and Civic Life.

Portland’s principles were influenced by similar efforts in Seattle and Oakland, along with the European Union’s new General Data Protection Regulations, said Dominguez. Those devising the Portland resolution leaned on guidance from privacy experts in Seattle and Oakland, as well as input from Portland’s community during public forums.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.