Page 32 - FCW, August 2019
P. 32

Emerging Tech
“We are seeing people being denied credit due to the factoring of digital composite profiles —which include
their web browsing histories, social media profiles and other inferential characteristics — in the factoring of credit models.”
Nicol Turner-Lee, a fellow at the Cen- ter for Technology Innovation at the Brookings Institution, told the House Financial Services Committee in June that an explosion of data has allowed for development of more sophisticated algorithms that can make inferences about people ranging from their identi- ties and demographic attributes to their preferences and likely future behaviors.
But algorithms also compare people and things to related objects, identify patterns and make predictions based on those patterns, and that’s where bad data or flawed learning protocols can create problems. A study by the Uni- versity of California, Berkeley last year found that automated consumer lend- ing systems charged black and Latino borrowers more to buy and refinance mortgages than white borrowers, result- ing in hundreds of millions of dollars of collective overcharging that couldn’t be explained.
“We are seeing people being denied credit due to the factoring of digital composite profiles — which include their web browsing histories, social media profiles and other inferential characteristics — in the factoring of credit models,” Turner-Lee said. “These biases are systematically \\\[less favorable\\\] to individuals within particular groups when there is no relevant difference between those groups \\\[that\\\] justifies those harms.”
Understanding how algorithms work
Technology is not necessarily the main culprit, and the Berkeley study found that in some cases financial technology algorithms can discriminate less than their human counterparts. Instead, most experts say it’s the humans behind those flawed algorithms who are careless and feed data into a system without having a clear understanding of how the algo- rithm will process and relate that data to groups of people.
However, the speed and scale at which these automated systems operate mean they have the potential to nation- ally spread discriminatory practices that were once local and confined to certain geographic enclaves.
Although organizations are legally responsible for the ways their prod- ucts, including algorithms, behave, many encounter what is known as the “black box” problem: situations where the decisions made by a machine-learn- ing algorithm become more opaque to human managers as the technology takes in more data and makes increas- ingly complex inferences. The challenge has led experts to champion explain- ability as a key factor for regulators to assess the ethical and legal use of algorithms — essentially being able to demonstrate that an organization has insight into the information its algorithm is using to arrive at its conclusions.
The Algorithmic Accountability Act introduced earlier this year by Sens. Cory Booker (D-N.J.) and Ron Wyden (D-Ore.) in the Senate and Rep. Yvette Clarke (D-N.Y.) in the House would give the Federal Trade Commission two years to develop regulations requir- ing large companies to conduct impact assessments of automated decision sys- tem algorithms and treat discrimination resulting from those decisions as “unfair or deceptive acts and practices,” open- ing those firms up to civil lawsuits. The assessments would look at training data for impacts on accuracy, bias, discrimi- nation, privacy and security and would require companies to correct any dis- crepancies they find.
In a statement introducing the bill, Booker shared his parents’ experi- ence with housing discrimination at the hands of real estate agents in the 1960s, saying that algorithms have the potential to bring about the same injus- tice but at scale and out of sight.
“The discrimination that my family faced in 1969 can be significantly harder to detect in 2019: houses that you never know are for sale, job opportunities that never present themselves and financing that you never become aware of — all due to biased algorithms,” he said.
A more proactive approach
Turner-Lee said organizations should do more to understand how their auto-
28 August 2019 FCW.COM
— NICOL TURNER-LEE, BROOKINGS INSTITUTION


















































































   30   31   32   33   34