TrendsWide
Contact US
  • Home
  • Trending
  • Health
  • Life Style
  • NBA
  • Reviews
No Result
View All Result
TrendsWide
  • Home
  • Trending
  • Health
  • Life Style
  • NBA
  • Reviews
No Result
View All Result
TrendsWide
No Result
View All Result
Home Trending

The algorithms that calculate who will reoffend discriminate against blacks (and it is not easy to correct them) | Technology

by souhaib
November 26, 2021
in Trending
0
74
SHARES
1.2k
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT



The algorithms that calculate who will reoffend discriminate against blacks (and it is not easy to correct them) |  Technology

You might also like

A judge blocks the decision of the Biden Administration to suspend Title 42

Jon Gosselin Attempts Comeback That No One Is Particularly Excited About

San Francisco archbishop says Pelosi cannot receive Communion

The algorithms at the moment do not dictate sentences in any rule of law. But they do help judges in various countries make decisions. The best-known system of this type is Compas, widely used in the United States, which informs togades of the probabilities of recidivism of prisoners when granting them parole and prison permits or benefits. Now, an analysis of the MIT Technology Review has concluded that this tool discriminates against certain minorities, specifically blacks. The report agrees on this point with previous studies, but also points to another disturbing conclusion: that it is practically impossible to correct the biases of the system.

Compas, the system developed by the Northpointe company that is applied in the States of New York, California, Florida or Wisconsin, among other jurisdictions, uses machine learning (machine learning), an artificial intelligence technique that is based on perfecting the algorithm as it learns from the data it processes. The profiles of the subjects analyzed by Compas are made up of a questionnaire that consists of 137 questions, ranging from whether they have suffered family abuse or have a criminal record to whether the subject has ever skipped classes or if they feel discouraged . Some questions are filled in by the official with data from the Administration; others must be answered by the person in question. The answers are weighted according to a series of criteria and at the end there is a score from zero to 10. If it is greater than seven, the risk of recidivism is high. A similar system operates in Catalonia, RisCanvi, although in its case it does not resort to artificial intelligence, but to statistical techniques such as multiple regression.

An investigation by ProPublica published in 2016 cast doubt on Compas. The experiment compared the risk assessments of more than 7,000 detainees in one Florida county with how often or not they actually did reoffend afterward. His conclusions were devastating: the system’s hit rate was similar regardless of whether it was applied to a white or black person, but the failures penalized blacks more, who were almost twice as likely as whites to be misclassified as potential repeat offenders.

The impact of the ProPublica study still resonates today. After that work came others that yielded similar results, placing it as a manual example of the so-called algorithmic biases (the mistakes of the machines that discriminate against specific population groups). His ability to be successful was also evaluated, leaving aside the racial question. And it was seen that it is no better than that of human beings: that of the algorithm and that of officials is around 65% (that is, they are wrong 35% of the time).

The new analysis of MIT Technology Review re-examines the database used by ProPublica (7,200 profiles in Broward County rated by Compas between 2013 and 2014), this time focusing on a random sample of 500 white and black defendants. The conclusions are the same: “At the default threshold of Compas between 7 and 8 [riesgo alto de reincidencia], 16% of black defendants who are not re-arrested have been unnecessarily incarcerated, while the same is true for only 7% of white defendants. There is a similar difference in many jurisdictions in the US, partly due to the country’s history, where the police disproportionately targeted minorities, ”writes Karen Hao, author of the analysis.

Can an algorithm be fair?

Is the algorithm racist then? Or rather, is it racist who programmed it? It doesn’t have to. The predictions that Compas makes reflect the data that was used to make them. If the proportion of black defendants who end up being arrested is greater than that of whites, it is normal that the expected arrests that the algorithm will project are also greater for that group. “They will have a higher risk score on average, and a higher percentage of them will be assessed as high risk, both correctly and incorrectly,” Hao adds.

This researcher’s analysis also points out that changing the algorithm does not help to correct the overrepresentation of the black population among those who are more likely to be evaluated at high risk of recidivism. Unless the system is altered to take into account race, which in the US (and in many countries) is illegal. Conclusion: there is no way for algorithms to act fairly.

For Antonio Pueyo, professor of Psychology of Violence at the University of Barcelona and director of the research group that developed RisCanvi, this statement does not make much sense. “Both human and algorithmic decisions have biases. The reasons are varied: because humans feed the algorithm with inadequate data, because the algorithm has not been programmed well or because biased criteria are included in it, such as differential cut-off points for certain groups ”.

The very questions asked to fill in Compas are anything but innocent. One of them, for example, refers to whether the subject has ever been identified by the police. Any black citizen who grew up in the United States in a predominantly black neighborhood has certainly had that experience, something not so common among white citizens raised in predominantly white neighborhoods. These same problems also tend to surface in the so-called police algorithms, those used by security forces to predict where a crime is going to take place.

What about the use made of the report that Compas produces? Lorena Jaume-Palasí, an expert in algorithm ethics and advisor to the European Parliament and the Government of Spain, among others, believes it is as important to analyze the algorithm as the use made after it. As he told EL PAÍS, there are studies that show that US judges who use Compas do not always pay attention to the risk of recidivism shown by the system. The algorithm may tell you that the subject is low risk and, being a black (and the racist judge), it will not grant the conditional as well.

Another criticism that the Compas system tends to receive is that the guts of the algorithm are unknown: what scales have the answers to which questions, how is the process that leads to the final assessment of each individual. It is not an exclusive peculiarity of algorithms dedicated to justice. Frank Pasquale, a professor at Brooklyn Law School and an expert in artificial intelligence law, addressed the sector’s lack of transparency in his book The Black Box Society already in 2015.

What is to be done then with the algorithms applied to justice? Eradicate them? Keep them? Perfect them? Pueyo is of the latter opinion. “Statistical or artificial intelligence techniques are just that, techniques, and therefore they can be improved. Trying to eliminate them from professional practice due to ideological prejudices is not reasonable, since on a day-to-day basis they show that their advantages outweigh their limitations ”.

Hao, the author of the MIT analysis, does not share that view. In the article he quotes the so-called Blackstone ratio, in reference to the English jurist William Blackstone, who wrote at the end of the 18th century that it is preferable to let ten guilty persons escape than one innocent person to suffer. Under that prism, there is no algorithm that is worth.

You can follow EL PAÍS TECNOLOGÍA at Facebook and Twitter or sign up here to receive our newsletter semanal.



Share this:

  • Twitter
  • Facebook
  • More
  • Pinterest
  • Telegram
  • Email
Share30Tweet19
Previous Post

Claire the Scottish Greyhound Makes History at the 2021 National Dog Show

Next Post

A huge bear breaks into a store in California .. Video

souhaib

Recommended For You

A judge blocks the decision of the Biden Administration to suspend Title 42

by souhaib
May 20, 2022
0

A federal judge in the state of Louisiana ordered this Friday the Government of the United States to continue implementing what is known as Title 42, a norm...

Read more

Jon Gosselin Attempts Comeback That No One Is Particularly Excited About

by souhaib
May 20, 2022
0

Former reality star Jon Gosselin is attempting to make a show business comeback — this time, by releasing a hip-hop track.The song is a collaboration with the International...

Read more

San Francisco archbishop says Pelosi cannot receive Communion

by souhaib
May 20, 2022
0

Speaker Nancy Pelosi has described herself as a "devout Catholic." Source link

Read more

Will Smith’s Letterman Interview Hits Different Post-Oscar Slap

by souhaib
May 20, 2022
0

NetflixThere is a very important disclaimer that appears at the beginning of David Letterman’s new interview with Will Smith on season four of his Netflix series My Next...

Read more

First Minister Nicola Sturgeon tests positive for Covid-19

by souhaib
May 20, 2022
0

Scottish First Minister Nicola Sturgeon has tested positive for Covid - just hours after she held unmasked face to face talks with Sinn Féin vice president Michelle O'Neill...

Read more
Next Post

A huge bear breaks into a store in California .. Video

No Result
View All Result

Recent Posts

  • A judge blocks the decision of the Biden Administration to suspend Title 42
  • Jon Gosselin Attempts Comeback That No One Is Particularly Excited About
  • San Francisco archbishop says Pelosi cannot receive Communion
  • Will Smith’s Letterman Interview Hits Different Post-Oscar Slap
  • Bankruptcy Recovery Kit. Recover Quickly & Properly From Bankruptcy! 90% off Discount

Browse by Category

  • Australia
  • Automotive
  • Business
  • Celebrity
  • Cryptocurrency
  • Deals
  • Economie
  • Education
  • Euro
  • Forex
  • Gaming
  • Health
  • Life Style
  • NBA
  • News
  • Reviews
  • Sports
  • Switzerland
  • Trending
  • U.S.
  • Uncategorized

Categories

  • Australia
  • Automotive
  • Business
  • Celebrity
  • Cryptocurrency
  • Deals
  • Economie
  • Education
  • Euro
  • Forex
  • Gaming
  • Health
  • Life Style
  • NBA
  • News
  • Reviews
  • Sports
  • Switzerland
  • Trending
  • U.S.
  • Uncategorized

Pages

  • Contact US
  • Newsletter
  • Privacy Policy
  • Terms & Conditions

© 2021 - TrendsWide

No Result
View All Result
  • Home
  • Contact US
  • Privacy Policy
  • Trending
  • U.S.
  • Economie
  • Deals
  • Reviews
  • Cryptocurrency
  • Health
  • Life Style
  • NBA

© 2021 - TrendsWide

loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.