| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Sentence by Numbers: The Scary Truth Behind Risk Assessment Algorithms

Page history last edited by Jose Jaime Bisuña 4 years, 4 months ago

Sentence by Numbers: The Scary Truth Behind Risk Assessment Algorithms

 

Author: Nikki Williams

Date: May 7, 2018

Link: https://www.digitalethics.org/essays/sentence-numbers-scary-truth-behind-risk-assessment-algorithms

 

First Impression: The essay probably talks about how using algorithms to determine the sentence length of the criminal could lead to unethical scenarios such as it being racially bias.

 

Quote: “System validation, training and quality assurance are all steps toward ensuring fairness in risk assessment technologies. But a better idea might be to strive for transparency and the right to see and challenge data, particularly when it is being used to make decisions that have such a critical impact on the outcome of people’s lives.”

 

Reflection: 

There are cases where-in a criminal’s calculated sentence is unfair because of bias due to skin color. An example would be that some are sentenced 1 to 2 years due to robbery, let’s say in a convenience store (black), while the other is sentenced for less than a year for homicide (white). There is obviously bias when it comes to determining the sentence period of people with different skin colors. There should be a machine that is trained to compute the sentence time of a criminal depending on the case, and is able to make unbiased decisions. But the problem with automating the decision making of the sentence duration is that the creator may also be racially biased. Since the judges are not aware of what is happening behind the scenes, the developer could write code that makes the algorithm give less time for people who are white and longer for those who are black. This is why before using these kinds of technology, we must first check the quality and validate the technology so that it would have a neutral decision when choosing how long the sentence would be depending on the crime committed.

 

5 Things I Learned

  1. When sentencing by numbers, there is a higher chance of a machine having bias when assessing the sentence.

  2. We can not fully rely on risk assessments.

  3. Risk assessment tools are being used in courtrooms to determine the sentence length and severity depending on the case.

  4. Most of the time these tools in different states are used without proper calibration, making it unfair for the defendant and promotes bias.

  5. If the tool is created with true gender and race neutrality, it can be a voice of reason.

5 Integrative Questions

  1. How are we sure that these risk assessment tools are trained with making neutral decisions, and not biased ones.

  2. In which states are these being used, and how do they validate it?

  3. Will the use of risk assessment tools be a more ethical way of determining the sentence time of a criminal?

  4. Since there are different kinds of genders, how are risk assessment tools updated to know the latest kinds of gender?

  5. Will these tools replace the traditional way of determining the number of years a person will be sentenced or it is just a tool to help judges make more ethical and moral decisions?

Comments (0)

You don't have permission to comment on this page.