How do you make an algorithm accountable


Discussion Post I

• Going forward, what do you think the ultimate solution to solve issues in algorithmic bias (e.g., data availability, data collection issues, training data, black box, etc.) will look like?

o Elaborate on both technical and regulatory solutions

• Does this tell us about the predictability of social science questions?

o What does this tell us about traditional data collection versus Observational data from digital platforms? What is your take away on the next step

• After reviewing the issues of police, prisons and technology - we know that there are millions of people and measurements going on in this system all the time such as police interaction with community members, police observations, video surveillance, media recordings, etc. The system certainly generates big data, but is it useful data? Can algorithms, computers and the current system interact in an intelligent manner?

Discussion Post II

• How do you make an algorithm accountable? What does it mean to be fair? Is it statistical fairness? Societal fairness? Something else?

• Should solutions derive from the technical side?

o Can we define an "ethical algorithm" that we trust? Can we solve this problem from an engineering side?

0 Or does this need to be solved from Societal level?

o E.g., Laws, norms and other societal tools to enact a "fair" system for all? Or is this something that cannot be fixed by one or the other?
How might we imagine laws (e.g., GDPR) and algorithmic solutions (e.g., not allowing gender bias) be combined? What do you think the solution to this conundrum is?

• What digital traces do you leave everyday?

o Do you have any privacy concerns about these digital traces being sold to third parties?
o What are the social/individual benefits of companies aggregating digital trace data at scale?
o What are potential disadvantages?

• What creates an unequal justice system?

o What are the social consequences of an unequal justice system?
o How can we improve faith that our criminal justice system treats individuals equally?
o How might we use technology to ensure equitable outcomes in the criminal justice system?

• Why do you think incarceration myths are so prevalent?

o How do incarceration myths harm individuals with a criminal record?
o Did the lecture and/or readings dispel any incarceration myths you previously held?
o Which one(s) and why was the myth convincing?
o How can you dissuade someone of an incarceration myth?

• Should evidence like tire tracks, fingerprints, bullet marks, and bite marks be allowed into court?

For society, which is worse: False positives (incarcerating individuals that did not commit a crime) or false negatives (not incarcerating individuals that did commit a crime)? Why?

o Should we change our legal system to incorporate more statistical expertise?
o E.g. someone to state when a statistical claim is unlikely

• Were you surprised by the racial differences in language?

o Do you think there may be other demographic differences in language?

- E.g. more or less positive words vary by gender, socioeconomic status, etc?

o How can these studies and body cameras be used to inform future policing practices?

The response should include a reference list. One-inch margins, Using Times New Roman 12 pnt font, double-space and APA style of writing and citations.

Solution Preview :

Prepared by a verified Expert
Other Subject: How do you make an algorithm accountable
Reference No:- TGS03191035

Now Priced at $40 (50% Discount)

Recommended (93%)

Rated (4.5/5)