Artificial Intelligence in the Public Sector
Case Study 6: Public Sector Data Analytics

Artificial intelligence can be used in a variety of ways that can benefit humanity and society. With the development of more advanced AI computers are able to assist law enforcement agencies but, in the process, run the risk of violating personal and civil liberties. When a new technology is introduced as it was in the case study (Princeton University, N.D.) officials often keep them secret to ensure the trial process runs smoothly, this violates the democratic rights the community. The mayor implements the software as quickly as possible to reduce crime rates but in doing so she is stepping beyond the power of her office by not allowing the community to be involved in the decision. When the mayor sees crime is on the rise again, she decides to take action and change the parameters of how the software works which makes the system into a form of predictive policing.
When a new technology such as AI is introduced into a community there are ethical concerns about the violation of rights of privacy and basic democracy. When Mayor Hobbes makes the decision to deploy the AI software in New Leviathan (3) she does so without making the public aware or giving them a chance to discuss what the implications may be for using such a system. She does not disclose that it will use their personal data and records that are securely held and she does not give opportunity for a town council or other form of committee to put implementation to a vote. Hobbes has taken it upon herself to make a decision for the community that involves personal data and information collected by the government. This is a direct violation of the democratic principals of the United States. Although Hobbes reason is to ensure that the public sees the software in a proper light and not as predictive policing, her reasoning is unjust.
Hobbes implements the software as quickly as possible with the hope to reduce crime rates quickly (3). She is quick to accept the reasoning given to her by the contracting firm and asks no additional questions (3). Hobbes is over-stepping the power of her office by not allowing the community to be involved in the decision and not assigning a committee to investigate the benefits and consequences New Leviathan will have receive by using the new technology.
Hobbes sees crime begins to rise right before re-election. She then decides to take action and change the parameters of how the AI software works (4). By performing this action Hobbes proves that there is a great amount of temptation that comes with using an AI system like she has access to. Her original ideals may have been just but in the end, she turns the software into the exact thing she promised it would not be used for.
It is clear there are many good uses for AI but there needs to be a form of national governance that dictates how the software is allowed to be used. Strict guidelines need to be developed that involve public awareness, implementation of evaluative committees for assessing their use and ensuring complete transparency between the private and public sectors dealings when developing these technologies. Strict rules with realistic and firm punishments must be put into place to deter those that may decide to use the software in ways that it is not developed for. For using AI software there must be guidelines that consider the rights of the people who are affected by their use be that directly or simply in the submission and use of their data.
Sources
Princeton University. “Public Sector Data Analytics Case Study: 6.” Princeton.Eu, aiethics.princeton.edu/wp-content/uploads/sites/587/2018/10/Princeton-AI-Ethics-Case-Study-6.pdf
Cover Image
Pixabay. (2017, September 1). Blue Bright Lights · Free Stock photo. Pexels. Link




Comments
There are no comments for this story
Be the first to respond and start the conversation.