setting the scene
play

Setting the Scene: Artificial In Intelligence: Ethical Concerns - PowerPoint PPT Presentation

Setting the Scene: Artificial In Intelligence: Ethical Concerns Prof Barbara Prainsack Department of Political Science, University of Vienna Department of Global Health & Social Medicine, Kings College London Dialogue seminar with


  1. Setting the Scene: Artificial In Intelligence: Ethical Concerns Prof Barbara Prainsack Department of Political Science, University of Vienna Department of Global Health & Social Medicine, King’s College London Dialogue seminar with churches, religious and philosophical organisations, European Parliament, 19 March 2019

  2. “Rise of the robotic workforce” (Forbes)

  3. Artificial intelligence techniques E.g. • natural language processing • machine learning • predictive analytics • generative adversarial networks • mechatronics and robotics All consist of material technologies and human practices [Marvin C. 1990. When old technologies were new: Thinking about electric communication in the late nineteenth century . Oxford University Press]

  4. EGE Statement on AI, robotics and “autonomous” systems (2018) A. Substantive values B. Process 1. Human dignity 1. Deliberation 2. Autonomy 3. Responsibility 4. Justice, equity, solidarity 5. Democracy 6. Rule of Law and Accountability 7. Security, safety, bodily and mental integrity 8. Data protection and privacy 9. Sustainability

  5. What we must not forget: • The political economy context: iLeviathan • What makes AI possible: Datafication, platforms • Technologies in the service of values, not the other way round! [Prainsack B. 2019. Data donation: How to resist the iLeviathan. In: Jenny Krutzinna and Luciano Floridi (eds). The Ethics of Medical Data Donation . Dordrecht: Springer. 9-22.] [Van Dijck J, Poell T, De Waal M. 2018. The platform society: Public values in a connective world . Oxford University Press.]

  6. Harm mitigation bodies (H (HMB) Harm mitigation (outside of the legal Legal remedies domain) • Accessible only for primary data subjects • Three main functions: (a) collect • Need to prove culpability and/or information on the types of harm causality occurring (b) relay feedback to improve data use (c) financial support • Everybody who can report significant harm can appeal to it [Prainsack B, Buyx A. (2013). A solidarity-based approach to the governance of research biobanks. Medical Law Review 21/1: 71-91. • No fault needs to be proven Prainsack B, Buyx A. (2016). Thinking ethical and regulatory frameworks in medicine from the perspective of solidarity on both sides of the Atlantic. Theoretical Medicine and Bioethics 37: 489-501.] • Causal link between an action of a [McMahon Prainsack B, Buyx A. Big data governance needs more collective agency: The role of harm mitigation in the governance of data-rich projects. Work data processor/controller and the in progress.] harm only needs to be made plausible • Independent governance

  7. Thank you for your attention • barbara.prainsack@univie.ac.at

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend