Knowledge Vault 4 /14 - AI For Good 2018
Storytellers
Aimee Van Wynsberghe
< Resume Image >
Link to IA4Good VideoView Youtube Video

Concept Graph & Resume using Claude 3 Opus | Chat GPT4o | Llama 3:

graph LR classDef ethics fill:#d4f9d4, font-weight:bold, font-size:14px classDef tech fill:#f9d4f9, font-weight:bold, font-size:14px classDef robots fill:#f9e1d9, font-weight:bold, font-size:14px classDef vision fill:#d9e6f2, font-weight:bold, font-size:14px A[Storytellers] --> B[Ethics drives
robotics/AI innovation. 1] A --> C[Ethics studies good life,
right/wrong. 2] A --> D[Ethical language: wellbeing,
dignity, health, environment. 3] B --> E[Technology seen as neutral
or value-laden. 4] E --> F[Gendered designs show technology
reflects norms. 5] F --> G[Technology alters good life,
facilitates it. 6] G --> H[Robotics/AI will change
good life. 7] C --> I[Robot ethics: issues
by stakeholder. 8] I --> J[Designers key issues: dignity,
wellbeing, capabilities. 9] J --> K[Employment issues: enhancing,
not replacing humans. 10] K --> L[User concerns: robot abuse
affects human interactions. 11] L --> M[Debate on robots/AI
as ethical agents. 12] M --> N[Against granting rights,
obligations to robots. 13] D --> O[Ethics guides design
from ideal vision. 14] O --> P[Humanitarian good life: helping
more people. 15] P --> Q[AI-powered drones: wellbeing,
dignity preservation. 16] Q --> R[Consider drones stress,
informed consent. 17] O --> S[Vision: minimizing stereotypes
in bureaucratic systems. 18] S --> T[AI detects/predicts biases
in decision-making. 19] O --> U[AI supports environmental goals:
pollution monitoring. 20] O --> V[Ethics as development guide,
not afterthought. 21] V --> W[Address ethics first,
then technological capabilities. 22] V --> X[Use AI to check biases,
express ethics. 23] X --> Y[Ethics brings values to
life via tech. 24] Y --> Z[Start with good life,
define future. 25] Z --> AA[Define ethical values
for AI 2030. 26] class B,C,D ethics class E,F,G,H tech class I,J,K,L,M,N robots class O,P,Q,R,S,T,U,V,W,X,Y,Z,AA vision

Resume:

1.- Ethics can be a driver of innovation in robotics and AI, not just a hindrance that slows progress.

2.- Ethics is the study of the good life - what it is and how to achieve it. It distinguishes good/bad, right/wrong.

3.- Ethical language uses values like wellbeing, dignity, health, and environmental protection that constitute or help achieve the good life.

4.- One view sees technology as neutral; another sees it as inherently value-laden, with designers' assumptions and biases shaping the end product.

5.- Examples like gendered razor blade designs show how technologies can create and reflect norms, values and meanings.

6.- Technology changes what elements make up the good life (e.g. global social connections) or helps us achieve the good life (e.g. telecommunication).

7.- Robotics and AI will similarly change what the good life looks like or how we achieve it.

8.- Robot ethics clusters issues by stakeholder - designers/developers/policymakers, users, and robots/AI systems as ethical agents.

9.- For designers, key issues are making robots that preserve dignity, enhance wellbeing, have capabilities to do this, and use well-trained AI.

10.- Employment issues arise - making robots that enhance rather than replace human workers. Media rhetoric also shapes perceptions of robotics/AI.

11.- For users, questions emerged around the Boston Dynamics robot dog abuse videos and how mistreating robots could affect human-human interactions.

12.- Views differ on whether robots/AI should be ethical agents that make moral decisions and what theory should guide their programming.

13.- Alternatively, some argue against this as it could require granting rights and obligations to sufficiently advanced robots/AI.

14.- Ethics can be part of the design process, starting with a vision of the ideal 2030 society and using ethics to get there.

15.- In a humanitarian context, the good life means resources going further to help more people in isolated places.

16.- AI-powered drones used by humanitarian NGOs fit this vision, but must be designed and used to preserve wellbeing and dignity.

17.- Designers must consider potential for drones to cause physiological/psychological stress. Informed consent is also an important ethical value to uphold.

18.- Another future vision minimizes stereotyping and discrimination in bureaucratic systems like courts, healthcare, banking and policing.

19.- AI could detect biases or predict when biases may occur in decision-making by court officials and judges.

20.- A third scenario envisions robotics/AI supporting environmental goals by monitoring pollution, cleaning up e-waste, etc.

21.- The key is making ethics the starting point that guides the technological development, not an afterthought.

22.- This contrasts with starting from technological capabilities and only later addressing ethical issues that arise.

23.- The goal is using AI to check our biases rather than continue them, and having robotics/AI express ethical values.

24.- Ethics becomes a tool for envisioning how to bring important values to life through technology.

25.- We should consider the good life as the starting point and use ethical language to define the desired future.

26.- The question is what ethical values we want to be the defining features of AI in 2030.

Knowledge Vault built byDavid Vivancos 2024