Knowledge Vault 6 /24 - ICML 2017
How AI Designers will Dictate Our Civic Future
Latanya Sweeney
< Resume Image >

Concept Graph & Resume using Claude 3.5 Sonnet | Chat GPT4o | Llama 3:

graph LR classDef main fill:#f9d9c9, font-weight:bold, font-size:14px classDef design fill:#d4f9d4, font-weight:bold, font-size:14px classDef issues fill:#d4d4f9, font-weight:bold, font-size:14px classDef research fill:#f9f9d4, font-weight:bold, font-size:14px classDef responsibility fill:#f9d4f9, font-weight:bold, font-size:14px Main[How AI Designers
will Dictate Our
Civic Future] Main --> A[Design Impact] Main --> B[Ethical Issues] Main --> C[Research and Experiments] Main --> D[Designer Responsibility] Main --> E[AI and Society] A --> A1[AI designers shape society
through technology 1] A --> A2[Technology design influenced early
US laws 2] A --> A3[Bed sensors vs. Apple Watch:
design values 3] A --> A4[Designers: unelected policymakers in
technocracy 4] A --> A5[Technology changes societal values
accidentally 12] A --> A6[AI shifted from ideals
to human mimicry 6] B --> B1[Anonymized health data identifiable
through linking 5] B --> B2[Online ads show racial
name bias 7] B --> B3[Ad bias potentially violates
anti-discrimination laws 8] B --> B4[Biased ads violate credit
reporting fairness 9] B --> B5[Complex US health data
flow paths 10] B --> B6[Discrimination: complex legal landscape 20] C --> C1[Sweeneys experiments inform policymakers,
advocates 13] C --> C2[Harvard class reveals unforeseen
tech consequences 14] C --> C3[Student projects expose algorithmic
issues 15] C --> C4[Facebook location leak prompted
swift change 16] C --> C5[Washington state changed hospital
data law 11] C --> C6[Election website vulnerabilities paper
forthcoming 27] D --> D1[Proactive harm consideration minimizes
unforeseen issues 17] D --> D2[Product managers bridge design
and organization 18] D --> D3[Incorporate social considerations in
foundational research 19] D --> D4[Technologists responsible for social
considerations 24] D --> D5[AI designers wield immense
societal influence 29] D --> D6[Proactive experimentation crucial for
responsible design 30] E --> E1[Language models may perpetuate
societal biases 21] E --> E2[Publicizing vulnerabilities often spurs
fixes 22] E --> E3[Dataset biases perpetuate in
machine learning 23] E --> E4[Facial recognition: rapidly shifting
acceptance 25] E --> E5[Privacy protection workshops had
limited success 26] E --> E6[External audits should prompt
constructive fixes 28] class Main main class A,A1,A2,A3,A4,A5,A6 design class B,B1,B2,B3,B4,B5,B6 issues class C,C1,C2,C3,C4,C5,C6 research class D,D1,D2,D3,D4,D5,D6 responsibility class E,E1,E2,E3,E4,E5,E6 responsibility

Resume:

1.- Latanya Sweeney discusses how AI designers shape society through technology design decisions, often accidentally and without oversight.

2.- Early photography and phone recording laws in the U.S. were shaped by technology design choices like lacking a mute button.

3.- Sleep Number bed sensors gather intimate data without user control, while Apple Watch stores data locally, reflecting different design values.

4.- Designers are the new policymakers in a technocracy, even if unelected, as their solutions are market-driven without much oversight.

5.- Sweeney's early work showed how purportedly anonymized health data could actually identify individuals by linking to publicly available information.

6.- In the late 1990s, AI focused more on computing mathematical ideals rather than mimicking human behavior. Modern AI applications were formerly research.

7.- Sweeney found her name generated online ads implying she had an arrest record, which happened more for "black-sounding" vs "white-sounding" names.

8.- This ad delivery disparity could violate U.S. anti-discrimination laws if it led to unequal treatment, e.g. in employment decisions.

9.- A similar ad delivery bias appeared on websites aimed at black audiences, potentially violating credit reporting fairness regulations.

10.- Health data in the U.S. flows in complex ways, with only half the pathways covered by HIPAA medical privacy rules.

11.- Washington state sold hospital data cheaply in a way that could be re-identified, until Sweeney's research prompted a law change.

12.- Technology is changing fundamental societal values and institutions in often accidental ways, as designers focus on products over broader impacts.

13.- Sweeney's experiments demonstrating these issues have helped shore up advocates, regulators and journalists in understanding and addressing technological shifts.

14.- She started a Harvard class where students conducted impactful experiments revealing unforeseen technology consequences and presented findings to D.C. policymakers.

15.- Example student projects included algorithms to proactively catch online fraud, price discrimination by zip code demographics, and privacy issues.

16.- Student Aron exposed how Facebook leaked user locations, prompting swift change, but also faced personal retaliation from Facebook.

17.- Designers can minimize unforeseen harms by proactively considering how things could go wrong and seeking outside perspectives early on.

18.- Product managers bridging design teams and broader organizational concerns are well-positioned to spur proactive consideration of potential downsides.

19.- Academics tend to punt these considerations to commercialization, but incorporating them into foundational research would lead to better outcomes.

20.- Discrimination is legally complex - offering a student discount is allowed, systematically charging more by race is not.

21.- Language models may reflect societal biases in training data - designers must choose whether to try to change or entrench norms.

22.- Some worry publicizing tech vulnerabilities enables abuse, but Sweeney found shining a light often spurs responsible fixes that wouldn't happen otherwise.

23.- Datasets used for machine learning may have inherent unknown biases that get perpetuated - ongoing external auditing is needed.

24.- While co-design with users is valuable, the core responsibility lies with technologists themselves to proactively "bake in" social considerations.

25.- Facial recognition hit major turbulence in 2001 between Super Bowl surveillance backlash and 9/11 increasing acceptance - trajectory can shift quickly.

26.- Carnegie Mellon workshops on building in privacy protections had limited success - the onus is on core designers themselves.

27.- A forthcoming paper shows vulnerabilities in 36 state election websites during the 2016 presidential election.

28.- The top-level message is that while not everything can be anticipated, harms found through external audits should be met with constructive fixes.

29.- AI technology designers have immense power in a global technocracy to shape societal rules and values in lasting ways.

30.- Proactive steps by AI designers to envision and experiment around potential harms is crucial to responsibly wielding this influence for good.

Knowledge Vault built byDavid Vivancos 2024