Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Gemini Adv | Llama 3:
Resume:
1.-The talk discusses how an operational model of technological affordances can clarify connections between technical features and social effects of machine learning systems.
2.-The affordance model, called the Mechanisms and Conditions Framework, can be used to identify, observe, reimagine, rebuild or dismantle ML systems as needed.
3.-It can also be a tool for intentional ground-up design of ML systems from a place of critical analysis.
4.-Technologies are social - they reflect and shape norms, values, patterns and structures. Design choices in ML are social choices.
5.-Machine learning systems learn, develop and change over time. Their effects can snowball, self-reinforce and codify the social worlds they operate in.
6.-ML systems project objectivity and are hard to pinpoint issues with and undo. They amplify inequalities by learning from unequal societal data.
7.-Machine learning defaults reinforce the status quo and margins, but defaults are not inevitable. Scrutiny through systemic frameworks can enable interventions.
8.-The Mechanisms and Conditions Framework of affordance specifies how technical features and social outcomes interact. It was introduced in the speaker's book.
9.-Affordances are how a technology's features affect its uses and functions, both direct utilities and social effects. The term balances tech's shaping effects and human agency.
10.-Two problems limit the analytic value of affordances: 1) Binary application (affords or not) 2) Presuming a universal subject (same effects for all).
11.-The framework corrects these flaws by articulating the mechanisms (how tech affords) and conditions (for whom, under what circumstances).
12.-Mechanisms indicate technologies request, demand, encourage, discourage, refuse, and allow social action. Conditions are perception, dexterity, cultural/institutional legitimacy.
13.-The framework assumes no universal design is possible, refocusing on a "pluriverse." It makes systems observable and accountable by connecting design and social effects.
14.-Analysts can use it to audit how systems open/foreclose opportunities and reflect power. Designers can use it to reimagine systems encouraging equity, inclusivity, etc.
15.-Example 1: Amazon fulfillment centers use ML to govern workers in a de-skilling, denigrating way. An affordance analysis clarifies how and for whom.
16.-The systems request task discretization, allowing easy monitoring. Management and workers lack legitimacy to alter metrics-focused systems demanding bodily/temporal compliance.
17.-Workers resist in small ways at a cost. The affordances still encourage compliance and control by default. Reimagining could support worker dignity instead.
18.-Specific changes: Depersonalize tracking to items vs individuals. Remove/recalibrate demanding time rates. Require intervention to change schedules. Collect different data, use it more respectfully.
19.-These worker-centric ideas come from labor movements, concretized through affordances. Imagining alternatives sets standards for debate and resistance that corporations must contend with.
20.-Example 2: Massively parallel sequencing (MPS), a policing tool that infers ancestry, gender, appearance from crime scene DNA. It constructs racialized suspect pools.
21.-Australia has no regulations on this biometric data use; police advocated to avoid regulations. MPS reproduces statistical racial profiling through machine learning.
22.-MPS demands integrating character and physiology, encouraging assumed criminality-appearance links. It discourages community input. The accused lack legitimacy to question it.
23.-MPS needs dismantling, not reconfiguring. Critical affordance analysis highlights when this is the case - not all ML systems should exist.
24.-Two caveats: 1) Analysis and design are linked - scrutiny should aim to remake or dismantle. ML objects are dynamic, always unfinished.
25.-2) Standpoint affects affordance analyses. Computational fields' lack of diversity is a "privilege hazard" - makers can't anticipate harms to people unlike them.
26.-Marginalized perspectives are uniquely valuable for recognizing clouded realities. Foreign affordance analysis needs a critical lens, best used by diverse people.
27.-Design and auditing should bring in affected communities, approach humbly, and be ready to respond when systems inevitably learn to reproduce social inequities.
28.-The talk introduces the Mechanisms and Conditions framework of affordances as a tool to critically analyze and intentionally design machine learning systems.
29.-It argues ML is socially embedded and its design has high stakes. Defaults matter but aren't inevitable. Systematic frameworks can enable productive interventions.
30.-Using affordances to reimagine systems requires centering marginalized voices and accepting the need to sometimes dismantle irredeemable applications. Responsible ML needs diverse critical lenses.
Knowledge Vault built byDavid Vivancos 2024