Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Gemini Adv | Llama 3:
Resume:
1.-Social media platforms allow some groups to easily add data and develop worldviews, while other groups face harassment and death threats.
2.-Many Ethiopian and Eritrean journalists have migrated from Facebook to Twitter due to ineffective moderation of misinformation, fake accounts, and propaganda.
3.-Twitter fails to suspend accounts issuing death threats, while suspending accounts of those receiving the threats. This impacts journalists and marginalized groups.
4.-The Ethical AI team at Google aimed to highlight how social media fails marginalized groups in non-English languages outside the U.S.
5.-Examples given of a popular Moroccan YouTube channel publishing blackmail videos and private documents to harass activists, without consequences from platforms.
6.-Tech companies take insufficient action on misinformation and hate speech, focusing efforts on English while neglecting other languages and regions.
7.-Their platforms and data are used to train large language models, while people in impacted regions cannot freely participate due to harassment.
8.-Work analyzing AI risks and harms must center voices of people most impacted, not just powerful entities. Lived experience provides crucial perspective.
9.-The AI ethics community must recognize how racism and sexism lead to the dismissal of work by Black women and marginalized groups.
10.-Sharing an anthropological photo caption exemplifies problematic white/Western gazes; ethnic scholars discussing their own communities provide better alternatives.
11.-Google followed a playbook to attempt discrediting Dr. Gebru's team after their paper passed peer review. Her firing shows power dynamics.
12.-Undergrads and student movements push for accountability. Academics and institutions need to demand ethical practices from tech companies they work with.
13.-Papers should not be edited by company lawyers to censor concerns. Conferences should refuse to review such censored work as propaganda.
14.-Academia lacks true academic freedom for most beyond a privileged few. Students, postdocs and junior faculty fear retaliation for speaking out.
15.-Researchers have a responsibility to push back on unethical corporate influence, like physicists did regarding nuclear weapons development work.
16.-Building large language models actively harms people, especially in regions not benefiting from the technology. Perspectives of impacted communities must be heard.
17.-When analyzing AI risks, the first step is listening to the people most impacted by the negative consequences, not shutting them down.
18.-Technology shifts power dynamics. Currently powerful entities ignore concerns of the less powerful. Amplifying marginalized voices is crucial for accountability.
19.-Institutional change is needed so individuals don't have to sacrifice careers to do the right thing. Building worker power enables pushback.
20.-Academia needs to change incentive structures to properly value vital areas like data-related work. New tracks at conferences are positive steps.
21.-Companies lack incentive to change unethical practices if not sufficiently regulated and held accountable legally. Massive lawsuits may be necessary.
22.-Academic disciplines like anthropology, history, sociology have useful frameworks for analyzing data ethics issues but are lower in the "hierarchy of knowledge."
23.-Elevating scholars from these fields into positions of power, like in government, is important so the tech industry must heed their input.
24.-Dr. Gebru is excited about proposals for a research institute centering needs of Black women and communities, providing an alternative perspective.
25.-Another project she's enthusiastic about analyzes the evolution of spatial apartheid in South Africa using satellite imagery, despite the long process.
Knowledge Vault built byDavid Vivancos 2024