Knowledge Vault 2/69 - ICLR 2014-2023
Bruce Bassett, Richard Armstrong, Michelle Lochner, Nadeem Oozeer ICLR 2020 - Workshop - Fundamental Science in the era of AI
<Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Gemini Adv | Llama 3:

graph LR classDef iclr fill:#f9d4d4, font-weight:bold, font-size:14px; classDef deep_learning fill:#d4f9d4, font-weight:bold, font-size:14px; classDef astronomy fill:#d4d4f9, font-weight:bold, font-size:14px; classDef chemistry fill:#f9f9d4, font-weight:bold, font-size:14px; classDef discussion fill:#f9d4f9, font-weight:bold, font-size:14px; A[Bruce Bassett et al
ICLR 2020] --> B[ICLR virtual workshop on
fundamental science in AI 1] A --> C[Deep learning + physics
for cosmology uncertainty 2] C --> D[OOD detection in
astronomy images 3] A --> E[Open molecular photoswitch
dataset for ML 4] A --> F[Panel: statistics, ML merging?
Bayesian science, AI complacency 5] A --> G[Invariant CNN classifies
periodic variable stars 6] A --> H[Transformation importance scores
interpret trained ML 7] A --> I[Panel: physics unknowns, AI
automation, explainability, bias 8] A --> J[Dense net for radio
galaxies, transfer learning 9] A --> K[Gaussian process emulator
for cosmology inference 10] A --> L[Wasserstein GAN ensemble estimates
X-ray polarization, absorption 11] A --> M[Bayesian NNs constrain
reionization from 21cm 12] A --> N[Variational autoencoders compress
particle physics data 13] A --> O[MOLSTO: active learning
for chemistry datasets 14] O --> P[Active learning strategies
compared on molecules 15] A --> Q[Bayesian deep learning in
time-domain astronomy 16] A --> R[Symmetry-embedded NNs solve
stellar fluid dynamics 17] A --> S[Simulation-based inference,
uncertainties in particle physics 18] A --> T[Uncertainty quantification methods
compared in deep learning 19] A --> U[Learn-as-you-go emulator quantifies
cosmological simulation errors 20] A --> V[Panel: AI for science
social good via automation 21] A --> W[Wasserstein GAN detects subtle
galaxy merger anomalies 22] A --> X[Astronomical transients &
personalized medicine similarities 23] X --> Y[dmdt transforms light curves
to images for CNNs 24] A --> Z[Challenges of high-dimensional
uncertainty quantification 25] A --> AA[ML can make science
open, collaborative, reviewed 26] A --> AB[Science can advance simulation-
based inference with ML 27] A --> AC[Lung cancer CT scan
atlas for data science 28] AC --> AD[Breast MRI neovascularization
predicts cancer progression 29] A --> AE[COVID-19 data science:
mutations, reopening criteria 30] class A,B iclr; class C,D,G,J,K,L,M,Q,R,U,W,X,Y,Z deep_learning; class E,N,O,P,AB chemistry; class F,H,I,S,T,V,AA discussion; class AC,AD,AE astronomy;


1.-The workshop on Fundamental Science in the AI Era at ICLR was held virtually

2.-Franšois Lanusse discussed combining deep learning and physical modeling for uncertainty quantification in cosmology using the Rubin Observatory Legacy Survey.

3.-Lorenzo Zanisi presented work on applying out-of-distribution detection to astronomical images using likelihood ratios from Bayesian neural networks.

4.-Ryan Rees-Griffiths introduced an open source dataset of molecular photoswitches to enable machine learning design of promising new molecules.

5.-Panel discussed if statistics and machine learning fields will merge, Bayesian techniques in science, and dangers of AI making scientists complacent.

6.-Charles An presented an invariant temporal convolutional network for classifying periodic variable star light curves that is phase shift invariant.

7.-John Lin discussed computing transformation importance scores to understand what a fully trained machine learning model has learned.

8.-Panel debated unknown unknowns in physics, if AI can automate scientific discovery, explainability of models, and bias in data.

9.-Ashwin Samudre presented a dense net architecture for radio galaxy classification using transfer learning and cyclical learning rates with limited data.

10.-Sooraj Bhat built a Gaussian process emulator to accelerate cosmological parameter inference for weak lensing using extreme data compression.

11.-Nikita Kosolapov used a Wasserstein GAN ensemble to estimate X-ray photon polarization direction and reconstruct absorption locations in gas pixel detectors.

12.-HÚctor Javier Hort˙a used Bayesian neural networks and variational inference to constrain reionization history parameters from 21cm signals.

13.-Konstantin Vayser proposed using variational autoencoders to compress particle physics data to overcome detector readout constraints and enable faster simulations.

14.-Austin Tripp introduced MOLSTO, an active learning library that provides chemical property labels from simulations to enable non-experts to use chemistry datasets.

15.-Ryan Rees-Griffiths compared different active learning strategies on molecules and found the optimal representation depends on the specific task.

16.-Anais M÷ller discussed applications of Bayesian deep learning in time domain astronomy for out-of-distribution detection and uncertainty quantification.

17.-Pablo Villanueva-Domingo embedded symmetries into neural networks for solving stellar fluid dynamics differential equations in supervised and unsupervised settings.

18.-Benjamin Nachman talked about simulation-based inference, quantifying uncertainties, and learning from unlabeled collider events in particle physics with deep learning.

19.-Sean Chevalier compared different uncertainty quantification methods in deep learning and found more training data variation is needed for good aleatoric uncertainties.

20.-Nathan Musoke presented a learn-as-you-go emulator framework that quantifies the errors introduced in the posterior when statistically emulating cosmological forward simulations.

21.-Panel examined if AI can help fundamental science achieve greater social good through automation, accessibility, and cross-disciplinary collaboration.

22.-Marc Huertas-Company used a Wasserstein GAN to efficiently detect subtle galaxy merger anomalies in large astronomical surveys where classic methods fail.

23.-Pavlos Protopapas discussed the similarities between astronomical transients and personalized medicine in terms of the need for rapid follow-up.

24.-Pavlos also presented the dmdt structure function transformation of irregular light curves into images for classification with convolutional neural networks.

25.-Anais M÷ller mentioned the challenges of uncertainty quantification in high dimensions, especially for systematic uncertainties that go beyond the training data.

26.-Sebastian said machine learning can teach science fields to have publications completely open, share code, and use open reviews.

27.-Benjamin Nachman suggested fundamental science has potential to advance simulation-based inference using high-fidelity simulators to generate infinite labeled training data.

28.-Ashish Mahabal presented an imaging data atlas to establish a lexicon and ontology for lung cancer screening CT scans.

29.-Ashish also examined neovascularization in breast MRIs to potentially predict cancer progression and separate patient-specific vascular structure.

30.-Ashish concluded by discussing data science projects related to COVID-19, such as analyzing virus genome mutations and determining safe reopening criteria.

Knowledge Vault built byDavid Vivancos 2024