- Who Can Present a Poster?
- Anyone can present a poster, but we strongly encourage undergraduate and graduate students and postdoctoral fellows to present. To present a poster, provide a poster title when you register.
- Poster Blitz
- Create one slide and give a ~30 second trailer talk to encourage researchers to visit your poster.
- Upload your slide HERE by 5pm on Friday, February 23, 2024.
- Poster Parameters
- We recommend a poster that 36 inches tall and 48 inches wide or smaller. We will provide poster boards and push pins for setup.
Emory University Organizing Committee and CODES Group
- Julianne Chung
- Matthias Chung
- Bree Ettinger
- Manuela Girotti
- Talea Mayo
- Manuela Manetta
- James Nagy
- Elizabeth Newman (GSCS 2024 Organizer)
- Levon Nurbekyan
- Lars Ruthotto
- Alessandro Veneziani
- Yuanzhe Xi
Location Information
- Address
- Math & Science Center, Emory University
- 400 Dowman Drive, Atlanta, GA 30322
- Parking
- Parking at Emory is free on the weekends.
- We recommmend parking in the Oxford Road Parking Deck (201 Dowman Drive, Atlanta).
- If the Oxford Parking Deck is full, we recommend parking in the Peavine Parking Lot (29 Eagle Row, Atlanta)
Schedule
Time | Event | Math and Science Center (MSC) Location |
---|---|---|
8:00 - 9:00 | Registration, Check-In, Poster Setup, Continental Breakfast | MSC Atrium |
9:00 - 9:15 | Opening Remarks | MSC E208 |
Session 1 | Chair: Kai Fung (Kelvin) Kan | |
9:15 - 9:40 |
Derivative-informed Neural Operators for PDE-Constrained Optimization under Uncertainty
Peng Chen In this talk I will present a novel machine learning framework for solving optimization problems governed by large-scale partial differential equations (PDEs) with high-dimensional random parameters. Such optimization problems can be found in Bayesian inverse problems for parameter estimation, optimal experimental design for data acquisition, and stochastic optimization for risk-averse optimal control and design. These problems are computationally prohibitive using classical methods, as the estimation of statistical measures may require many solutions of an expensive-to-solve PDE at every iteration of a sampling or optimization algorithm. To address this challenge, we will present a class of Derivative-Informed Neural Operators (DINO) with the combined merits of (1) being able to accurately approximate not only the mapping from the inputs of random parameters and optimization variables to the PDE state, but also its derivative with respect to the input variables, (2) using a reduced basis architecture that can be efficiently constructed and is scalable to high-dimensional problems, and (3) requiring only a limited number of training data to achieve high accuracy for both the PDE solution and the optimization solution. I will present some applications in material science, computational fluid dynamics, and structure mechanics. This talk is based on the following papers: Accelerating Bayesian Optimal Experimental Design with Derivative-Informed Neural Operators |
MSC E208 |
9:45 - 10:10 |
Predictive Modeling for Instability Avoidance Under Human-structure Interaction
Igor Belykh Modern civil engineering structures are designed for high fidelity using sophisticated linear finite-element and other computational methods backed up by stringent design codes. Yet, the area of human-structure interaction remains elusive, with most analyses treating pedestrians as merely added mass. There remain frequent, costly failures caused by the unexplained dynamic interaction of crowds, notably in the lateral instability of pedestrian bridges. Our recent experimentally validated theories have dispelled the myth that these are caused by spontaneous synchronization between pedestrians. Rather, it is the nonlinear response of individual humans to the incipient movement that, when scaled up, provides effective negative damping to lateral bridge vibration. In this talk, we will seek to answer the following important questions. Given the heterogeneous nature of pedestrians’ motion, can we understand how the interactions between individual pedestrian agents contribute to the effective negative damping? How much feedback is involved with the bridge? Most importantly, is there any way to use such information to produce reduced-order models and estimate realistic distributions of their parameters, suitable for making practical predictions for the natural bridge damping necessary to avoid dangerous instabilities? |
MSC E208 |
10:15 - 10:30 | Poster Blitz | MSC E208 |
10:30 - 12:00 | Poster Session | MSC Atrium |
12:00 - 1:15 | Lunch (provided) | MSC Atrium |
Session 2 | Chair: Lucas Onisk | |
1:30 - 1:55 |
A Novel Computational Framework for Mean-field Games
Levon Nurbekyan Mean-field games (MFG) are a theoretical and computational framework for analyzing games with large numbers of participants with applications in economics, finance, industrial engineering, material design, and data science. Despite the impressive advances in MFG theory and algorithms, several critical technical challenges remain unaddressed. In particular, systematic analysis and algorithms for MFG systems that do not admit a potential (variational) formulation are largely missing. In this talk, I will present a novel computational framework for non-potential MFG systems, paving the way towards general algorithms and analysis techniques for such systems. |
MSC E208 |
2:00 - 2:25 |
Repulsive Curves and Surfaces
Henrik Schumacher Repulsive energies were originally constructed to simplify knots in R^3. The driving idea was to design energies that blow up to infinity when a time-dependent family of knots develops a self-intersection. Thus, downward gradient flows should simplify a given knot without escaping its knot class. In this talk I will focus on a particular energy, the so-called tangent-point energy. It can be defined for curves as well as for surfaces. After outlining its geometric motivation and some of the theoretical results (existence, regularity), I will discuss several hardships that one has to face if one attempts to numerically optimize this energy, in particular in the surface case. As we will see, a suitable choice of Riemannian metric on the infinite-dimensional space of embeddings can greatly help to deal with the ill-conditioning that arises in high-dimensional discretizations. I will also sketch briefly how techniques like the Barnes-Hut method can help to reduce the algorithmic complexity to an extent that allows for running nontrivial numerical experiments on consumer hardware. Finally (and most importantly), I will present a couple of videos that employ the gradient flows of the tangent-point energy to visualize some stunning facts from the field of topology. |
MSC E208 |
2:30 - 3:30 | Discussion | MSC Atrium |
Session 3 | Chair: Deepanshu Verma | |
3:45 - 4:10 |
Mirror Diffusion Model for Constrained Generative Modeling
Molei Tao Generative modeling is the task of generating more samples that are similar to samples in a training data set. Denoising diffusion is a recently proposed method for generative modeling, but it already became a dominant approach in the field. Nevertheless, if the training data actually satisfy some constraints, for example due to some prior knowledge, new data generated by a generic diffusion model may no longer satisfy those constraints. This talk will report an improvement of diffusion model that gains back exact constraint satisfaction, motivated by our recently developed constrained SOTA sampling algorithm known as mirror Langevin algorithm. If time permits, applications to privacy/safety and quantum problems will also be briefly discussed. Joint work with Tianrong Chen, Ruilin Li, Guan-Horng Liu, Evangelos Theodorou, Santosh Vempala, Andre Wibisono, and Yuchen Zhu (alphabetical). | MSC E208 |
4:15 - 4:40 |
Learning-to-Optimize via Implicit Networks
Samy Wu Fung |
MSC E208 |
4:45 - 5:00 | Closing Remarks | MSC E203 |
5:30 | Dinner (at your own cost) | Suggestion: Double Zero |
Acknowledgements
The GSCS 2024 is supported by the Department of Mathematics at Emory University.
Previous Symposia
- 2023 GSCS at Georgia State University
- 2022 GSCS at Georgia Institute of Technology
- 2021 GSCS at University of Georgia
- 2020 GSCS at Emory University
- 2019 GSCS at Georgia Institute of Technology
- 2018 GSCS at Georgia State University
- 2017 GSCS at University of Georgia
- 2016 GSCS at Emory University
- 2015 GSCS at Georgia Institute of Technology
- 2014 GSCS at Kennesaw State University
- 2013 GSCS at Georgia State University
- 2012 GSCS at University of Georgia
- 2011 GSCS at Emory University
- 2010 GSCS at Georgia Institute of Technology
- 2009 GSCS at Emory University