Georgia Scientific Computing Symposium 2024 (GSCS 2024)

February 24, 2024
Emory University
Atlanta, USA


The Georgia Scientific Computing Symposium is a forum for professors, postdocs, graduate students and other researchers in Georgia to meet in an informal setting, to exchange ideas, and to highlight local scientific computing research. The symposium has been held every year since 2009 and is open to the entire research community.
This year, the GSCS will be held on Saturday, February 24, 2024 in Math and Science Center at Emory University (400 Dowman Drive, Atlanta, GA 30322).

THANKS FOR COMING!

Plenary Speakers

Peng Chen
Georgia Institute of Technology

Igor Belykh
Georgia State University

Levon Nurbekyan
Emory University

Henrik Schumacher
University of Georgia, Chemnitz University of Technology

Molei Tao
Georgia Institute of Technology

Samy Wu Fung
Colorado School of Mines


Who Can Present a Poster?
Anyone can present a poster, but we strongly encourage undergraduate and graduate students and postdoctoral fellows to present. To present a poster, provide a poster title when you register.
Poster Blitz
Create one slide and give a ~30 second trailer talk to encourage researchers to visit your poster.
Upload your slide HERE by 5pm on Friday, February 23, 2024.
Poster Parameters
We recommend a poster that 36 inches tall and 48 inches wide or smaller. We will provide poster boards and push pins for setup.

Emory University Organizing Committee and CODES Group


Location Information

Address
Math & Science Center, Emory University
400 Dowman Drive, Atlanta, GA 30322
Parking
Parking at Emory is free on the weekends.
We recommmend parking in the Oxford Road Parking Deck (201 Dowman Drive, Atlanta).
If the Oxford Parking Deck is full, we recommend parking in the Peavine Parking Lot (29 Eagle Row, Atlanta)

Schedule

Time Event Math and Science Center (MSC) Location
8:00 - 9:00 Registration, Check-In, Poster Setup, Continental Breakfast MSC Atrium
9:00 - 9:15 Opening Remarks MSC E208
Session 1Chair: Kai Fung (Kelvin) Kan
9:15 - 9:40 Derivative-informed Neural Operators for PDE-Constrained Optimization under Uncertainty

Peng Chen

In this talk I will present a novel machine learning framework for solving optimization problems governed by large-scale partial differential equations (PDEs) with high-dimensional random parameters. Such optimization problems can be found in Bayesian inverse problems for parameter estimation, optimal experimental design for data acquisition, and stochastic optimization for risk-averse optimal control and design. These problems are computationally prohibitive using classical methods, as the estimation of statistical measures may require many solutions of an expensive-to-solve PDE at every iteration of a sampling or optimization algorithm. To address this challenge, we will present a class of Derivative-Informed Neural Operators (DINO) with the combined merits of (1) being able to accurately approximate not only the mapping from the inputs of random parameters and optimization variables to the PDE state, but also its derivative with respect to the input variables, (2) using a reduced basis architecture that can be efficiently constructed and is scalable to high-dimensional problems, and (3) requiring only a limited number of training data to achieve high accuracy for both the PDE solution and the optimization solution. I will present some applications in material science, computational fluid dynamics, and structure mechanics.

This talk is based on the following papers:

Derivative-Informed Neural Operator: An Efficient Framework for High-dimensional Parametric Derivative Learning

Efficient PDE-Constrained Optimization under High-dimensional Uncertainty Using Derivative-Informed Neural Operators

Accelerating Bayesian Optimal Experimental Design with Derivative-Informed Neural Operators

MSC E208
9:45 - 10:10 Predictive Modeling for Instability Avoidance Under Human-structure Interaction

Igor Belykh

Modern civil engineering structures are designed for high fidelity using sophisticated linear finite-element and other computational methods backed up by stringent design codes. Yet, the area of human-structure interaction remains elusive, with most analyses treating pedestrians as merely added mass. There remain frequent, costly failures caused by the unexplained dynamic interaction of crowds, notably in the lateral instability of pedestrian bridges. Our recent experimentally validated theories have dispelled the myth that these are caused by spontaneous synchronization between pedestrians. Rather, it is the nonlinear response of individual humans to the incipient movement that, when scaled up, provides effective negative damping to lateral bridge vibration. In this talk, we will seek to answer the following important questions. Given the heterogeneous nature of pedestrians’ motion, can we understand how the interactions between individual pedestrian agents contribute to the effective negative damping? How much feedback is involved with the bridge? Most importantly, is there any way to use such information to produce reduced-order models and estimate realistic distributions of their parameters, suitable for making practical predictions for the natural bridge damping necessary to avoid dangerous instabilities?

MSC E208
10:15 - 10:30 Poster Blitz MSC E208
10:30 - 12:00 Poster Session MSC Atrium
12:00 - 1:15 Lunch (provided) MSC Atrium
Session 2Chair: Lucas Onisk
1:30 - 1:55 A Novel Computational Framework for Mean-field Games

Levon Nurbekyan

Mean-field games (MFG) are a theoretical and computational framework for analyzing games with large numbers of participants with applications in economics, finance, industrial engineering, material design, and data science. Despite the impressive advances in MFG theory and algorithms, several critical technical challenges remain unaddressed. In particular, systematic analysis and algorithms for MFG systems that do not admit a potential (variational) formulation are largely missing. In this talk, I will present a novel computational framework for non-potential MFG systems, paving the way towards general algorithms and analysis techniques for such systems.

MSC E208
2:00 - 2:25 Repulsive Curves and Surfaces

Henrik Schumacher

Repulsive energies were originally constructed to simplify knots in R^3. The driving idea was to design energies that blow up to infinity when a time-dependent family of knots develops a self-intersection. Thus, downward gradient flows should simplify a given knot without escaping its knot class.

In this talk I will focus on a particular energy, the so-called tangent-point energy. It can be defined for curves as well as for surfaces. After outlining its geometric motivation and some of the theoretical results (existence, regularity), I will discuss several hardships that one has to face if one attempts to numerically optimize this energy, in particular in the surface case. As we will see, a suitable choice of Riemannian metric on the infinite-dimensional space of embeddings can greatly help to deal with the ill-conditioning that arises in high-dimensional discretizations. I will also sketch briefly how techniques like the Barnes-Hut method can help to reduce the algorithmic complexity to an extent that allows for running nontrivial numerical experiments on consumer hardware.

Finally (and most importantly), I will present a couple of videos that employ the gradient flows of the tangent-point energy to visualize some stunning facts from the field of topology.

MSC E208
2:30 - 3:30 Discussion MSC Atrium
Session 3Chair: Deepanshu Verma
3:45 - 4:10 Mirror Diffusion Model for Constrained Generative Modeling

Molei Tao

Generative modeling is the task of generating more samples that are similar to samples in a training data set. Denoising diffusion is a recently proposed method for generative modeling, but it already became a dominant approach in the field. Nevertheless, if the training data actually satisfy some constraints, for example due to some prior knowledge, new data generated by a generic diffusion model may no longer satisfy those constraints. This talk will report an improvement of diffusion model that gains back exact constraint satisfaction, motivated by our recently developed constrained SOTA sampling algorithm known as mirror Langevin algorithm. If time permits, applications to privacy/safety and quantum problems will also be briefly discussed.

Joint work with Tianrong Chen, Ruilin Li, Guan-Horng Liu, Evangelos Theodorou, Santosh Vempala, Andre Wibisono, and Yuchen Zhu (alphabetical).

MSC E208
4:15 - 4:40 Learning-to-Optimize via Implicit Networks

Samy Wu Fung

Learning-to-Optimize (or L2O) is an emerging approach where machine learning is used to learn an optimization algorithm. It automates the design of an optimization method based on its performance on a set of training problems. Learning optimization algorithms in an end-to-end fashion can be challenging due to their asymptotic nature. This talk discusses a class of network architectures, called implicit networks, whose outputs are defined by a fixed point (or optimality) condition, which makes them naturally suited for L2O. We will cover how to design and train these networks efficiently.

MSC E208
4:45 - 5:00 Closing Remarks MSC E203
5:30 Dinner (at your own cost) Suggestion: Double Zero

Acknowledgements

The GSCS 2024 is supported by the Department of Mathematics at Emory University.