Upcoming Seminars

Title: Off-diagonal Weyl Laws for Commuting Selfadjoint Operators
Seminar: Analysis and Differential Geometry
Speaker: Suresh Eswarathasan of Dalhousie University
Contact: David Borthwick, dborthw@emory.edu
Date: 2024-02-23 at 10:00AM
Venue: MSC W301
Download Flyer
Abstract:
The Weyl Law concerns the asymptotics of the eigenvalue counting function for, amongst other operators, Laplacians on compact manifolds. In this talk, we focus on the joint spectrum for commuting selfadjoint operators on compact manifolds (a special case being the joint spectrum for the Laplacian and the generator for rotations on a surface of revolution). In joint work with Blake Keeler (CRM Montréal and AARMS Halifax), we prove a corresponding "off-diagonal" Weyl asymptotic in this setting. Such an asymptotic describes the covariance function for certain types of "random waves" and gives a complementary eigenvalue counting result to that of Colin de Verdière from 1979.
Title: Ramsey and density results for approximate arithmetic progressions.
Seminar: Combinatorics
Speaker: Marcelo Sales of UC Irvine
Contact: Cosmin Pohoata, cosmin.pohoata@emory.edu
Date: 2024-02-23 at 4:00PM
Venue: MSC W201
Download Flyer
Abstract:
Let AP_k={a,a+d,\ldots,a+(k-1)d} be an arithmetic progression of length k. For a given epsilon>0, we call a set AP_k(epsilon)={x_0,…,x_{k-1}} an epsilon-approximate arithmetic progression of lenght k for some a and d, if the inequality |x_i-(a+id)|<\epsilon d holds for all i in {0,1,...,k-1}. In this talk we discuss numerical aspects of Van der Waerden and Szemeredi type of results in which arithmetic progressions are replaced by their epsilon-approximation. Joint work with Vojtech Rodl.
Title: Bounds on the Torsion Subgroups of Second Cohomology
Seminar: Algebra
Speaker: Hyuk Jun Kweon of University of Georgia
Contact: Andrew Kobin, ajkobin@emory.edu
Date: 2024-02-27 at 4:00PM
Venue: MSC W301
Download Flyer
Abstract:
Let $X \hookrightarrow \mathbb{P}^r$ be a smooth projective variety defined by homogeneous polynomials of degree $\leq d$ over an algebraically closed field $k$. Let $\mathbf{Pic}\, X$ be the Picard scheme of $X$, and $\mathbf{Pic}\, ^0 X$ be the identity component of $\mathbf{Pic}\, X$. The N\'eron--Severi group scheme of $X$ is defined by $\mathbf{NS} X = (\mathbf{Pic}\, X)/(\mathbf{Pic}\, ^0 X)_{\mathrm{red}}$, and the N\'eron--Severi group of $X$ is defined by $\mathrm{NS}\, X = (\mathbf{NS} X)(k)$. We give an explicit upper bound on the order of the finite group $(\mathrm{NS}\, X)_{{\mathrm{tor}}}$ and the finite group scheme $(\mathbf{NS} X)_{{\mathrm{tor}}}$ in terms of $d$ and $r$. As a corollary, we give an upper bound on the order of the torsion subgroup of second cohomology groups of $X$ and the finite group $\pi^1_\mathrm{et}(X,x_0)^{\mathrm{ab}}_{\mathrm{tor}}$. We also show that $(\mathrm{NS}\, X)_{\mathrm{tor}}$ is generated by $(\deg X -1)(\deg X - 2)$ elements in various situations.
Title: Nonlinear scientific computing in machine learning and applications
Seminar: Numerical Analysis and Scientific Computing
Speaker: Wenrui Hao of Pennsylvania State University
Contact: Yuanzhe Xi, yuanzhe.xi@emory.edu
Date: 2024-02-29 at 1:00PM
Venue: MSC E300
Download Flyer
Abstract:
Machine learning has seen remarkable success in various fields such as image classification, speech recognition, and medical diagnosis. However, this success has also raised intriguing mathematical questions about optimizing algorithms more efficiently and applying machine-learning techniques to address complex mathematical problems. In this talk, I will discuss the neural network model from a nonlinear scientific computing perspective and present recent work on developing a homotopy training algorithm to train neural networks layer-by-layer and node-by-node. I will also showcase the use of neural network discretization for solving nonlinear partial differential equations. Finally, I will demonstrate how machine learning can be used to learn a mathematical model from clinical data in cases where the pathophysiology of a disease, such as Alzheimer's, is not well understood.
Title: Structure-conforming Operator Learning via Transformers
Seminar: Numerical Analysis and Scientific Computing
Speaker: Shuhao Cao of University of Missouri-Kansas City
Contact: Yuanzhe Xi, yuanzhe.xi@emory.edu
Date: 2024-03-21 at 10:00AM
Venue: MSC W201
Download Flyer
Abstract:
GPT, Stable Diffusion, AlphaFold 2, etc., all these state-of-the-art deep learning models use a neural architecture called "Transformer". Since the emergence of "Attention Is All You Need" paper by Google, Transformer is now the ubiquitous architecture in deep learning. At Transformer's heart and soul is the "attention mechanism". In this talk, we shall dissect the "attention mechanism" through the lens of traditional numerical methods, such as Galerkin methods, and hierarchical matrix decomposition. We will report some numerical results on designing attention-based neural networks according to the structure of a problem in traditional scientific computing, such as inverse problems for Neumann-to-Dirichlet operator (EIT) or multiscale elliptic problems. Progresses within different communities will be briefed to answer some open problems on the mathematical properties of the attention mechanism in Transformers.
Title: Triangular modular curves
Seminar: Algebra
Speaker: Juanita Duque-Rosero of Boston University
Contact: Andrew Kobin, ajkobin@emory.edu
Date: 2024-03-26 at 4:00PM
Venue: MSC W301
Download Flyer
Abstract:
Triangular modular curves are a generalization of modular curves and arise from quotients of the upper half-plane by congruence subgroups of hyperbolic triangle groups. These curves naturally parameterize hypergeometric abelian varieties, making them interesting arithmetic objects. In this talk we focus on the Borel-kind prime level triangular modular curves. We show that there are finitely many such curves of any given genus and present an algorithm to enumerate these curves. This is joint work with John Voight.