Spring 2020
Introduction to mathematics with knot theory
Prerequisites: Familiarity with algebra
Mentor: Kelvin Lam
Description: Knots are one of the simplest class of mathematical objects one can think of, and it’s certainly not surprising that there are lots of applications that arise from studying knots given how often they show up in real life; from your shoelaces to your DNA, and even physical models used in statistical mechanics, etc.
We will be studying knots from a mathematical perspective. The book we will be using is called “The Knot Book” by Colin Adams. The purpose of this project is to introduce the student(s) to mathematical reasoning – how mathematicians think when they study a subject or when they are solving problems. The goal is for the Student(s) to learn and practice mathematical thinking by studying knots – which are geometric and intuitive to balance the (much needed) rigor and formal details in studying mathematics. And if time permits, we will also go into studying some applications of knot theory to understand how rigorous mathematics can lead to real life applications.
Inequalities
Prerequisites: Math 124, 125
Mentor: Junaid Hasan
Description: In math we often talk of equalities between objects. This time lets change our focus to inequalities. We would start with the basic inequalities: the Arithmetic-Geometric mean inequality and slowly move towards the Cauchy-Schwarz inequality and its applications.
References:
1. Inequalities, P.P. Korovkin 2. The Cauchy-Schwarz Master Class, J. Michael Steele.
Maxima Minima without Calculus
Prerequisites: Math 120 or an interest in plane geometry
Mentor: Junaid Hasan
Description: This project is named after the wonderful book “Maxima and Minima without Calculus” by Ivan Niven and Lester H Lance.
We would be focusing on finding maxima and minima for plane geometric objects. So if you love drawing triangles, quadrilaterals, this project would be a wonderful for you. We would compute Maxima and Minima but without taking derivatives!
References:
1. Maxima and Minima without Calculus, Ivan Niven and Lester H Lance.
Graph theory
Prerequisites: None
Mentor: Albert Artiles
Description: Here we will study the math of connecting the dots. No, I am not kidding. There is a lot of math one can do when studying how to connect dots on a piece of paper. Can you connect all the dots without your lines crossing? How many colors do you need to color all the dots so that no two connected dots are the same color? These turn out to be hard questions that we will explore and draw lots of beautiful pictures along the way.
Analysis!
Prerequisites: Math 124
Mentor: Camila Vásquez
Description: The things you know about the real numbers are cool and all, but what if you couldn’t assume these things were true until you justified it?
That’s real analysis at its most simple, boiled down to its bones. And Kenneth Ross’s Introduction to Elementary Analysis is a gentle introduction to that!
Circles and Triangles and other shapes! Oh my!
Prerequisites: None
Mentor: Camila Vásquez
Description: Geometry doesn’t need to be what it was in high school. It’s not just two-column proofs that you quickly filled out. Geometry involves constructions, and has a deep history worth recognizing and exploring as well. The goal would be to start with Euclidean geometry (the one we are familiar with) and move on from there.
Beginner Level: May require some Calculus
Groups! and Graphs!
Prerequisites: Concurrent w/ 120, or higher, are encouraged and all are welcome.
Mentor: Paige Helms
Description: The mathematical goal of this project is to learn about what groups and graphs are as objects, and see the ways they can relate to each other. We’ll do this by reading through a little bit of a book called “Office Hours with a Geometric Group Theorist”. There are many things we can explore once we have some idea of what these things are, so the project is open-ended, and very much about enjoying math and having fun while we do it.
Controlling memory selection in the Hopfield memory network
Prerequisites: Linear algebra and differential equations, comfortable programming in matlab or python
Mentor: Megan Morrison
Description: The Hopfield network is an auto-associative memory network that retrieves memories stored in the network when given an initial stimulus that vaguely resembles one of the stored memories. This network is a simple artificial neural network that can serve as a model for understanding human memory. The storage capacity of the network is limited by both its size as well as the relationship between memories selected for storage. In this project we study how to develop control procedures to hop between memories in the storage space as the complexity and instability of memories in the system grows.
Part 1) We create a directed graph for the neural network where nodes represent memories and edges represent achievable transitions between the memories. We explore the relationship between the creation of the memory network and the resulting directed graph it generates.
Part 2) We explore the inverse problem – given a directed graph, how would one select memories to store that would produce the desired graph? Different graphical structures may be desirable due to their computational efficiency, robustness, etc.
Learning the intrinsic dimension of dynamical systems with machine learning and data-driven discovery
Prerequisites: Linear algebra, differential equations, scientific computing, and comfortable using python and pytorch for machine learning
Mentor: Megan Morrison
Description: Many high-dimensional dynamical systems have dynamics that exist on a lower-dimensional manifold that is often not obvious. For example the Hodgkin-Huxley model is a four dimensional dynamical system that has intrinsic dynamics that exist in a two dimensional subspace — the FitzHugh-Nagumo model. Finding low-dimensional models is important because it makes systems easier to analyze and visualize and it reduces the computational expense of processes involving these systems. Deep neural network autoencoders can compress dynamics to a low-dimensional subspace. For example, previous studies have shown that two-dimensional linear embeddings can be found for some intrinsically linear systems, while nonlinear systems can be found with a combination of autoencoders and the SINDy algorithm. In this project we use deep neural network autoencoders to successively compress high-dimensional dynamical systems to lower dimensional systems and measure the extent to which we can compress each system. We test this process on randomly generated dynamical systems and canonical systems found in the systems biology literature. We develop a process for finding the minimum dimension of the system while minimizing computational resources.
Generating Functions
Prerequisites: Math 124/125/126
Mentor: Graham Gordon
Description: If you like the Fibonacci numbers, wait until you see their generating function! By recording a discrete sequence with a continuous function, you can use calculus to study the sequence – and vice versa. We can check out the book “generatingfunctionology” for a fun introduction to the topic.
Intermediate Level: Require Math 300 (proofs) and possibly other 300-level courses
Ill-posed problems
Prerequisites: Math 308, 307, and 309
Mentor: Kirill Golubnichiy
Description: My research interests are Inverse problems, Ill Posed Problems, Differential Equations and Numerical analysis. I have more than 20 publications in the domain of Inverse Problems for nonlinear transport equations.
Geometric Group Theory
Prerequisites: Math 300 is desired but not necessary
Mentor: Albert Artiles
Description: When looking at objects like a sphere or a circle or an equilateral triangle one starts to wonder about a way to quantify how symmetric they are. The sphere seems more symmetric than the circle which seems more symmetric than an equilateral triangle. But what is symmetry? How can we quantify it. An answer to this my friends is the concept of a group. In this reading course, however, we will turn the question around. We will ask, given some symmetries, what are they the symmetries of? This course does not require any prior knowledge. We will develop everything we need as we go.
Markov Chains
Prerequisites: Math 308 and 394
Mentor: Anthony Sanchez
Description: Markov chains are a random process that arise in many theoretical contexts and also in real world situations such as biology, economics, and the social sciences. The aim of this project will be to learn about Markov chains, their long term behavior, and first-step analysis. We will read out of the text Essentials of Stochastic Processes by Durrett. This project will require some background in probability (such as Math 394) and linear algebra (Math 308).
On von Neumann’s inequality for matrices
Prerequisites: A course in linear algebra would be super useful but not strictly required.
Mentor: Raghavendra Tripathi
Description: Von Neuman inequality is one of the most important and celebrated inequality in operator theory (a cousin of linear algebra). Though the general Von Neumann inequality is highly non-trivial, in this project we will study the Von Neumann inequality for matrices. We will start with the necessary linear algebraic concepts (viz. vector spaces, matrices, norm of a matrix etc.) and towards the end we will prove the Von Neumann inequality for matrices.
Advanced: Require upper-level (400-level) mathematics courses
Varieties
Prerequisites: Math 402 (Ring theory)
Mentor: Stark Ledbetter
Description: A variety is a set of solutions to a system of polynomial equations. Some varieties you know include graphs of curves in 2 dimensions, and surfaces in 3 dimensions. The study of varieties ties together lots of areas of math, including algebra, topology, and geometry. We will read the excellent and award-winning book “Ideals, Varieties, and Algorithms” by Cox, Little, and O’Shea. This book handles the topic beautifully, with incredibly well-crafted examples.
The Arcsine Law
Prerequisites: Math 395 and 425
Mentor: Anthony Sanchez
Description: Suppose we play a game consisting of successive tosses of a fair coin. Every time it lands on heads you gain one point and every time it lands on tails I gain one point. Intuitively expect you to be ahead about half the time and me to be ahead the other half in a long enough game with a fair coin.
As it turns out, our intuition is incorrect! It turns out that one of us will be ahead 85% of the time with a probability greater than 1/2. The exact distribution of who is ahead is given by the arcsine law. The main goal of this project is to understand and to prove this theorem using some analysis and probability theory. We will use the wonderful text, “Heads or Tails” by Lesigne.