Courses from previous years – 2019, 2018, 2017, 2016, 2015, 2014

]]>The program has three components this summer: Computational Bootcamp, Professional Development Sessions, and Internships.

1. **Computational Mathematics Bootcamp**

**June 1-12** (online), focusing on Data Science with R and Python. Instructor: Uma Ravat.

2. ~~Professional Development Sessions~~*.* ~~Three or four professional development sessions, at times to be determined during June and July. Participation is expected of all students receiving Internship funding.~~ [Professional development events canceled in view of COVID-19 disruptions.]

3. **Internships.**Various dates. Hosts to be arranged.

]]>

The program has four components. Students early in the graduate program usually do the Linear Algebra Working Group, Computational Bootcamp, and Prepare/Train Group. Students later in the graduate program usually do the Computational Bootcamp and an Internship.

1. **Linear Algebra Working Group**

**May 15-17** (239 Altgeld Hall), led by Elliot Kaplan.

2. **Computational Mathematics Bootcamp**

Part I: **May 20-31** (239 Altgeld Hall), led by Uma Ravat focusing on Data Science with R and Python.

3. **Prepare and Train Group – Sociophysics: Bias and homophily in professional hierarchies, and Models of social group competition**

Dates: **June 3-July 15** (with holiday on July 4)

Location: 159 Altgeld Hall has been booked 9am-1pm each day

Instructor: Prof. Sara Clifton

The program is a “Research Experience for Graduate Students” style endeavor: after a series of introductory lectures, the students form small groups (2-5 people) to work on open-ended, interconnected problems. Students will work during the first three weeks on models for “Bias and homophily in professional hierarchies” (building on this paper). Then the groups will change around for the second three weeks, tackling problems on “Models of social group competition” (where references include this and this).

*Overview.* The goal is to guide students through the transition from working on “canned” problems to tackling open-ended problems and formulating the problems themselves. We expect the group work to involve a mixture of computational experiments (to generate conjectures) and theory (to prove them).

**Professional Development Sessions***.* New to the program this year are four professional development sessions, on Wednesday afternoons during June. Participation is expected of all students in the Prepare/Train program.

- •
*Exploring Broad Careers in Mathematics*(Derek Attig, Director of Career Development, Graduate College), Wednesday June 5, 3:30-5:00pm in 308 Coble Hall - •
*Managing Large and Complex Projects*(Mike Firmand, Assistant Director for Employer Outreach, Graduate College), Wednesday June 12, 3:30-5:00pm in 308 Coble Hall - •
*Data presentation/visualization*(Megan Ozeran, Data Analytics & Visualization Librarian, University Library), Wednesday June 19, 3:30-5:00pm in 308 Coble Hall - •
*Talking about Your Research*(Emily Wuchner, Thesis Coordinator, Graduate College), Wednesday June 26, 3:30-5:00pm in 308 Coble Hall

Refreshments provided.

4. **Internships**

Various dates. Hosts to be arranged. Interns funded through PI4 are strongly encouraged to participate in the **professional development events** listed above, provided their employer grants permission to make up the hours at another time during the week.

**What is the format of the workshop?**

Students work in groups through the worksheets (see files below). Informal presentations and discussions on the most important problems occur throughout the day. The style is informal, with students working in a collaborative environment.

**How are the worksheets structured?**

Each day’s worksheet contains a list of definitions, theorems, and exercises (about 40 each day). Some of the problems were modified from those in the references, while others are problems written for this working group.

**Day 1 Worksheet**(Day1_Sheet)

Projections and the Gram-Schmidt Process

QR Factorization

Least-squares

Linear Models: Regression**Day 2 Worksheet**(Day2_Sheet)

Diagonalization

Symmetric matrices

Spectral Theorem

Quadratic Forms

Singular Value Decomposition**Day 3 Worksheet**(Day3_Sheet)

Principal Component Analysis and Dimensional Reduction

Brief look at Markov Chains

LU Factorizations

Duals and annihilators

Some multilinear algebra

**Acknowledgment**

These Linear Algebra Workshop materials may be freely used by others. We ask that when materials are re-used, the following statement be included:

These materials were created by Stefan Klajbor Goderich at the University of Illinois and edited by Elliot Kaplan, with support from National Science Foundation grant DMS 1345032 “MCTP: PI4: Program for Interdisciplinary and Industrial Internships at Illinois.”

]]>Computational Bootcamp 2018 – Part 2 (3 days on Mathematica fundamentals) ]]>

The program has four components. Students early in the graduate program usually do the Linear Algebra Working Group, Computational Bootcamp, and Prepare/Train Group. Students later in the graduate program usually do the Computational Bootcamp and an Internship.

1. **Linear Algebra Working Group**

**May 16-18** (239 Altgeld Hall), led by Stefan Klajbor Goderich

2. **Computational Mathematics Bootcamp**

Part I: **May 21-26** (239 Altgeld Hall) with focus on Data Science, led by David LeBauer.

Part II: **May 30-June 1** (239 Altgeld Hall) with focus on Mathematica Fundamentals, led by A. J. Hildebrand

3. **Prepare and Train Group – Algorithms for Analytic Combinatorics**

Dates: **June 4-July 13** (with holiday on July 4)

Location: 159 Altgeld Hall (room booked in mornings)

Instructor: Stephen Melczer (U. of Pennsylvania)

The program is a “Research Experience for Graduate Students” style endeavor: after a series of introductory lectures, the students form small groups (2-5 people) to work on open-ended interconnected problems.

*Overview.* The goal is to guide students through the transition from working on “canned” problems to tackling open-ended problems and formulating the problems themselves. We expect the group work to involve a mixture of computational experiments (to generate conjectures) and theory (to prove them).

One of the draws of combinatorics is its ability to draw on, motivate, and even push forward diverse areas of mathematics and computer science. The focus of this program will be to study the methods of analytic combinatorics – a field drawing inspiration from complex analysis, differential geometry, and algebraic geometry – from the perspective of computer algebra.

Students will begin by learning the underlying theory and implementing algorithms which have been previously described at a theoretical level. Later, students will engage with open problems of both a theoretical and computational nature, and examine new applications of this fast-growing theory. A wide range of problems of varying difficulty will be available for students with different backgrounds.

Topics include the new theory of analytic combinatorics in several variables, effective enumeration results for power series coefficients of algebraic functions, and decompositions of multivariate rational functions. Potential applications touch on areas of queuing theory, representation theory, theoretical computer science, transcendence theory, probability theory, and (of course) combinatorics.

A more detailed statement of problems can be downloaded here.

4. **Internships**

Various dates. Hosts to be arranged.

]]>

**What is the format of the workshop?**

Students work in groups through the worksheets (see files below). Informal presentations and discussions on the most important problems occur throughout the day. The style is informal, with students working in a collaborative environment.

**How are the worksheets structured?**

Each day’s worksheet contains a list of definitions, theorems, and exercises (about 40 each day). Some of the problems were modified from those in the references, while others are problems written for this working group.

**Day 1 Worksheet**(tex file, tex label index, pdf file)

Projections and the Gram-Schmidt Process

QR Factorization

Least-squares

Linear Models: Regression**Day 2 Worksheet**(tex file, tex label index, pdf file)

Diagonalization

Symmetric matrices

Spectral Theorem

Quadratic Forms

Singular Value Decomposition**Day 3 Worksheet**(tex file, tex label index, pdf file)

Principal Component Analysis and Dimensional Reduction

Brief look at Markov Chains

LU Factorizations

Duals and annihilators

Some multilinear algebra

The “tex label index” files are useful when editing the tex files, as they list the labels used in the tex code.

**Acknowledgment**

These Linear Algebra Workshop materials may be freely used by others. We ask that when materials are re-used, the following statement be included:

These materials were created by Stefan Klajbor Goderich at the University of Illinois, with support from National Science Foundation grant DMS 1345032 “MCTP: PI4: Program for Interdisciplinary and Industrial Internships at Illinois.”

]]>David LeBauer, *Carl R Woese Institute for Genomic Biology*, University of Illinois

Neal Davis, *Department of Computer Science*, University of Illinois

**Teaching Assistant:**

Stefan Klajbor, *Department of Mathematics*, University of Illinois

**Room:**

239 Altgeld Hall

A two week course designed to introduce Math graduate students with little or no programming experience to methods in data analysis and computation. The goal is to prepare students to apply their understanding of math to solve problems in industry.

Courses from previous years – 2016, 2015, 2014 – focused on numerical analysis. This year the focus is shifting to the use and analysis of large and complex data.

Although the course is aimed at students with limited experience using software, you are expected to complete two introductory courses in order to become familiar with the basic syntax and operations in R and Python. Two free courses from DataCamp are **Required***; completion certificates must be mailed to the instructors by midnight May 25. *Each of these courses should take just a few hours to complete:*

*Students who have significant experience with R and / or Python may elect to substitute a more advanced course.

- May 26: Computing Basics
- May 30-June 2: Data and Statistics in R
- June 5-June 8: Data and Machine Learning with Python
- June 9: Conclusion and Project Presentations

**Linear Algebra Working Group**

May 22-24, in 239 Altgeld Hall

**Computational Mathematics Bootcamp:**

May 26-June 9, in 239 Altgeld Hall; note Memorial Day holiday on Monday May 29

**Prepare and Train Group – Machine Learning: Algorithms and Representations**

Dates: Monday June 12 through Friday July 21; note Independence Day holiday on Tuesday July 4

Location: TBD

Instructors: Maxim Raginsky (Department of Electrical and Computer Engineering) and Matus Jan Telgarsky (Department of Computer Science)

The program is a “Research Experience for Graduate Students” style endeavor: after a series of introductory lectures, the students form small groups (2-5 people) to work on open-ended interconnected problems.

The goal is to guide students through the transition from working on “canned” problems to tackling open-ended problems and formulating the problems themselves. We expect the group work to involve a mixture of computational experiments (to generate conjectures) and theory (to prove them).

The topics will focus on probabilistic and approximation-theoretic aspects of machine learning, with emphasis on neural networks. We will introduce the probabilistic formulation of machine learning and relate the performance of commonly used learning algorithms (such as stochastic gradient descent) to the concentration of measure phenomenon. Problems of varying levels of difficulty will revolve around several open questions pertaining to stability and convergence of stochastic gradient descent. We will also cover several results characterizing neural network function classes, for instance results saying that neural networks can fit continuous functions, that neural networks gain in power with extra

layers, and that neural networks can model polynomials. Open questions will cover more nuanced aspects of adding layers, as well as other neural net architectures, for instance convolutional and recurrent neural networks.

**Internships**

Various dates. Hosts to be arranged.

]]>