- This event has passed.
Triangle Lectures in Combinatorics
November 10, 2018 @ 10:00 am - 6:30 pm
The Triangle Lectures in Combinatorics is a series of combinatorial workshops held each semester on a Saturday in the Research Triangle region of North Carolina, funded by the National Science Foundation. The workshop this coming fall, the 16th installment of the Triangle Lectures in Combinatorics, will be hosted by UNC Chapel Hill in Chapel Hill, North Carolina on Saturday, November 10, 2018. It will include four invited talks as well as coffee breaks and ample time for discussions throughout the day.
10-11am, Lek-Heng Lim, Combinatorial geometry of deep neural networks
11-11:30am, coffee break
11:30am-12:30pm, Rebecca Goldin, On some (equivariant) combinatorics related to flag manifolds
12:30-2:15pm, lunch break
2:15-3:15pm, Daniel Bienstock, Approximate nonconvex optimization and tree-width
3:15-3:45pm, coffee break
3:45-4:30pm, László Babai, Global symmetry from local information: The graph isomorphism problem
4:30-4:45pm, coffee break
4:45-5:30pm, László Babai, Hidden irregularity vs. hidden symmetry: The emergence of the Johnson graph
6:30pm informal conference dinner
We are asking that participants pre-register if possible, as pre-registration is very helpful for planning our coffee breaks and obtaining funding to support these meetings. We have some funding available for some participants, especially for early-career participants, most of which is restricted to US citizens and permanent residents. The first round of funding awards will be made after September 15, 2018 based on applications received by then.
Practical information: See the conference website (https://wp.math.ncsu.edu/tlc/) where some practical information (such as a list of local hotels) is already posted and more will be posted as the conference date gets closer.
Talk titles and abstracts:
Lek-Heng Lim (University of Chicago)
Title: Combinatorial geometry of deep neural networks
Abstract: A useful way to visualize a classification problem in machine learning is as a problem of finding a decision boundary that separates two collections of points (e.g., spam or nonspam) in some high-dimensional space. A simple data set, say, comprising points sampled from two convex sets, may be easily separated by a hyperplane but modern complex data sets tend to require more complex, usually piecewise linear, decision boundaries. The number of linear regions of a classifier then gives a measure of its expressive power. By undertaking a tropical geometric perspective, we show that the linear regions of a feedforward ReLU neural network correspond to vertices of polytopes associated with tropical rational functions. The main consequence of this is that a deeper network is exponentially more expressive than a shallow network, and a side observation is that zonotopes form the basic building blocks for deep neural networks.
This is joint work with Greg Naitzat (Chicago) and Liwen Zhang (Facebook).
Rebecca Goldin (George Mason University)
Title: On some (equivariant) combinatorics related to flag manifolds
Abstract: Schubert calculus arose from 19th Century explorations of enumerative geometry, but has evolved into the study of specific rings associated with homogeneous spaces. The topic includes, for example, examining the product structure of the cohomology ring of the Grassmannian of k-dimensional subspace of complex n-space, or the equivariant cohomology ring of G/P, where P is a parabolic in a complex group G), as well as the K-theory or equivariant K-theory of the complete flag variety. In a geometrically motivated basis for each ring, the product of any two basis elements, expanded in that basis, has coefficients that are positive in an appropriate sense.
Remarkably, equivariant cohomology and K-theory–which take into account a group action by a torus T (an abelian group)–have some advantages over their non-equivariant counterparts, even as they have strictly more information. They can be described simply by a sequence of polynomials (in the case of cohomology), and rational functions (in the case of K-theory). This description is amenable to combinatorial methods and provides hope for future positive formulas on structure constants. I will give an overview of some results on positivity, as well as a geometric (but unfortunately non-positive) recursion for computing these coefficients using divided difference operators. The last part is joint work with Allen Knutson.
Daniel Bienstock (Columbia)
Title: Approximate nonconvex optimization and tree-width
Abstract: The intersection graph of an optimization problem has a vertex corresponding to each variable and an edge between two vertices whenever the corresponding variables appear in the same constraint. This is a classical concept dating to more than fifty years ago. In this talk we discuss how the structure of the intersection graph, in particular low tree-width, can be leveraged to produce rigorous algorithms for approximate optimization of nonconvex problems. This body of work encompasses research by other authors, and recent work by the author and collaborators. We will also discuss applications to generalization problems in machine learning.
László Babai (University of Chicago)
Title, Part 1: Global symmetry from local information: The graph isomorphism problem
Title, Part 2: Hidden irregularity vs. hidden symmetry: The emergence of the Johnson graph
Abstract: In the first talk we explain the central, group theoretic, component of the recent quasipolynomial-time algorithm to test isomorphism of graphs. The second talk outlines the combinatorial ingredients, based on coherent configurations (certain highly regular colorings of the edges of the complete directed graph).
Spring 2018 TLC Organizing committee: Prakash Belkale (UNC Chapel Hill), Gabor Pataki (UNC Chapel Hill), Richard Rimanyi (UNC Chapel Hill), and Seth Sullivant (NCSU)