COSA Workshop

Combinatorial Optimization, Statistics, and Applications

March 14-15, 2011, TU Munich, Germany

Zentralinstitut für Medizintechnik (ZIMT or IMETUM)
Boltzmannstrasse 11, 85748 Garching (MAP Campus Garching)
Room EG.026 (round part of the building, close to the U-Bahn station)

Organizers: Raymond Hemmecke (TU Munich, Germany), Milan Studený (UTIA Prague, Czech Republic)


This workshop is in the spirit of the former Model Selection Days that were held at a variety of places with the intention to bring together researchers from different fields such as algebra, (algorithmic) discrete mathematics/optimization, computer science, mathematical statistics with applications for example in the life sciences or in machine learning.

There will be 3 talks in the afternoon of March 14 (Monday) and 3 talks starting early on March 15 (Tuesday). In addition to this "official part", we encourage discussions on Monday morning and Tuesday afternoon among those participants planning to stay for the full two days.

No registration is required for the workshop. However, a short email to hemmecke@ma.tum.de is appreciated if you intend to come.



Program


March 14, 09:30 a.m.Time for discussions
March 14, 11:30 a.m.Lunch break
March 14, 01:00 p.m.Opening words
March 14, 01:15 p.m.Ruriko YoshidaOptimality of the neighbor joining algorithm and faces of the balanced minimum evolution polytope(ppt-file)
March 14, 02:30 p.m.Sonja PetrovićExistence of MLE for a directed random graph model
March 14, 04:00 p.m.Stefanie JegelkaCooperative Cuts




March 15, 09:30 a.m.Fabian TheisEnsemble methods for graph community detections and applications
March 15, 10:45 a.m.Milan StudenýLearning Bayesian networks viewed as an optimization problem(pdf-file)
March 15, 12:15 p.m.Silvia LindnerLearning restricted Bayesian network structures
March 15, 01:15 p.m.Closing words
March 15, 01:30 p.m.Lunch break
March 15, 03:00 p.m.Time for discussions

Speakers and abstracts


Stefanie Jegelka
MPI for Biological Cybernetics, Tübingen, Germany
Cooperative Cuts

Combinatorial problems with submodular cost functions have recently drawn interest. In a standard combinatorial problem, the sum-of-weights cost is replaced by a submodular set function. The result is a powerful model that is though very hard. In this talk, I will introduce cooperative cuts, minimum cuts with submodular edge weights. I will outline methods to approximately solve this problem, and show an application in computer vision. If time permits, the talk will also sketch regret-minimizing online algorithms for submodular-cost combinatorial problems. This is joint work with Jeff Bilmes (University of Washington).


Silvia Lindner
Combinatorial Optimization
TU Munich, Germany
Learning restricted Bayesian network structures

Learning Bayesian network structures (BNS) is an NP-hard nonlinear combinatorial optimization problem. Using algebraic representatives like standard imsets or the recently introduced characteristic imset, this problem can be transformed to a linear program in exponentially many variables. The underlying polyhedron (= convex hull of all standard imsets or of all characteristic imsets) is not yet well-understood and subject to ongoing research. In this talk we introduce the problem of learning BNS and characteristic imsets (exponential size integer vectors) as unique representatives of BNS. These integer vectors allow us to simplify proofs of known results and to easily derive known and new complexity results for learning among restricted BNS, for example, learning decomposable BNS. Finally, we present valid inequalities for the polytope of characteristic imsets and use them to derive some first computational results of learning (restricted) BNS via characteristic imsets using the state-of-the-art optimization software CPLEX. This is joint work with Raymond Hemmecke (TUM) and Milan Studený (UTIA Prague).


Sonja Petrović
Department of Mathematics, Statistics and Computer Science
University of Illinois, Chicago, USA
Existence of MLE for a directed random graph model

The p1 model describes dyadic interactions in a social network. We study the algebraic properties of the p1 model and show that it is a toric model specified by a multi-homogeneous ideal. In this talk, I will describe our study of the Markov bases for p1 that incorporate explicitly the constraint arising from multi-homogeneity. I will also describe the properties of the corresponding toric variety and relate them to the conditions for the existence of the maximum likelihood and extended maximum likelihood estimators or the model parameters. Our results are directly relevant to the estimation and conditional goodness-of-fit testing problems in p1 models. This is joint work with Alessandro Rinaldo and Stephen E. Fienberg.


Milan Studený
Institute of Information Theory and Automation
UTIA Prag, Czech Republic
Learning Bayesian networks viewed as an optimization problem

Bayesian networks are special graphical statistical models of conditional independence structure described by acyclic directed graphs. In the beginning of the talk the algebraic approach to learning Bayesian network structure will be explained. The basic idea is to represent the structure, described traditionally by a graph, by a special vector of high dimension, called the standard imset. Then usual criteria for learning Bayesian network structure become affine functions of the standard imset. This allows one to transform the learning task to maximization of a linear function over a special polytope, namely the convex hull of the set of standard imsets (over a fixed set of variables). To apply the eficient methods of combinatorial optimization in this area, for example the linear programming procedures like the simplex method, one needs to describe that polytope in the form of a polyhedron (= using linear inequalities) or, alternatively, characterize the edges (= faces of dimension 1) of the polytope. Simply, there is a bunch of higly important open mathematical problems of geometric nature concerning this polytope. In the second part of the talk I will report on attempts to solve some of these open problems. The talk will be based on joined research with Raymond Hemmecke, Silvia Lindner, Jirka Vomlel and David Haws.


Fabian Theis
Computational modeling in biology
TU Munich & Helmholtz Center for Environmental Health, Munich, Germany
Ensemble methods for graph community detections and applications

With the increasing availability of large-scale networks describing for instance interactions in biological systems or communication structures in social sciences, we face the challenge of interpreting and analyzing these data sets in a comprehensive fashion. A common analysis step is to learn closely connected clusters within these networks, so-called communities. After reviewing a popular approach based on graph modularity, I will demonstrate in an example in transport networks that it is not sufficient to only identify (or approximate) the global optimum of this cost function; instead we need to understand at least in parts the structure of the local optima, for which we propose a novel ensemble extension. In a second-example, I will then consider graph structures from biology. A common particularity of these networks lies both in their k-partiteness and in the need for overlapping clusters. I will propose an extended cost function for soft partitioning of such graphs, show how to minimize it and apply it to a manually annotated bipartite gene-complex graph. I will finish with a Bayesian extension of the learning algorithm to allow for multiple related solutions as in the transport network example. This is joint work with Florian Blöchl, the Dirk Brockmann and the Volker Stümpflen group.


Ruriko Yoshida
Department of Statistics
University of Kentucky, Lexington, USA
Optimality of the neighbor joining algorithm and faces of the balanced minimum evolution polytope

Balanced minimum evolution (BME) is a statistically consistent distance-based method to reconstruct a phylogenetic tree from an alignment of molecular data. It is a weighted least-squares solution that puts more confidence on shorter distances than longer ones. In 2008, Eickmeyer et. al. showed that the BME method is equivalent to optimizing a linear function, the dissimilarity map, over the BME polytope defined as the convex hull of vectors obtained from Pauplins formula applied to all binary trees. In this talk, we first show that any subtree-prune-regraft move from a binary tree to another binary tree corresponds to an edge of the BME polytope. Moreover, we describe an entire family of faces parametrized by disjoint clades, clades-faces, which are smaller dimensional BME polytopes themselves. Then we show that for any choice of picking neighbors (cherries), there exists a distance matrix such that the Neighbor-Joining algorithm returns the BME tree. This is joint work with David Haws and Terrell Hodge.