Skip to main content
European Commission logo print header

Robust and Self-organizing Networks

Final Report Summary - ROSES (Robust and Self-organizing Networks)

ROSES: Robust and self-organizing networks.

Networks in transport, logistics, and telecommunication are pivotal for modern societies. These networks are complex and of very large scale. They must be operated both efficiently and reliable.

In many applications the networks are not controlled by a central, optimizing authority. They are operated by selfish players. A market mechanism must guarantee that the players in pursuit of their own benefit achieve global efficiency and reliability.

A market mechanism to instill robustness in a network controlled by selfish players at minimum loss in efficiency, is the method of choice to reconcile the benefits of a free market economy and the need for security in some of the most influential networks of our time: the networks created by credit obligations. To design regulations yielding robust credit networks it seems reasonable to complement classical, stochastic techniques with robust network optimization and algorithmic game theory.

The goal of this project is to extend recoverable robustness for applications with integer programming recovery, for applications without central control, and to stability in credit networks.

We developed a general approach to lift existing techniques for an integer linear program to its cost robust counterpart. This method also works if the uncertainty affects (few) constraints. We demonstrated the applicability of the approach by providing algorithms for robust counterparts of totally unimodular integer programs, integer programs with two variables per constraint, and unbounded knapsack problems.

We developed a novel, very elegant approach to find optimal appointment schedules in a recovery robust setting. The basic question is, when to schedule a series of appointments, if the duration of each appointment can vary in an interval. The goal is to minimize the worst-case of the delay cost and the opportunity cost of idle time, that occurs when an appointment is finished, but the next one is scheduled far later. We find a closed form description of an optimal schedule for this basic problem. This allows to also approximate the optimal order. The particular elegance of this approach is that unlike standard robust methods it avoids arbitrary truncations of the scenario space without being over-conservative.

We consider the networks formed by financial entities and the liabilities among them. We provide for a linear programming based method to determine the unique maximal clearing vector. The method is computationally efficient and yields dual variables that measure the systemic risk incurred by individual players. Further, we extend the method to find optimal bailout strategies that either minimize the cost of an intervention or maximize the effect of an intervention with a fixed budget.

Moreover, a central authority can identify players that will depend on the help of the authority in the near future and offer them a bailout guarantee in exchange for abstaining to withdraw capital before. A computational study shows that this mechanism significantly reduces the cost of an eventual bailout.

We study the basic combinatorial network flow problem from the viewpoint of robust optimization. We show that the robust maximum flow problem can be solved in polynomial time, but the robust minimum cut problem is NP-hard. We further prove that the adaptive, ie 2-stage versions are NP-hard. We further characterize the adaptive model as a two-person zero sum game and prove the existence of an equilibrium in such games.

Moreover, we consider a path-based formulation of flows in contrast to the more commonly used arc-based version of flows. This leads to a different model of robustness for maximum flows. We analyze this problem as well and develop a simple linear optimization model to obtain approximate solutions. Furthermore, we introduce the concept of adaptive maximum flows over time in networks with transit times on the arcs. Unlike the deterministic case, we show that this problem is NP-hard on series-parallel graphs even for the case that only one arc is allowed to fail. Finally, we propose heuristics based on linear optimization models that exhibit strong computational performance for large-scale instances.

We commence an algorithmic study of bulk-robustness, a new model of robustness in combinatorial optimization. Unlike most existing models, bulk-robust combinatorial optimization features a highly nonuniform failure model. Instead of an interdiction budget, bulk-robust counterparts provide an explicit list of interdiction sets, comprising the admissible set of scenarios, thus allowing to model correlations between failures of different components in the system, interdiction sets of variable cardinality and more. The resulting model is suitable for capturing failures of complex structures in the system.

We provide complexity results and approximation algorithms for bulk-robust counterparts of the Minimum Matroid Basis problems and the Shortest Path problem. Our results rely on various techniques, and outline the rich and heterogeneous combinatorial structure of bulk-robust optimization.

Operations research applications often pose multicriteria problems. A multi-criteria optimization setting is a model for different, potentially conflicting interests in a society or among different players. Mathematical research on multicriteria problems predominantly revolves around the set of Pareto optimal solutions, while in practice, methods that output a single solution are more widespread. In real-world multicriteria optimization the reference point method is a widely used and successful such method. A reference point solution is the solution closest to a given reference point in the objective space. These methods can be viewed as fair mechanisms to reconcile the interests of different players.

We study the approximation of reference point solutions. In particular, we establish that approximating reference point solutions is polynomially equivalent to approximating the Pareto set. Complementing these results, we show for a number of general algorithmic techniques in single criteria optimization how they can be lifted to reference point optimization. In particular, we lift the between dynamic programming and FPTAS, as well as oblivious LP-rounding techniques. The latter applies, e.g. to set-cover and several machine scheduling problems.

Real-time systems increasingly contain processing units with multiple cores. To use this additional computational power in hard deadline environments, one needs schedulability tests for task models that represent the possibilities of parallel execution of jobs of a task. A standard model for this is to represent a (sporadically) recurrent task by a directed acyclic graph (DAG). The nodes of the DAG correspond to the jobs of the task. All of them are released simultaneously, have to be completed within some common relative deadline, and some pairs of jobs are linked by a precedence constraint, i.e. an arc of the DAG.

This poses new challenges for analyzing whether a task system is feasible, in particular for the commonly used online al- gorithms earliest deadline first (EDF) and deadline monotonic (DM). We completely close the gap between the algorithmic understanding of feasibility analysis for the usual sporadic task model and the case where each sporadic task is a DAG. We show for DAG tasks that EDF has a tight speedup bound of 2 - 1/m, where m is the number of processors, while DM has a speedup bound of at most 3 - 1/m. Moreover, we present polynomial and pseudopolynomial time tests, of differing effectiveness, for determining whether a set of sporadic DAG tasks can be scheduled by EDF or DM to meet all deadlines on a specified number of processors. We remark that the effectiveness of some of our tests matches the best known algorithms for ordinary sporadic task sets, thus closing the gap.
roseslogof.png