Recent Changes - Search:

2012-2013

June 19, 2013, 10:00
Reachability Problems in Hybrid Systems
Paul Bell, Loughborough University

There has been a huge amount of interest recently in understanding the complexity and computability of various questions for hybrid systems such as reachability, stability, convergence and control. In this talk, we will present a survey of results concerning reachability in various hybrid systems, such as piecewise constant derivatives, piecewise affine maps, timed and rectangular automata, together with some new results concerning hierarchical PCDs

June 3, 2013, 14:00
On computing the Gromov hyperbolicity
Aurélien Lancin, INRIA Sophia-Antipolis

The Gromov hyperbolicity is an important parameter for analyzing complex networks since it expresses how the metric structure of a network looks like a tree. It is for instance used to provide bounds on the expected stretch of greedy-routing algorithms in Internet-like graphs. However, the best known algorithm for computing this parameter has time complexity in O(n3.69), which is prohibitive for large-scale graphs. We first proposed an algorithm for determining the hyperbolicity of a graph that is scalable for large graphs. The time complexity of this algorithm is output-sensitive and depends on the shortest-path distances distribution in the graph and on the computed value of the hyperbolicity. Although its worst case time complexity is in O(n4), this new algorithm is much faster than previous proposals in practice since it uses bounds to cut the search space that could be reduced to a single operation. Secondly, we showed how to use a decomposition by clique-separators to reduce the size of the input graph. Thirdly, we proposed both multiplicative factor and additive constant approximation algorithms. Finally, we evaluated the performances of our algorithms on various large graphs.

May 27-29, 2013
Our fantastic annual "Research Retreat"
Located in Wonderful Daverdisse this year

More information soon.

May 17, 2013, 11:00
New condition for the convergence of products of stochastic matrices using the Sarymsakov class
Weiguo Xia, University of Groningen

The convergence analysis of products of stochastic matrices is critical in establishing the effectiveness of distributed coordination algorithms for multi-agent systems. We provide new insight into the somewhat obscure definition of the Sarymsakov class of stochastic matrices and use it to construct a new necessary and sufficient condition for the convergence of products of stochastic matrices. These results are then applied to investigate a specific coordination task in multi-agent systems with asynchronous update events. The set of scrambling stochastic matrices, a subclass of the Sarymsakov class, is utilized to establish the convergence of the agents' states even when there is no common clock for the agents to synchronize their update actions.

May 3, 2013, 11:00
GraPHedron and House of Graphs – tools for graph theorists
Hadrien Melot (University of Mons)

In this talk we present two web-based tools for researchers in graph theory. First, we show how the system GraPHedron allows to derive new conjectures (interactively or automatically) in extremal graph theory. We will briefly survey some theoretical results obtained with the help of GraPHedron. Then we present a database of graphs (House of Graphs). The key principle is to have a searchable database and to offer – next to complete lists of some graph classes – a list of special graphs that already turned out to be interesting and relevant in the study of graph theoretic problems (e.g., as extremal graphs or as counterexamples). To build an initial database, we used GraPHedron to generate and solve a set of problems in an automated way. (The project House of Graphs is joint work with Gunnar Brinkmann, Kris Coolsaet and Jan Goedgebeur (UGent))

April 26, 2013, 11:00
Applying High Performance Computing to Multi-Area Stochastic Unit Commitment for Renewable Energy Integration
Anthony Papavasiliou (UCL)

We present a parallel implementation of a Lagrangian relaxation algorithm for solving stochastic unit commitment subject to uncertainty in wind power supply and generator and transmission line failures. We present a scenario selection algorithm inspired by importance sampling in order to formulate the stochastic unit commitment problem and validate its performance by comparing it to a stochastic formulation with a very large number of scenarios, that we are able to solve through parallelization. We examine the impact of narrowing the duality gap on the performance of stochastic unit commitment and compare it to the impact of increasing the number of scenarios in the model. We report results on the running time of the model and discuss the applicability of the method in an operational setting.

April 19, 2013, 11:00
Recent Advances in Nonnegative Matrix Factorization
Nicolas Gillis (UCL)

Nonnegative matrix factorization (NMF) is a powerful dimensionality reduction technique as it automatically extracts sparse and meaningful features from a set of nonnegative data vectors. NMF has many applications; for example in text mining, graph clustering (including community detection which we briefly describe), and hyperspectral imaging. Unfortunately, NMF is NP-hard in general, and highly ill-posed. However, NMF has been shown recently to be tractable under the separability assumption, which amounts for the columns of the input data matrix to belong to the convex cone generated by a small number of columns. Since then, several algorithms have been proposed to handle this subclass of NMF problems under any small perturbation of the input matrix (these are called near-separable NMF problems). In this talk, we present some recent advances for solving near-separable NMF, including an approach using successive orthogonal projections (joint work with S. Vavasis), and another using linear optimization (joint work with R. Luce).

March 22, 2013, 11:00
Explaining Zipf’s law via Mental Lexicon
Weibing Deng (ISMANS, Le Mans, France)

The Zipf’s law is the major regularity of statistical linguistics that served as a prototype for the rank frequency relations and scaling laws in natural sciences. Here we show that the Zipf’s law—together with its applicability for a single text and its generalizations to high and low frequencies including hapax legomena—can be derived from assuming that the words are drawn into the text with random probabilities, while their apriori density relates, via the Bayesian statistics, to the stable and efficient organization of mental lexicon of the author who produced the text.

March 15, 2013, 11:00
Aggregating Probability Distribution Functions Over Time-varying Anonymous Networks
Hakan Terelius (KTH Stockholm)

The aggregation and estimation of values over networks is fundamental for distributed applications, such as wireless sensor networks. Estimating the average, minimal and maximal values has already been extensively studied in the literature. Here, we focus on estimating the entire distribution of values in a network with anonymous agents. In particular, we compare two different estimation strategies in terms of their convergence speed and accuracy. Furthermore, we also consider time-varying networks, where the size, topology and values can change over time, with some regularization assumptions.

March 8, 2013, 11:00
Numerical issues in multiscale simulation and the role of approximate macroscopic models
Giovanni Samaey (KULeuven)

Nowadays, motivated by the availability of massive amounts of computing power, computational scientists (for instance, in biology, chemistry or physics) have an inclination to move away from overly simplified (macroscopic) descriptions, and to start including processes at microscopic space and time scales directly into the mathematical models. Nevertheless, simulation of multiscale models remains computationally challenging, and requires the development of multiscale numerical methods. Moreover, one might be interested in macroscopic steady states and their stability, even if the underlying microscopic system keeps evolving. In this talk, we overview a set of computational methods that couple a macroscopic level (at which a macroscopic model is not, or incompletely, known) to a microscopic level, where only appropriately initialized, local, short bursts of simulation are performed. While such methods are highly appealing, and find their way in many applications, many questions on numerical analysis of such schemes remain open. Here, we address some of these questions. We will discuss numerical errors that arise when transferring information between the microscopic and macroscopic levels of description, and look into the use of approximate macroscopic models to reduce the computational cost.

March 1, 2013, 11:00
Surprise maximization reveals the community structure of complex networks
Rodrigo Aldecoa (Instituto de Biomedina de Valencia, Spain)

Detecting community structure in networks is of high interest in many scientific fields. Nodes within the same community are expected to share attributes, common properties or functional relationships and, therefore, the characterization of those communities can improve our understanding of complex systems. In this talk, I will present the main concepts of community detection and our recent work in the field. In particular, I will introduce Surprise, our proposed measure to evaluate the quality of the partition of a network into communities. Our results demonstrate that the use of Newman and Girvan's modularity (the most popular measure proposed to this end) is incorrect. On the contrary, the strategy of maximizing Surprise recovers the expected communities in all networks tested. Moreover, the extremely small error made by this approach, suggests that Surprise maximization could be an optimal way to detect the real community structure of complex networks.

February 22, 2013, 11:00
Modelling and mapping human population distributions and movements in malaria endemic countries
Catherine Linard (ULB) and Andy Tatem (University of Florida)

Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Here we describe first the development of high resolution spatial population datasets for Africa based on the use of land cover data to redistribute census counts to map human population distributions continent-wide. Human movements at multiple spatial and temporal scales also impact the epidemiology and control of infectious diseases. While movement patterns are becoming relatively well quantified in high income regions of the world, they remain poorly understood in low income regions where the burden from infectious diseases is greatest. The advent of novel spatial digital datasets is increasing our abilities to measure human movement dynamics across large areas in these regions, however, and when combined with mathematical models of disease transmission, can provide valuable guidance on control. Here, we will discuss the potential of these new datasets and methods in disease control strategic planning and present examples of the use of (i) cellphone call data records in malaria control and elimination planning in Zanzibar, Namibia and Kenya, and (ii) satellite-derived night-time lights in measles vaccination strategies in Niger.

February 8, 2013, 11:00
Mean square error in randomized averaging algorithms
Julien Hendrickx (UCL)

We consider randomized discrete-time consensus systems that compute the average on expectation, and provide a new upper bound on the mean square deviation of the consensus value from the initial average. Our results are based on a new approach unrelated to the convergence properties of the system and easily applicable to many classes of systems. We show that a certain asymptotic accuracy can be guaranteed when there are few simultaneous interactions or when the simultaneous interactions are sufficiently uncorrelated. We particularize our result to various algorithms having been proposed in the literature, obtaining bounds that match or outperform the previously available ones.

February 1, 2013, 11:00
Finding the stationary states of Markov chains by iterative methods
Yurii Nesterov (UCL)

In this talk, we develop new methods for approximating dominant eigenvectors of column-stochastic matrices. We analyze the Google matrix, and present an averaging scheme with linear rate of convergence in terms of 1-norm distance. For extending this convergence result onto general case, it is enough to assume an existence of positive row in the matrix. Our new numerical scheme, the Reduced Power Method , can be seen as a proper averaging of a random walk in a graph with certain service rate provided by global authorities. We analyze also the usual Power Method and obtain convenient conditions for its linear rate of convergence with respect to 1-norm. This is a joint work with Arkadi Nemirovski (Georgia Tech)

January 30, 2013, 11:00
Importance of individual events in temporal networks
Taro Takaguchi (University of Tokyo)

Data derived from social interactions, such as face-to-face conversations, e-mail exchanges, and phone calls, have been accumulated in the form of time-ordered sequences of events between pairs of individuals. Such data are collectively called temporal networks [1], whereby the links with time stamps are assumed between pairs of nodes (i.e., individuals) involved in interaction events. We propose a centrality measure for each interaction event [2]. We assume that an event is important if it conveys new information about others to the two individuals involved in the event. We define the importance of event by generalizing the concept of advance of event [3] to the case in which an individual can be involved in multiple events in a single time unit. By applying the proposed importance to the real data sets, we find that the measure adequately represents the centrality of each event in the following sense. When we remove a small fraction of events with large importance values, the connectivity of the remaining temporal networks is drastically decreased. By contrast, the connectivity changes little when we remove a large fraction of events with small importance values (the robustness property). We perform the same event removal analysis for the randomized event sequences to investigate the origin of the robustness property. The discovered origins are also discussed. [1] P. Holme and J. Saramaki, Phys. Rep. 519, 97 (2012). [2] T. Takaguchi et al., New J. Phys. 14, 093003 (2012). [3] G. Kossinets et al., in Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, p.435 (2008).

December 14, 2012, 11:00
Exploring Road Networks with Greedy Navigators and Their Core-Periphery Structures
Dr. Sang Hoon Lee (Oxford Centre for Industrial and Applied Mathematics, Oxford University, UK)

During the last decade of network research focusing on structural and dynamical properties of networks, the role of network users has been more or less underestimated from the bird’s-eye view of global perspective. In this era of global positioning system equipped smartphones, however, a user’s ability to access local geometric information and find efficient pathways on networks plays a crucial role, rather than the globally optimal pathways. In this talk, I will present a simple greedy spatial navigation strategy as a probe to explore spatial networks. These greedy navigators use directional information in every move they take, without being trapped in a dead end based on their memory about previous routes. The centralities measures have to be modified to incorporate the navigators’ behavior, and present the intriguing effect of navigators’ greediness where removing some edges may actually enhance the routing efficiency, which is reminiscent of Braess’s paradox. In addition, using samples of road structures in large cities around the world, it is shown that the navigability measure we define reflects unique structural properties, which are not easy to predict from other topological characteristics. The inverse problem of optimizing network structures and their geometric layouts for better greedy navigability will also be discussed. Last but not least, I will present other properties of the road networks in regard to the core-periphery structures defined on the ground of topological/geometrical connections and backup pathways in terms of transportation.

Thursday December 6, 2012, 11:00
Benchmarking network analysis: Maximum Entropy Modeling
Valentin Dalibard (Cambridge University, UK)

Empirical analysis of real-world graphs has lead to the discovery of structural properties or patterns of networks such as heavy-tailed degree distributions, small diameter, community structure and so on. In recent years, the overwhelming number of those measures, such as the high number of community detection methods, has made it difficult to establish a precise understanding of what the key features of real-world graphs are.

In this talk, I will introduce maximum entropy models and present how they can be applied to networks. I will then show how, given a number of real-world graph properties, I can evaluate the maximum entropy of a graph that would fit them. This way, the information gained through the knowledge of these properties can be measured and compared.

November 30, 2012, 11:00
Constrained Graph Resampling for Group Assessment in Human Social Networks
Nicolas Tremblay (ENS Lyon, France)

The increasing availability of time -and space- resolved data of human activities and interactions gives insight into the study of both static and dynamic properties of human behavior. In practice, nevertheless, real-world datasets can often be considered as only one realisation of a particular event, giving rise to a key issue in social network analysis: the statistical significance of these properties. We focus in this work on features regarding subgroups of the networks and present a resampling - a.k.a. bootstrapping - method that enables us to add confidence intervals to such features. This in turn gives us the opportunity to compare subgroups’ behaviors within any network. We apply this method to a new high resolution dataset of face-to-face proximity collected during two co-located scientific conferences, and it enables us to probe whether or not co-locating two conferences is an effective way of bringing together two different communities.

November 23, 2012, 11:00
Power Network Cyber-physical Security Analysis and Its Computation Problem
Kin Cheong Su (KTH Stockholm, Sweden)

SCADA (Supervisory Control and Data Acquisition) systems are widely used to monitor and control the behavior of electric power networks. As the data communication of SCADA systems is increasingly dependent on the Internet, the threat of a data attack becomes more imminent. For better protection it is vital to identify in the network the measurements vulnerable to data attack. An important vulnerability analysis problem can be posed as a constrained cardinality minimization problem, which in general is NP hard (i.e., not tractable). This cardinality minimization problem has the electric network interpretation that under certain constraints, the nodal voltages should be chosen to minimize the number of nonzero branch currents and nodal injection currents. This interpretation, with the additional assumptions such as all "branch currents" being measured, leads to an equivalent generalized minimum cut problem with costly nodes. This problem turns out to be solvable in polynomial time after an appropriate reformulation, and hence the original network vulnerabilities analysis problem is in fact tractable. We will demonstrate the computation results numerically on a realistic network.

November 16, 2012, 11:00
Coordination and robustness in dynamical multi-agents systems
Samuel Martin (University of Grenoble, France)

In this talk, I will present a study on multi-agent systems. Such systems find numerous applications such as multi-vehicle control in robotics, the design of smart distributed energy networks and the modeling of opinion dynamics. In a first part, I present new results regarding consensus theory which extend the recent work from Hendrickx and Tsitsiklis on cut-balanced consensus. Then, I apply the consensus system to the control of a fleet of vehicles. I will present several results regarding velocity alignment (flocking). This study is based upon a graph robustness analysis in order to preserve the connectivity of the interaction network. This concept is of main importance in this study. In the last part, I will state results from a collaborative work with a sociologist on the analysis of the social network linked to the controversy regarding off-road motorized leisure in France. We shall see that algebraic graph theory used in the beginning of the talk can also be useful to this new context.

October 26, 2012, 11:00
Community detection on real world networks : handling the dynamic, the overlap and the relevance of the results
Rémy Cazabet (University of Toulouse, France)

Many solutions have been proposed to detect communities in complex networks. Recent methods such as InfoMap or the Louvain method give excellent results even on large, real networks. However, some aspects of community detection are still in their early stages, for example : - Strong overlap of communities - Evolution of networks (and communities) along time - The human relevance of topological communities I'll present some works I've done during my PhD on these topics, including : -An algorithm to detect communities and their evolution in strongly evolving networks -Applications of dynamic community detection on real temporal networks -Some tools to evaluate the quality of a community decompositions on real networks -A Facebook application allowing users to evaluate community detection algorithms on their ego-network.

October 19, 2012, 11:00
Fast graph discovering in anonymous networks
Damiano Varagnolo (KTH Stockholm, Sweden)

"Topology matters", summarizes the following fact: the macroscopic properties and capabilities of distributed systems are dictated by the structure of the communication network. In these frameworks, having knowledge of the topology is thus useful for several reasons, e.g., to optimize the overall functioning of the system or to trigger and perform reconfiguration procedures. Here we consider the following scenario: a set of collaborative agents want to distributedly infer the topology of the underlying communication network, constrained to privacy assumptions (meaning that the peers do not want to disclose their identities), and fast inference assumptions (meaning that we assume that the network may change over time, so that the inference process must be fast -- namely, as fast as possible). In this scenario we propose and show the benefits of max consensus, i.e., the ability of computing order statistics over networks. More precisely, we show how a basic building block "estimate the size of the network" can be used to infer several properties (e.g., number of links, diameters, radii, lengths of cycles) and how all these properties can enable real-world applications (e.g., in transportation networks). Finally we describe the true purpose of the seminar, that is presenting a blue ocean, plenty of theorems awaiting to be discovered (we will describe several holes in the existing literature) and thousands of applications to be ideated and implemented.

October 9, 2012
JojoFest

October 5, 2012, 11:00
Effective Control of Networked Systems through Intefaces, Modeling and Design
Airlie Chapman (University of Washington, Seattle)

Complex dynamic networks are an integral part of our natural world with examples such as biological, chemical and social networks, and our technological world with designed networks such as the internet, power grids, and robotic networks. Of increasing importance is the manipulation and monitoring of such networks. The cornerstone of effective control and observation of networked systems is the appreciation of the interplay between system performance and network structure.
In this talk, we will provide network-theoretic approaches for the design of efficient interfaces and, where possible, effective models and designs of networks. We specifically investigate measures of controllability to quantify efficient interfaces. In the realm of effective modeling, we will present the graph-based tools of Cartesian product decomposition as a method of model reduction. For designable networks, we employ a distributed protocols to dynamically rewire the network topology to improve system performance; such a technique can be applied to Unmanned Aerial Vehicle (UAV) swarms in the presence of wind gusts to promote network disturbance rejection.

October 3, 2012, 11:00Special seminar (special day)
The Structure and Dynamics of Economic Complexity
Cesar Hidalgo (MIT, Cambridge)

In this talk I present an empirical method, quantitative model, and theoretical framework that can be used to quantify the complexity of a country's economy. I show that economic complexity can explain differences in the income distribution of countries, and their dynamics, since it is highly predictive of future economic growth, and of the changes in a country's productive structure. When explaining economic growth, complexity out-competes measures of education, competitiveness and governance (institutions). To finalize I will present a new online visualization tools that can be used to explore the productive structures of countries and their evolution.

September 21, 2012, 11:00
Heterogenous agents in the FX market: A matter of time horizon?
Sophie Béreau (CORE and Louvain School of Management, UCL)

This paper deals with a Bayesian extension of a behavioral finance framework ‘à la’ De Grauwe and Grimaldi (2006a) in which agents operating in the FX market differ in their forecasting time horizon for the exchange rate. More specifically, as well documented by recent surveys on trading behaviors, agents may target either the short, the medium or the long run, and thus potentially different equilibrium values for the exchange rate as shown by Bénassy- Quéré, Béreau and Mignon (2010). In the short run, if we believe in the world described by Meese and Rogoff (1983), this leads to a chartist rule, whereas in the long run, the PPP condition appears as a natural anchor. In between, i.e. in the medium run, we implement an APEER model using Bayesian tools, as an alternative to the FEER-BEER nexus. Our results show that the stabilizing impact of the intermediate rule depends on agents’ good perception of the fundamentals.

Edit - History - Print - Recent Changes - Search
Page last modified on August 21, 2013, at 11:15 AM