MonMonday | TueTuesday | WedWednesday | ThuThursday | FriFriday | SatSaturday | SunSunday |
---|---|---|---|---|---|---|

January 28, 2019 | January 29, 2019 | January 30, 2019 | January 31, 2019 | February 1, 2019 | February 2, 2019 | February 3, 2019 |

February 4, 2019 | February 5, 2019 | February 6, 2019 | February 7, 2019 | February 8, 2019 | February 9, 2019 | February 10, 2019 |

February 11, 2019 | February 12, 2019 | February 13, 2019 | February 14, 2019## Melissa: Modular External Library for In Situ Sensitivity Analysis by Theophile Terraz (Datamove)Melissa: Modular External Library for In Situ Sensitivity Analysis by Theophile Terraz (Datamove)February 14, 2019 – Bâtiment IMAG (406)Saint-Martin-d'Hères, 38400 France Classical sensitivity analysis consists in running different instances of a numerical simulation with different sets of input parameters, store the results to disk, to later read them back from disk to compute the required statistics. A simulation can be multi-dimensional, multivariate, and multivalued, and a global sensitivity analysis often requires thousands of runs. The amount of storage needed can quickly become overwhelming, with the associated long read time that makes statistic computing time consuming. To avoid this pitfall, scientists usualy reduce their study size by running low resolution simulations or down-sampling output data in space and time. |
February 15, 2019 | February 16, 2019 | February 17, 2019 |

February 18, 2019 | February 19, 2019 | February 20, 2019 | February 21, 2019 | February 22, 2019 | February 23, 2019 | February 24, 2019 |

February 25, 2019 | February 26, 2019 | February 27, 2019 | February 28, 2019## Building Stable Conventions, by Jonathan Newton (Kyoto University)Building Stable Conventions, by Jonathan Newton (Kyoto University)February 28, 2019 – Bâtiment IMAG (406)Saint-Martin-d'Hères, 38400 France BUILDING STABLE CONVENTIONS |
March 1, 2019 | March 2, 2019 | March 3, 2019 |

**January 10, 2019 @ Bâtiment IMAG (406)****-- Best-of-two-worlds analysis of online search, by Christopher Durr (Lip6)**Best-of-two-worlds analysis of online search

In search problems, a mobile searcher seeks to locate a target that hides in some unknown position of the environment. Such problems are typically considered to be of an on-line nature, in that the input is unknown to the searcher, and the performance of a search strategy is usually analyzed by means of the standard framework of the competitive ratio, which compares the cost incurred by the searcher to an optimal strategy that knows the location of the target. However, one can argue that even for simple search problems, competitive analysis fails to distinguish between strategies which, intuitively, should have different performance in practice.

Motivated by the above, in this work we introduce and study measures supplementary to competitive analysis in the context of search problems. In particular, we focus on the well-known problem of linear search, informally known as the cow-path problem, for which there is an infinite number of strategies that achieve an optimal competitive ratio equal to 9. We propose a measure that reflects the rate at which the line is being explored by the searcher, and which can be seen as an extension of the bijective ratio over an uncountable set of requests. Using this measure we show that a natural strategy that explores the line aggressively is optimal among all 9-competitive strategies. This provides, in particular, a strict separation from the competitively optimal doubling strategy, which is much more conservative in terms of exploration. We also provide evidence that this aggressiveness is requisite for optimality, by showing that any optimal strategy must mimic the aggressive strategy in its first few explorations.

joint work with Spyros Angelopoulos and Shendan Jin

**January 24, 2019 @ Bâtiment IMAG (406)****-- Realistic simulation of the execution of applications deployed on large distributed systems with a focus on improving file management, by Anchen Chai (Insa Lyon)**Simulation is a powerful tool to study distributed systems. It allows researchers to evaluate different scenarios in a reproducible manner, which is hardly possible in real experiments. However, the realism of simulations is rarely investigated in the literature, leading to a questionable accuracy of the simulated metrics. In this context, the main aim of our work is to improve the realism of simulations with a focus on file transfer in a large distributed production system (i.e., the EGI federated e-Infrastructure (EGI)). Then, based on the findings obtained from realistic simulations, we can propose reliable recommendations to improve file management in the Virtual Imaging Platform (VIP).

In order to realistically reproduce certain behaviors of the real system in simulation, we need to obtain an inside view of it. Therefore, we collect and analyze a set of execution traces of one particular application executed on EGI via VIP. The realism of simulations is investigated with respect to two main aspects in this thesis: the simulator and the platform model.

Based on the knowledge obtained from traces, we design and implement a simulator to provide a simulated environment as close as possible to the real execution conditions for file transfers on EGI. A complete description of a realistic platform model is also built by leveraging the information registered in traces. The accuracy of our platform model is evaluated by confronting the simulation results with the ground truth of real transfers. Our proposed model is shown to largely outperform the state-of-the-art model to reproduce the real-life variability of file transfers on EGI.

Finally, we cross-evaluate different file replication strategies by simulations using an enhanced state-of-the-art model and our platform model built from traces. Simulation results highlight that the instantiation of the two models leads to different qualitative decisions of replication, even though they reflect a similar hierarchical network topology. Last but not least, we find that selecting sites hosting a large number of executed jobs to replicate files is a reliable recommendation to improve file management of VIP. In addition, adopting our proposed dynamic replication strategy can further reduce the duration of file transfers except for extreme cases (very poorly connected sites) that only our proposed platform model is able to capture.**February 14, 2019 @ Bâtiment IMAG (406)****-- Melissa: Modular External Library for In Situ Sensitivity Analysis by Theophile Terraz (Datamove)**Classical sensitivity analysis consists in running different instances of a numerical simulation with different sets of input parameters, store the results to disk, to later read them back from disk to compute the required statistics. A simulation can be multi-dimensional, multivariate, and multivalued, and a global sensitivity analysis often requires thousands of runs. The amount of storage needed can quickly become overwhelming, with the associated long read time that makes statistic computing time consuming. To avoid this pitfall, scientists usualy reduce their study size by running low resolution simulations or down-sampling output data in space and time.

Melissa bypass this limitation by avoiding intermediate file storage. Melissa processes the data in transit, enabling very large scale sensitivity analysis. Melissa is built around two key concepts: iterative statistics algorithms and asynchronous client/server model for data transfer. Simulation outputs are never stored on disc. They are sent by the simulations to a parallel server, which aggregate them to the statistic fields in an iterative fashion, and then throw them away. This allows to compute statistics maps on every mesh element for every timestep on a full scale study (ubiquitous statistics).

Melissa is a file avoiding, adaptive, fault tolerant and elastic framework, enabling very efficient executions on large scale supercomputers.

Melissa comes with iterative algorithms for computing the average, variance and co-variance, skewness, kurtosis, max, min, threshold exceedance, quantiles and Sobol' indices, and can easily be extended with new algorithms.**February 28, 2019 @ Bâtiment IMAG (406)****-- Building Stable Conventions, by Jonathan Newton (Kyoto University)**BUILDING STABLE CONVENTIONS

Abstract: Strategies of players in a population are updated according to the choice rules of agents, where each agent is a player or a coalition of players. It is shown that choice rules that satisfy a specific type of asymmetry can be combined in a variety of ways while retaining this asymmetry. It is known that, at a global level, this asymmetry implies stochastic stability of a given homogeneous strategy profile. Taken together, these results enable two approaches, one reductive, the other constructive. Firstly, for models in which every agent follows the same choice rule, stochastic stability can be proven by showing that the asymmetry holds for a representative agent. This allows us to easily recover and extend many results from the literature. Secondly, agents who follow choice rules that satisfy the asymmetry can be combined arbitrarily while the same homogeneous strategy profile remains stochastically stable.**March 7, 2019 @ Bâtiment IMAG (406)****-- A journey to causal advertising: interventions, Datasets & Models by Eustache Diemert (Criteo Grenoble)**Title: A journey to causal advertising: interventions, Datasets & Models

Abstract: In a culture where claims are backed with data, digital advertising shall demonstrate and optimize its causal effect. We will present two use cases from this industry leading to interesting problems. Then we'll describe how to generate datasets to allow for proper counterfactual learning, along with practical optimization tricks and experimental results. Presentation will include material from work published at NeurIPS Causal Learning 2018 and open datasets from Criteo.

**March 21, 2019 @ Bâtiment IMAG (406)****-- Eigenvalues and Graphs by Fanny Dufossé (Datamove)**TBA