Présentations invitées

La plate-forme accueille six présentations invitées, qui ont lieu au cours de sessions communes accessibles à toutes les conférences, afin de donner aux participant·e·s l'occasion de découvrir des travaux récents de diverses communautés.

Page en cours de construction !

Torben Bach Pedersen (lundi 9h)

Torben Bach Pedersen is a Professor of Computer Science at Aalborg University, Denmark. His research interests include many aspects of Big Data analytics, with a focus on technologies for “Big Multidimensional Data” - the integration and analysis of large amounts of complex and highly dynamic multidimensional data in domains such as logistics (indoor/outdoor moving objects), smart grids (energy data management), transport (GPS data), and Linked Open Data. He is an ACM Distinguished Scientist, and a member of the Danish Academy of Technical Sciences, the SSTD Endowment, and the SSDBM Steering Committee. He has served as Area Editor for Information Systems and Springer EDBS, PC Chair for DaWaK, DOLAP, SSDBM, and DASFAA, and regularly serves on the PCs of the major database conferences.

His website: http://people.cs.aau.dk/~tbp/

Managing Big Multidimensional Data - From Data Acquisition to Prescriptive Analytics

More and more data is being collected from a variety of new sources such as sensors, smart devices, social media, crowd-sourcing, and (Linked) Open Data. Such data is large, fast, and often complex. There is a universal wish to perform multidimensional OLAP-style analytics on such data, i.e., to turn it into “Big Multidimensional Data”. The keynote will look at challenges and solutions in managing Big Multidimensional data. This is a multi-stage journey from the initial data acquisition, over cleansing and transformation, to (distributed) storage, indexing, and query processing, further on to building (predictive) models over data, and ultimately performing prescriptive analytics that integrates analytics with optimization to suggest optimal actions. Several case studies from advanced application domains such as Smart Energy, Smart Transport, and Smart Logistics will be used for illustration.

Leon van den Torre (mardi 9h)

Leon van der Torre joined the University of Luxembourg as a full professor for Intelligent Systems in 2006. He developed the BOID agent architecture (with colleagues from Vrije Universiteit Amsterdam), input/output logic (with David Makinson) and the game-theoretic approach to normative multiagent systems (with Guido Boella). He is an editor of the handbook of deontic logic and normative systems (first volume 2013, second volume in preparation), editor of the handbook on formal argumentation (in preparation), editor of the handbook on normative multi agent systems (in preparation), deontic logic corner editor of Journal of Logic and Computation, and member of editorial board of Logic Journal of the IGPL, the IfCoLog Journal of Logics and their Applications, and the EPiC series in Computer Science. Moreover he is coordinator of the Horizon2020 Marie Curie RISE Network “Mining and Reasoning with Norms” (MIREL, 2016-2019).

His website: http://icr.uni.lu/leonvandertorre/

Dynamic argumentation semantics

Dung introduced so-called argumentation semantics as a function from argumentation frameworks to sets of acceptable arguments. The handbooks on formal argumentation describe how this general framework for non-monotonic reasoning has been extended in many different ways over the past two decades. In this presentation I introduce a dynamic agenda going beyond dialogue, and in particular I introduce two new dynamic extensions of Dung's approach. The first dynamic argumentation semantics is based on an update relation removing attacks and arguments from argumentation frameworks. The fixpoints of this update semantics are the extensions of Dung's static approach. The second dynamic argumentation semantics builds on input/output argumentation frameworks, which have been introduced three years ago. We introduce dynamics in this compositional approach to formal argumentation by considering input streams of arguments.

Raúl García-Castro (mercredi 9h)

Dr. Raúl García-Castro is Assistant Professor at the Computer Science School at Universidad Politécnica de Madrid (UPM), Spain. After three years as a software engineer, since he graduated in Computer Science (2003) he has been working at UPM in the Ontology Engineering Group in a wide range of European and Spanish research projects as well as in different collaborations with the industry.

His research focuses on ontological engineering, on ontology-based data and application integration, and on the benchmarking of semantic technologies. In 2008 he obtained a Ph.D. in Computer Science and Artificial Intelligence at the UPM with his thesis titled “Benchmarking Semantic Web technology”, which obtained the Ph.D. Extraordinary Award at UPM.

He has authored more than 90 publications and regularly participates in standardization bodies (W3C, ETSI, AENOR, OASIS) and in the program committees of the conferences and workshops that are most relevant in his field. Since 2015 he is member of the Steering Committee of the International Conference on Knowledge Capture and since 2012 he is the Europe Liaison of the International Conference on Software Engineering and Knowledge Engineering.

Furthermore, he is member of the Editorial Board of 3 journals and has edited 2 special issues in journals. He has collaborated in the organization of 16 international conferences, 4 international workshops, 3 summer schools (being the Director of the 1st Summer School on Smart Cities and Linked Open Data), 3 VoCamps, and 5 tutorials in conferences.

His website: http://www.garcia-castro.com/

Technical and social aspects of semantic interoperability in the IoT

The Internet of Things (IoT) envisions an ecosystem in which physical entities, systems and information resources bridge the gap between the physical and the virtual world. The existing heterogeneity in such physical entities, systems and information resources, intensified by the fact that they originate from different sectors and according to different perspectives, poses numerous challenges to the IoT vision.

One of them is the need for interoperability, since capturing the maximum value from the IoT involves multiple IoT systems working together and, therefore, seamlessly interchanging information. However, successfully achieving interoperability requires coping with different aspects, not only technological but also social and/or regulatory ones. This talk will address how these aspects influence semantic interoperability, taking into account that such interoperability requires being aware of both the information interchanged and the data model (i.e., ontology) of such information.

In order to achieve interoperability, systems need not only to successfully interchange information but also to use the information that has been interchanged. In the IoT, semantic interoperability not only requires interchanging the information itself, but also the ontologies used to represent such information and other types of information that support IoT-specific tasks (e.g., discovery). Furthermore, using the interchanged information will require, on the one hand, to understand the information (usually through an ontology) and, on the other hand, to deal with mismatches among different views of the world.

The latter is very important because the reality is that the landscape of IoT ontologies is fragmented and reconciling views goes beyond solving technical issues and requires social approaches. This need in the IoT field for consensual models has led to multiple initiatives to define consensus-driven ontologies both in standardisation bodies and in other groups that aim to produce de facto standards. Even so, these processes require a special focus on aspects such as collaborativeness or openness, that are partially tackled with in traditional ontological engineering practices and tools and bring new demands for them.

This talk will discuss current approaches and challenges for semantic interoperability in the IoT, covering not only technical aspects but also social ones, presented through different examples drawn from the VICINITY H2020 project and various initiatives in ontology standardisation.

Natalie van der Wal (jeudi 9h)

Dr. Natalie van der Wal is a Research Fellow and Lecturer at the Behavioural Informatics Group, department of Computer Science, Vrije Universiteit, Amsterdam, Netherlands. She is specialised in the computational modelling of cognitive and affective processes in groups. With her current research on the effects of culture on crowd behaviour for the European H2020 project ‘IMPACT’, she aspires to improve safety by predicting crowd behaviour in emergency situations. She will continue her research in improving evacuations next year (2018) with her recently awarded Marie Sklowdowska-Curie Fellowship. She will work with professor Wandi Bruine de Bruin in the Centre for Decision Research and the Leeds University Business School.

Her website: http://www.few.vu.nl/~cwl210/

Simulating Socio-Cultural Crowd Behaviour in Emergency Situations

Evacuation modelling is becoming integrated in emergency prevention and management. Crowd evacuation simulations have been used to analyse different phenomena, such as exit selection, queuing, panic propagation, escape behaviour, clogging and following behaviour. Most of these models do not incorporate psychological and social factors. Within the European H2020 project IMPACT, the effects of culture, cognitions, and emotions on crisis management and prevention are analysed. An agent-based crowd evacuation simulation model was created, named IMPACT, to study the evacuation process from a transport hub. This model will be presented and discussed in this presentation. To extend previous research, various socio-cultural, cognitive, and emotional factors were modelled, including: language, gender, familiarity with the environment, emotional contagion, prosocial behaviour, falls, group decision making, and compliance. The IMPACT model was validated against data from an evacuation drill using the existing EXODUS evacuation model. Results show that on all validation measures, the IMPACT model is within or close to the expected validation boundaries, thereby establishing its validity. Structured simulations with the validated model revealed important findings, including: the effect of doors as bottlenecks, social contagion speeding up evacuation time, falling behaviour not affecting evacuation time significantly, and travelling in groups being more beneficial for evacuation time than travelling alone. This research has important practical applications for crowd management professionals, including transport hub operators, first responders, and risk assessors.

Michaël Perrot (vendredi 9h)

Actuellement en contrat post-doctoral au MPI de Tuebingen en Allemagne, Michaël Perrot est le lauréat du prix de thèse 2017 de l'AFIA, ex-æquo avec Éric Piette.

Son site web : https://is.tuebingen.mpg.de/de/people/mperrot

Learning Metrics with Controlled Behaviour

The goal in Machine Learning is to acquire new knowledge from data. To achieve this many algorithms make use of a notion of distance or similarity between examples. A very representative example is the nearest neighbour classifier which is based on the idea that two similar examples should share the same label: it thus critically depends on the notion of metric considered. Depending on the task at hand these metrics should have different properties but manually choosing an adapted comparison function can be tedious and difficult. The idea behind Metric Learning is to automatically tailor such metrics to the problem at hand. One of the main limitation of standard methods is that the control over the behaviour of the learned metrics is often limited. In this talk I will present two approaches specifically designed to overcome this problem. In the first one we consider a general framework able to take into account a reference metric acting as a guide for the learned metric. We are then interested in theoretically studying the interest of using such side information. In the second approach we propose to control the underlying transformation of the learned metric. Specifically we use some recent advances in the field of Optimal Transport to force it to follow a particular geometrical transformation.

Éric Piette (vendredi 14h)

Actuellement Attaché Temporaire à l'Enseignement et à la Recherche (ATER) au Centre de Recherche en Informatique de Lens (CRIL), Éric Piette est le lauréat du prix de thèse 2017 de l'AFIA, ex-æquo avec Michael Perrot.

Son site web : http://www.cril.univ-artois.fr/~epiette/

Une nouvelle approche au General Game Playing dirigée par les contraintes

Développer un programme capable de jouer à n’importe quel jeu de stratégie, souvent désigné par le General Game Playing (GGP) constitue un des Graal de l’intelligence artificielle. Les compétitions GGP, où chaque jeu est représenté par un ensemble de règles logiques au travers du Game Description Language (GDL), ont conduit la recherche à confronter de nombreuses approches incluant les méthodes de type Monte Carlo, la construction automatique de fonctions d’évaluations, ou la programmation logique et ASP. De par cette thèse, nous proposons une nouvelle approche dirigée par les contraintes stochastiques.

Dans un premier temps, nous nous concentrons sur l’élaboration d’une traduction de GDL en réseaux de contraintes stochastiques (SCSP) dans le but de fournir une représentation dense des jeux de stratégies et permettre la modélisation de stratégies.

Par la suite, nous exploitons un fragment de SCSP au travers d’un algorithme dénommé MAC-UCB combinant l’algorithme MAC (Maintaining Arc Consistency) utilisé pour résoudre chaque niveau du SCSP tour après tour, et à l’aide de UCB (Upper Confidence Bound) afin d’estimer l’utilité de chaque stratégie obtenue par le dernier niveau de chaque séquence. L’efficacité de cette nouvelle technique sur les autres approches GGP est confirmée par WoodStock, implémentant MAC-UCB, le leader actuel du tournoi continu de GGP.

Finalement, dans une dernière partie, nous proposons une approche alternative à la détection de symétries dans les jeux stochastiques, inspirée de la programmation par contraintes. Nous montrons expérimentalement que cette approche couplée à MAC-UCB, surpasse les meilleures approches du domaine et a permis à WoodStock de devenir champion GGP 2016.