Home      Log In      Contacts      FAQs      INSTICC Portal

Keynote Lectures

The Role of Physiological Data in Neurorehabilitation
Eduardo Rocon, Consejo Superior de Investigaciones Científicas, Spain

Physiological Computing Reshapes User-system Interaction Research and Practical Application
Lucas Noldus, Noldus Information Technology bv, Netherlands

The Ordinal Nature of Psychophysiology
Georgios N. Yannakakis, University of Malta, Malta

Grand Challenges for Human-Computer Interaction: The Shift from HCI to Citizen-Environment Interaction (CEI) in Cooperative Cities and Societies
Norbert Streitz, Founder and Scientific Director, Smart Future Initiative, Germany


The Role of Physiological Data in Neurorehabilitation

Eduardo Rocon
Consejo Superior de Investigaciones Científicas

Brief Bio
Eduardo Rocon was born in Vitoria, Brazil (1979). He graduated in Electrical Engineering at Universidade Federal do Espiríto Santo (UFES) in 2001. Subsequently he moved to Spain to pursue a Ph.D. degree in Industrial Engineering at Universidad Politécnica de Madrid with Prof. A. Barrientos and Prof. J.L. Pons. His Ph.D. thesis (2006), for which he was awarded the Georges Giralt PhD Award (2008), focused on the development of a rehabilitation robotic exoskeleton that provides a means of testing and validating non grounded control strategies for robotic exoskeletons for active upper limb tremor suppression. Dr. Rocon continued his work in tremor suppression and the application of neuroprosthetics and neurorobotics in rehabilitation on a post-doctoral contract from 2006 to 2009. In 2009, Dr. Rocon was awarded with a Ramón y Cajal contract to continue developing his activities (the most competitive and prestigious postdoc contract in Spain). At the age of 30, Dr. Rocon got a tenured researcher position (2010-present) at CSIC. His career has recently been awarded the prestigious Juan Lopez de Peñalver Award of the Spanish Royal Academy of Engineering. Dr. Rocon’s multidisciplinary work has contributed to different aspects of robotics, neuroscience and medicine. Dr. Rocon research activities have generated more than 150 publications scientific publication, 1 book, 9 book chapters, and 7 patents.

This talk will introduce our research activity focused on the development of technologies to understand, monitor and restore human motor control. As part of these activities, we have expanded our research from pure robotics to the emerging field of neural engineering, adopting emerging technologies and drawing a stronger inspiration from neuroscience. In this field, the symbiotic relationship between humans and robots transcends the boundaries of simple physical interaction. It involves smart sensors, actuators, algorithms and control strategies capable of gathering and decoding complex human expressions or physiological phenomena. Once this process is complete, robots use the information to adapt, learn and optimize their functions, or even to transmit back a response resulting from a cognitive process occurring within the robot. In order to develop such interfaces, we have been developing interfaces based on physiological data in its different dimensions, either bioelectrical, biomechanical, biochemical or biophysical, in order to assess the generation, transmission and execution of motions. Our hypothesis is that this approach will improve neurophysiologic knowledge of human motor control and enable the development of cognitive interfaces more robust and functional. The contribution to these research lines will be illustrated by our developments in particular scenarios: development of a robotic solutions for tremor suppression and rehabilitation of people with mobility impairments, and the development of robotic, interfaces and serious games for the rehabilitation of children with Cerebral Palsy. 



Physiological Computing Reshapes User-system Interaction Research and Practical Application

Lucas Noldus
Noldus Information Technology bv

Brief Bio
Lucas Noldus is the founder and managing director of Noldus Information Technology (www.noldus.com), a leading developer of software tools and integrated solutions for the study of human behavior and human-system interaction. He holds a M.Sc. degree in behavioral biology from Leiden University (1983) and a Ph.D. in behavioral ecology from Wageningen University (1989). Prior to founding Noldus Information Technology in 1989, he conducted research on animal behavior, with a focus on development of experimental setups and data acquisition & analysis software, first at Beijing Normal University (Beijing, China, 1984), subsequently at the Insect Biology & Population Management Research Laboratory, USDA-ARS (Tifton, GA, USA, 1985), and finally at the Laboratory of Entomology, Wageningen University. He has (co)authored more than 130 papers and conference presentations about methods and techniques in behavioral research. From the start of the company, Lucas has always been closely involved with the development of Noldus Information Technology’s main products, most notably The Observer, EthoVision and the PhenoTyper. In 1996 he co-initiated the Measuring Behavior conference series, which has become a biennial event since then. Lucas Noldus is the co-founder of two other companies: Delta Phenomics, a contract research organization for preclinical research, and TeleMetronics Biomedical, a developer of biotelemetry equipment. Besides his work for the company, Lucas Noldus serves on a range of boards and committees related to science and innovation, including ICT for Brain, Body & Behavior (i3B), Man-Machine Interaction Platform, NLR - National Aerospace Centre Advisory Committee Aerospace Operations, Radboud University Nijmegen Workfield Committee Artificial Intelligence, Netherlands Academy of Technology and Innovation (AcTI), and International Council of Academies of Engineering and Technological Sciences (CAETS). LinkedIn page: www.linkedin.com/in/lucasnoldus.

It is widely accepted that the user is central to the development of interactive systems, from initial brainstorms about the problem to be solved or the value to be created, to usability and user experience testing during later development cycles. In the past, when computer systems were primarily designed to automate routine office tasks, usability tests could be limited to assessment of the user’s efficiency and efficacy (through video recording, logging user behavior and counting mouse clicks), and satisfaction (through subjective reports), during task execution. Since then we have seen an enormous diversification of applications, from traditional PC-based productivity tools to web-based information and transaction systems, mobile games, and infotainment systems embedded in cars, to name just a few. Usability testing has evolved into user experience evaluation, in which other aspects of the user-system interaction (besides ease of use) are assessed, such as engagement, trust, cognitive workload, excitement and fun. This has led to a proliferation of new techniques and tools for the measurement of eye movement, physiological signals, facial expressions and other modalities, each of which offers us a glimpse in the user’s physical or mental state. In this presentation, I will address a number of technical trends that have allowed physiological sensing to extend from the research lab to practical application, and from post-hoc analysis of user-system interaction to real-time feedback and intervention. As sensors become smaller and wireless, measurements become less obtrusive and can take place in naturalistic settings with a higher ecological validity. By integrating behavioral data with different physiological signals, we can detect episodes of arousal, stress or cognitive workload and take action when needed. These developments open new avenues to design products and services that are safer, easier and more pleasant to use.




The Ordinal Nature of Psychophysiology

Georgios N. Yannakakis
University of Malta

Brief Bio
Georgios N. Yannakakis (yannakakis.net) is a Professor and Director of the Institute of Digital Games, University of Malta. He is a leading expert of the game artificial intelligence research field with core theoretical contributions in machine learning, evolutionary computation, affective computing and player modelling, computational creativity and procedural content generation. He has published more than 220 papers and his work has been cited broadly. He has attracted funding from several EU and national research agencies and received multiple awards for published work in top-tier journals and conferences. His work has been featured in New Scientist, Science Magazine, The Guardian, Le Monde and other venues. He is regularly invited to give keynote talks in the most recognised conferences in his areas of research activity and has organised a few of the most respected conferences in the areas of game AI and game research. He has been an Associate Editor of the IEEE Transactions on Computational Intelligence and AI in Games and the IEEE Transactions on Affective Computing journals; he is currently an Associate editor of the IEEE Transactions in Games. He is the co-author of the Artificial Intelligence and Games Textbook.


How is a psychological state best labelled and in turn captured by a computational model? What are the challenges of annotating the magnitude of physiological manifestations? Is it meaningful to represent any subjective phenomenon as a number of predefined classes?
What if the magnitude or the class of an emotion are simply irrelevant (or even inappropriate!) labels for modelling psychophysiology?
In this talk I will attempt to address the above questions by viewing the field of psychophysiology under an ordinal perspective. I will first outline the theoretical reasons and empirical evidence to favour ordinal labels for representing and annotating psychological states and then I will discuss the good, bad and ugly practices of their processing. The advantages of the ordinal approach will be showcased via a number of representative studies in machine learning, psychophysiology, affective computing, and human computer interaction.
I will conclude the talk by reflecting upon the main limitations of the ordinal perspective.



Grand Challenges for Human-Computer Interaction: The Shift from HCI to Citizen-Environment Interaction (CEI) in Cooperative Cities and Societies

Norbert Streitz
Founder and Scientific Director, Smart Future Initiative

Brief Bio
Dr. Norbert Streitz (Ph.D. in physics, Ph.D. in cognitive science) is a Senior Scientist and Strategic Advisor with more than 35 years of experience in information and communication technology. Founder and Scientific Director of the Smart Future Initiative launched in 2009. From 1987-2008, he held positions as Deputy Director and Division Manager at the Fraunhofer Institute IPSI, Darmstadt, e.g., founding and managing the research division "AMBIENTE – Smart Environments of the Future". Teaching appointments at the Department of Computer Science, Technical University Darmstadt for more than 15 years. Before Fraunhofer, he was Assistant Professor at the Technical University Aachen (RWTH), where he founded and managed ACCEPT (AaChen Cognitive Ergonomics ProjecT). At different times of his career, he was a post-doc research fellow at the University of California, Berkeley, a visiting scholar at Xerox PARC, Menlo Park, and at the Intelligent Systems Lab of MITI, Tsukuba Science City, Japan.

He has published/edited 25 books and authored/coauthored more than 150 scientific peerreviewed papers. His research and teaching activities cover a wide range of areas: Cognitive Science, Human-Computer Interaction, Hypertext/Hypermedia, Computer-Supported Cooperative Work (CSCW), Ubiquitous Computing, Artificial Intelligence and Ambient Intelligence, Privacy Enhancing Technologies (Privacy by Design), Interaction and Experience Design, Hybrid Worlds, Autonomous Driving, Smart Cities and Smart Airports.

Principal Investigator and Manager of many projects funded by the European Commission (EC) (e.g., Disappearing Computer Initiative, Ambient Agoras, Towards the Humane City, …) and by industrial and public national and international funding agencies. Reviewer and evaluation expert for the EC, member of Editorial Boards (e.g., Journal of Ambient Intelligence and Smart Environments, Journal of Ambient Intelligence and Humanized Computing) and Advisory Boards of research institutes in Europe and Asia, consultant, and keynote speaker.

He has been organizing many conferences as general or program chair during his long career, too many to list here. During the last six years, he is the program chair of the International Conference on Distributed, Ambient and Pervasive Interactions (DAPI), now in its sixth edition as DAPI 2018.

Instead of dealing with individual, personal desk-top computers, laptops, tablets, smartphones, etc., experiences and interactions of humans with “computers” will increasingly take place in the context of interacting with “smart artifacts” integrated into the environment and in a next phase with “smart materials” constituting “smart ecosystems”. This has serious implications for the future of what currently is still called “human-computer-interaction”.
There is not only a shift from laptops and smartphones to smart artifacts and smart materials embedded in the environment, but also a shift in terms of scale and context, ranging from individual devices for personal activities to multiple devices used in group activities and social interactions. This is followed by the progression from smart rooms to smart or cooperative buildings and their extension to smart urban environments as, e.g., smart cities and airports. The trend towards more comprehensive application contexts requires a corresponding shift from a mostly individual person-based user-centered design approach to a multiple people and multiple devices-based citizen-centered design approach for smart urban environments we are confronted with in the urban age.

The ubiquitous and pervasive deployment of smart technology in urban environments has serious implications for privacy and security issues. This goes along with an increasing trend of using artificial intelligence for algorithm-based automation and autonomous systems resulting in a loss of having humans in the loop and in control. Thus, we are confronted with the challenge to address the corresponding design trade-offs and the need to rethink and redefine the “smart-everything” paradigm in order to move beyond “smart-only” cities to Humane, Sociable and Cooperative Hybrid Cities and Societies.  

The implications will be discussed along the following summary lines:

-          Shift from Human-Computer Interaction to Human-Environment Interaction
-          Shift from Human-/User-Centered Design to Citizen-Centered Design
-          Usable Privacy and Security by Design and by Default
-          Human in the Loop and in Control vs. Automation and Autonomous Systems
-          Redefining the “Smart-Everything” Paradigm

Norbert Streitz (2018). Beyond 'Smart-Only' Cities: Redefining the 'Smart-Everything' Paradigm. 
Journal of Ambient Intelligence and Humanized Computing. pp. 1 - 22.
Available as First-online: https://link.springer.com/article/10.1007/s12652-018-0824-1