Model-Based User-Interface Adaptation by Exploiting Situations, Emotions and Software Patterns
Model-Based User-Interface Adaptation by Exploiting Situations, Emotions and Software Patterns
Model-Based User-Interface Adaptation by Exploiting Situations, Emotions and Software Patterns
Keywords: Situation Analytics, Emotion Recognition, Adaptive User Interfaces, HCI-Patterns, Model-based User
Interface Design.
Abstract: This paper introduces the SitAdapt architecture for adaptive interactive systems that are situation-aware and
respond to changing contexts, environments and user emotions. An observer component is watching the
user during interaction with the system. The adaptation process is triggered when a given situation changes
significantly or a new situation arises. The necessary software modifications are established in real-time by
exploiting the resources of the PaMGIS MBUID development framework with software patterns and
templates at various levels of abstraction. The resulting interactive systems may serve as target applications
for end users or can be used for laboratory-based identification of user personas, optimizing user experience
and assessing marketing potential. The approach is currently being evaluated for applications from the e-
business domain.
The main contributions of this paper are the various adaptation types supported by the system in
following: detail. The specification language of the HCI
Design of architecture extensions for an existing patterns and model fragments used in the process is
MBUID framework for enabling real-time discussed elsewhere (Engel et al. 2015).
adaptation of target application user interfaces Chapter 4 demonstrates the SitAdapt component
by using situation analytics in action. It gives an example for a situation
Discussion of the use of situation and HCI recognized in a travel-booking application and
patterns for adaptation purposes shows, how the adaptation process would react in
Detailed discussion of the adaptation process order to improve the user experience and task
with the focus on user-related adaptations accomplishment for a specific user.
Chapter 5 concludes the paper with a short
The paper is organized in the following way. discussion of currently evolving target applications
Chapter 2 discusses related work in the field of and gives hints to our planned future work.
model- and pattern-based user interface development
systems and defines the underlying concept of
“situation”. The chapter also discusses the related 2 RELATED WORK
work in the areas of situation-aware systems, model
and pattern based construction of interactive In this paper we present the user interface adaptation
systems, and emotion recognition.
process of the SitAdapt system that combines
Chapter 3 introduces SitAdapt, our architectural model- and pattern-based approaches for interactive
approach for developing and running situation- system construction with visual and bio-signal-based
analytic adaptive interactive systems. The chapter
emotion recognition technology to allow for
focuses on discussing the adaptation process and the
software adaptation by real-time situation analytics. model (i.e. typically an object-oriented class model
The underlying software and monitoring technology defining business objects and their relations) and
was introduced in (Märtin et al., 2016). defines abstract user interface objects that are still
independent of the context of use.
2.1 Model- and Pattern-based User The AUI model can then be transformed into a
Interface Design concrete user interface model (CUI), which already
exploits the context model and the dialog model,
Model-based User Interface Design (MBUID) which is responsible for the dynamic user interface
during the last decades has paved the way for many behavior.
well-structured approaches, development In the next step PaMGIS automatically generates
environments and tools for building high-quality the final user interface model (FUI) by parsing the
interactive systems that fulfil tough requirements CUI model. To produce the actual user interface, the
regarding user experience, platform-independence, resulting XML-conforming UI-specification can
and responsiveness. At the same time model-driven either be compiled or interpreted during runtime,
and model-based architectural and development depending on the target platform. Chapter 3
approaches (Meixner et al., 2014) were also discusses in detail, how SitAdapt complements and
introduced to automate parts of the development accesses the PaMGIS framework.
process and shorten time-to-market for flexible
interactive software products. As one de-facto 2.2 Context- and Situation-Awareness
process and architectural standard for MBUID the
CAMELEON reference framework has emerged Since the advent of smart mobile devices, HCI
(Calvary et al., 2002). CRF defines many model research has started to take into account the various
categories and a comprehensive base architecture for new usability requirements of application software
the construction of powerful model-based running on smaller devices with touch-screen or
development environments. speech interaction or of apps that migrate smoothly
In order to allow for automation of the from one device type to another. Several of the
development process for interactive systems, to raise needed characteristics for these apps targeted at
the quality and user experience levels of the different platforms and devices can be specified and
resulting application software and to stimulate implemented using the models and patterns residing
developer creativity, many software-pattern-based in advanced MBUID systems. Even runtime support
design approaches were introduced during the last for responsiveness with the interactive parts
15 years (Breiner et al., 2010). distributed or migrating from one (virtual) machine
In order to get the best results from both, model to the other and the domain objects residing in the
and pattern-based approaches, SitAdapt is integrated Cloud can be modeled and managed by CRF-
into the PaMGIS (Pattern-Based Modeling and conforming development environments (e.g.
Generation of Interactive Systems) development Melchior et al., 2011).
framework (Engel et al., 2015), which is based on The concept of context-aware computing was
the CAMELEON reference framework (CRF). originally proposed for distributed mobile
PaMGIS contains all models proposed by CRF, but computing in (Schilit et al., 1994). In addition to
also exploits pattern-collections for modeling and software and communication challenges to solve
code generation. The CRF is guiding the developer when dynamically migrating an application to
on how to transform an abstract user interface over various devices and locations within a distributed
intermediate model artifacts into a final user environment, the definition of context given then
interface. The overall structure of the PaMGIS also included aspects such as lighting or the noise
framework with its tools and resources, the level as well as the current social situation (e.g. are
incorporated CRF models, and the new SitAdapt other people around?, are these your peers?, is one
component is shown in figure 1. of them your manager?, etc.).
The SitAdapt component has full access to all Since then, mobile software has made huge steps
sub-models of the context of use model and interacts towards understanding of and reacting to varying
with the user interface model. situations. To capture the individual requirements of
Within CRF-conforming systems the abstract a situation (Chang, 2016) proposed that a situation
user interface model (AUI) is generated from the consists of an environmental context E that covers
information contained in the domain model of the the user’s operational environment, a behavioral
application that includes a task model and a concept context B that covers the user’s social behavior by
interpreting his or her actions, and a hidden context changes of the content, behavior, and the user
M that includes the users’ mental states and interface of the target application. In general, three
emotions. A situation Sit at a given time t can thus different types of adaptation can be distinguished
be defined as Sit = <M, B, E>t. A user’s intention when focusing on user interfaces (Akiki et al.,
for using a specific software service for reaching a 2014), (Yigitbas et al., 2015):
goal can then be formulated as temporal sequence
<Sit1, Sit2, …, Sitn>, where Sit1 is the situation that Adaptable user interfaces. The user interface is
triggers the usage of a service and Sitn is the goal- a-priori customized to the personal preferences
satisfying situation. In (Chang et al., 2009) the Situ of the user.
framework is presented that allows the situation- Semi-automated adaptive user interfaces. The
based inference of the hidden mental states of users user interface provides recommendations for
for detecting the users intentions and identifying adaptations, which can be accepted by the user
their goals. The framework can be used for modeling or not.
and implementing applications that are situation Automated adaptive user interfaces. The user
aware and adapt themselves to the users’ changing interface automatically reacts to changes in the
needs over runtime. context of the interactive application
Our own work, described in the following
chapters, was inspired by Situ, but puts most The SitAdapt runtime-architecture co-operates
emphasis on maintaining the model-based approach with the PaMGIS framework to support both, semi-
of the PaMGIS framework by linking the domain automated and automated user interface adaptation.
and user interface models with the user-centric For constructing real situation-aware systems,
situation-aware adaptation component. however, user interface adaptation aspects have to
An approach for enabling rich personal analytics be mixed with content-related static and dynamic
for mobile applications by defining a specific system aspects that are typically covered by the task
architectural layer between the mobile apps and the and context model, both together forming the
mobile OS platform is proposed in (Lee, Balan, domain model of the framework.
2014). The new layer allows to access all sensors In (Mens et al., 2016) a taxonomy for classifying
and low-level I/O-features of the used devices. and comparing the key-concepts of diverse
The users’ reactions to being confronted with the approaches for implementing context-awareness is
results of a hyper-personal analytics system and the presented. The paper gives a good overview of the
consequences for sharing such information and for current state-of-the-art in this field.
privacy are discussed in (Warshaw et al., 2015).
For implementing the emotion recognition
functionality that can be exploited for inferring the 3 SITADAPT – ARCHITECTURE
desires and sentiments of individual users while
working with the interactive application, the current In order to profit from earlier results in the field of
version of SitAdapt captures both visual and MBUID systems, the SitAdapt runtime environment
biometric data signals. In (Herdin et al., 2017) we
is integrated into the PaMGIS (Pattern-Based
discuss the interplay of the various recognition Modeling and Generation of Interactive Systems)
approaches used in our system. development framework. The SitAdapt environment
In (Picard, 2014) an overview of the potential of
extends the PaMGIS framework and allows for
various emotion recognition technologies in the field modeling context changes in the user interface in
of affective computing is given. In (Schmidt, 2016)
real-time.
emotion recognition technologies based on bio-
The architecture (fig. 2) consists of the following
signals are discussed, whereas (Qu, Wang, 2016) parts:
discusses the evolution of visual emotion
The data interfaces from the different devices
recognition approaches including the interpretation (eye-tracker, wristband, FaceReader software
of facial macro- and micro-expressions. and metadata from the application)
The signal synchronization component that
2.3 Context-Adaptation synchronizes the data streams from the different
input devices by using timestamps
The recognition of emotions and the inference of The recording component that records the eye-
sentiments and mental states from emotional and and gaze-tracking signal of the user and tracks
other data is the basis for suggesting adaptive his or her emotional video facial expression
with the Noldus FaceReader software as a The task model is specified using the
combination of the six basic emotions (happy, ConcurTaskTree (CTT) notation (Paternò, 2001) as
sad, scared, disgusted, surprised, and angry). well as an XML file, the UI configuration file,
Other recorded data about the user are, e.g., age- generated from the CTT description and from
range and gender. The stress-level and other accessing contents of the context of use model (see
biometric data are recorded in real-time by a fig. 1).
wristband. In addition, application metadata are The context of use model holds four sub-models:
provided by the target application. the user model, the device model, the UI toolkit
The situation analytics component analyzes the model, and the environment model. All models play
situation by exploiting the data and a status flag important roles in the user interface construction
from the FUI or AUI in the user interface process and are exploited when modeling
model. A situation profile of the current user is responsiveness and adapting the target application to
generated as a dynamic sequence of situations. specific device types and platforms. For the
The decision component uses the data that are situation-aware adaptation process, however, the
permanently provided by the situation analytics user model is the most relevant of these sub-models.
component to decide whether an adaptation of
the user interface is currently meaningful and
necessary. Whether an adaptation is meaningful
depends on the predefined purpose of the
situation-aware target application. Goals to meet
can range from successful marketing activities
in e-business, e.g. having the user buying an
item from the e-shop or letting her or him
browse through the latest special offers, to
improved user experience levels, or to meeting
user desires defined by the hidden mental states
of the user. Such goals can be detected, if one or
more situations in the situation profile trigger an
application dependent or domain independent
situation pattern. Situation patterns are located
in the pattern repository. If the decision
component decides that an adaptation is
necessary, it has to provide the artifacts from
the PaMGIS pattern and model repositories to
allow for the modification of the target
application by the adaptation component. The
situation pattern provides the links and control
information for accessing and composing HCI-
patterns and model fragments, necessary for
constructing the modifications.
The adaptation component finally generates the
necessary modifications of the interactive target
application.
These architectural components provided by the Figure 2: SitAdapt Architecture.
SitAdapt system are necessary for enabling the
PaMGIS framework to support automated adaptive It is structured as follows:
user interfaces.
<UserCharacteristics>
<UserIdentData>
3.1 User Interface Construction <UserAbilities>
<USRUA_Visual>
The domain model (see fig. 1) serves as the starting <USRUA_Acoustic>
<USRUA_Motor>
point of the process that is used for user interface <USRUA_Mental>
modeling and generation. The model consists of two <UserExperiences>
sub-models, the task model and the concept model. <USRUE_Domain>
<USRUE_Handling>
<UserDistraction> tags that may serve as context variables that hold
<UserLegalCapacity>
<UserEmotionalState>
information relevant for controling the UI
<UserBiometricState> configuration and later, at runtime, the adaptation
<USRBS_Pulse> process.
<USRBS_BloodPressure> One of the sub-tags may for instance hold
<USRBS_PulseChangeRate>
<USRBS_E4_DataSet> information that a task “ticket sale” is only
… authorized for users from age 18. When the situation
<USRBS_DeviceX_DataSet>
analytics component at runtime discovers that the
current user is less than 18 years old, a hint is given
The user model holds both static information in the final user interface (FUI) model that she or he
about the current user, and dynamic data describing is not authorized to buy a ticket, because of her or
the emotional state as well as the biometric state. his age.
Not all of the attributes need to be filled with The concept model consists of the high-level
concrete data values. The dynamic values specifications of all the data elements and interaction
concerning emotional state and biometric state are objects from the underlying business model that are
taken from the situation profile, whenever an relevant for the user interface. It can therefore be
adaptation decision has to be made (see chapter 3.2). seen as an interface between the business core
Note, however, that the situation profile that is models of an application and the user interface
generated by the situation analytics component models. It can, e.g., be modeled using UML class
contains the whole sequence of emotional and diagram notation. From the concept model there is
biometric user states from session start until session also derived an XML specification.
termination. The temporal granularity of the In PaMGIS the task model serves as the primary
situation profile is variable, starting from fractions basis for constructing the dialog model (see figure
of a second. It depends on the target application’s 1). The various dialogs are derived from the tasks of
requirements. the task model. Additional input for modeling data
A priori information can be exploited for types and inter-class-relations is provided by the
tailoring the target application, when modeling and concept model. The dialog model is implemented by
designing the appearance and behavior of the user using Dialog Graphs (Forbrig, Reichart, 2007).
interface, before generating it for the first time and In the next step the abstract user interface (AUI)
use. This can already be seen as part of the is constructed by using the input of the domain
adaptation process. model and the dialog model. The dialog model
Typical a priori data can be user identification provides the fundamental input for AUI
data, data about the various abilities of the user, and specification, because it is based on the user tasks.
specific data about the users fluency with the target Each of the different dialogs is denoted as a
application’s domain and its handling. <Cluster>-element. All elements together compose a
Dynamic data will change over time and can be <Cluster> that is also specified in XML.
exploited for adapting the user interface at runtime. For transforming the AUI into concrete user
Such data includes the emotional and biometric interface (CUI) the AUI elements are mapped to
state, observed and measured by the hard- and <Form> CUI elements. The context of use model is
software devices attached to the recording exploited during this transformation. For instance by
component. accessing the UI toolkit sub-model it can be
The data structure also allows to directly guaranteed that only such widget types are used in
integrate proprietary data formats provided by the the CUI that come with the used toolkit and for
devices used in SitAdapt such as the Empatica E4 which the fitting code can later be generated in
wrist band. conjunction with the target programming or markup
If not available a priori, some of the attributes language used for the final user interface.
can be completed by SitAdapt, after an attribute The CUI specification is also written in XML.
value was recognized by the system. For instance the Finally, the CUI specification has to be
age of a user can be determined with good accuracy transformed into the final user interface (FUI). The
by the FaceReader software. With this information, XML specification is therefore parsed and translated
the <UserLegalCapacity> attribute can be into the target language, e.g., HTML, C# or Java.
automatically filled in.
The generated UI configuration file contains a
<ContextOfUse> tag field for each task. It has sub-
3.2 User Interface Adaptation Process focus of the user. It is provided by the eye-tracking
data and the mouse coordinates. For more advanced
For modeling a situation-aware adaptive target adaptation techniques additional attributes may be
application, SitAdapt components are involved in required (see chapter 3.3). Situations with their
several parts of the entire adaptation process. attributes provide a dynamic data base for allowing
Already before the final user interface is the decision component to find one or more domain
generated and displayed for the first time, the dependent or independent situation patterns in the
situation can be analyzed by SitAdapt. SitAdapt can pattern repository that match an individual situation
access the user attributes in the profiles. The or a sequential part of the situation profile. The
situation analytics component gets the synchronized situation patterns hold links to HCI-patterns or
monitored data streams from the recording model fragments that are used for adaptation
component and stores the data as the first situation purposes.
of a sequence of situations in the situation profile. Typically not all of the existing situation
Thus, the user will get an adapted version of the attributes are needed to find a suitable situation
user interface with its first display, but will not pattern.
notice that an adaptation has already occurred. Also note that some situation attributes are not
At runtime a situation is stored after each type-bound. The eye- and gaze-tracking attributes,
defined time interval. In addition, environmental for instance, can either contain numeric coordinates
data, e.g. time of day, room lighting, number of or sequences of such coordinates to allow for
people near the user, age and gender of the user, analyzing rapidly changing eye movements in fine-
emotional level, stress level, etc. can be recorded grained situation profiles, e.g. when observing a car
and pre-evaluated and be attached to the situations in driver. However, they can also give already pre-
the situation profile. processed application-dependent descriptions of the
A situation profile has the following structure: watched UI objects and the type of eye movements
(e.g. a repeating loop between two or more visual
<SituationProfile> objects).
<TargetApplication> To enable adaptations, some modeling aspects of
<User>
<Situation_0> the PaMGIS framework have to be extended.
<SituationTime> 0 When transforming the AUI model into the CUI
<FUI_link> NULL model the current situation profile of the user is
<Eye_Tracking> …
<Gaze_Tracking> …
checked by the decision component. The situational
<MetaData> NULL data in the profile may match domain independent or
<Environment> application specific situation patterns. If one ore
<EnvAttrib_1> … more situation patterns apply, these patterns guide,
…
<EmotionalState> EmoValue_1 how a dynamically adapted CUI can be constructed.
<BiometricState> BioValue_1 The construction information is provided by HCI
… patterns and/or model fragments in the PaMGIS
<Situation_i> repositories. Each situation pattern is linked with
<SituationTime> i
<FUI_link>FUI_object_x such UI-defining artifacts.
<Eye_Tracking> v1,v2 Situation patterns are key resources for the
<Gaze_Tracking> w1,w2.w3 SitAdapt adaptation process. Domain independent
<MetaData>
<Mouse> x,y situation patterns cover recurring standard situations
… in the user interface. Application-specific situation
<Environment> patterns have to be defined and added to the pattern
… repository when modeling a new target application.
<EmotionalState> angry
<BiometricState> They mainly cover aspects that concern specific
<Pulse>95 communications between business objects and the
<StressLevel> yellow user interface. Existing application-specific situation
<BloodPressure>160:100
…
patterns can be reused, if they also apply for a new
target application.
This is the structure of the situation profile as The CUI that was modified with respect to the
currently used for prototypical applications and for triggering situation then serves as construction
evaluating our approach. The attribute <FUI_link> template from which the FUI is generated.
is used to identify the part of the user interface in the However, the FUI is also monitored by SitAdapt
at runtime, after it was generated, i.e. when the user Monitoring different users with SitAdapt while
interacts with the interactive target application. working on the tasks of various target applications in
Thus, the characteristics of the FUI can be modified the usability lab can also lead to identifying different
dynamically by SitAdapt whenever the decision personas and usage patterns of the target
component assesses the occurred situational changes applications. The findings can also be used for a
for the user as significant (i.e. the user gets angry or priori adaptations in the target systems in the case
has not moved the mouse for a long time period). where no situation analytics process is active.
In this case a flag in the UI tells SitAdapt, which
part of the AUI is responsible for the recognized
situation within a window, dialog, widget and/or the 4 SITADAPT IN ACTION
current interaction in the FUI. Depending on the
situation analytics results, the detected situation
A simple example that demonstrates the
patterns will hint to the available HCI patterns, functionality of the SitAdapt system for an
model fragments and construction resources (e.g. interactive real-life application is the following e-
reassuring color screen background, context-aware
business case.
tool tip, context-aware speech output, etc.). The A user will book a trip from one city to another
decision component may thus trigger a modification on the website fromatob.com. This website applies a
of the concerned CUI parts.
wizard HCI-pattern for the booking process. When
The adaptation component then accesses and using SitAdapt for an existing proprietary web-
activates the relevant HCI patterns and/or model application, a simplified user interface model of the
fragments. From the modified CUI a new version of
application as well as a task model fragment can be
the FUI is generated and displayed, as soon as provided in order to make the adaptation
possible. After adaptation the FUI is again
functionality partly available for the existing
monitored by the situation analytics component.
software.
In the first step (fig. 3) the traveller has to enter
3.3 Advanced Adaptation Techniques her or his personal details into the wizard fields.
Two major goals of adaptive technologies are 1) to
raise the effectiveness of task accomplishment and
2) to raise the level of user experience. With the
resources, tools, and components available in
PaMGIS and SitAdapt we plan to address these
goals in the near future.
To monitor task accomplishment, links between
the sequence of situations in the situation profile and
the tasks in the task model have to be established.
For this purpose the situation analytics component
needs access to the task model. As the tasks and sub-
tasks of the task model are related to business
objects, the linking of situations to data objects in
the concept model appears to be helpful. With these
Figure 3: fromatob.com wizard pattern.
new communication mechanisms, we can check,
whether the sequence of situations goes along with
the planned sequence of tasks. If derivations or SitAdapt is monitoring the user during this task. The
unforeseen data values occur, situation-aware SitAdapt system records the eye movements, pulse,
adaptation can help the user to find back to the stress level, and the emotional state, and gets real-
intended way. time information from the website (FUI_link).
Both, for situation sequences matching with the The situation analytics component creates a user-
task sequences planned by the developer, and for specific situation from this data:
situations that have left the road to the hoped-for <SituationProfile>
business goal, emotional and stress-level monitoring <TargetApplication>
may trigger adaptations of the user interface that <User>
raise the joy of use or take pressure from the user in <Situation_booking>
<SituationTime>60sec
complex situational contexts. <FUI_link>Wizard_Part1
<Eye_Tracking>Field Another area we currently evaluate is the e-business
Bahncardnumber
<EmotionalState>angry
portal of a cosmetics manufacturer. For this domain
<BiometricState> we make extensive use of usability-lab based user
<Pulse>high tests with varying scenarios in order to get sufficient
<StressLevel>orange data for mining typical application-dependent
</Situation_booking>
situation patterns.
For both target application domains we are
The Decision Component determines whether an currently exploring usability, user experience and
adaptation is necessary with the help of the pattern marketing aspects. It is our next goal to define a
repository and the model repository. In this example, large set of situation patterns, both domain-
the component decides that the user has a problem dependent and universally applicable and thus
with one field in the form of the wizard stepwise improve the intelligence level and variety
(bahncardnumber). The situation can be mapped to of the resources of the SitAdapt decision component.
the domain-independent situation pattern form-field-
problem. The form-field-pattern hints to an HCI-
pattern from the pattern repository to help the user in REFERENCES
this situation. The adaptation creates a new final user
interface (FUI) with a chat window in order to help Akiki, P.A., et al.: Integrating adaptive user interface
the user (fig. 4). capabilities in enterprise applications. In: Proceedings
of the 36th International Conference on Software
Engineering (ICSE 2014), pp. 712-723. ACM (2014)
Breiner, K. et al. (Eds.): PEICS: towards HCI patterns into
engineering of interactive systems, Proc. PEICS ’10,
pp. 1-3, ACM (2010)
Calvary, G., Coutaz, J., Bouillon, L. et al., 2002. “The
CAMELEON Reference Framework”. Retrieved
August 25, 2016 from
http://giove.isti.cnr.it/projects/cameleon/pdf/CAMELE
ON%20D1.1RefFramework.pdf
Chang, C.K.: Situation Analytics: A Foundation for a New
Software Engineering Paradigm, IEEE Computer, Jan.
2016, pp. 24-33
Chang, C.K. et al.: Situ: A Situation-Theoretic Approach
to Context-Aware Service Evolution, IEEE Trans.
Services Computing, vol. 2, no. 3, 2009, pp.261-275
Figure 4: fromatob.com wizard pattern with chat window. Engel, J., Märtin, C., Forbrig, P.: A Concerted Model-
driven and Pattern-based Framework for Developing
User Interfaces of Interactive Ubiquitous Applications,
Proc. First Int. Workshop on Large-scale and Model-
5 CONCLUSION based Interactive Systems, Duisburg, pp. 35-41,
(2015)
In this paper we have presented the new adaptive Engel, J., Märtin, C., Forbrig. P.: Practical Aspects of
functionality developed for the PaMGIS MBUID Pattern-supported Model-driven User Interface
framework. This was made possible by integrating a Generation, To appear in Proc. HCII 2017, Springer
situation-aware adaptation component seamlessly (2017)
into the model- and pattern-based development Forbrig, P., Reichart, D., 2007. Spezifikation von
„Multiple User Interfaces“ mit Dialoggraphen, Proc.
environment and by adding runtime-features.
INFORMATIK 2007: Informatik trifft Logistik.
The new SitAdapt component is now fully Beiträge der 37. Jahrestagung der Gesellschaft für
operational. After finalizing the signal Informatik e.V. (GI), Bremen
synchronization and recording components, the Herdin, C., Märtin, C., Forbrig, P.: SitAdapt: An
system is currently being tested and evaluated with Architecture for Situation-aware Runtime Adaptation
target applications from the e-business domain. In of Interactive Systems. To appear in Proc. HCII 2017,
one realistic application domain that is currently Springer (2017)
being implemented we use SitAdapt to watch the Lee, Y., Balan, R.K.: The Case for Human-Centric
user and search for recurring situation patterns in the Personal Analytics, Proc. WPA ’14, pp. 25-29, ACM
(2014)
domain of ticket sale for long-distance-travel.
Märtin, C., Rashid, S., Herdin, C.: Designing Responsive
Interactive Applications by Emotion-Tracking and
Pattern-Based Dynamic User Interface Adaptations,
Proc. HCII 2016, Vol. III, pp. 28-36, Springer (2016)
Meixner, G., Calvary, G., Coutaz, J.: Introduction to
model-based user interfaces. W3C Working Group
Note 07 January 2014. http://www.w3.org/TR/mbui-
intero/. Accessed 27 May 2015
Melchior, J., Vanderdonckt, J., Van Roy, P.: A Model-
Based Approach for Distributed User Interfaces, Proc.
EICS ‘2011, pp. 11-20, ACM (2011)
Mens, K. et al.: A Taxonomy of Context-Aware Software
Variability Approaches, Proc. MODULARITY
Companion’16, pp. 119-124, ACM (2016)
Moore, R., Lopes, J., 1999. Paper templates. In
TEMPLATE’06, 1st International Conference on
Template Production. SCITEPRESS.
Paternò, F., 2001. ConcurTaskTrees: An Engineered
Approach to Model-based Design of Interactive
Systems, ISTI-C.N.R., Pisa
Picard, R.:”Recognizing Stress, Engagement, and Positive
Emotion”, Proc. IUI 2015, March 29-April 1, 2015,
Atlanta, GA, USA, pp. 3-4
Qu, F., Wang, S.-J. et al. CAS(ME)2: A Database of
Spontaneous Macro-expressions and Micro-
expressions, M. Kuroso (Ed.): HCI 2016, Part III,
NCS 9733, pp. 48-59, Springer, (2016)
Schilit, B.N., Theimer, M.M.: Disseminating
Active Map Information to Mobile Hosts, IEEE Network,
vol. 8, no. 5, pp. 22–32, (1994)
Schmidt, A. Biosignals in Human-Computer Interaction,
Interactions Jan-Feb 2016, pp. 76-79, (2016)
Warshaw, J. et al.: Can an Algorithm Know the “Real
You”? Understanding People’s Reactions to Hyper-
personal Analytics Systems, Proc. CHI 2015, pp. 797-
806, ACM (2015)
Yigitbas, E., Sauer, S., Engels, G.: A Model-Based
Framework for Multi-Adaptive Migratory User
Interfaces. In: Proceedings of the HCI 2015, Part II,
LNCS 9170, pp. 563-572, Springer (2015)