Multimodal Fusion of Smart Home and Text-based Behavior Markers for Clinical Assessment Prediction

Multimodal Fusion of Smart Home and Text-based Behavior Markers for Clinical Assessment Prediction
Predictive smart home features
Predictive smart home features

1. INTRODUCTION

Individuals with amnestic mild cognitive impairment (MCI) and Alzheimer’s Disease typically experience symptoms such as memory loss, difficulty with language and visual-spatial abilities, decreased ability to focus, and issues with reasoning, planning, and complex decision making. Of these symptoms, difficulty with everyday memory exhibits the greatest degree of impairment and can negatively impact one’s independence and quality of life [1]. To assist with memory impairment in performing daily activities, compensatory devices, such as pagers and memory notebooks, have been successfully introduced [2]. Recently, technology has enabled enhanced digital versions of such compensatory devices, utilizing mobile apps and smart environments. For example, a digital memory notebook app can use notifications to remind users to log important events, tasks, and notes. Such an app can also be coupled with a smart home. The smart home continuously and unobtrusively collects naturalistic data, which can be automatically labeled with corresponding activity labels such as cook, work, and sleep. Using labeled activities to provide context, smart home algorithms can automatically provide in-the-moment digital memory notebook prompts to remind residents to perform common, day-to-day activities and encourage digital memory notebook use [2], [3].

Beyond providing compensatory assistance, smart homes and memory notebook apps serve a dual purpose for MCI and Alzheimer’s Disease stakeholders. Automated algorithms can continuously analyze smart home sensor data to model activity and behavior patterns over time [4]. If there is a sudden or slow onset of behavior change detected, care providers and family members can be notified and provide early treatment. Recently, researchers are using behavior markers and machine learning to map smart home data onto clinical health scores [5]-[8]. Such mappings could then be used to regularly screen for changes in health, complementing more traditional clinical assessment methods and leading to proactive intervention. These mappings are also based on continuous data, which may provide additional insights and could further augment data collected in a brief visit with a physician. When combined with additional information sources, such as demographic information or wearable data, the accuracy of these health score predictions can be further improved [5]. If an intervention tool such as a memory notebook app is introduced into a daily routine, use of the tool itself can also provide a unique source of information that may correlate with cognitive health. This process of merging data from different sources (i.e., modes or modalities) to feed machine learning problems is called multimodal fusion [9].

In this paper, we explore multimodal fusion to predict ten different clinical assessment scores using regression techniques (see Section 3.2 for clinical assessment details):

Objective Testing Scores

Self-report Measures

Our data modalities include behavior markers extracted from ambient sensors embedded in smart homes, a memory notebook tablet app called EMMA (Electronic Memory and Management Aid) [10], and participant demographic information. We hypothesize that smart homes and digital memory notebooks offer informative behavior markers, that when the markers used in combination, can predict multiple clinical health scores. We validate our methods and provide evidence to support our hypothesis using data collected from N = 14 participants with amnestic MCI who participated in the EMMA/smart home partnered condition of a pilot randomized controlled clinical trial [10].

This paper offers both clinical and technical contributions. In terms of clinical contributions, we introduce and evaluate the ability to automatically assess health in naturalistic, unscripted settings. We further describe how an intervention app can provide dual use as an assessment tool as well as a compensatory device. Based on our participant sample, we offer insights on the relationship between cognitive health assessments and behavior marker sets describing activity patterns and memory notebook usage. In terms of technical contributions, we introduce methods for extracting digital markers that reflect patterns in behavior, writing content/style, and intervention adherence. We describe and compare multiple approaches to fusing these multimodal data. Furthermore, we consider the design of machine learning techniques to predict precise clinical scores. Finally, we utilize joint prediction to boost assessment performance by harnessing the predictive relationship between multiple assessment measures.