Please use this identifier to cite or link to this item:
Title: User models for personalized physical activity interventions: Scoping review
Authors: Ghanvatkar, S 
Kankanhalli, A 
Rajan, V 
Issue Date: 2019
Citation: Ghanvatkar, S, Kankanhalli, A, Rajan, V (2019). User models for personalized physical activity interventions: Scoping review. Journal of Medical Internet Research 21 (1) : e11098. ScholarBank@NUS Repository.
Abstract: Background: Fitness devices have spurred the development of apps that aim to motivate users, through interventions, to increase their physical activity (PA). Personalization in the interventions is essential as the target users are diverse with respect to their activity levels, requirements, preferences, and behavior. Objective: This review aimed to (1) identify different kinds of personalization in interventions for promoting PA among any type of user group, (2) identify user models used for providing personalization, and (3) identify gaps in the current literature and suggest future research directions. Methods: A scoping review was undertaken by searching the databases PsycINFO, PubMed, Scopus, and Web of Science. The main inclusion criteria were (1) studies that aimed to promote PA; (2) studies that had personalization, with the intention of promoting PA through technology-based interventions; and (3) studies that described user models for personalization. Results: The literature search resulted in 49 eligible studies. Of these, 67% (33/49) studies focused solely on increasing PA, whereas the remaining studies had other objectives, such as maintaining healthy lifestyle (8 studies), weight loss management (6 studies), and rehabilitation (2 studies). The reviewed studies provide personalization in 6 categories: goal recommendation, activity recommendation, fitness partner recommendation, educational content, motivational content, and intervention timing. With respect to the mode of generation, interventions were found to be semiautomated or automatic. Of these, the automatic interventions were either knowledge-based or data-driven or both. User models in the studies were constructed with parameters from 5 categories: PA profile, demographics, medical data, behavior change technique (BCT) parameters, and contextual information. Only 27 of the eligible studies evaluated the interventions for improvement in PA, and 16 of these concluded that the interventions to increase PA are more effective when they are personalized. Conclusions: This review investigates personalization in the form of recommendations or feedback for increasing PA. On the basis of the review and gaps identified, research directions for improving the efficacy of personalized interventions are proposed. First, data-driven prediction techniques can facilitate effective personalization. Second, use of BCTs in automated interventions, and in combination with PA guidelines, are yet to be explored, and preliminary studies in this direction are promising. Third, systems with automated interventions also need to be suitably adapted to serve specific needs of patients with clinical conditions. Fourth, previous user models focus on single metric evaluations of PA instead of a potentially more effective, holistic, and multidimensional view. Fifth, with the widespread adoption of activity monitoring devices and mobile phones, personalized and dynamic user models can be created using available user data, including users’ social profile. Finally, the long-term effects of such interventions as well as the technology medium used for the interventions need to be evaluated rigorously. © Suparna Ghanvatkar, Atreyi Kankanhalli, Vaibhav Rajan.
Source Title: Journal of Medical Internet Research
ISSN: 1438-8871
DOI: 10.2196/11098
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
10_2196_11098.pdf397.74 kBAdobe PDF




checked on Aug 14, 2022

Page view(s)

checked on Aug 4, 2022


checked on Aug 4, 2022

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.