4 Discussion

This study set out to map the landspace of current digital self-control tools and relate them to an integrative dual systems model of self-regulation. Our review of 367 apps and browser extensions found that blocking distractions or removing user interface features were the most common approaches to digital self-control. Grouping design features into clusters, the prevalence ranking was block/removal > self-tracking > goal advancement > reward/punishment. Out of these, 65% of tools focused on only one cluster in their core design; and most others (32%) on two. The frequencies of design features differed between the Chrome Web Store, Play Store, and Apple App store, which likely reflects differences in developer permissions. When mapping design features to our dual systems model, the least commonly targeted cognitive component was unconscious habit scaffolding, followed by the delay and expectancy elements of the expected value of control.

We now turn to discuss how these empirical observations can inform future research by pointing to: i) widely used and/or theoretically interesting design features in current digital self-control tools that are underexplored in HCI research; ii) feature gaps identified by our application of the dual systems model, showing neglected areas that could be relevant to researchers and designers, and iii) how the model may be used directly to guide research and intervention design. We then outline limitations and future work.

4.1 Research opportunities prompted by widely used or theoretically interesting design features

The market for digital self-control effectively amounts to hundreds of natural experiments in supporting self-control, meaning that successful tools may reveal design approaches with wider applicability. These approaches present low-hanging fruit for research studies, especially as many are so far lacking evaluation in terms of their efficacy and the transferability of their underlying design mechanisms. As an example, we highlight three such instances:

Responsibility for a virtual creature: Forest (Seekrtech 2018) ties device use to the well-being of a virtual tree. Numerous variations and clones of this approach exist among the tools reviewed, but Forest is the most popular with over 5 million users on Android alone. It presents a novel use of ‘virtual pets’ that requires the user to abstain from action (resist using their phone) rather than take action to ‘feed’ the pet, and is a seemingly successful example of influencing the reward component of expected value of control.

Redirection of activity: Timewarp (Stringinternational.com 2018) reroutes the user to a website aligned with their productivity goals when navigating to a distracting site (e.g. from Reddit to Trello), and numerous tools implement similar functionality. Such apps seem to be automating ‘implementation intentions’ (if-when rules for linking a context to a desired response (Gollwitzer and Sheeran 2006)), an intervention which digital behaviour change researchers have highlighted as a promising way to scaffold transfer of conscious System 2 goals to automatic System 1 habits (Pinder et al. 2018; Stawarz, Cox, and Blandford 2015).

Friction to override past preference: A significant number of tools not only allow the user to restrict access to digital distractions, but also add a second layer of commitment, e.g. by making blocking difficult to override, as in the browser extension Focusly (Trevorscandalios 2018), which requires a laborious combination of keystrokes to be turned off. This raises important design and ethical questions about how far a digital tool should go to hold users accountable for their past preferences (cf. Bryan, Karlan, and Nelson 2010; Lyngs et al. 2018).

4.2 Gaps identified by the dual systems model

By applying our model, we also identified three cognitive mechanisms that appear underexplored by current digital self-control tools. We argue that focusing on these mechanisms could lead to new powerful models for digital self-control:

Scaffolding habits

Similar to the situation in general DBCIs (cf. Pinder et al. 2018; Stawarz, Cox, and Blandford 2015), the least frequently targeted cognitive component relates to scaffolding of new, desirable unconscious habits (as opposed to preventing undesired ones from being triggered via blocking or feature removal). Habit formation is crucial for long-time behaviour change, and in the context of DBCIs, Pinder et al. suggested implementation intentions and automation of self-control as good candidate strategies for habit targeting. We note that some such design interventions are already being explored amongst current digital self-control tools: Apart from the tools mentioned above that redirect activity, we highlight that four tools allow blocking functionality to be linked to the user’s location (e.g. and ). We expect this to be a powerful way of automatically triggering a target behaviour in a desired context.

Delay

The delay component of expected value of control is also less commonly targeted: the number of tools including functionality targeting delay drops to 4% if we exclude the display of a timer (which raises time awareness rather than affecting actual delays). This is surprising from a theoretical perspective, because the effects on behaviour of sensitivity to delay are strong, reliable, and—at least to behavioural economists—at the core of self-control difficulties (Ariely and Wertenbroch 2002; Dolan et al. 2012). Even if rewards introduced by gamification features may have the side effect of reducing delay before self-control is rewarded, it remains surprising that only two of 367 reviewed tools directly focused on using delays to scaffold successful self-control (Space (Boundless Mind Inc 2018) increases launch times for distracting apps on iOS; Pipe Clogger (Croshan 2018) does the same for websites). As previous research has found people to be especially sensitive to delays in online contexts (Krishnan and Sitaraman 2013), we expect interventions that leverage delays to scaffold self-control in digital environments to be highly effective.

Expectancy

The expectancy component (i.e. how likely a user think it is that she will be able to reach her goal through self-control exertion) was also less frequently targeted, and mainly through timers limiting the duration where the user tried to exert self-control. Given the crucial role of self-efficacy in Bandura’s influential work on self-regulation (Bandura 1991), this may also represent an important underexplored area. One interesting approach to explore is found in Wormhole Escaper (Bennett 2018) which lets the user administer words of encouragement to themselves when they manage to suppress an urge to visit a distracting website. In so far as this is effective, it may be by boosting the user’s confidence in their ability to exert self-control.

4.3 Using the model directly to guide intervention research and design

The abstracted nature of the model enables it to be utilised on different levels of analysis to inspire new avenues for research as well as drive specific design:

For researchers, the model may be used to organise existing work on design interventions by the cognitive components targeted, as well as a roadmap for future studies that focus on different components of the self-regulatory system. Whereas many other theories and frameworks are on offer for this purpose, one advantage of the dual systems model is that it provides HCI researchers with clear connections to wider psychological research on basic mechanisms of self-regulation, which can be utilised in design.

As such, the model may be used as a starting point for design consideration that is aligned with the cognitive mechanisms involved in self-regulation; its components can be readily expanded if inspiration from more theoretical details and predictions is required. For example, the ‘reward’ component readily expands into more specific models explaining the types of stimuli that may be processed as rewards; how timing of rewards impact their influence; how the impact of gains differ from losses; and so on (Berridge and Kringelbach 2015; Caraco, Martindale, and Whittam 1980; Schüll 2012; Kahneman and Tversky 1979).

Two recent examples in HCI research illustrate such possible use of psychological theory in the design process: In the design and development of TimeAware, Kim et al. were guided by work on differential sensitivity to gains vs losses, and found that their visualisation tool more effectively supported productivity when displaying time spent engaging with distracting rather than productive activities (Kim et al. 2016). Similarly, based on dual systems theory, Adams et al. (Adams et al. 2015) trialed ways of applying visual and auditory perception biases in design interventions to influence food choice and voice pitch. We hope our model may inspire designs that are similarly informed by psychological theory.

4.4 Limitations and future work

Our review has some limitations. First, due to space restrictions, and because information about numbers of users are not available on the Apple App Store, our tool analysis has focused on functionality analysis, while leaving consideration of numbers of installs or content of user reviews to future work. We note that this is similar to the approach taken by other reviews in related areas (Stawarz, Cox, and Blandford 2015, 2014).

Second, the integrative dual systems model we have applied points to directions of future research, but its high-level formulation leaves its cognitive design space under-specified. How precisely one should be able to anchor details of specific interventions directly in causal theories is a point of longstanding debate (Michie et al. 2008; Hardeman et al. 2005; Ajzen 1991). A main benefit of dual systems theory, however, is that while concise, it remains directly grounded in well-established basic research on self-regulation. As mentioned above, this means that each component of the model has substantial literature behind it, so that more detailed specifications and predictions can be found in lower-level theories on demand.

Turning to the future, self-control in relation to digital device use involves unique challenges and opportunities compared to general behaviour change research. On the one hand, portable, powerful, internet-connected devices present an unprecedented self-regulation challenge: Never before have so many behavioural options, information about nearly everything, engaging games, and communication with friends, family, and strangers, been instantly available. On the other hand, this very challenge presents a unique research opportunity. Precisely because digital devices afford so much functionality, they allow us to test design interventions with greater precision, flexibility, and dramatically lower cost than changing the physical environment. Moreover, context detection, a constant challenge in DBCI research for administering meaningful and well-timed interventions (Pinder et al. 2018), is more manageable in relation to device use, because a large amount of relevant activity can be easily measured.

The research on digital self-control tools should therefore be of wide interest as a test bed for interventions that optimise self-control in an environment where most factors can be changed at minimal cost.

References

Seekrtech. 2018. “Forest: Stay focused.” https://www.forestapp.cc.

Gollwitzer, Peter M., and Paschal Sheeran. 2006. “Implementation Intentions and Goal Achievement: A Meta-analysis of Effects and Processes.” Advances in Experimental Social Psychology 38: 69–119. https://doi.org/10.1016/S0065-2601(06)38002-1.

Pinder, Charlie, Jo Vermeulen, Benjamin R. Cowan, and Russell Beale. 2018. “Digital Behaviour Change Interventions to Break and Form Habits.” ACM Transactions on Computer-Human Interaction 25 (3): 1–66. https://doi.org/10.1145/3196830.

Stawarz, Katarzyna, Anna L Cox, and Ann Blandford. 2015. “Beyond Self-Tracking and Reminders: Designing Smartphone Apps That Support Habit Formation.” In Proceedings of the 33rd Annual Acm Conference on Human Factors in Computing Systems, 2653–62. CHI ’15. New York, NY, USA: ACM. https://doi.org/10.1145/2702123.2702230.

Bryan, Gharad, Dean S. Karlan, and Scott Nelson. 2010. “Commitment Devices.” Annual Review of Economics 2: 671–98. https://doi.org/10.1146/annurev.economics.102308.124324.

Lyngs, Ulrik, Reuben Binns, Max Van Kleek, and Nigel Shadbolt. 2018. “"So, Tell Me What Users Want, What They Really, Really Want!".” In Extended Abstracts of the 2018 Chi Conference on Human Factors in Computing Systems, alt04:1—–alt04:10. CHI Ea ’18. New York, NY, USA: ACM. https://doi.org/10.1145/3170427.3188397.

Ariely, Dan, and Klaus Wertenbroch. 2002. “Procrastination, deadlines, and performance: self-control by precommitment.” Psychological Science, no. 2000: 219–24. https://doi.org/10.1111/1467-9280.00441.

Dolan, Paul, Antony Elliott, Robert Metcalfe, and Ivo Vlaev. 2012. “Influencing Financial Behavior: From Changing Minds to Changing Contexts.” Journal of Behavioral Finance 13 (2): 126–42. https://doi.org/10.1080/15427560.2012.680995.

Boundless Mind Inc. 2018. “Space - You Need a Breather.” https://itunes.apple.com/gb/app/space-you-need-a-breather/id1187106675?mt=8\&ign-mpt=uo%3D4.

Krishnan, S. Shunmuga, and Ramesh K. Sitaraman. 2013. “Video stream quality impacts viewer behavior: Inferring causality using quasi-experimental designs.” IEEE/ACM Transactions on Networking 21 (6): 2001–14. https://doi.org/10.1109/TNET.2013.2281542.

Bandura, Albert. 1991. “Social Cognitive Theory of Self-Regulation.” Organizational Behavior and Human Decision Processes 50 (2): 248–87. https://doi.org/10.1016/0749-5978(91)90022-L.

Berridge, Kent C., and Morten L. Kringelbach. 2015. “Pleasure Systems in the Brain.” Neuron 86 (3). Elsevier Inc.: 646–64. https://doi.org/10.1016/j.neuron.2015.02.018.

Caraco, Thomas, Steven Martindale, and Thomas S. Whittam. 1980. “An empirical demonstration of risk-sensitive foraging preferences.” Animal Behaviour 28 (3): 820–30. https://doi.org/10.1016/S0003-3472(80)80142-4.

Schüll, Natasha Dow. 2012. Addiction By Design - Machine Gambling in Las Vegas. New Jersey: Princeton University Press.

Kahneman, Daniel, and Amos Tversky. 1979. “Prospect Theory: An Analysis of Decision Under Risk.” Econometrica 47 (2): 263–92.

Kim, Young-Ho, Jae Ho Jeon, Eun Kyoung Choe, Bongshin Lee, KwonHyun Kim, and Jinwook Seo. 2016. “TimeAware: Leveraging Framing Effects to Enhance Personal Productivity.” In Proceedings of the 2016 Chi Conference on Human Factors in Computing Systems, 272–83. CHI ’16. New York, NY, USA: ACM. https://doi.org/10.1145/2858036.2858428.

Adams, Alexander T., Jean Costa, Malte F. Jung, and Tanzeem Choudhury. 2015. “Mindless Computing: Designing Technologies to Subtly Influence Behavior.” In Proceedings of the 2015 Acm International Joint Conference on Pervasive and Ubiquitous Computing, 719–30. ACM. https://doi.org/10.1145/2750858.2805843.

Stawarz, Katarzyna, Anna L Cox, and Ann Blandford. 2014. “Don’t Forget Your Pill!: Designing Effective Medication Reminder Apps That Support Users’ Daily Routines.” In Proceedings of the Sigchi Conference on Human Factors in Computing Systems, 2269–78. CHI ’14. New York, NY, USA: ACM. https://doi.org/10.1145/2556288.2557079.

Michie, Susan, Marie Johnston, Jill Francis, Wendy Hardeman, and Martin Eccles. 2008. “From Theory to Intervention: Mapping Theoretically Derived Behavioural Determinants to Behaviour Change Techniques.” Applied Psychology 57 (4): 660–80. https://doi.org/10.1111/j.1464-0597.2008.00341.x.

Hardeman, Wendy, Stephen Sutton, Simon Griffin, Marie Johnston, Anthony White, Nicholas J. Wareham, and Ann Louise Kinmonth. 2005. “A causal modelling approach to the development of theory-based behaviour change programmes for trial evaluation.” Health Education Research 20 (6): 676–87. https://doi.org/10.1093/her/cyh022.

Ajzen, Icek. 1991. “The theory of planned behavior.” Organizational Behavior and Human Decision Processes 50 (2): 179–211. https://doi.org/10.1016/0749-5978(91)90020-T.