Content area
Sensor-based data collection of human behaviour (digital phenotyping) enables real-time monitoring of behavioural and physiological markers. This emerging approach offers immense potential to transform mental health research and care by identifying early signs of symptom exacerbation, supporting personalised interventions, and enhancing our understanding of daily lived experiences. However, despite its promise, technical and user-experience challenges limit its effectiveness. This Perspective critically examines these challenges and provides standardisation strategies, including universal protocols and cross-platform interoperability. We propose the development of universal frameworks, adoption of open-source APIs, enhanced cross-platform interoperability, and greater collaboration between academic researchers and industry stakeholders. We also highlight the need for culturally sensitive and user-centred designs to improve equity and engagement. By addressing these gaps, standardisation can enhance data reliability, promote scalability and maximise the potential of digital phenotyping in clinical and research mental health settings.
Alam et al. outline key technical and ethical challenges in sensor-based digital phenotyping for mental health. They propose standardisation strategies to enhance data reliability, scalability and global applicability.
Introduction
Digital phenotyping (DP) refers to the moment-by-moment quantification of the individual-level human phenotype using data from personal digital devices such as smartphones and wearables1. It involves the collection and analysis of behavioural and physiological data to generate insights into an individual’s mental and physical states in real-time1, 2–3. DP has gained significant interest for use in mental health care2,4, 5–6. By leveraging wearable devices and smartphones, DP offers real-time insights into individuals’ health, enabling the detection of subtle changes in mental and physical states that were previously difficult to detect7, 8–9. This technique shows high sensitivity in detecting early signs of mental illness6 and can help predict relapse using smartphone data days before they become clinically apparent10, 11, 12, 13–14. Recent work has even suggested that DP ‘could support gold-standard assessment and…predict symptom exacerbations’6. This offers particular promise, particularly in mental health care, as early intervention can dramatically improve outcomes14,15 for conditions such as depression16, 17–18, anxiety17,19, 20, 21, 22–23, and serious mental illnesses such as psychotic disorders7,8,16, 17, 18, 19–20.
Despite its potential, DP faces critical technical challenges and usability barriers that undermines its reliability and scalability24,25. These challenges are compounded by the absence of standardisation in methodologies, which results in variability across platforms and studies, limiting the reproducibility and generalisability of findings. In this Perspective, we outline these challenges and propose strategies for developing universal frameworks and protocols, to enable more reliable, scalable and impactful applications of DP.
Addressing technical challenges
Battery life and power consumption
One of the primary technical challenges of sensor-based data collection is battery life25. Wearable devices and smartphones rely on continuous power to collect and transmit data25 and data such as GPS tracking26, 27–28, accelerometers29, and continuous heart rate monitoring30, 31–32 consume significant energy25. At a refresh rate of 1 Hz (the number of times the screen updates per second), smartphones experience rapid battery drainage, with Samsung devices lasting approximately six hours and iPhones lasting about 5.5 h28. Location services such as GPS tracking consume approximately 13% of battery life when operating with a strong signal but in areas with weak signal strength, battery consumption can significantly increase, reaching up to 38%27,33. Using accelerometer-based continuous sensing apps (CSAs) such as Google Fit34 increases battery consumption, particularly during high-mobility activities such as jogging or excercise29. Walking or other mobility activities such as running can increase battery consumption by up to 3–4 times29,35. Day-long experiments demonstrated how smartphones running CSAs experienced increase in battery consumption up to three times higher than those without such apps, particularly during physical activity29. Significant battery drainage is also seen when using photoplethysmography in digital devices36 for heart rate monitoring30, 31–32. Continuous heart rate monitoring requires high energy due to frequent processing and wireless data transmission to remote servers31. This limits smartphone uses in real-world scenarios to approximately 9 h on average, which inconveniences users as they must recharge their devices during the day31. Similarly for wearables, due to data transmission requirements, there is significant battery drainage when using heart rate monitoring30,32. This limits their utility in long-term studies or real-time monitoring scenarios, as frequent recharging can disrupt data collection and affect user compliance24,25.
One approach to improve energy efficiency is adaptive sampling, which dynamically adjusts the frequency of sensor data collection based on user activity37. This reduces unnecessary power consumption by lowering the sampling rate when the user is stationary and increasing it only during movement38,39. Another strategy is sensor duty cycling, which alternates between low-power sensors, such as accelerometers, and high-power sensors, like GPS and heart rate monitors40. By activating power-intensive sensors only, when necessary, duty cycling conserves battery life without compromising data quality41. Additionally, the development of low-power wearable devices, leveraging energy-efficient chipsets42, Bluetooth Low Energy (BLE)43 and hardware-based power management algorithms, enables prolonged monitoring while reducing the frequency of recharging42,44. These innovations can allow DP applications to minimise battery drain, enhance usability and improve participant compliance in long-term studies.
Furthermore, researchers may strategically prioritise use and choice of sensors based on study aims and resource constraints. For instance, short-term studies that focus on movement may prioritise IMU sensors, while long-term studies assessing autonomic function may rely on intermittent heart rate variability (HRV) sampling.
Device selection is another critical consideration. For example, the Polar H10 chest strap is known for accurate HRV data collection with excellent battery life (up to 400 h)45,46, while the ActiGraph GT9X offers reliable IMU data with long-term battery support suitable for week-long recordings47. Wrist-worn devices such as Fitbit Charge 5 balance HR monitoring with moderate battery life, approximately 7 days, but may offer lower data granularity. Selecting devices with built-in power-saving modes or configurable sampling rates can optimise both data quality and battery performance48,49.
By combining hardware-efficient design, adaptive sampling and intentional feature prioritisation, digital phenotyping studies can maintain a balance between data richness and battery feasibility. These decisions should be guided by the specific use case, data fidelity requirements and the anticipated level of participant engagement.
Device compatibility and app development
The heterogeneity of devices and operating systems presents another technical hurdle50. Smartphones and wearables come from various manufacturers, each with unique hardware configurations and software ecosystems25,50, leading to inconsistencies in data collection and integration, as certain devices may not support specific sensors or data formats25,50. For example, some data collection applications for DP only work on iOS51, 52–53 or Android54, which excludes many participants and data from studies.
Beyond hardware and software differences, the choice between cross-platform and native app development further influences data collection reliability55. Cross-platform development allows applications to run on multiple operating systems using a single codebase, leveraging frameworks such as React Native, Flutter, or Xamarin55. While this approach improves accessibility and reduces development time, it often comes at the cost of performance and customisation. In contrast, native development involves building applications specifically for a single platform or operating system (e.g., Swift for iOS or Kotlin for Android), allowing deeper integration with system-level features and optimised performance55,56. Given that DP applications rely heavily on sensor-based data collection and real-time processing57, cross-platform solutions may not be the most suitable approach. Native development provides greater control over data handling, seamless integration with platform-specific health APIs, and optimised input/output (I/O) operations, making it a more reliable choice for applications requiring continuous data monitoring and precise hardware interaction56.
Recent advances in Generative AI (GenAI), particularly large language models (LLMs) and diffusion-based architectures, offer new opportunities for enhancing DP. GenAI can support the automated synthesis and contextual understanding of unstructured behavioural data such as speech, social media, journaling and passive text inputs58, 59–60. For example, LLMs such as Generative Pretrained Transformers (GPT) and Bidirectional Encoder Representations from Transformers (BERT) variants have been shown to detect depressive or anxious language patterns with high sensitivity61,62. In clinical research, fine-tuned GenAI models can assist in generating individualised behavioural baselines, summarising daily mood reports, or simulating realistic synthetic data for rare psychiatric presentations63, 64–65. Additionally, generative models can support just-in-time adaptive interventions (JITAIs) by tailoring mental health content or therapeutic prompts based on real-time sensor input and user preferences66. In DP applications for low-resource settings, GenAI can improve accessibility through natural language generation in regional languages and simplification of app interfaces for low-literacy users67. Careful benchmarking, human oversight and ethical safeguards are necessary for its responsible deployment in DP.
Interoperability is a critical area for innovation. The development of open-source frameworks and standardised APIs can facilitate seamless integration of data across various devices and platforms, fostering collaborative research and scalability59,68. Additionally, AI-powered natural language processing (NLP) and sentiment analysis can unlock new dimensions of behavioural insights by analysing voice and text data69,70. These advancements can significantly enhance the ability to detect subtle changes in mental health status.
To ensure inclusivity, technological development must prioritise accessibility. This involves designing energy-efficient devices with lower costs and user-friendly interfaces, making DP feasible for diverse populations, including those in resource-constrained settings.
Cross-platform interoperability is crucial for integrating data from various devices and applications. Currently, many wearables and apps operate within proprietary ecosystems, limiting their ability to share data seamlessly25,71,72. The use of Application Programming Interfaces (APIs) and Software Development Kits (SDKs) offers a practical solution73. APIs allow different software applications to communicate with one another, while SDKs enable developers to create compatible tools and features for existing platforms59,68,73. For example, Apple HealthKit and Google Fit provide APIs that facilitate data integration from multiple sources, but broader adoption and further refinement are needed to ensure comprehensive interoperability73. However, caution is warranted when using data extracted from such APIs and SDKs. These data are often pre-processed by the platform providers, and changes in preprocessing algorithms over time can lead to discrepancies even in historical data74. For instance, identical data exported at different time points can yield different outputs due to back-end updates, as reflected in metadata or timestamp inconsistencies75. This highlights that data from such platforms are not truly raw and should be interpreted with transparency regarding preprocessing pipelines and limitations74. Recent consensus guidelines have emphasised the importance of understanding data provenance and reproducibility in sensor-derived health data75,76. Developers can also leverage cross-platform frameworks such as React Native77,78, which allows the use of JavaScript to build apps for both iOS and Android while maintaining high performance through integration with native components78. Similarly, Flutter79, a toolkit developed by Google, enables developers to create applications for both operating systems, supporting consistent functionality and user experience across platforms80.
Collaboration between industry and academia is also vital for promoting interoperability. Industry stakeholders, including device manufacturers and app developers, must align their technologies with agreed-upon standards59,73. Academic researchers can provide insights into the practical challenges of implementing these standards.
Inconsistent data transmission, storage and security protocols
The lack of robust data transmission and storage solutions presents critical challenges as well25,57,81,82. DP often involves the real-time transmission of substantial data volumes, which can overwhelm existing network infrastructures57. For instance, studies have shown that high-frequency data collection from wearable devices, such as electrodermal activity or heart rate monitors, can generate datasets exceeding gigabytes daily, especially when combined with continuous geolocation tracking and other sensor data29,32,83. This high volume of data often exceeds the capabilities of low-bandwidth networks, resulting in signal loss and transmission failures84.
Inadequate storage systems, particularly those lacking scalability, struggle to handle the exponential growth of health-related data81. This challenge is compounded by insufficient encryption protocols, which leave sensitive health information vulnerable to breaches81,82. A case study in India on the use of mobile health apps found that insecure storage mechanisms led to unauthorised access to patient data, raising significant concerns about privacy and confidentiality85. Similarly, research on mental health monitoring systems noted that improper encryption during data transmission exposed sensitive behavioural and physiological data to interception, undermining both participant trust and the overall reliability of the research86,87.
To address these challenges, Kotlin Multi-Platform (KMP), developed by JetBrains, offers a reliable solution for optimising data transmission88 and integration across multiple platforms89. KMP enables seamless data processing by allowing shared business logic while maintaining native performance for both iOS and Android90. Unlike traditional cross-platform frameworks, which may struggle with customising health data libraries, KMP ensures efficient data handling and security by supporting encrypted data transmission instead of raw data transfer89,90. Additionally, by reducing redundant code and improving database structures across platforms, KMP enhances scalability and interoperability90, making it a suitable choice for DP applications that require secure and high-performance cross-platform compatibility. Labfront, an alternative research-grade platform, offers a low-code environment designed specifically for collecting and analysing physiological and behavioural data from wearable sensors91. Key features include customisable survey tools, secure cloud-based data storage, real-time participant involvement monitoring, and built-in analytics capabilities91,92. Its intuitive design and minimal programming requirements make it particularly suitable for research teams with limited technical resources, enabling efficient data management across diverse study designs91. Choosing the appropriate solution depends on several factors such as device compatibility, sensor types and data granularity, security needs, network availability and budget.
Unstable network connectivity remains another major obstacle84, particularly in LMIC settings where DP applications may operate in environments with intermittent internet access. Offline data storage with periodic uploads can mitigate this issue by allowing sensor data to be stored locally on the device and transmitted to cloud servers only when a stable connection is available40. This approach reduces the likelihood of data loss and synchronization issues40. Data compression techniques further enhance network reliability by reducing the size of transmitted data, conserving bandwidth and ensuring faster uploads93,94. Similarly, edge computing which is the processing of data closer to where it is generated rather than relying on distant cloud servers enables real-time data processing directly on the device, reducing reliance on cloud-based computation93,95. By analysing and filtering data locally before transmission, edge computing minimizes network dependency, enhances processing efficiency, and strengthens privacy protections95.
Integrating energy-efficient sensing and network-independent data transmission methods is crucial for improving the sustainability and scalability of DP. Future research should continue refining these approaches to ensure that sensor-based monitoring remains reliable, particularly in diverse and resource-constrained settings.
Addressing user-centred challenges
User engagement and participation
Engaging individuals in DP initiatives requires thoughtful design that aligns with users’ values, preferences, and lived experiences.
Maintaining user participation and engagement with wearables and apps is a persistent challenge. Many individuals choose to disengage not only because of discomfort, forgetfulness, or lack of perceived benefit96, but also because the technology may offer little actual value to them, or it may present physical, cognitive, or contextual barriers97,98. Importantly, these decisions are not mere lapses in behaviour, but informed acts of disengagement shaped by unmet needs or unaddressed concern99. For example, studies reported that participants discontinued the use of digital tools due to sensory discomfort, lack of clarity on the utility of the data, and difficulties navigating interfaces that were not adapted to their cognitive or literacy level100. Similarly, barriers such as low digital literacy, mental health symptoms (e.g., fatigue, paranoia), and physical limitations can significantly impact a person’s ability to engage meaningfully with DP tools101, 102–103.
Technical interruptions, such as forgetting to charge or wear the device82,96,104, are often secondary to more complex issues, such as poorly designed interfaces, lack of support or training, and limited adaptability to users’ routines and needs105. This introduces challenges in the continuity of data collection and can compromise the reliability of insights derived from the data25,106.
To address these challenges, future DP efforts must prioritise participatory design, where users are involved throughout the development process to ensure accessibility, relevance and inclusivity. Designs must accommodate varying cognitive abilities, language proficiency and sensory needs to support equitable engagement107.
Privacy concerns
Privacy concerns represent a significant barrier to DP technologies3,82,108,109. These methods inherently involve the collection of highly sensitive data, including behavioural, physiological and contextual information, which can provide deep insights into an individual’s health, and habits108,109. However, the very richness of this data also heightens users’ fears about misuse, unauthorised access, and potential breaches of confidentiality82,109. A significant driver of these concerns is the lack of transparency in data usage policies108. Many users are unclear about how their data is collected, processed and shared, leading to scepticism and distrust109. While some companies have made progress in simplifying their privacy documentation, many users still find it difficult to understand how their data is collected, processed, stored, or shared53,108,109. For instance, Fitbit’s privacy policy outlines core purposes for data use such as improving product functionality, personalising recommendations, enhancing cybersecurity, and fulfilling legal obligations (e.g., responding to subpoenas or law enforcement requests)110.
Ensuring robust data security is a significant issue as well108. Cybersecurity threats, including data breaches and hacking, can expose sensitive information to malicious actors109. For instance, real-time data transmissions from wearables or smartphone apps are particularly vulnerable to interception if not encrypted appropriately82,108,109. Furthermore, once collected, storing large datasets securely remains a challenge, especially for smaller organisations or research groups with limited resources57,82. These challenges highlight the urgent need for stronger regulatory frameworks, user-centric privacy safeguards and transparent data governance to build trust and ensure the ethical implementation of DP.
Building trust is critical for addressing privacy concerns of DP104,109. Developers of DP tools must implement stringent privacy safeguards, including end-to-end encryption, secure data storage protocols and multi-factor authentication to protect user data109. Further efforts must be made to develop decentralised machine learning techniques to protect user’s data. Transparent and user-friendly consent processes are equally vital, empowering users to make informed decisions about their participation108. Regular communication about data usage, anonymisation efforts and security measures can further reassure users about their data’s safety81. Ultimately, addressing privacy concerns is not just about compliance with legal standards; it is imperative to respect and protect the rights and autonomy of those who entrust their data to DP technologies108,109. Building trust through stringent privacy safeguards, clear consent protocols, transparent AI and anonymisation techniques is essential.
Ethics must be a guiding principle in the development and deployment of DP technologies. The vast amounts of personal and sensitive data collected pose significant privacy risks3,109. Transparent and user-centric data governance models are crucial to building trust82,109. These models should include clear consent mechanisms, robust anonymisation protocols, and stringent data encryption standards81,82. Using research-focused platforms such as Labfront bypasses commercial manufacturer servers entirely. Data from compatible devices (e.g., Garmin wearables) is transmitted directly to Labfront’s secure cloud, enabling compliance with academic standards such as the Health Insurance Portability and Accountability Act (HIPAA) and Institutional Review Board (IRB) protocols91,92. These design choices reduce exposure to commercial data pipelines and provide researchers with greater control over data governance and participant confidentiality.
Cultural and socioeconomic barriers
The accessibility of sensor-based technologies varies widely across cultural and socioeconomic contexts111. High costs of devices and limited digital literacy in underserved populations can restrict participation in DP initiatives4,82,111. For example, in a study conducted in the United States among participants at a federally qualified health centre, cost was ranked relatively low among the barriers to adopting wearable technologies112, but this might not be the case in other settings. Furthermore, cultural attitudes towards technology and data sharing may affect willingness to engage4,113. Tailoring interventions to specific populations and ensuring equitable access are critical for global scalability.
Cultural sensitivity is a vital ethical consideration114. Mental health is profoundly influenced by cultural beliefs and practices, necessitating the co-design of interventions with local stakeholders114,115. Engaging communities in the development and testing of tools ensures their relevance, acceptability and effectiveness104. Additionally, emphasising transparency and inclusivity throughout the design and deployment process can foster trust among end-users and stakeholders alike104,109.
Lack of standardisation and strategies
Absence of universal protocols
The lack of universal protocols for data collection, processing and analysis represents a significant challenge for DP25. This absence of standardisation hinders the scalability, reproducibility and generalisability of research findings, creating barriers to the broader adoption and implementation of these technologies25,71. Currently, DP studies exhibit wide variability in key parameters such as data formats, sampling rates, device types and quality metrics71,72. Similarly, data formats vary across platforms and devices, complicating efforts to integrate and analyse datasets from multiple sources. This lack of consistency undermines the comparability of findings across studies, limiting opportunities for meta-analyses and cross-contextual validation25,71,72.
The variability also poses challenges for interpreting results and replicating studies. Without standardised protocols, it becomes difficult to determine whether differences in findings are due to true variations in the phenomena being studied or methodological inconsistencies25,71. Moreover, the absence of universal guidelines creates inefficiencies in data sharing and collaboration57,71. Researchers must often invest significant time and resources in reformatting and preprocessing data to make it compatible with their tools and methodologies. This inefficiency not only slows down the pace of advancement108,116,117 but also increases the risk of errors and misinterpretations71,72.
Addressing these challenges requires standardised guidelines and frameworks for DP to be established. These protocols should define best practices for data collection, including optimal sampling rates and acceptable device specifications, to ensure consistent data quality. They should also provide guidance on data preprocessing, feature extraction, and analytic methods, enabling more reliable and comparable outcomes. Ultimately, establishing universal protocols is not merely a technical necessity but a foundational step toward building trust in DP as a reliable and scalable tool for advancing personalised health and precision medicine. By promoting consistency, reproducibility and transparency, standardised guidelines can unlock the full potential of this emerging field.
Variability in methodologies across studies and platforms
Methodological variability is a significant obstacle in the advancement of DP, particularly in sensor-based data collection25. The use of diverse applications, devices and analytical approaches across studies amplifies these challenges, limiting the generalisability and scalability of findings25,71,72. Different studies often employ distinct apps and platforms for data collection, each with its own set of capabilities, data formats and compatibility requirements71,72. This diversity can result in inconsistencies in the types and quality of data captured.
Pre-processing steps, which are critical for preparing raw data for analysis, further contribute to variability24,71,72. Different studies adopt diverse techniques for handling missing data, outlier detection and noise reduction25. For instance, some research may use imputation methods to fill gaps in data, while others may discard incomplete data altogether, potentially biasing results71,72. Variations in feature extraction approaches, such as the choice of time windows or signal processing algorithms, further complicate cross-study comparisons25,72. The challenges extend into the machine learning pipeline. Variability in model selection, training protocols and evaluation metrics can lead to divergent findings, even when analysing similar datasets71,72,94. Additionally, the choice of algorithms ranging from traditional statistical models to complex prediction models, can introduce further inconsistencies25.
Collaborative efforts to align methodologies and share best practices are essential to overcome these challenges. Standardised protocols for data collection should prioritise interoperability across apps and devices, ensuring that data from multiple sources can be seamlessly integrated. Establishing guidelines for preprocessing steps, such as unified approaches to handling missing data and feature extraction, would help reduce variability and improve the comparability of datasets. In the machine learning domain, adopting shared evaluation frameworks and benchmarking practices would promote consistency. Researchers could benefit from using open-source platforms and repositories to share pre-trained models, annotated datasets and pipelines.
Strategies for standardisation
The advancement of DP as a reliable and scalable field requires robust strategies for standardisation across data collection, processing and analysis methodologies71. Standardisation ensures consistency, reproducibility and interoperability, ultimately enhancing the generalisability and utility of findings. This section explores three key strategies for achieving standardisation: developing universal frameworks, promoting cross-platform interoperability, and leveraging pilot projects for validation.
Developing universal frameworks
Establishing universal frameworks is foundational for standardising DP. Examples from other fields, such as Open mHealth and HL7 FHIR (Fast Healthcare Interoperability Resources), provide valuable blueprints118. Open mHealth offers standardised schemas for health data, allowing developers to integrate diverse datasets seamlessly118,119. HL7 FHIR, widely used in clinical informatics, standardises the exchange of electronic health records across healthcare systems, ensuring data compatibility and accessibility120. Adopting similar frameworks for DP can address the variability in data formats, sampling rates and quality metrics that currently hinder the field118.
Proposals for universal data formats, protocols and reporting standards are essential. A universal data format would specify how data from wearables, apps and other devices should be structured, annotated and stored, making it easier to integrate and analyse datasets from different sources118,120,121. Protocols should define best practices for data collection, such as recommended sampling frequencies and minimum data quality thresholds, ensuring that studies generate comparable and reliable data. Reporting standards would ensure transparency in methodologies, making it easier for researchers to replicate studies and evaluate findings118.
Global implementation and standardisation
Expanding the impact of DP globally requires coordinated efforts in standardisation and implementation. Standardisation provides a foundation for large-scale, cross-cultural research by ensuring consistency in data collection, storage and analysis24,72. Harmonising methodologies across studies will improve reproducibility, reduce variability and foster meta-analyses that yield generalisable findings71.
Standardisation efforts by creating universally accepted guidelines and protocols is important. For instance, adopting unified metrics for data quality, sampling rates and feature extraction can help bridge methodological gaps122. Platforms such as Open mHealth and HL7 FHIR offer promising frameworks that can be adapted for DP118,119.
Moreover, global implementation necessitates culturally adaptive solutions. Collaborations between researchers, policymakers and local communities can ensure that DP tools align with cultural norms and address linguistic and literacy barriers114,123. For example, apps designed for specific regions could incorporate local languages, cultural references and intuitive visual cues, promoting user engagement and participation. Equitable access is a pressing concern in global implementation114.
In addition to technical harmonisation and culturally adaptive solution, collaboration between industry and academia is crucial to achieve true cross-platform standardisation. Industry stakeholders, such as device manufacturers, OS developers and app creators must commit to aligning their products with emerging universal standards for data interoperability, transparency, and security. Without such alignment, even the most robust academic standards will face limited uptake. Conversely, academia brings domain expertise, ethical oversight, and implementation experience that can guide responsible technology design. Initiatives such as Open mHealth and HL7 FHIR exemplify how cross-sector cooperation can produce frameworks that are both technically sound and practically scalable71,118. Ongoing dialogue, co-creation, and shared governance between sectors are essential to ensure that DP tools are interoperable, equitable and responsive to end-user needs5,67,124.
Pilot projects and validation studies
Pilot projects and validation studies play a critical role in demonstrating the feasibility and benefits of standardisation efforts125. Case studies of successful standardisation initiatives can provide actionable insights and serve as models for broader implementation126. For instance, the RADAR-CNS (Remote Assessment of Disease and Relapse— Central Nervous System) project has successfully integrated data from multiple wearable devices and apps, demonstrating the potential for standardised data collection and analysis in the context of mental health research124.
Iterative testing is essential to refine protocols and address practical challenges125,126. Pilot studies should test the compatibility of proposed data formats and protocols across different devices and platforms, identifying any gaps or inconsistencies59,71,126. Validation studies can assess the reliability and accuracy of standardised methods in real-world settings, ensuring that they meet the needs of researchers, clinicians and participants alike126. These studies also provide opportunities to incorporate user feedback, ensuring that standardised approaches are practical and user-friendly104,126.
Outlook
Sensor-based data collection for DP represents a transformative approach to monitoring mental health and other conditions2,3,127. However, realising its full potential requires addressing critical challenges, including technical limitations25, user compliance82,104, privacy concerns82,109 and the lack of standardisation71. These barriers not only hinder the scalability and reliability of DP but also limit its adoption in diverse contexts, from high-resource settings to low- and middle-income countries.
Among these challenges, the absence of standardised methodologies stands out as a fundamental issue71,72. Variability in data formats, sampling rates and analytic techniques across studies creates inconsistencies that undermine reproducibility and comparability25. Without universal protocols, the field risks perpetuating fragmentation, slowing progress and reducing the generalisability of findings71,72. Standardisation offers a pathway to overcome these obstacles by fostering interoperability, enhancing data quality and enabling large-scale, cross-cultural research.
The role of standardisation extends beyond technical considerations. It has the potential to bridge divides between diverse stakeholders, including researchers, clinicians, policymakers and industry leaders. By adopting shared frameworks and open-source platforms, the field can facilitate collaboration and knowledge exchange, driving innovation and inclusivity71. Initiatives like Open mHealth and RADAR-CNS provide valuable blueprints, highlighting the feasibility and benefits of standardised approaches118,119,121,124.
The path forward calls for collective action. Researchers must work alongside technology developers to ensure that tools are interoperable and user-friendly. Policymakers should prioritise funding for standardisation initiatives and establish regulatory frameworks that promote transparency and equity. Industry players, including device manufacturers and app developers, must align their technologies with standardised guidelines to maximise their impact. Equally, end-users—patients, caregivers and community members—must be engaged throughout the development process to ensure solutions are culturally relevant and ethically sound.
By leveraging technological innovations, fostering global collaboration and embedding ethical principles, the field can evolve to provide scalable, reliable and culturally sensitive solutions for mental health care. In conclusion, sensor-based DP holds immense promise for advancing personalised health care and mental health interventions. By addressing the challenges of data collection and embracing the critical role of standardisation, the field can unlock its transformative potential. Collaborative efforts to establish universal frameworks will not only enhance the reliability and scalability of DP but also ensure its benefits are equitably distributed across populations. This shared vision will pave the way for a future where DP becomes an integral tool in improving global health outcomes.
Acknowledgements
We would like to acknowledge the wider TRANSFORM team for their support.
Author contributions
N.A. and S.J. wrote the manuscript. C.K.D. assisted in structuring the manuscript for clarity and coherence. M.S. assisted by contributing to the technical aspects of the manuscript. D.G. and S.S. provided critical feedback and contributed to refining the arguments. All authors supported the manuscript writing. Reflexivity was integral to the development of this viewpoint. The authors represent diverse cultural, disciplinary and geographical backgrounds, bringing a wide range of experiences to the discussion of DP. This diversity enriched the manuscript but also necessitated ongoing reflection to minimise bias and ensure a balanced synthesis of perspectives. The lead author, Nadia Binte Alam, is a Bangladeshi doctoral researcher in Health Sciences based in the UK, with public training in the US and UK on neuropsychology and public health. Mohsin Surani is an Indian research software engineer based in the UK, contributing technical and analytical insights into data collection systems and platform architecture. Chayon Kumar Das is a Bangladeshi-based clinical psychologist with extensive experience in digital mental health intervention research and community-based care. Domenico Giacco, a psychiatrist and academic based in UK, specialises in co-design and patient-centred intervention research across Europe. Professor Swaran Singh, an internationally recognised leader in global mental health and psychiatry based in UK, provided senior guidance and ethical oversight rooted in years of research in cross-cultural contexts. Sagar Jilka is a UK-based researcher and data scientist with expertise in digital health innovation, user engagement and AI-based behavioural analysis. The writing process was shaped by interdisciplinary dialogue. All authors critically reviewed the manuscript at multiple stages to ensure conceptual clarity, inclusivity of perspectives.
Peer review
Peer review information
Communications Medicine thanks the anonymous reviewers for their contribution to the peer review of this work.
Competing interests
This study is part of the Transforming Access to Care for Serious Mental Disorders in Slums (TRANSFORM) Project and is funded by the UK’s National Institute for Health and Care Research (NIHR) (Award number: NIHR200846). The authors declare no other financial or non-financial competing interests.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
1. Onnela, J. P. & Rauch, S. L. Harnessing Smartphone-based digital phenotyping to enhance behavioral and mental health. Neuropsychopharmacology41, 1691–1696 (2016).
2. Oudin, A et al. Digital phenotyping: data-driven psychiatry to redefine mental health. J. Med. Internet Res.; 2023; 25, e44502. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/37792430][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10585447]
3. Huckvale, K; Venkatesh, S; Christensen, H. Toward clinical digital phenotyping: a timely opportunity to consider purpose, quality, and safety. NPJ Digit. Med; 2019; 2, 88. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31508498][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6731256]
4. Merchant, R., Torous, J., Rodriguez-Villa, E. & Naslund, J. A. Digital technology for management of severe mental disorders in low-income and middle-income countries. Curr. Opin. Psychiatry33, 501–507 (2020).
5. Jilka, S; Giacco, D. Digital phenotyping: how it could change mental health care and why we should all keep up. J. Ment. Health; 2024; 33, pp. 439-442. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39301756]
6. Bufano, P; Laurino, M; Said, S; Tognetti, A; Menicucci, D. Digital phenotyping for monitoring mental disorders: systematic review. J. Med. Internet Res.; 2023; 25, e46778. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38090800][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10753422]
7. Masud, M. T. et al. Unobtrusive monitoring of behavior and movement patterns to detect clinical depression severity level via smartphone. J. Biomed. Inform.103, 103371 (2020).
8. Chen, CM et al. Towards wearable and flexible sensors and circuits integration for stress monitoring. IEEE J. Biomed. Health Inf.; 2020; 24, pp. 2208-2215.
9. Abdullah, S et al. Automatic detection of social rhythms in bipolar disorder. J. Am. Med. Inform. Assoc.; 2016; 23, pp. 538-543. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26977102][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11740758]
10. Faurholt-Jepsen, M et al. Smartphone data as objective measures of bipolar disorder symptoms. Psychiatry Res; 2014; 217, pp. 124-127. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/24679993]
11. Faurholt-Jepsen, M et al. Daily electronic self-monitoring in bipolar disorder using smartphones—the MONARCA I trial: a randomized, placebo-controlled, single-blind, parallel group trial. Psychol. Med; 2015; 45, pp. 2691-2704. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26220802]
12. Baker, J; Mueller, N; Onnela, JP; Ongur, D; Buckner, R. 368. Deep dynamic phenotyping: neural changes underlying fluctuations in bipolar disorder over one year. Biol. Psychiatry; 2017; 81, pp. S150-S151.
13. Ben-Zeev, D et al. CrossCheck: integrating self-report, behavioral sensing, and smartphone use to identify digital indicators of psychotic relapse. Psychiatr. Rehabil. J.; 2017; 40, pp. 266-275. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28368138][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5593755]
14. Adler, DA et al. Predicting early warning signs of psychotic relapse from passive sensing data: an approach using encoder-decoder neural networks. JMIR mHealth uHealth; 2020; 8, e19962. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32865506][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7490673]
15. Gumley, AI et al. Digital smartphone intervention to recognise and manage early warning signs in schizophrenia to prevent relapse: the EMPOWER feasibility cluster RCT. Health Technol. Assess.; 2022; 26, v–122.
16. Renn, BN; Pratap, A; Atkins, DC; Mooney, SD; Areán, PA. Smartphone-based passive assessment of mobility in depression: challenges and opportunities. Ment. Health Phys. Act.; 2018; 14, pp. 136-139. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30123324][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6095666]
17. Meyerhoff, J. et al. Evaluation of changes in depression, anxiety, and social anxiety using smartphone sensor features: longitudinal cohort study. J. Med. Internet Res.23, e22844 (2021).
18. De Choudhury, M., Gamon, M., Counts, S. & Horvitz, E. Predicting Depression via Social Media. www.aaai.org (AAAI, 2013).
19. Di Matteo, D. et al. Automated screening for social anxiety, generalized anxiety, and depression from objective smartphone-collected data: cross-sectional study. J. Med. Internet Res.23, e28918 (2021).
20. Moshe, I. et al. Predicting symptoms of depression and anxiety using smartphone and wearable data. Front. Psychiatry12, 625247 (2021).
21. Place, S. et al. Behavioral indicators on a mobile sensing platform predict clinically validated psychiatric symptoms of mood and anxiety disorders. J. Med. Internet Res.19, e75 (2017).
22. Jacobson, NC; Lekkas, D; Huang, R; Thomas, N. Deep learning paired with wearable passive sensing data predicts deterioration in anxiety disorder symptoms across 17–18 years. J. Affect. Disord.; 2021; 282, pp. 104-111. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33401123]
23. Jacobson, N. C. & Feng, B. Digital phenotyping of generalized anxiety disorder: using artificial intelligence to accurately predict symptom severity using wearable sensors in daily life. Transl. Psychiatry12, 336 (2022).
24. Birk, RH; Samuel, G. Digital phenotyping for mental health: reviewing the challenges of using data to monitor and predict mental health problems. Curr. Psychiatry Rep.; 2022; 24, pp. 523-528. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36001220]
25. Onnela, J-P. Opportunities and challenges in the collection and analysis of digital phenotyping data. Neuropsychopharmacology; 2021; 46, pp. 45-54. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32679583]
26. Ding, N et al. Characterizing and modeling the impact of wireless signal strength on smartphone battery drain. ACM SIGMETRICS Perform. Eval. Rev.; 2013; 41, pp. 29-40.
27. Ellis Gibson. How Much Battery Does Having GPS On Use Up? Insights on GPS Drain and Consumption Rates. https://poweringautos.com (PoweringAutos, 2024).
28. Hess, B., Farahani, A. Z., Tschirschnitz, F. & von Reischach, F. Evaluation of fine-granular GPS tracking on smartphones. in Proc. 1st ACM SIGSPATIAL International Workshop on Mobile Geographic Information Systems, 33–40. https://doi.org/10.1145/2442810.2442817 (ACM, 2012).
29. Min, C. et al. Sandra helps you learn. in Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing, 421–432. https://doi.org/10.1145/2750858.2807553 (ACM, 2015).
30. Alugubelli, N., Abuissa, H. & Roka, A. Wearable devices for remote monitoring of heart rate and heart rate variability—what we know and what is coming. Sensors22, 8903 (2022).
31. Makedon, F., Clements, M., Pelachaud, C., Kalogeraki, V. & Maglogiannis, I. G. in Proc. 7th International Conference on PErvasive Technologies Related to Assistive Environments, 2014, Rhodes, Greece (PETRA ’14). (ACM, 2014).
32. Hooshmand, M; Zordan, D; Del Testa, D; Grisan, E; Rossi, M. Boosting the battery life of wearables for health monitoring through the compression of biosignals. IEEE Internet Things J.; 2017; 4, pp. 1647-1662.
33. Tawalbeh, LA; Basalamah, A; Mehmood, R; Tawalbeh, H. Greener and smarter phones for future cities: characterizing the impact of GPS signal strength on power consumption. IEEE Access; 2016; 4, pp. 858-868.
34. Henriksen, A; Hopstock, LA; Hartvigsen, G; Grimsgaard, S. Using cloud-based physical activity data from google fit and apple healthkit to expand recording of physical activity data in a population study. Stud. Health Technol. Inf.; 2017; 245, pp. 108-112.
35. Horvath, Z; Jenak, I; Brachmann, F. Battery consumption of smartphone sensors. J. Reliab. Intell. Environ.; 2017; 3, pp. 131-136.
36. De Ridder, B; Van Rompaey, B; Kampen, JK; Haine, S; Dilles, T. Smartphone apps using photoplethysmography for heart rate monitoring: meta-analysis. JMIR Cardio; 2018; 2, e4. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31758768][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6834218]
37. Masoum, A., Meratnia, N. & Havinga, P. J. M. An energy-efficient adaptive sampling scheme for wireless sensor networks. in Proc. IEEE Eighth International Conference on Intelligent Sensors, Sensor Networks and Information Processing, 231–236. https://doi.org/10.1109/ISSNIP.2013.6529794 (IEEE, 2013).
38. Alippi, C., Anastasi, G., Galperti, C., Mancini, F. & Roveri, M. Adaptive sampling for energy conservation in wireless sensor networks for snow monitoring applications. in Proc. IEEE Internatonal Conference on Mobile Adhoc and Sensor Systems, 1–6. https://doi.org/10.1109/MOBHOC.2007.4428700 (IEEE, 2007).
39. Alippi, C; Anastasi, G; Di Francesco, M; Roveri, M. An adaptive sampling algorithm for effective energy management in wireless sensor networks with energy-hungry sensors. IEEE Trans. Instrum. Meas.; 2010; 59, pp. 335-344.
40. Barnett, S et al. Intelligent sensing to inform and learn (InSTIL): a scalable and governance-aware platform for universal, smartphone-based digital phenotyping for research and clinical applications. J. Med. Internet Res.; 2019; 21, e16399. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31692450][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6868504]
41. Demirel, BU; Chen, L; Al Faruque, MA. Data-driven energy-efficient adaptive sampling using deep reinforcement learning. ACM Trans. Comput. Health.; 2023; 4, pp. 1-19.
42. Li, J et al. Low power optimisations for IoT wearable sensors based on evaluation of nine QRS detection algorithms. IEEE Open J. Circuits Syst.; 2020; 1, pp. 115-123.
43. Gomez, C; Oller, J; Paradells, J. Overview and evaluation of bluetooth low energy: an emerging low-power wireless technology. Sensors; 2012; 12, pp. 11734-11753. [PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3478807]
44. Mohamed, S et al. Energy management for wearable medical devices based on gaining–sharing knowledge algorithm. Complex Intell. Syst.; 2023; 9, pp. 6797-6811.
45. Schaffarczyk, M; Rogers, B; Reer, R; Gronwald, T. Validity of the Polar H10 Sensor for Heart Rate Variability Analysis during Resting State and Incremental Exercise in Recreational Men and Women. Sensors; 2022; 22, 6536. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36081005][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9459793]
46. Ray Maker. Polar H10 heart rate monitor: very long term in-depth review. DCRainMaker (2021).
47. Suau, Q et al. Current knowledge about ActiGraph GT9X link activity monitor accuracy and validity in measuring steps and energy expenditure: a systematic review. Sensors; 2024; 24, 825. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38339541][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10857518]
48. Ray Maker. Fitbit Charge 5 in-depth review. DCRainMaker (2021).
49. DeShaw, KJ et al. Methods for activity monitor validation studies: an example with the Fitbit Charge. J. Meas. Phys. Behav.; 2018; 1, pp. 130-135.
50. Hassan, L et al. Utility of consumer-grade wearable devices for inferring physical and mental health outcomes in severe mental illness: systematic review. JMIR Ment. Health; 2025; 12, e65143. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39773905][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11751658]
51. Andrea, A., Agulia, A., Serafini, G. & Amore, M. Digital biomarkers and digital phenotyping in mental health care and prevention. Eur. J. Public Health. 30. https://academic.oup.com/eurpub/article/30/Supplement_5/ckaa165.1080/5915618 (2020).
52. Wisniewski, H., Henson, P. & Torous, J. Using a smartphone app to identify clinically relevant behavior trends via symptom report, cognition scores, and exercise levels: a case series. Front. Psychiatry10, 652 (2019).
53. Henson, P., Barnett, I., Keshavan, M. & Torous, J. Towards clinically actionable digital phenotyping targets in schizophrenia. npj Schizophr.6, 13 (2020).
54. van Berkel, N; D’Alfonso, S; Kurnia Susanto, R; Ferreira, D; Kostakos, V. AWARE-Light: a smartphone tool for experience sampling and digital phenotyping. Pers. Ubiquitous Comput.; 2023; 27, pp. 435-445.
55. Biørn-Hansen, A; Rieger, C; Grønli, TM; Majchrzak, TA; Ghinea, G. An empirical investigation of performance overhead in cross-platform mobile development frameworks. Empir. Softw. Eng.; 2020; 25, pp. 2997-3040.
56. Jayvir, S. Native vs. cross-platform app development: pros and cons A. native app development: crafting tailored experiences. (2024).
57. Coghlan, S; D’Alfonso, S. Digital phenotyping: an epistemic and methodological. Anal. Philos. Technol.; 2021; 34, pp. 1905-1928.
58. Sherstinsky, A. Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys. D; 2020; 404, 132306.
59. Saripalle, RK. Leveraging FHIR to Integrate activity data with electronic health record. Health Technol.; 2020; 10, pp. 341-352.
60. Singh, R; Paxton, M; Auclair, J. Regulating the AI-enabled ecosystem for human therapeutics. Commun. Med.; 2025; 5, 181. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/40382515][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12085592]
61. Ji, Z et al. Survey of hallucination in natural language generation. ACM Comput. Surv.; 2023; 55, pp. 1-38.
62. Olawade, DB et al. Enhancing mental health with artificial intelligence: current trends and future prospects. J. Med. Surg. Public Health; 2024; 3, 100099.
63. Le Glaz, A et al. Machine learning and natural language processing in mental health: systematic review. J. Med. Internet Res.; 2021; 23, e15708. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33944788][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8132982]
64. Malgaroli, M; Hull, TD; Zech, JM; Althoff, T. Natural language processing for mental health interventions: a systematic review and research framework. Transl. Psychiatry; 2023; 13, [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/37798296][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10556019]309.
65. Cornet, VP; Holden, RJ. Systematic review of smartphone-based passive sensing for health and wellbeing. J. Biomed. Inf.; 2018; 77, pp. 120-132.
66. Nahum-Shani, I et al. Just-in-time adaptive interventions (JITAIs) in mobile health: key components and design principles for ongoing health behavior support. Ann. Behav. Med.; 2018; 52, pp. 446-462. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27663578]
67. Rodriguez, DV et al. Leveraging generative AI tools to support the development of digital solutions in health care research: case study. JMIR Hum. Factors; 2024; 11, e52885. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38446539][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10955400]
68. Reda, R; Piccinini, F; Martinelli, G; Carbonaro, A. Heterogeneous self-tracked health and fitness data integration and sharing according to a linked open data approach. Computing; 2022; 104, pp. 835-857.
69. Sahoo, C; Wankhade, M; Singh, BK. Sentiment analysis using deep learning techniques: a comprehensive review. Int J. Multimed. Inf. Retr.; 2023; 12, 41.
70. Atmaja, BT; Sasou, A. Sentiment analysis and emotion recognition from speech using universal speech representations. Sensors; 2022; 22, 6369. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36080828][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9460459]
71. Sameh, A; Rostami, M; Oussalah, M; Korpelainen, R; Farrahi, V. Digital phenotypes and digital biomarkers for health and diseases: a systematic review of machine learning approaches utilizing passive non-invasive signals collected via wearable devices and smartphones. Artif. Intell. Rev.; 2024; 58, 66.
72. Dlima, SD; Shevade, S; Menezes, SR; Ganju, A. Digital phenotyping in health using machine learning approaches: scoping review. JMIR Bioinform. Biotech.; 2022; 3, e39618.
73. Farshchian, B. A. & Vilarinho, T. Which mobile health toolkit should a service provider choose? A comparative evaluation of Apple HealthKit, Google Fit, and Samsung Digital Health Platform. in Proc. Ambient Intelligence. AmI 2017. Lecture Notes in Computer Science, 152–158. https://doi.org/10.1007/978-3-319-56997-0_12 (Springer, 2017).
74. Langener, AM et al. A template and tutorial for preregistering studies using passive smartphone measures. Behav. Res. Methods; 2024; 56, pp. 8289-8307. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39112740][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11525430]
75. Armoundas, AA et al. Data interoperability for ambulatory monitoring of cardiovascular disease: a scientific statement from the American Heart Association. Circ. Genom. Precis. Med.; 2024; 17, e000095. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38779844][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11703599]
76. Robertson, S et al. Development of a sports technology quality framework. J. Sports Sci.; 2023; 41, pp. 1983-1993. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38305379]
77. React Native. https://reactnative.dev/.
78. Shevtsiv, N., Shvets, D. & Karabut, N. Prospects for using react native for developing cross-platform mobile applications. Cent. Ukr. Sci. Bull. Tech. Sci. 208–213. https://doi.org/10.32515/2664-262X.2019.2(33).208-213 (2019).
79. Flutter. https://flutter.dev/.
80. Kuzmin, N., Ignatiev, K. & Grafov, D. Experience of developing a mobile application using flutter. in Proc. Information Science and Applications. Lecture Notes in Electrical Engineering, 571–575. https://doi.org/10.1007/978-981-15-1465-4_56 (Springer, 2020).
81. Perez-Pozuelo, I; Spathis, D; Gifford-Moore, J; Morley, J; Cowls, J. Digital phenotyping and sensitive health data: implications for data governance. J. Am. Med. Inform. Assoc.; 2021; 28, pp. 2002-2008. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33647989][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8363798]
82. Tomičić, A; Malešević, A; Čartolovni, A. Ethical, legal and social issues of digital phenotyping as a future solution for present-day challenges: a scoping review. Sci. Eng. Ethics; 2022; 28, 1.
83. Pantelopoulos, A; Bourbakis, NG. A Survey on wearable sensor-based systems for health monitoring and prognosis. IEEE Trans. Syst. Man Cybern. Part C; 2010; 40, pp. 1-12.
84. Siam, S. M. et al. Real-time accident detection and physiological signal monitoring to enhance motorbike safety and emergency response. https://doi.org/10.48550/arXiv.2403.19085 (2024).
85. Mukherjee, A. Implementing electronic health records in India: status, issues & way forward. Biomed. J. Sci. Tech. Res.33, 25690–25694 (2021).
86. Zhang, D; Lim, J; Zhou, L; Dahl, AA. Breaking the data value-privacy paradox in mobile mental health systems through user-centered privacy protection: a web-based survey study. JMIR Ment. Health; 2021; 8, e31633. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34951604][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8742208]
87. Tangari, G., Ikram, M., Ijaz, K., Kaafar, M. A. & Berkovsky, S. Mobile health and privacy: cross sectional study. BMJ n1248 https://doi.org/10.1136/bmj.n1248 (2021).
88. Karthan, M. et al. Enhancing mHealth data collection applications with sensing capabilities. Front. Public Health10, 926234 (2022).
89. Skantz, A. Performance Evaluation of Kotlin Multiplatform Mobile and Native IOS Development in Swift (KTH Royal Institute of Technology, 2023).
90. Wasilewski, K; Zabierowski, W. A comparison of java, flutter and kotlin/native technologies for sensor data-driven applications. Sensors; 2021; 21, 3324. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34064776][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8150988]
91. LabFront. LabFront| All-in-One Research Solution for Real World Data.
92. Hsu, C. F., Huang, H.-P., Peng, C., Masys, J. & Ahn, A. C. Obtaining actigraphy sleep report using Garmin wearable devices and labfront app. https://doi.org/10.1101/2023.11.03.23297889 (2023).
93. Borova, M; Prauzek, M; Konecny, J; Gaiova, K. A performance analysis of edge computing compression methods for environmental monitoring nodes with LoRaWAN communications. IFAC-PapersOnLine; 2022; 55, pp. 387-392.
94. Wang, X et al. HOPES: an integrative digital phenotyping platform for data collection, monitoring, and machine learning. J. Med. Internet Res.; 2021; 23, e23984. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33720028][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8074871]
95. Larrakoetxea, NG et al. Efficient machine learning on edge computing through data compression techniques. IEEE Access; 2023; 11, pp. 31676-31685.
96. Torous, J et al. Creating a digital health smartphone app and digital phenotyping platform for mental health and diverse healthcare needs: an interdisciplinary and collaborative approach. J. Technol. Behav. Sci.; 2019; 4, pp. 73-85.
97. De Boer, C et al. A call to expand the scope of digital phenotyping. J. Med. Internet Res.; 2023; 25, e39546. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/36917148][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10132029]
98. Bertolazzi, A; Quaglia, V; Bongelli, R. Barriers and facilitators to health technology adoption by older adults with chronic diseases: an integrative systematic review. BMC Public Health; 2024; 24, [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/38365698][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10873991]506.
99. Simblett, S et al. Barriers to and facilitators of engagement with mhealth technology for remote measurement and management of depression: qualitative analysis. JMIR mHealth uHealth; 2019; 7, e11325. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/30698535][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6372936]
100. Torous, J; Lipschitz, J; Ng, M; Firth, J. Dropout rates in clinical trials of smartphone apps for depressive symptoms: a systematic review and meta-analysis. J. Affect. Disord.; 2020; 263, pp. 413-419. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31969272]
101. Bexheti, A., Niforatos, E., Bahrainian, S. A., Langheinrich, M. & Crestani, F. Measuring the effect of cued recall on work meetings. in Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, 1020–1026. https://doi.org/10.1145/2968219.2968563 (ACM, 2016).
102. Harari, GM. A process-oriented approach to respecting privacy in the context of mobile phone tracking. Curr. Opin. Psychol.; 2020; 31, pp. 141-147. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31693976]
103. Naslund, JA; Marsch, LA; McHugo, GJ; Bartels, SJ. Emerging mHealth and eHealth interventions for serious mental illness: a review of the literature. J. Ment. Health; 2015; 24, pp. 321-332. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26017625][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4924808]
104. Devraj, K; Jones, L; Higgins, B; Thomas, PBM; Moosajee, M. User-centred design and development of a smartphone application (oversight) for digital phenotyping in ophthalmology. Healthcare; 2024; 12, 2550. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39765977][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11675816]
105. Hsin, H et al. Transforming psychiatry into data-driven medicine with digital measurement tools. npj Digit. Med.; 2018; 1, 37. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31304319][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6550182]
106. Jakobsen, JC; Gluud, C; Wetterslev, J; Winkel, P. When and how should multiple imputation be used for handling missing data in randomised clinical trials—a practical guide with flowcharts. BMC Med. Res. Methodol.; 2017; 17, [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29207961][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5717805]162.
107. Torous, J; Onnela, J-P; Keshavan, M. New dimensions and new tools to realize the potential of RDoC: digital phenotyping via smartphones and connected devices. Transl. Psychiatry; 2017; 7, e1053–e1053.
108. Davidson, BI. The crossroads of digital phenotyping. Gen. Hosp. Psychiatry; 2022; 74, pp. 126-132. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33653612]
109. Martinez-Martin, N; Greely, HT; Cho, MK. Ethical development of digital phenotyping tools for mental health applications: Delphi study. JMIR mHealth uHealth; 2021; 9, e27343. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34319252][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8367187]
110. Google. Fitbit privacy policy. Fitbit (2024).
111. Zehra, T; Parwani, A; Abdul-Ghafar, J; Ahmad, Z. A suggested way forward for adoption of AI-Enabled digital pathology in low resource organizations in the developing world. Diagn. Pathol.; 2023; 18, 68. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/37202805][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10193335]
112. Holko, M et al. Wearable fitness tracker use in federally qualified health center patients: strategies to improve the health of all of us using digital health devices. npj Digit. Med.; 2022; 5, 53. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35469045][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9038923]
113. Carter, H., Araya, R., Anjur, K., Deng, D. & Naslund, J. A. The emergence of digital mental health in low-income and middle-income countries: a review of recent advances and implications for the treatment and prevention of mental disorders. J. Psychiatric Res.133, 223–246 (2021).
114. Naslund, J. A. & Deng, D. Addressing mental health stigma in low-income and middle-income countries: a new frontier for digital mental health. Ethics Med. Public Health19, 100719 (2021).
115. Javed, A et al. Reducing the stigma of mental health disorders with a focus on low- and middle-income countries. Asian J. Psychiatr.; 2021; 58, 102601. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/33611083]
116. Lakhtakia, T. et al. Smartphone digital phenotyping, surveys, and cognitive assessments for global mental health: initial data and clinical correlations from an international first episode psychosis study. Digit. Health8, 20552076221133758 (2022).
117. Loftness, B. C. et al. Toward digital phenotypes of early childhood mental health via unsupervised and supervised machine learning. https://doi.org/10.1109/EMBC40787.2023.10340806 (2023).
118. Falkenhein, I. et al. Wearable device health data mapping to open mHealth and FHIR data formats. in Healthcare Transformation with Informatics and Artificial Intelligence.https://doi.org/10.3233/SHTI230500 (IOS Press, 2023).
119. Estrin, D; Sim, I. Open mHealth architecture: an engine for health care innovation. Science; 2010; 330, pp. 759-760. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/21051617]
120. El-Sappagh, S; Ali, F; Hendawi, A; Jang, J-H; Kwak, K-S. A mobile health monitoring-and-treatment system based on integration of the SSN sensor ontology and the HL7 FHIR standard. BMC Med. Inf. Decis. Mak.; 2019; 19, 97.
121. Chen, C et al. Making sense of mobile health data: an open architecture to improve individual- and population-level health. J. Med. Internet Res.; 2012; 14, e112. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/22875563][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3510692]
122. Bauer, AM et al. Acceptability of mHealth augmentation of collaborative care: a mixed methods pilot study. Gen. Hosp. Psychiatry; 2018; 51, pp. 22-29. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/29272712]
123. Naslund, JA; Aschbrenner, KA; Bartels, SJ. Wearable devices and smartphones for activity tracking among people with serious mental illness. Ment. Health Phys. Act.; 2016; 10, pp. 10-17. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27134654][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4845759]
124. Ranjan, Y et al. RADAR-base: open source mobile health platform for collecting, monitoring, and analyzing data using sensors, wearables, and mobile devices. JMIR mHealth uHealth; 2019; 7, e11734. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/31373275][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6694732]
125. Teresi, JA; Yu, X; Stewart, AL; Hays, RD. Guidelines for designing and evaluating feasibility pilot studies. Med. Care; 2022; 60, pp. 95-103. [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34812790][PubMedCentral: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8849521]
126. Short, T. H. & Pigeon, J. G. Protocols and pilot studies: taking data collection projects seriously. J. Stat. Educ.6, https://doi.org/10.1080/10691898.1998.11910607 (1998).
127. Smets, E. et al. Large-scale wearable data reveal digital phenotypes for daily-life stress detection. npj Digit. Med.1, 67 (2018).
© The Author(s) 2025. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.