Acceptable data use vs exploitation with children’s ‘free’ digital health tools

By Published On: December 23, 2025Last Updated: January 14, 2026
Acceptable data use vs exploitation with children’s ‘free’ digital health tools

By Wolfgang Hackl, CEO, OncoGenomX Inc., Allschwil, Switzerland

Digital health tools are becoming an integral part of childhood and adolescence. From ADHD trackers and mental-health apps to diabetes sensors, sleep monitors, fitness wearables, neurodevelopmental screeners, and educational wellbeing platforms, children and teenagers interact with more digital health technology than any generation before them. Much of it is “free.”

But “free” in the context of minors is uniquely complex: children and teenagers are not merely smaller adults. Their rights, vulnerabilities, developmental stages, and digital environments require heightened ethical scrutiny. When the business model behind a “free” tool relies on extracting behavioral, emotional, biometric, or health-adjacent data from young users, the stakes become significantly higher. The central question becomes unavoidable: When does data use genuinely support children’s health and development, and when does it exploit their limited ability to consent?

Children’s data: uniquely sensitive, uniquely enduring

Children generate data that is:

  • Highly predictive (behavioral and emotional trajectories can shape life outcomes)
  • Easily misinterpreted (context and development are crucial)
  • Attractive to advertisers (habit formation, family segmentation, early brand influence)
  • Difficult to meaningfully consent to
  • Long-lasting (today’s data may affect future education, employment, insurance or social opportunities)

Even “wellness” apps for young people often collect:

  • Mood and mental-health symptoms
  • Stress, sleep, screen-time, and social behavior
  • Academic focus patterns
  • Movement, location, and activity
  • Images, voice recordings, and metadata
  • Social graph information (friends, contacts, peer networks)

When collected without strict guardrails, these data can follow a child into adulthood, shaping profiles they never agreed to create – and cannot easily erase.

Parental consent ≠ meaningful child protection

Many digital health systems rely on “parental consent” as the ethical and legal safeguard.

But parental consent is not always informed, realistic or sufficient, especially when:

  • Consent forms are long, technical, or opaque
  • Parents assume the tool is medically validated when it is not
  • Data flows to third parties in ways parents cannot foresee
  • Teens bypass parental views by independently downloading apps
  • Behavioral data are used to profile the entire household, not just the child

A growing body of research warns that family-level consent models often mask rather than mitigate exploitation risks. Transparency must be proactive, visible, and understandable to both parents and children.

How exploitation appears in “free” child-facing health technologies

While many digital health tools for young people are beneficial, harmful patterns are emerging:

  1. Behavioral tracking beyond health purpose

Apps designed for stress reduction or focus improvement sometimes harvest extensive behavioral and location data for commercial analysis.

  1. Data-driven nudges that influence vulnerable users

Engagement-driven algorithms may push motivational content, wellness messages, or product suggestions at emotionally delicate moments.

  1. Advertising and profile building

Some “free” platforms monetize mood logs, attention patterns, or sleep cycles by selling audience segments to advertisers.

  1. Predictive analytics without accountability

Tools may generate risk scores (e.g., “learning difficulty likelihood”) without clinicians, parents or educators understanding how these predictions were made – or their accuracy.

  1. Family data inference

A child’s behaviors can be used to infer parental psychological state, household stress, socioeconomic status, or purchasing power.

This is not acceptable, even if legal language appears to permit it.

A positive path: Responsible innovation for children and adolescents

The solution is not to restrict access to digital health tools – these technologies can empower young people, democratize mental-health support, and fill care gaps. Instead, innovators must adopt a child-first design philosophy built on five key commitments.

  1. Developmentally appropriate transparency

Young users need explanations they can understand: “What data are we collecting? Why? What choices do you have?”

Visual icons, videos, and simple metaphors often outperform written notices.

  1. Strict purpose limitation and minimization

Only collect what is essential for the health or wellbeing function. Children’s data should not be used to:

  • Fuel targeted advertising
  • Build long-term behavioral profiles
  • Support consumer segmentation
  • Train non-health commercial models
  1. Meaningful control for both children and caregivers

Control must grow with the child. This can include:

  • Gradual autonomy settings
  • Consent renewal at developmental milestones
  • Clear deletion pathways
  • Granular data-sharing options
  1. Zero tolerance for harmful secondary uses

Children’s health and behavioral data must be categorically barred from:

  • Insurance or educational scoring
  • Legal surveillance
  • Manipulative persuasive design
  • High-risk predictive profiling
  • Household-level marketing
  1. High-security, high-accountability architecture

Children’s data require the strongest security standards – equivalent to genomic protection.

This includes:

  • Encryption by default
  • Minimal attack surfaces
  • Independent audits
  • Third-party vetting
  • Robust anonymization with explicit safeguards against re-identification

The opportunity: Children as empowered participants, not passive data sources

Digital health tools can help young people understand their bodies, emotions and health in ways our systems have never achieved before. But success depends on trust – and trust depends on responsibility.

The companies that thrive will be those that:

  • Honor developmental differences
  • Build safety and consent into their design DNA
  • Treat data as co-owned, not captured
  • Focus on wellbeing over engagement metrics
  • Ensure that benefits return to families and society

The new generation of digital health tools must not only support children’s development – they must protect their futures.

By embedding fairness, transparency and integrity into every layer of design, digital health innovators can build a landscape where “free” truly means valuable, safe, and empowering.

How to build an insight-led campaign when you don't have the data yet
Google’s AI health summaries risk patient harm, investigation reveals