She’s 16. Sitting in a sterile clinic room, phone clenched in one hand, courage in the other.
She’s just told the nurse she wants help. Real help. For the dark thoughts she doesn’t talk about at school.
The nurse nods. Hands her a tablet.
“You need to register on the app.”
She taps. Uploads her ID. Starts to fill in her details.
Then—“Please enter a parent or guardian’s ID number.”
She freezes. She doesn’t want her parents to know. That’s the whole point.
She closes the app. Smiles at the nurse.
“I’ll come back another time,” she lies.
That’s not fiction. It’s how we’re designing data rights today in South Africa — with good intentions and bad outcomes.
And that’s the moment the penny dropped.
South Africa’s Protection of Personal Information Act (POPIA) defaults to requiring parental consent—called competent person consent—for anyone under 18. It’s designed to protect minors. But in today’s hyper-digital world, it’s often doing the opposite.
The problem: consent is broken by design
We see it every week:
- In healthtech, teens can legally seek care—but apps still demand a parent’s permission.
- In fintech, youth banking apps stall because legal teams fear non-compliance.
- In EdTech, schools consent on behalf of learners with no meaningful participation from the students themselves.
- On social media, under-18s are simultaneously creators, data subjects, and minors in legal limbo.
This isn’t a policy gap. It’s a design failure.
And behind that failure is a legal system that treats capacity like a switch that flips at 18, ignoring the principle of evolving capacity—a core tenet of international child rights law that recognises children’s growing ability to make informed decisions.
What the world is doing better
In the UK, the Age Appropriate Design Code makes platforms default to high privacy and design for gradual autonomy. It’s law, not a suggestion.
In Vietnam and Thailand, regulators require dual consent—from both the child and the parent—for certain age ranges.
In France, regulators co-design consent flows with teenagers, not just for them.
These aren’t theoretical tweaks. They’re market enablers.
They reduce regulatory friction.
They build user trust.
They get youth-focused products to market faster.
Our approach: evolving capacity as a design pattern
At ITLawCo, we don’t just rewrite privacy policies. We help clients redesign consent. Here’s how we do it:
1. Understand the child
We map age ranges to maturity, risk levels, and service types. A 14-year-old using a mental health chatbot isn’t the same as a 9-year-old using a spelling app. Our diagnostic helps you meet them where they are.
2. Map the journey
Where do you collect data? When do you ask for consent? What does the child see? We help you visualise this, friction points and all.
3. Co-create consent flows
We don’t just say “use plain language”. We design it: videos, icons, layered notices. We test it. We iterate.
4. Implement tiered access
We help you set up tiered permissions—think parental oversight for under-13s, joint consent for 13–15s, autonomy for 16+. This lets you scale responsibly and stay ahead of regulators.
5. Embed it in your DNA
From training to board-level risk registers, we integrate evolving capacity into your governance—not just your UI.
What our clients say
“We were stuck. We couldn’t launch until we knew we were compliant. ITLawCo helped us create a consent model that was clear, defensible, and felt right to our users.”
— CEO, youth-focused fintech startup
Why it matters now
The Information Regulator is starting to pay attention. The Children’s Act already lets minors consent to health services, contraception, and more. The courts will eventually ask: why does POPIA lag behind?
If you wait for reform, you’ll be late.
If you design for rights now, you’ll lead.
Let’s talk
Want to know if your product’s consent flow is POPIA-compliant and youth-respecting?
Book a free 20-minute risk mapping call
Or download our Youth Consent Readiness Toolkit (coming soon)
📧 support@itlawco.com
One final thought
“Consent isn’t a checkbox. It’s a coming-of-age ritual in a digital world.”
We can design it to be ethical, empowering, and excellent.
Let’s build that future together.




