Designing for the Social Media Ban

Portable co-founder Andrew Apostola considering how to design in light of the Online Safety Amendment (Social Media Minimum Age) Bill 2024.

Last week the Federal Government, supported by the opposition, introduced and passed legislation banning the use of social media by young people aged under 16 years. Called the Online Safety Amendment (Social Media Minimum Age) Bill 2024, it’s a series of amendments to the Online Safety Act which passed in 2021.

From my perspective, this is a really welcome step towards seeking to address the types of harm, practices and lack of safety design from all products operating in the space. A few years ago I wrote extensively about how we ended up in a place where vile content could be uploaded and viewed by children with little recourse for individuals. I’m a proponent of a free and open internet, however what’s clear is that we’re no longer dealing with the internet but proprietary platforms that have been left to police themselves but have failed to deploy their money into the many aspects of consumer safety that exist in an online ecosystem.

As someone who works in space designing digital products for government, I spent the weekend contemplating what it would actually take to make this type of legislation work in the ecosystem that we live in today in Australia.

Start With App Stores, Devices and Operating Systems

The conversation has immediately been directed towards the creators of social media platforms, however we need a device and operating system in place before we can even get to these products and it is entirely practical and feasible for discussions around the technical design of content for young people to begin here. The three most pervasive products that exist in the market are Apple Screen Time, Google Family Link and Microsoft Family Safety.

Parents from around the world complain endlessly about the lack of dexterity that comes from these products, in particular Apple Screen Time, which touts itself for leading in design excellence and privacy. Parental Controls were first launched as part of Apple’s iOS 2.0 release in 2008, however it took a decade for the introduction of Screen Time. In 2021 Christopher Skogen, a software engineer on the team at Apple announced the release of the Screen Time API, which allows for other software developers and product teams to develop products on top of Apple iOS.

Whilst Screen Time provides a limited experience for parents to control experiences on their children's devices it does provide a place of play for the new social media legislation. A large part of the power of this legislation I believe is that it puts a framework in place for parents to follow which previously did not exist. Media around the world is inundated with news stories featuring parents and families who have had poor experiences and outcomes with social media content. For me this moment has an eerie resemblance to the moment when former activist and presidential candidate Ralph Nader helped bring to life the National Traffic and Motor Vehicle Safety Act of 1966, which for the first time put in place a framework for car manufacturers to follow when it came to safety, including the mandate of seat belts in all vehicles. This social media legislation means that in a year’s time, not only will there be an onus on the manufacturers of these products to deal with the issue of safety, parents will have knowledge and the law behind them when it comes to making decisions around the products their children use.  Tool design should come next.

What is expected to follow is that first device manufacturers and operating system creators will need to design enough granular controls in the Screen Time API to be able to say to social media companies, we’ve given you all of the tools, it’s up to you now. That should include a simpler design on onboarding rules for new phone installs to determine whether the primary user of the device is under the age of 16 and continued monitoring to ensure that they stay compliant. It should not be good enough to just ask for a birthdate. Apple, Microsoft and Android operation systems will need to invest more than they currently are into these product teams to ensure that they don’t become scandalised through young teen social media use.

If I were to design this new system, I would design an additional layer of connectivity between devices operating within the home. The FindMy product with iPhones that allows you to own devices and see their location could clearly label whether the device is being operated by an adult or young person under the age of 16 and provide simple prompts for devices operating in a home network to be classified. A product without an authentication enters the location or network, then the parent is notified. Sure, children under the age of 16 could still pretend to be adults, but it would make it extremely difficult for parents to pretend that they don’t know. If a parent lets a 12 year old child into a car without enforcing the use of a seatbelt which is a design feature of a car, then in the unfortunate case of an accident, that parent would face scrutiny from law enforcement, the media and society.

Ultimately I believe this is the primary way that this design issue should be tackled: the manufacturers of mobile devices should be forced to create the equivalent of seatbelt and airbag cavities into their products and make them easy for the manufacturers of those components to plug in and operate. It doesn’t take away the onus on social media companies to design better safety controls or to have better content and algorithmic practices, but provides the first part of the stack necessary for a safe environment.

The Smart Adoption of Digital ID Technology

We all know that the way in which age verification now works on all social media platforms is an absolute joke. When I was growing up in the 1980s the creators of “risque” products like Leisure Suit Larry would ask their users questions that they determined only an adult would know the answer to. My favourite was “What is Spiro Agnew?” which I always use to miscategorise as a type of toothpaste instead of a former vice president of the United States. For the past twenty years, most have made basically no attempt to improve this process in any way. Why? Because there was no financial or legislative imperative to do so.

The greatest fear many people have is around the forced use of identity verification for the use of online products. That would simply mean that before you could sign into a social media product you’d have to verify that you were over the age of 16. Mandatory identity verification poses significant challenges, from privacy and security risks to exclusion and ethical concerns. It undermines anonymity, can expose sensitive data to breaches, and risks government overreach. Vulnerable groups, such as those without ID, the elderly or communities in low-resource settings, may face exclusion as a result. One of the largest challenges is the chilling effect it may have on innovation, locking us in with the same social media networks we have today because the barriers of entry for start-ups with little resources would be great.

Yet these challenges provide us with opportunities to design safe measures that are world leading. Earlier this year the Australian Government passed the Digital ID Act, which provides a regulatory framework around digital identity systems. In 2017 Australia Post launched Digital ID which is a platform that enables individuals to securely identify their identity online and in person. Digital IDs and driver’s licenses are coming online in states across Australia. Whilst currently operating in the government sector, it’s set to expand in December 2026 — perhaps a tad too late for the implementation of the new social media laws.

If we stop to contemplate this world for a moment, it’s entirely imaginable that social media companies could ask users to confirm just the age component of their identity digitally using pre-existing digital infrastructure under a pre-existing legal framework. Alternatively, device and iOS manufacturers could connect to this service upon account creation or install to pre-authorise the user and all associated app stores that make these accounts available for download. To address concerns around accessibility I’d provide a 30 day grace upon account creation for those users that need to acquire ID or seek physical authentication. The infrastructure required is relatively simple - the equivalent of payments on ecommerce platforms - so new companies could incorporate safety settings relatively simply.

Changes to legislation

For those who despise government interference, I can see how this approach is unpalatable, yet I would argue that the landscape we operate in across healthcare, transport and social services increasingly looks like this anyway. To make all of this happen I’d recommend the following changes to legislation:

  1. Legislate explicit provisions to integrate digital ID systems into social media platforms, building on the existing requirements under the Online Safety Act and the Online Safety Amendment.
  2. Mandate privacy and security standards for storing and handling identity verification data by social media companies by amending the Privacy Act 1988 to include specific protections for digital ID-linked data in the context of social media.
  3. Amend the Digital ID Act 2024 to directly support social media platforms as accredited entities under the Digital ID System, ensuring compliance with high-security and privacy standards.
  4. Introduce consumer consent requirements in the Privacy Act 1988, ensuring users can share only the minimum data necessary for identity verification and maintain control over their personal information.
  5. Establish legislative pathways for secure and privacy-conscious data-sharing agreements between digital ID providers and social media companies by updating the Digital ID Act 2024 to include explicit guidelines for private-sector data exchange.
  6. Enact legislation in the Digital ID Act 2024 to limit government agencies from requesting or accessing more personal information than is strictly necessary during the identity verification process, ensuring minimal data exposure and adhering to the principle of data minimisation.
  7. Introduce safeguards in the Digital ID Act 2024 to restrict government access to the identity of social media users without a clear and lawful purpose, requiring judicial oversight or warrants to protect user privacy.

There are a range of other options that I believe would be available and I would love to have the time to actually do the technical exploration and UX design of some of these products to try to imagine a little further what this would look like.

Where do we go from here?

Andrew Apostola is Portable's co-founder. If you'd like to connect with him about this article, potential technical exploration, UX design, or other topics, reach out!

Email Andrew

Sign up to our email newsletter to get updates about our events, work and research

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.