Welcome to Future Media and an exclusive pod with two former leaders of Meta who are breaking their silence on Australia’s world first under 16s social ban.
The regulations are now six days old and we’ve heard a lot from government, a lot of opinions, and a lot from parents and kids.
Until now though, we haven’t heard from those inside Meta who have expert and important insights, so I invited two of them on.
Former Meta director Kelly Stonelake worked on Mark Zuckerberg’s metaverse until she warned about child harms and was laid off.
She now advocates for child safety and tech regulation and has branded social media “a breeding ground for harm”. She advises the US Federal Trade Commission and has a Substack, Overturned.
Brian Boland was Vice President of ads and marketing at Meta until 2020. He now advises governments and gave evidence to the US Senate that Meta, YouTube, X and TikTok prioritise growth over safety.
In recently unredacted court papers, he testified: “My feeling then, and my feeling now, is that they don’t meaningfully care about user safety.
“It’s not something that they spend a lot of time on. It’s not something they think about. And I don’t really think they care.”
The night before our interview, Australia was the victim of a terror attack on Bondi Beach, which led to violent footage flooding social media.
I asked them whether society can rely on tech companies to self-regulate, or will it take laws like those in Australia to force change.
Brian said: “I heard someone say recently that your child will be only be ready for social media when they’re ready to see suicide or violent death.
“My heart’s broken because tragically we saw that where you are in Australia overnight.
“I’m excited about the uproar over Australia’s policy. It’s forcing a really important conversation.”
Kelly and Brian spent a combined 26 years at Meta, and Kelly admits she fell victim to the corporate Kool-Aid.
“Ten years ago when I heard arguments that tech should be left alone to do the right thing, I nodded,” she said. “I inherently believed that we would do the right thing.”
But “everything turned upside down” when she worked on the Metaverse. Leaders ignored warnings about children being targeted, she said.
Evidence it was “a proven breeding ground for predators” was set aside and “proactively solving it wasn’t a chief concern”.
“The conversation I heard was how do we protect the company?” she said. “How do we minimise liability? How can we make parents feel safer even though it’s not?
“That was the turning point for me. I feel ashamed I didn’t wake up earlier.
“If these companies wanted to change, they would, but it comes down to profitability and keeping people engaged.”
She praised Australia’s law for putting them on notice, with $50 million fines for non-compliance, and said she was hopeful.
“I predict we’re going to see a seachange in how tech companies and social media companies work. They’re suddenly going to be innovating in child safety.”
Brian said Meta’s blindspot on child safety was a symptom of deeply held “ideology and goals”.
He said: “They have a near religious belief that this product is good for mankind incrementally and that justifies an unimaginable amount of harm.”
He hit back at tech’s position that it’s the responsibility of parents to lock young kids out of social, and imposing responsibility on platforms impedes free speech.
He said: “If you were building a playground and you put in a giant sandpit with spikes in it, should we have regulations that say no spikes or just let the parents deal with it?
“It’s not the parents’ job to keep kids safe on the inherently unsafe product we’ve built.
“No-one thinks parents should be uninvolved but we should have products that are designed safely.”
Kelly joined in: “It boils my blood that tech companies have this narrative that this is a parenting issue, and that parents somehow don’t care.
“I really appreciate how Australia’s approach is a partnership, with a huge amount of parent education, webinars, FAQs, and guides on how to talk to kids.”
She added: “And that playground with the spikes? A billionaire put those spikes there for trillion-dollar profits. Who’s he to say parents need to do a better job?”
She praised Australian eSafety Commissioner Julie Inman Grant. “I watched your interview,” she said.
“I’ve also watched a lot of the interviews with Australian kids talking about how they’re getting around it.
“These aren’t reasons to side with trillion-dollar companies over kids and families. This regulation will still be a significant step in harm reduction.”
And she dismissed suggestions that kids blocked from Meta, Snap, X and YouTube would rush to more dangerous corners of the web.
“Kids aren’t seeking out the dark web. If they were, they’d be going there in the first place,” she said.
“They’re searching for inspirational break-up quotes and getting pro-suicide content.”
She called out the story of 16-year-old Mason Edens who died after TikTok sent him pro-suicide material after a break up. He was featured in Sydney’s Daily Telegraph.
“You’ve reported on the amount of anxiety that many young people experience about social media, and how the pressure shows up. Now they have an out,” she said. “The barrier to entry has gone up significantly”
Brian also pushed back on the suggestion kids would be left isolated, saying: “Texting and messaging are very effective ways to communicate and set up groups.
“This free speech narrative started out as an American thing because it’s a core value (in the US Constitution), but it’s really a false argument.
“Imagine you’re in a public square where everyone’s sharing ideas. It’s a vibrant forum for speech.
“Then somebody comes in and makes a lot of noise. They put up giant speakers and a microphone and handpick people to put on the stage.
“The giant speakers ensure their volume is louder than the crowd, and it drowns out what’s really happening.
“Then the speaker says something everyone hates, and people want to argue. Everyone tries to be heard, but the speakers just drown them out.
“There’s no real speech. No discussion. No real debate. That’s what these platforms do. They’re a selective magnifier of selected speech.
“They don’t tell you what’s going on. They don’t let you hear the breadth of opinion. They don’t inform you. They get you to click, to stay on, to post, and to share.”
Meta’s internal teams are rewarded for driving engagement this way, he said.
“This is not about free speech,” he added. “They want your minutes and your clicks on their platform because they turn them into money which turns into power.”
Meta’s financial filings show it earned 97.2 per cent of its record US$164.5 billion from ads on Instagram and Facebook in 2024. WhatsApp just launched ads too.
Brian spent years as VP of ad tech at Meta and said shareholder greed was the root problem.
“The global economy is wired for shareholder value above all other things. This is a symptom of that,” he explained.
“When Facebook whistleblower Frances Haugen implicated Meta in human trafficking, cartel recruitment, and youth harms, the stock took a $3 billion hit, and soon rebounded.
“Then, a few months later, a company filing on profitability left the market thinking it was not making enough profit. That crushed the stock. Absolute nosedive.
“Meta responded by cutting costs and firing a ton of people, and the market cheered with balloons and celebrations and confetti and brought the stock right roaring back.
“The market penalises you for not making more money, and it doesn’t reward you for keeping people safe, so companies just follow that market logic.”
Australia’s new law has one major difference over others that have been tried overseas. It places the burden to keep under-16s off on the platforms, with $50 million fines for failing.
Kelly told me: “Brian and I were at Facebook 10 years ago and it had a very good idea how old everyone was on the platform then.
“It knew their movement patterns. Were they at school from 8am to 4pm? What did they buy? What content did they engage with?
“But when the conversation moves to using the same technology to keep kids safe from proven harms, that’s where a line gets drawn.”
Brian added: “We really need to wrestle with what information we want these companies having.
“The amount of data that tech collects and how they string it together creates profiles that are are more accurate than any identification that you carry. A government ID is almost inconsequential.
The conversation should be a bigger one around what data and data privacy we should expect as a society.”
Kelly said the first test of the efficacy of the law will come once the ban has been in place for a while.
“It’ll be six months, maybe longer, before a meaningful shift in data, but I expect significant differences in rates of depression and anxiety among younger people.
“Differences to be visible in rates of suicide, differences in outcomes at school and participation in offline activities.
“I would be measuring how many hours of sleep are kids getting, how that is changing, and measuring instances of cyberbullying.
“There’s still access to messaging apps and texting. Cyberbullying won’t go away, but I bet we’ll see a significant decrease and that will have a ripple effect in many other areas that are causing harm to kids.”
She predicted Australia’s lead will spark a global shift.
“When evidence shows a product harms kids, society has a duty to act, and in many cases, governments have failed to take right-sized action,” she said.
“This is the beginning of what your eSafety Commissioner has called the seatbelt moment. It’s basic common sense.
“Ten years from now, we’ll see that what started last week in Australia is everywhere, and we won’t be talking about many of the issues that we’re talking about now.”
Brian added: “You guys acted. We shouldn’t miss the point that Australia took a big risk in taking a stance.
“I think it’s a brave, heroic and important step that’s incredibly hopeful.”
But the work is only beginning, he said, as AI pushes content into smart glasses and other formats.
“A few months ago, Mark Zuckerberg talked about the loneliness epidemic and that people have space for more friends,” Brian pointed out.
“So what did he do? He said he would create some AI friends for them so Meta can fill feeds and users’ lives. Do you trust Mark to create AI friends for your kid?
“Sit with that for a moment. Picture the direction these platforms have gone without guidance, without safety, and ask if you want that - because it’s going to happen.
“These technologies are going to continue to march and evolve and try to find their way into more of our lives because of the profit motive.
“That’s why regulation is so important. It’s really important to take bold, brave action and Australia has taken a step early on.
“It will be important to listen, and keep moving fast and fixing things, and tweaking the regulations.”
And he is adamant it won’t be fixed as long as Mark Zuckerberg leads Meta.
“I’ve seen no evidence that he will take it in a different direction, while I’ve seen every evidence to the contrary,” he said.
“Presented with information and data that could make the product healthier and safer, Mark chooses not to.
“Fines and fees are just a cost of doing business. A $3 billion fine sounds scary, but they’ll do the math and decide it’s worth it. The law needs Great White-sized teeth.”
What would change things is the prospect of a Big Tech leader facing criminal liability and jail time.
“That would change the temperature pretty quick. You’d see a different tenor then,” he said, “but left on their own, the leadership won’t change.”
I asked whether Australia could really change anything given Big Tech is mainly American, and The White House has shown little appetite for similar change.
Kelly told me: “I’m hopeful that the US will enact a duty of care legislation and repeal Section 230. It’s a matter of when.
“Tech uses money to drive influence with politicians. This can be putting data centres in their home state.
“These forces were built brick by brick, so we’re going to have to pull them apart brick by brick.
“Australia’s step is critically important to start this conversation. It shows the harms are real enough for an entire continent to make the products unavailable to under 16s.
“I expect to see other jurisdictions follow quickly, but will the US be at the front of that pack? I sure hope so, but I don’t expect that.”
Brian added: “The US is a funny place, and we’ve got our issues right now, but we love our kids.
“There will be a continued push inside the United States to figure out ways to protect kids on these platforms.”
But he then echoed something I have been pushing for months, which is that Meta’s ongoing revenue growth isn’t reliant on the US, but from emerging markets.
Brian said: “The vast majority of Facebook’s users aren;t in the US. And while the revenue is concentrated there, the global economy is on the verge of some pretty massive shifts over the next 10 to 20 years.
“You could see a very different financial set of incentives for Meta if the countries that benefit most from the current economic shifts start to take this seriously.”
Let’s dive into that.
Earnings:
Today, 38 per cent of Meta’s income comes from the US/Canada, as Brian highlighted, followed by AsiaPac (~27 per cent), Europe (~23 per cent), and ROW (~11 per cent).
Growth:
Meta’s biggest markets; US, Canada, Europe and Australia are all laggards on growth, while emerging economies Africa, South America, Indonesia and India are spiking.
Plot Meta’s current revenue against those growth trajectories and it suggests - as Brian says - that its cash cow, the US, is looking pretty spent.
Interesting stuff, but that’s just the start.
If you cross-reference Meta’s fastest growing markets with the largest populations, a strategy begins to emerge.
Then cross reference which of those countries are predicted to have the fastest compound GDP growth through to 2030, and you get this.
It gets even saucier when you flag which of these are BRICS countries which are planning to dump the US dollar as their base currency.
And even then, there’s still one glaring hole.
China 🇨🇳 and its 1.4 billion population and its compound 25.5 per cent GDP growth through to 2030.
Another Meta whistleblower Sarah Wynn-Williams caused a storm last year, telling a Senate inquiry that Meta banks an unreported $18.3 billion from America’s #1 enemy.
So… add all that together, give it a stir, and you get an idea what Meta’s growth nerds are pouring over while slurping industrial-sized Doctor Peppers down at 1 Meta Way.
That fact is that child welfare doesn’t compute in any of those equations, but Brian says we do have agency.
“It’s our choice as citizens of the globe and citizens of our nations and citizens of our states and localities. We’re told that we’re powerless in this, but we’re not,” he said.
“We need to recapture what it means to be a citizen and have a voice about the outcomes that we want in our lives and our society.
“And it’s up to us collectively to dictate terms and not say some visionary boy wonder in Silicon Valley is going to save the world because he’s not. It’s going to be us.
“I’m hopeful that we’re taking some really good, bold steps, and that we’re going to start to see some success in these areas. We can learn from Australia, and do it more. It’s up to us.”
Kelly added: “I want to paint a picture in your head.
“We’re talking about the difference between a billionaire-owned trillion-dollar company with enough resources to fly drones over Africa to capture a user base against a child alone in their bedroom with a device glowing in their face.
“And we’re asking that child and their parents to absorb the cost of the inaction of that billionaire and his trillion-dollar company.
“Keep that image in your mind because there’s a lot of money and a lot of really smart people who want us thinking about all this in all theoretical ways.
“But this isn’t a matter of theory. This is a matter of real kids dying and experiencing irreversible harms to their lives because of what’s in the best interest of a billionaire.”
Thank you Kerry Shaw, jinbin, Amy Neville, and many others for tuning into my live video with Brian Boland and Kelly Stonelake! Join me for my next live video in the app.


















