🗂️ About This Series

This is the first post in our six-part series on how personal data has been weaponised—from Cambridge Analytica to the AI-driven surveillance systems quietly profiling you today.

We’ll explore how these tools work, who’s using them, what they’re doing with your data—and what you can do to push back.

Let’s start at the beginning, with the case that blew the lid off: Cambridge Analytica.

⚠️
Not Our Usual Topic—But Please Read On
We know this isn’t a post about residential construction or the building industry. And yes—we’ve got plenty more of those coming soon.

But we also feel it’s important to pause and talk about something bigger. Because the way technology—especially AI, machine learning, and big data—is being used today will shape how we live, work, and plan for the future.

This series isn’t a rant. It’s a reality check. A look back at how companies have already used personal data in deeply manipulative ways—and a warning about how those same patterns are being dressed up with new names and slicker branding.

Personally, I think AI is great. Some of the tools out there are genuinely incredible. But like the tech boom of the early 2000s, there’s also a mountain of hype, smoke, and self-serving noise—mostly coming from the same Silicon Valley firms that love to sell visions more than they deliver value.

Take Roblox, for example. Built as a virtual playground, it’s now facing lawsuits and international backlash. Critics accuse the company of prioritising user growth, engagement, and profits over the protection of children, even as vulnerabilities have emerged—predators using voice-altering software, simulated sexual content, and more.

Roblox isn’t alone—but it’s emblematic. Platforms thriving on engagement-at-all-cost create incentives that can drown out user safety.

Silicon Valley used to run on a pay-it-forward culture. These days, it’s more about pay-to-play. That shift matters—and it’s worth paying attention to.
your device is always listening to you
Yep, your mum was right all along. Check the link below to read and see why

Introduction

You probably suspect your phone is listening. (BTW: It is—and that’s recently been proven - READ ABOUT IT HERE).

But what if every single interaction—from customer calls to social media use—is logged, analysed, and added to a robust AI-assembled profile?

Today, these profiles are pulled together with our unique device IDs, cross-referenced and networked.

This isn’t the minority report either—it’s a reality already happening. Remember Cambridge Analytica, the political consultancy firm that harvested data from up to 87 million Facebook users and turned it into predictive personality scores.

Critics and whistle blowers called it a psychological weapon, using personal data to sway elections. That scandal shocked the world—yet it barely scratched the surface of what today’s AI tools can do.

But emerging technologies and heightened awareness give us a chance to push back. We can learn from the past to protect the future—by adopting privacy-preserving tools, demanding rights to non-profiling, and pushing for stronger regulation. The question is: will we act before the weapon becomes too precise to resist?

Cambridge Analytica as a “Data WMD”

So WTF exactly happened?

In the early 2010s, a Facebook app named “This Is Your Digital Life” harvested data not just from people who downloaded it, but from their friends too—without their knowledge. That expanded the dataset immensely.

Experts estimate they gathered information on up to 87 million Facebook users, including their likes, friends, and interests.

Using that data, the firm Cambridge Analytica created detailed personality profiles—called psychographic profiles—based on the Big Five traits (like openness, conscientiousness, introversion, etc.). Instead of grouping people by age or location (demographics), they segmented by behaviours and psychological traits. That’s what makes it powerful—and dangerous.

🧠
A psychographic profile is a way to describe people not by where they live or how much money they make, but by their personality, beliefs, values, and lifestyle. Think of it as a profile built on why you act the way you do—your opinions, daily routines, interests, and even fears.

Unlike demographics—which capture obvious facts such as age or location—psychographics digs deeper into your mindset and motivations. Marketers and firms collect this information through surveys, social media behavior, and digital footprints.

When combined with lots of user data (like Facebook likes or browsing habits), companies like Cambridge Analytica used psychographic profiles to predict—and even try to influence—the way you think or vote. This is why critics labeled it a real-life “data weapon.

Why call it a “Data WMD”?

The whistle blower Christopher Wylie and others labelled this approach psychological warfare. By knowing someone’s personality, CA could craft personalised messages designed to emotionally influence and sway decisions—especially during the 2016 U.S. elections and Brexit campaigns.

In the words of Wylie: they could determine "what kind of messaging you’d be susceptible to" and push you toward a specific outcome.

So, what were the consequences?

Cambridge Analytica didn’t survive the fallout. After global outrage and mounting legal pressure, the company folded in May 2018, filing for bankruptcy. It was a textbook example of a business built on shady data practices finally crashing under the weight of public scrutiny.

Facebook—now rebranded as Meta—didn’t escape unscathed either. It faced a $5 billion fine from the U.S. Federal Trade Commission, which, at the time, was one of the largest penalties ever handed down for privacy violations. On top of that, it settled a separate class action with U.S. users for $725 million. Other regulators in places like the UK and Australia also imposed penalties for the company’s failure to protect user data.

And it’s not entirely in the rear view mirror either. In 2025, a court in Washington, D.C., revived a long-running lawsuit accusing Meta of misleading consumers in relation to the Cambridge Analytica scandal—proof that data misuse has a long tail.

🧠 What this case teaches us

The biggest takeaway from the Cambridge Analytica scandal? Personal data is incredibly powerful—and dangerous—when mixed with behavioural science.

Cambridge Analytica might be gone, but the playbook they used hasn’t disappeared. In fact, it’s evolved. The tools they helped expose—like psychographic targeting, which once relied on manual analysis—can now be done automatically with AI. And not just at scale, but in real time.

What’s more unsettling is that many of the same people behind the original operation didn’t fade away. They resurfaced in new companies, under new names, continuing the same kind of work. The machinery didn’t stop—it just got re-branded and kept moving.

Darth Maul Phantom Menace
Still around but with a new appearance.

Conclusion

The story of Cambridge Analytica isn’t history - its still around but with a new appearance and "powdered by AI". The same tactics are still in use, just harder to detect, faster to deploy, and powered by AI instead of spreadsheets.

If we learnt anything from this scandal, it’s this: your data is valuable because it reveals who you are—and that makes it powerful.

In the next post in the series, we’ll look at what happened after Cambridge Analytica collapsed—and how the same people and playbook just moved on under new names.


Further Reading + Sources + References

Here’s the Pitch Deck for ‘Active Listening’ Ad Targeting
404 Media previously reported Cox Media Group (CMG) was advertising a service that claimed to target ads based on what potential customers said near device microphones. Now, here is the pitch deck CMG sent to prospective companies. Google has kicked CMG off its Partner Program in response.
Cambridge Analytica Scandal: How Investigative Reporting Exposed Data Misuse
Explore how investigative journalism uncovered the Cambridge Analytica scandal, exposing Facebook data misuse and sparking global privacy debates. Learn about its impact on tech regulation.
Facebook–Cambridge Analytica data scandal - Wikipedia
Landmark settlement of $50m from Meta for Australian users impacted by Cambridge Analytica incident
The Australian Information Commissioner has agreed to a $50 million payment program as part of an enforceable undertaking received from Meta to settle civil penalty proceedings.
USA: Families sue Character.AI over alleged role in encouraging violence, self-harm and sexual content - Business & Human Rights Resource Centre
Check out this page via the Business and Human Rights Resource Centre
Putting the ‘Capitalism’ in ‘Surveillance Capitalism’
In the 21st century, every life experience is a monetizable data point.
How big five personality traits influence information sharing on social media: A meta analysis - PMC
Research interest in information sharing behavior on social media has significantly increased over the past decade. However, empirical studies on the relationship between Big Five personality traits and information sharing behavior have yielded…

Ditch Microsoft & Apple: Come to the Dark Side - Try MX Linux + Save A Packet
Fed up with Microsoft and Apple subscription fees? Embrace MX Linux: subscription-free, lightning-fast, and refreshingly easy. Revive your previous-generation computer with zero hassle. The dark side is peaceful here—no sharks allowed.
GrapheneOS: Secure Your Data and Privacy on Your Smartphone
Tired of big tech companies exploiting your personal data? GrapheneOS offers a secure, privacy-focused alternative to mainstream operating systems. In this guide, we explore what GrapheneOS is, why you should consider using it, and how to install it. Take control of your digital life today.
Secure Your Data with Cryptomator: Easy NSA-Grade Protection
Discover how Cryptomator provides NSA-grade security for your personal data. This user-friendly encryption tool ensures your files are protected from unauthorized access. Learn the importance of data security and how to easily install and use Cryptomator on various devices.
BorgBase & Vorta: Secure, Budget-Friendly Data Backup
Learn why you need solid data prevention plans—and how BorgBase + Vorta deliver encrypted, cost-effective backups with easy setup across Mac, Linux and more for under about $120 AUD/year (non sponsored post)

✅ Want to learn how to update your privacy settings?

Naomi Brockwell’s NBTV YouTube channel features great explainers that are clear and easy to understand.

Naomi Brockwell TV