You’re not the product — You’re the target
Imagine if the building industry ran like the internet. Builders would pour slabs without soil tests, guess the reinforcement, and say: “We’ll monitor performance and patch defects later.”
By the time the cracks showed, they’d have moved on, and you’d get a voucher and an apology blog post. Thoughts and prayers. Move Fast & Break Things Baby!
No one wants that house. We shouldn’t accept that web.
To be clear: the web isn’t “free.” It’s financed by your data, your attention, and a running profile that guesses what you’ll do next.
That glossy user interface (flat UI), “be your best self” productivity app, or smart home plug? Lovely UI on the front, hungry data flowing out the back to all manner of partners, providers, and grubby data hungry companies "looking out for you."
(Source: Out of Control: How consumers are exploited by the online advertising industry)
Right now, a version of you exists online—a "digital you." This avatar is built from every click, every search, every location check-in, and every app you use. But the thing is: you don’t own this digital self. Corporations do.
They’ve built a multi-trillion dollar industry on a simple, brutal business model: take your personal experience for free, package it as data, and sell it to the highest bidder. You are both the product and the customer, paying for "free" services with your privacy, your attention, and your money.
They dress it up in the slick paintwork of productivity and connection. "Be your best you!" the apps cheerfully promise, while quietly vacuuming up the raw material of your life to use against you.
Dynamic pricing means you might pay more for a flight or a hotel room because you’ve searched for it before. Your data can be used to determine your creditworthiness, your insurance premiums, and even the news you see.
This is a fundamental power imbalance that undermines our autonomy, our wallets, and our democracy. The current system is designed to be opaque, forcing us into "agreements" we didn’t read and couldn’t understand if we tried, all while hiding behind complex legal and jurisdictional shields.
But it doesn’t have to be this way. We’ve reached a tipping point. It's time to stop asking nicely for privacy and start demanding it as a fundamental right.
It's time for a global Digital Bill of Rights—a simple, powerful set of rules that returns control of our digital selves to us, where it always should have been.
In this post we argue for something cleaner and future-proof:
A Digital Bill of Rights (#DBoR)
that sets bright lines everyone can understand and regulators can actually enforce.
No lawyerly BS, unfair contract terms, without right to repair, seek recourse then appended with "yes, I agree". No “click to accept or get lost.” Just a black-and-white checklist that says:
- Consent must be explicit (no trickery, no “opt-out in 47 clicks”, "certified/registered post sent to an office during the hours of 10am and 2pm, when the moon phase is ebbing, on a leap year" - type opt outs).
- No tracking by default.
- Telemetry off unless you turn it on — and if companies profit from that data, you get a big cut (80%-20% is fair and reasonable - it is your data after all).
- Plain language terms, not legal sudoku.
- No scraping from non-sanctioned sources (especially biometric identification, health data, PII (Personally Identifiable Information) Data, gait recognition).
- Tax where users live — the value created from a local user base should directly fund public services within that community. This must include contributing to the costs of addressing harms these platforms cause, such as funding mental health services required to counter the effects of deliberate psychological manipulation designed into applications to 'create stickiness' and 'boost user engagement.'
- Real penalties that actually deter, including disgorging ill-gotten gains and director liability (jailtime with the homies) for repeat or wilful breaches.
- Without real consequences, there is no deterrence. Monetary fines are almost useless against corporations with annual turnovers greater than the GDP of many nation-states; they simply treat them as a cost of doing business. The only way to ensure genuine compliance is to hold individuals accountable: directors and executives must face the prospect of criminal liability, including jail time, for egregious violations of the law—just as any regular citizen would for violating a binding contract.
The Illusion of Convenience: How Our Data is Stolen and Sold
We’ve all heard marketing: "This service is free!" But as the saying goes, if you’re not paying for the product, you are the product. This is the bargain we’ve all supposedly agreed to. But let’s be honest with each other: it’s not a bargain, and it wasn’t agreed to. It’s a one-sided heist.
The business model is called surveillance capitalism, and its currency is you. Here’s how the con works:
- The Bait: A company offers a incredibly useful, shiny, and "free" tool—a social network, a productivity app, a smart home device. They sell you on convenience, connection, and efficiency.
- The Hook: Once you’re hooked on the service, the real extraction begins. Every click, every scroll, every second you linger on a post, every location you check into is meticulously logged. This isn’t just to improve your experience; it’s to build a frighteningly detailed blueprint of your behaviour, your preferences, your fears, and your desires.
- The Theft: This blueprint—your data—is then packaged up and sold in a hidden marketplace you never see. It’s traded between data brokers, advertisers, and other third parties. Your personal information, your patterns, your very personality become commodities to be auctioned off.
They don’t call it theft either. They use precise words like "data collection" and "personalised advertising." But let’s call it what it is. If someone took your personal diary, photocopied it, and sold the copies to strangers without your permission, you’d call it stealing. This is no different.
The most insidious part is the "productivity paintwork." Apps for managing your life, your health, your finances, and even your home often present themselves as altruistic tools to help you "be your best you." In reality, that quest for self-improvement is a goldmine of data.
Your fitness app knows your health vulnerabilities. Your budgeting app knows your financial anxieties. Your smart home IOT device knows your daily routine. This isn’t just used to show you ads; it can be used to manipulate you, to exploit your insecurities, and to charge you more for things you desperately need—a practice known as dynamic pricing.
🥜 The slick UI and promises of a better life are just the veneer on a system designed to take as much data from you as possible, all while making you thank and pay them for the privilege.
(Source: The Washington Post - It’s not just what you tell your phone. It’s how you tell it.)
The Weaponisation of Our Lives: From Data to Control
So, a company has a detailed profile of you. What’s the worst that could happen? It’s just for ads, right?
Wrong. This is where the abstract concept of "data collection" gets real—and frightening. Your data isn’t just sold; it’s weaponised. It’s used to manipulate your choices, discriminate against you, and maximise corporate profit at your direct expense. The goal is no longer just to predict your behaviour, but to shape and control it at scale.
Let’s break down exactly how your turned-against-you data plays out in the real world:
- Dynamic Discrimination: This is the polite term for getting charged more because of who you are, where you live, or what you look like. Algorithms analyse your data—your income level inferred from your postcode, your browsing history showing desperation (e.g., repeatedly searching for a specific medication or last-minute flight), even the type of device you use—to present you with a personalised price. That same hotel room or batch of timber for your renovation might cost you significantly more than the person next to you, simply because the algorithm knows you can, or will, pay more.
- The Erosion of Autonomy: The content you see online—your news feed, your search results, your video recommendations—is curated by AI trained on your data. Its goal isn’t to inform you; it’s to engage you. This creates a feedback loop that pushes you toward more extreme content, keeps you scrolling, and locks you inside a filter bubble that reinforces your existing biases. Your perception of reality is quietly, algorithmically, shaped for profit. Your choices aren’t as free as you think they are.
- The "Productivity" Veneer: This is perhaps the most cynical twist. Apps and platforms that promise to help you optimise your life, manage your construction project, or improve your health are often the worst offenders. They use the guise of self-improvement to gain access to our most sensitive data. That project management software knows your budget and deadlines. That fitness tracker knows your vulnerabilities. This data can be used to sell you more stuff, but it can also be sold to insurers, employers, or data brokers, potentially affecting your premiums or job prospects.
The mantra of "move fast and break things" has come to mean breaking us—our privacy, our trust, and our social fabric. The race for market dominance treats human beings as mere data points to be optimised for engagement and monetisation. The love of neighbour and community well-being is sacrificed at the altar of shareholder value.
They don’t want to help you be your best you; they want to help themselves to the most profitable version of you they can engineer.
Why "Asking for Forgiveness" is a Deliberate Strategy
You might be thinking, "Surely this is illegal?" or "Aren't there rules against this?"
The frustrating reality is that much of this data exploitation is technically "legal" because companies have written the rules themselves. They've built a system where getting caught is just a minor cost of doing business.
Their strategy isn't to ask for permission; it's to take what they want and beg for forgiveness later—if they're ever caught.
This isn't an accident; it's a carefully crafted business plan. Here's how they get away with it:
- The Illusion of Consent: When was the last time you truly read a Terms of Service or Privacy Policy? These documents are intentionally designed to be unreadable. They are long, filled with impenetrable legalese, and buried behind multiple clicks.
- A study estimated it would take the average person 76 work days to read all the privacy policies they encounter in a single year. This isn't about informed consent; it's about creating a legal shield. You've "agreed," even if you had no realistic way of understanding what that agreement meant.
- Forced Arbitration: Buried deep within those unread terms is one of the most insidious clauses: forced arbitration. This means that if a company violates your privacy, you cannot sue them in a public court of law. You are forced into a private, secretive arbitration system that overwhelmingly favours corporations.
- Even worse, these agreements often ban class-action lawsuits, preventing individuals from banding together to challenge a company's powerful legal team. It's a rigged game from the start.
- The Calculated Risk of "Paltry Fines": Let me be blunt: when a multi-billion dollar company is fined a few million dollars for a massive data breach, that's not a punishment; it's a business expense.
- Companies conduct cost-benefit analyses. The profit generated from exploiting user data vastly outweighs the occasional, minimal fine from regulators.
- It's like getting a $10 parking ticket for illegally parking in a spot that earns you $10,000. Until the fines are debilitatingly large and personally target executive accountability, this "ask for forgiveness later" model will continue. There must be real consequences, not just a perfunctory mea culpa (or "my bad").
This entire system is built on an imbalance of power.
They hide behind their market dominance and legal firepower, offering us the candy of convenience in exchange for our digital sovereignty. They know we feel powerless to fight back. They're betting on it.
(Source: *ACMA - Google to pay $60 million for misleading Australians about location data*)
The Unavoidable Need for a Digital Bill of Rights (DBoR)
Given the scale of the problem, it’s tempting to feel hopeless. What can one person do against a global system designed to exploit them? The answer is: we don’t fight as individuals. We fight as a society, by establishing new, non-negotiable rules of the game.
Awareness is our power.
Tinkering around the edges with slightly stricter privacy policies or voluntary industry codes is like using a band-aid to fix a broken leg. The model of surveillance capitalism itself is the disease, and a Global Digital Bill of Rights (DBoR) is the only cure strong enough to address it.
Here’s why this is not just a good idea, but an unavoidable necessity:
- Data Rights are Human Rights: In the 21st century, our digital and physical lives are inseparable. The right to privacy, to autonomy, to fair treatment, and to not be exploited are fundamental human rights. They don’t cease to exist the moment we go online. A DBoR is simply an update to our social contract for the digital age, formally recognising that our rights must be protected in the virtual world just as they are in the physical one.
- The Failure of Self-Regulation: For years, we’ve heard promises from tech giants that they can regulate themselves. The result has been a continuous stream of scandals, breaches, and ever-more-invasive practices. The profit motive is too powerful to expect them to voluntarily relinquish their primary revenue source. We need strong, external, and legally enforceable boundaries.
- A Global Problem Demands a Global Solution: The internet doesn't respect national borders. A company can harvest data from Australians on servers located in another country, under the jurisdiction of a third. Piecemeal national laws, like Australia’s Privacy Act, are easily circumvented. A DBoR must establish a unified, global standard that creates a level playing field and prevents companies from shopping for the most lenient regulatory environment.
- Simplicity Over Legal Complexity: The current system thrives in the shadows of complexity. A DBoR must be built on black-and-white, easy-to-understand principles. Either a company is in compliance, or it is in breach. There should be no room for protracted legal defences or loopholes. This clarity protects everyone—it gives companies unambiguous rules to follow and gives citizens and regulators a simple yardstick for enforcement.
Waiting any longer is not an option. Every day without these fundamental protections, more of our data is harvested, more profiles are built, and more power is concentrated in the hands of a few unaccountable corporations. A Digital Bill of Rights is our chance to hit the reset button and build a digital economy that serves people, not just profit.
Power doesn’t panic until it risks being taken away.
(Source: *Australian Human Rights Commission - Human Rights and Technology Final Report 2021*)
📜 What Must Be in a Digital Bill of Rights: The Non-Negotiable Principles
A Digital Bill of Rights (DBoR) shouldn't be another lengthy, unreadable legal document. It must be a clear, powerful, and simple set of rules that anyone can understand. It’s about replacing complexity with clarity and exploitation with fairness.
Here is what a real DBoR must include—the non-negotiable principles to reclaim our digital lives:
📜 A. The Principles of Control and Consent
- Explicit, Opt-In Permission: The era of assumed consent is over. The default for any data collection must be OFF. Companies cannot collect, use, or share any data—including background telemetry and diagnostics—without first obtaining our explicit, informed, and unambiguous opt-in consent. No more pre-ticked boxes or dark patterns.
- The Right to Understand: All terms must be in plain English. Legal documents must be short, readable, and use layered summaries (a simple headline, a short summary, full details) so a reasonable person can understand what they're agreeing to in under a minute. If it’s not understandable, it’s not enforceable.
- No More Manipulation: Dark patterns are banned. Consent must be freely given, without deceptive design tricks, nudges, or interfaces that confuse or pressure us into saying yes. Saying ‘no’ must be as easy as saying ‘yes’.
📜 B. The Principles of Transparency and Fairness
- Radical Transparency: Companies must clearly disclose all commercial relationships and exactly what data is collected before gaining permission. This includes all background telemetry and data collection, which must be disclosed and turned off by default.
- Data Minimisation: Companies can only collect data that is strictly necessary for a clearly stated purpose. They can’t just hoard everything on the chance it might be useful later. If they don’t need it for the core function of their service, they can’t take it.
- The User Dividend: If a company profits from our data after we’ve opted in, we get a share. The mechanism for this profit-sharing must be transparently disclosed upfront. Our data creates their wealth; it’s time for a fair share.
- No Secret Scraping: Scraping data from non-sanctioned sources is theft, full stop. This is especially critical for sensitive biometric and facial recognition data, which must be off-limits without direct, specific consent.
📜 C. The Principles of Ownership and Agency
- You Buy, You Own: When you purchase software, a digital movie, an ebook, or a smart device, you are buying a product, not licensing a temporary, revocable access pass. The "you will own nothing" model is unacceptable. Companies cannot lock essential functionality behind ongoing subscriptions, remotely disable features, or render your property useless. The right to repair, modify, and use what you've purchased indefinitely is fundamental.
- Your Data, Your Choice: We must have a real right to access, portability, and deletion. This means the ability to easily download all our data in a usable format, take it to a competitor, and have it completely and verifiably deleted—fast, free, and without hassle.
- No Retaliation: Companies cannot punish us for exercising these rights. They cannot degrade service, charge fees, or deny access to core features if we choose to opt out of data sharing or modify the products we own. The choice must be truly free.
- The Right to Audit: For high-risk AI and data systems, companies must submit to independent, external audits. We have a right to know how these "black box" systems are making decisions that affect our lives, from loan applications to job recruitment.
📜 D. The Principles of Safety and Accountability
- Strict Boundaries: No cross-context tracking without fresh consent. Data collected for one purpose (e.g., your health data in a fitness app) cannot be used for another (e.g., targeted advertising) without asking you all over again in a clear and specific way.
- Child-First Design: Services used by children must have stricter defaults and limitations on data use by design, prioritising their wellbeing over profit. Their data deserves the highest level of protection.
- Security by Default: Companies must implement strong baseline security controls and are obligated to disclose data breaches promptly and clearly so we can protect ourselves. You can’t collect our data without also being responsible for protecting it.
- Pay Your Way: If a company generates substantial revenue from users in a country, it must pay taxes in that country. Profits generated from Australians should contribute to Australian society, helping to fund the services and infrastructure that support our digital lives.
- Unavoidable Consequences: Breaches of these rights must result in prohibitive, automatic fines based on a company's global revenue—a significant percentage, not a paltry fee. This makes violation a catastrophic business decision, not a calculated risk. The wrist-slaps are over.
👉️ These principles are simple, enforceable, and designed to restore the balance of power.
Our rights can be implied they must be enforced.
The Ripple Effect: How a DBoR Would Repair Our Social Fabric
It’s easy to see a Digital Bill of Rights as a set of restrictions—a list of "thou shalt nots" for tech companies. But that misses the bigger picture entirely. This isn’t about limiting innovation; it’s about steering it toward humanity.
Implementing these principles wouldn’t just protect our data; it would begin to repair the widespread damage caused by surveillance capitalism and create a healthier, more trustworthy digital world for everyone.
Let’s think about the ripple effects from a swift DBoR introduction:
- Restoring Trust in Technology: Right now, we engage with digital tools with a underlying sense of suspicion. A DBoR would change that. When you know an app is legally bound to simple, ethical rules, you can actually trust it. This trust is the foundation for technology that truly serves us, rather than exploiting us. Innovation would flourish around creating genuine value, not on perfecting new ways to mine our attention.
- Creating a More Equitable Digital Economy: The principle of the "User Dividend" fundamentally changes the economic model. Instead of a few corporations capturing all the value created by our data, that value would be shared. This could lead to fascinating new possibilities—imagine a future where your data earnings help pay your internet bill or contribute to your superannuation. The digital economy would become a partnership, not an extraction operation.
- Protecting Democracy and Social Cohesion: When algorithms are no longer optimised solely for engagement (which often means outrage), our public discourse can begin to heal. With transparency and auditing rights, we can reduce the spread of misinformation and the manipulation of our political and social choices. A society less fractured by filter bubbles and viral falsehoods is a stronger society.
- Fostering Real Innovation: For too long, "innovation" in tech has meant finding smarter ways to track users. A DBoR would force a paradigm shift. The competitive advantage would no longer be "who has the most data," but "who can build the best, most private, and most respectful product." This would unleash a wave of creativity focused on solving real human problems, not on creating addictive features.
Think of it like the introduction of building codes and quality standards in construction. They didn’t kill the industry; they made it safer, more reliable, and more professional.
They ensured that homes were built on solid foundations for the long term. A Digital Bill of Rights does the same for our digital landscape. It lays down a solid, ethical foundation upon which we can build a future that doesn’t just move fast and break things, but that moves wisely and builds things that last.
Consent is non negotiable. You cannot take then ask for permission. Permission first.
Conclusion: This is Our Last Chance to Draw the Line
We stand at a crossroads. Down one path lies the continued, unchecked rise of surveillance capitalism, powered by "AI"—a world where our digital selves are owned, sold, and weaponised against us, eroding our autonomy, our wallets, and the very fabric of our society.
Down the other lies a future where technology serves humanity, not the other way around.
The choice isn't just about your privacy either. It's about what kind of world we want to live in and what kind of digital world we will leave for our children. Will it be one governed by opaque algorithms and corporate greed, techno feudalism or by transparent rules and fundamental human rights?
The argument that we must sacrifice our privacy for convenience or innovation is a false one.
We can have incredible, powerful technology that also respects us. We can have connected lives without being constantly monitored. The notion that we must choose is a lie sold to us by those who profit from the exploitation.
The need for a global Digital Bill of Rights is not a far fetched idea; it is a necessary one. It is the foundational framework that will allow us to harness the good of technology while protecting ourselves from its harms. It is the update our social contract desperately needs for the 21st century.
This will not happen by itself. The companies benefiting from the current system will not volunteer to change it. Change will only come when we, as citizens, consumers, and neighbours, demand it loudly and unequivocally.
So, what can you do?
- 👉️ Demand More: Ask companies for transparent data practices. Choose tools and services that respect your privacy. Exercise the rights you already have.
- 👉️ Support Advocates: Get behind organisations like Digital Rights Watch Australia who are fighting this battle on our behalf.
- 👉️ Talk About It: Have conversations with friends and family. Make digital rights a topic around the dinner table, not just in tech blogs.
- 👉️ Lobby Your Representatives: Write to your local MP. Tell them that this matters to you, their constituent. Ask them what they are doing to strengthen Australia's privacy laws and advocate for global cooperation.
We have a chance to redeem the promise of the digital age—to build a world that is more connected, more efficient, and more equitable. But first, we must build it on a foundation of trust, transparency, and rights.
We still have a choice. We know life is demanding and your attention is pulled in a dozen different directions. It’s easy to feel this is a problem for someone else to solve. But this is one fight we cannot afford to ignore. The outcome will define the digital world we leave behind—not just for us, but for our children and for generations to come. The time to secure their future is now.
Let us choose a different direction. Your voice, your privacy, and your very agency are more valuable than you know. They are the digital gold that fuels the trillion-dollar empires of surveillance capitalism
👉️ Remember: without your data and your engagement, these trillion-dollar empires simply do not exist. That is your power.
Power doesn’t panic until it risks being taken away.
The choice is ours. We can fight for our rights, undivided, and build a digital world that serves humanity. Or we risk falling together. The time to act is now.
Frequently Asked Questions (FAQ)
1. What exactly is a Digital Bill of Rights?
It's a proposed set of fundamental, non-negotiable rules that would govern how companies can collect, use, and profit from our personal data. Think of it as a universal charter of digital human rights designed to put control back in the hands of individuals, not corporations.
2. Isn't this what laws like GDPR already do?
Laws like the EU's GDPR are a significant step in the right direction, but they are often complex, region-specific, and can still be circumvented by corporate legal teams. A true Digital Bill of Rights aims to be simpler, global in its application, and with much stronger, unavoidable enforcement mechanisms to prevent loopholes.
3. Won't this break the internet and make all my favourite apps paid?
Not necessarily. It would shift the business model from hidden data extraction to transparent value exchange. While some services might introduce ethical subscription models, others could be funded by advertising that you explicitly opt into, perhaps even sharing in the revenue. The goal is to create choice and fairness, not to eliminate free access.
4. How can a law possibly be enforced on a global scale?
The key is in the stringent, automatic financial penalties that apply to a company's global revenue. By making the cost of non-compliance catastrophically high—a significant percentage of worldwide income—it becomes financially irrational for any multinational corporation to ignore the rules, regardless of where they are headquartered.
5. What's the difference between this and a website's "Terms of Service"?
Terms of Service are written by companies to protect themselves. A Digital Bill of Rights would be written to protect people, establishing a floor of basic rights that no Terms of Service could override or undermine.
6. I'm not very tech-savvy. How will this help me?
It’s designed precisely for you. By establishing simple, black-and-white rules and banning deceptive design ("dark patterns"), it would make your digital life simpler and safer. Privacy would be the default setting, and any requests for your data would be in clear, plain English.
7. Does this affect more than just social media?
Absolutely. This impacts every corner of your digital life: the apps on your phone, your internet browser, your smart home devices, your online banking, the software used by your builder or accountant, and even the loyalty programs at your local supermarket.
8. What about the argument that this will stifle innovation?
This framework doesn't stifle innovation; it redirects it. Instead of rewarding companies for inventing new ways to track users, it rewards them for inventing products and services that people genuinely want and trust. True innovation shouldn't require the violation of privacy.
Further Reading
This post wouldn't be complete without looking closer to home. A whole industry has sprung up to serve the real estate sector, offering instant access to vast databases of personal information for a monthly fee.
Search for "accurate and fast service that offers the ability to search and obtain data necessary for real estate prospecting" and take a look for yourself.
This raises critical questions that go to the heart of our digital rights: Where did this data come from? and When did you ever give explicit consent for your personally identifiable information (PII) to be collected, stored, and sold in this way?
Many of these platforms do offer an opt-out or 'Remove from Database' function—a tacit admission that they hold your data without your direct permission. But this places the entire burden of protection on the individual. The more profound question remains:
Why is your private information on a database to be sold to anyone who pays the access fee in the first place?
This model illustrates the very problem a Digital Bill of Rights seeks to solve: the presumption of access and ownership by corporations over our personal data, with consent only being an afterthought.
