The Arcology Garden

How to Destroy Surveillance Capitalism - Cory Doctorow

LifeTechEmacsTopicsArcology

Archive

Cory Doctorow pens a critique of The Age of Surveillance Capitalism as a 70-page Essay-cum-Book, and I tend to agree with most of it and find Zuboff's arguments to be a bit too "tech magical thinking" to really gel with my experience of machine learning and risks of big-data. I haven't read the critiqued book though it's been on my list, I think he builds a lot on it by expressing the limitations of the Surveillance Capitalism ideas. maybe next. I am slowly engaging with this text, thinking with it and integrating my opinions in. Most of these opinions build on my wider thinking about Moving Towards Real Privacy, and it's a little difficult to read this critically and to feel unbiased.

I'm taking annotations using Kobo Aura One and my notes will be in the KoboView doc and in remarkablenotes.

I think Cory's arguments are pretty sound, but I think it's going to be really hard to motivate people through the fear and reality of a collapsing Capitalist order, especially at a time when even Socialism is a four-letter-word that get people to show up threatening you with guns and violence. But I think this is really what all this clamoring around Big Tech is about, undoing 50 years of technical debt in our government, and it's the very same strain that is at the root of Donald Trump's rise to power, and the outpouring of fascism and normalization of conspiracy: the complete corporatisation of our nation and the global economy. We blame tech, but I think it's fair to say that the problem with Big Tech is much more "big" than "tech". For this reason, I support small decentralized systems with localized moderation and control, I support free software and open protocols, and I always have. I say a lot that Uber Turned Me in to an Anti-Capitalist, but I've always felt it, only rarely seen it: this shit is a house of cards1.

Is Surveillance Capitalism the match or the fuel behind the conflagration of Big Tech?

Doctorow begins by presenting the arguments in 'The Age of Surveillance Capitalism' as falling in a trap of "tech exceptionalism", as giving outsized credit to the "mind-altering" capabilities of ML-charged marketing and design, unlike any other abusive commercial practice in history. She calls it "a rogue capitalism". I largely believe that the effects of AI in The Economy are overstated,2 and I think ascribing machine learning mystical qualities does a disservice to technology, society, and the truth. Rather than focusing on the galaxy brain super powers, I think that modern development and design strategies need to be radically re-thunk with an eye on transparency and trust. The advertising economy increasingly seems to be a self-inflated bubble with considerable effort going in to obscuring the source of the air, and its quality.3 Doctorow asks us to evaluate past the marketing material on the Amazon Web Services page, or on the facetious claims in a patent, and that we should agitate for enough transparency to be able to truly do this.

So, what then is Doctorow's root belief, if it's not in the idea that machine learning strips us of our cognitive free will? Monopolistic control over commerce and communications is the bubble inflating the entire concept of "a tech industry", of these handful of outsized winners, and of upstart outsized winners A.K.A. startups4. Doctorow believes that we're being mystified by sales literature that may not be a reliable indicator of the product's real efficiency.

How does Surveillance Capitalism work?

Doctorow summarizes Big Tech Surveillance Capitalism as:

The Age of Surveillance Capitalism and the general sentiment around Big Tech is largely concerned with the last one, the Artificial Intelligence that is going to take our jobs and turn us in to Matrix batteries. The science on this stuff is sort of out, though – largely it doesn't seem super effective. Human mind are pliable, but not infinitely, we adapt. The abuse of our dopamine receptors is not infinite, and so they have to constantly work to change and develop new tactics. Consider how agitated everyone is after a web application change is rolled out: new reddit, new twitter, new facebook, always feel worse to use and agitate you for the first few weeks5, but eventually your brain settles down and you are forced to get used to it. Ironically, one of Big Tech's biggest "competitors", the thing eating most of its margins, is our own brain's unwillingness to be bored.

All of that behavior modification is ultimately leveraged towards increasing advertising revenue, the ad-revenue starting as kindling for an M&A bonfire, because advertising is far less effective than people intuit. "Tripling the rate at which someone buys a widget sounds great unless the base rate is way less than 1% with an improved rate of … still less that 1%". Google records massive huge ad-revenue numbers by virtue of being at the top of a ponzi scheme of middle-men, publishers, and interested consumers all trying their damnedest to push responsibility to another party and largely succeeding. When objectionable advertisements appear on the web, the ad-tech industry's makeup makes it nearly impossible for a publisher to have any accountability for, say a Nazi ad, it's some problem upstream that they'll report and wipe their hands of. Google will tell the middle-man vendor that the ad should be banned, and it'll fall in to an ether of unaccountable third parties until the next time, or until some advertising manager calls up yelling. By diluting responsibility like this, we allow socially objectionable content to flourish in a way that we never could if it was a billboard advertisement or an appliance mailer.

The idea that surveillance capitalism has nearly succeeded in building machine learning which can effectively and indefinitely modify behavior is an unfalsifiable hypothesis: If data gives a tech company even a tiny improvement in behavior prediction and modification, the company declares that it has taken the first step toward global domination with no end in sight. If a company fails to attain any improvements from gathering and analyzing data, it declares success to be just around the corner, attainable once more data is in hand.

Domination of markets is fundamentally about giving targets no escape route. Companies persue data surveillance capitalism to meet the ends of monopolistic domination and simply evaluated on their effectiveness, Cory reasons that monopolistic behavior is far more important and broader of an issue than surveillance capitalism: if we broke the back of the surveillance capitalists, broke their companies apart, many pieces would wither on the vine, having been unhooked from the hose of capital and fresh ideas and fresh eye-balls that these behemoths rely on. While we can argue at length about the ultimate upside of it, the fact is that Google Facebook and Amazon and the likes all had just a few "big world changing ideas" which they leveraged in to a big pile of money until the margins started to collapse, at which point they went out and copied or bought up the startups that were chewing their margins, slowly building the ornate beautiful walled gardens. Google did not invent Android, Amazon did not invent Ring. They bought them and then plugged them in to a corporate institution designed to slowly pull tendrils across the web. Google is well-known for developing or acquiring-and-expanding web-services and then killing them off with little to no warning6, and in fact, I would argue that Google's execution is often disorganized and poor. One has to only look at the history of their messenger platforms to see a company that is too big and too intertwined to be able to execute as well as their quarterly reports would imply7. Cory calls this allowed-ineffeciency a "diseconomy of scale."

Ultimately, Cory thinks, Facebook has only a few ways to truly "modify our behavior": it can lock in all of our friends and family to one place to find out what they are up to, and they can make us angry and anxious. I don't think this is quite a rigorous argument, I think that the "behaviorists" at Facebook and the like also explicitly design to maximise time in the app (partly through anger/anxiety) and this does alter our behavior. In COVID-19 times we started calling it "doomscrolling", being trapped in the ludic loop of infinite scroll and automatic refresh, towards the end of serving advertisements, away from self-actualization of getting to the work that needs to be done. But this lock-in is the meat of the thing. Big Tech has lobbied hard and for decades for stronger copyright rules, for legalised lock-down8, and for the sort of legislative rules which can only be applied to large companies.

If we’re worried about giant companies subverting markets by stripping consumers of their ability to make free choices, then vigorous antitrust enforcement seems like an excellent remedy. If we’d denied Google the right to effect its many mergers, we would also have probably denied it its total search dominance.

Technical and legal countermeasures like the DRM locks enforced by DMCA Section 1201 exist because companies know that consumers would not voluntarily submit to their terms so they do everything they can to trick users and deprive them of the right, despite nominally relying on concepts like "informed consent", there are legal teeth behind efforts to claw back control – the data processor doesn't consent, and they will get the FBI to throw you in a box. If they violate their rules you might get a minor settlement in a confidential binding arbitration which they've stripped you of the right to opt out of. New legal inventions to lock-in, far more effective than any technical endeavor or mind control ray.

Why does this keep happening?

Big Tech is

Doctorow talked about the first two issues, the legislative situation is obviously evolving. No one is sure whether the General Data Protection Regulation or other local legislation will cause those economies to implode, or if it's the natural progression of enlightenment values. While the US doesn't have any comprehensive modern data rights legislation, California and a few other states attempting to shore this up themselves find themselves fighting a problem where their constituency's rights can be effectively arbitraged by the data economy the rest of the world's lassiez-faire situation prop up. Doctorow believes a major factor in this is the support of State Surveillance, and I am apt to agree:

Monopolism is key to the project of mass state surveillance. It’s true that smaller tech firms are apt to be less well-defended than Big Tech, whose security experts are drawn from the tops of their field and who are given enormous resources to secure and monitor their systems against intruders. But smaller firms also have less to protect: fewer users whose data is more fragmented across more systems and have to be suborned one at a time by state actors.

A proliferation of closed devices that aren't secure, that we are legally prohibited from inspecting sound like the perfect tool for state surveillance. Of course, it's the perfect tool for Russia, China, the Five Eyes, and any other states that want to surveil either broadly or narrowly. The roots and effectiveness of state surveillance are worth exploring and pondering on their own, but consider a system that is 99% accurate will falsely identify 9,999 out of 1 million, and it is exponentially more expensive to increase "the number of nines"9 to decrease that false-positive rate. The US government will very loudly and proudly claim to care about the security of all networks and systems, not just critical ones, but their treatment of those networks and devices as pawns and tools should be considered, and it's impossible to ignore Edward Snowden's historic releases of information on the US surveillance infrastructure.

Without Palantir, Amazon, Google, and other major tech contractors, U.S. cops would not be able to spy on Black people, ICE would not be able to manage the caging of children at the U.S. border, and state welfare systems would not be able to purge their rolls by dressing up cruelty as empiricism and claiming that poor and vulnerable people are ineligible for assistance.

Big Tech being centralized to a dozen companies means that the leaders of all those companies can meet privately and build consensus and positions that they can all feel are reasonable, and meet with world leaders at the process. When I worked at Uber, our CEO Dara was very very proud of himself and the company for being invited to meet with world leaders at the World Economic Forum in Davos the look in his eyes when he was talking about the experiences was of joy, and of a recognition of his surreal place in the world. Tech and finance are slowly, steadily being intertwined as Big Tech companies become investment firms, and the 'Small Tech' firms all jostle elbows to get under the bloodstained umbrella of Masayoshi Son's SoftBank Vision Fund10.

The Stasi had to recruit one in every sixty citizens of East Germany to spy on their neighbors to in an effort to have a total control over information11, and now that is maybe one in ten thousand thanks to the advances brought about by Big Tech in concert with governments. And no one can even point to any acts of terror stopped by the panopticon, nor is it clear that the data police get from Ring is actually making neighborhoods more safe, despite being a global off-the-books extralegal surveillance network.

How do they get away with it?

Doctorow believes that it's critical that we break up Big Tech in the same way we broke up telecom and oil and steel: a literal reading of the Sherman Act. We're deathly afraid of the economic and political fallout of fracturing these systems, terrified that it'll let hostile political activity, or the rise of fascism to go unnoticed, but it already has, the centralized system insists that the only way to keep us safe is to build tall walls and try to raise the ladder behind them, and ask us to ignore the fact that there's already enemies within the perimeter.

Because of the difficulty in regulating large companies, there is a revolving door of sorts between Big Tech and regulation, and this also creates a very friendly regulatory environment for these companies. After all, when your Public Servant exercise wraps up you'll need to move in to a VP or C-suite role in a new company, and if there's five or ten top options, you'll be sure those companies getting better treatment on the whole. It's easy for any who is leaving the intelligence community or the military and find a home in the Trust & Security departments of Big Tech companies, and the links between the communities run deep1213.

While surveillance’s benefits are mostly overstated, its harms are, if anything, understated.

The harms of state surveillance is widely reported and widely researched, but we have no real ability to reason concretely about corporate surveillance. I think Doctorow's work here and elsewhere are an effort to build a vocabulary for this, and there are of course many others probing and attacking the surveillance economy including Zuboff. Tech leaders are lionized and given pedestals in the way popular celebrities are, often they are intertwined with celebrity and their company. Elon Musk is the obvious person to draw attention to here, but of course Mark Zuckerburg is nothing without Facebook, and he's doing his damnedest to keep the inverse true. Of course there is Jeff Bezos and other scions of the industry, and flameouts like Travis Kalanick and Adam Neumann. And so there is a definite social campaign to lionize these people as stars in their own right, CEOs and even VPs have brand managers and PR operatives.

But Big tech is hardly the only industry that is too big to be regulated. In 2008 the global banking system crashed because of unaccountable, fundamentally flawed economic instrument was allowed to run free, and no one was able to get justice because they were "too big to fail". Chickenshit Club goes deep on this, and how the SEC got to be captured by the banks, and the playbook was the same: a revolving door between the SEC and the banks, a swamp of intentional obfuscation and attack on any attempts to audit, a fear of having large cases fail or fall apart undermining trust in the institution, paralysis. I think the causes are the same, and I think eventually the outcome will be.

So why doesn't the department of justice persue antitrust cases in based on the Sherman Act which was able to break up the Steel empires that all those Carnegie Libraries were named after? Doctorow blames Reagan appointees for the deregulatory land-rush, and I tend to agree. Specifically, he calls on Robert Bork, who fabricated an alternate reading of the Sherman Act, the reason most commonly pushed out as the reason Big Tech can't be broken up: they were intended to prevent “consumer harm” — in the form of higher prices. And Big Tech's big hack is in giving shit to consumers for free, and charging someone else to inject their own thoughts. That's the whole thing: the ad-tech economy enables consumers to be given services in exchange for attention rather than monetary weight, and they can make monopolistic plays. It took 20 years after this for the idealistic view of computing and the early open web to be slowly stamped out while capital accrued in to a handful of companies with deep ties to the US military industrial complex14. Cory is arguing here that "Tech was born at the moment that antitrust enforcement was being dismantled, and tech fell into exactly the same pathologies that antitrust was supposed to guard against," and that this is not intrinsically because of the technology: it's the "big" not the "tech".

This phenomenon of industrial concentration is part of a wider story about wealth concentration overall as a smaller and smaller number of people own more and more of our world. This concentration of both wealth and industries means that our political outcomes are increasingly beholden to the parochial interests of the people and companies with all the money

Here I cite Debt: The First 5000 Years (essay) and David Graeber's wider work.

What's the fallout?

Outside of the market and technical failures we've seen as a result of this, there are larger issues which are worth exploring, Cory Doctorow and Shoshana Zuboff present a few, and I have ideas of my own15

On Finding An Authentic Self

What ubiquitous surveillance is doing is to slowly force everyone in to quantifiable states by destroying or at the very least eavesdropping on authentic moments where growth takes place. Like a child stuck in wonder snapped back to reality by a parent, when we realize we're being watched and judged, we instinctively modulate our behavior. And so this world of targeted quantization, Cory argues, even excluding the issues around power and domination, the surveillance does rob us of space to be ourselves and to develop who that person is. Without room for Making Mistakes and learning for ourselves, basically.

In the digital age, our authentic selves are inextricably tied to our digital lives. Your search history is a running ledger of the questions you’ve pondered. Your location history is a record of the places you’ve sought out and the experiences you’ve had there. Your social graph reveals the different facets of your identity, the people you’ve connected with.

To be observed in these activities is to lose the sanctuary of your authentic self.

Echoes of this ring true in my own life, I did not find space to identify my authentic self, I had to build my own private spaces to feel comfortable doing that, this place, and others. If we're robbed of this by monopolists, and many are, there is no alternative. I fundamentally don't understand those who live in TikTok and accept the risks of living in public, so I'm not super inclined to bring judgment there.

The modern mobile web is designed to constantly but unreliably trickle information at us, never allowing our brains to focus on the task at hand, that self-actualization I mentioned earlier. I think about the design of Slack, specifically engineered to let work bleed in to every available crack of your personal time, and your employer pays them to do it. Algorithmic timelines are often criticized from the perspective of a timeline consumer but the algorithmic timeline allows posts to receive interaction for much longer, more interactions, more push notifications, more excuse to hit the "see new posts" button after adding a reaction to a new comment on your reaction to your new job. It's the most effective tools that the "behaviorists" have for abusing our dopamine receptors, and its all an excuse to distract you from the task at hand. Google and Apple have begun to "snap back" on the social anxiety triggers finally building in tools for silencing phones and putting timers on individual applications or categories, and I am happy to use and advocate for their use.

"you don’t need a mind-control ray to make someone anxious"

Leaked data is also an issue, and it's nearly impossible to punish it effectively, and this mirrors financial crime again.16 Location data can be trivally reidentified, it's possible to identify people from their web search history, location history and a phone number and some other now-publicly available information is now enough to get a credit card in your name or the ability to buy yet more profile-building information, and that's before we talk about uniquely American failures like the vast amount of Social Security Numbers available on the internet.

On the Post-Truth Society

All of this is undermining our rationality, but it's part of a wider cycle: the failure of American corporatism undermining trust in institution as a whole, a stage play distraction of minor threats is persued while financialization and monopolisation truly robs us of a "right to a future tense":

The past 40 years of rising inequality and industry concentration, together with increasingly weak accountability and transparency for expert agencies, has created an increasingly urgent sense of impending doom, the sense that there are vast conspiracies afoot that operate with tacit official approval despite the likelihood they are working to better themselves by ruining the rest of us.

The collapse of the credibility of our systems for divining and upholding truths has left us in a state of epistemological chaos.

"How can you really say that vaccines are safe??" a skeptic will ask, and Doctorow and I don't have concrete answers. Of course, I trust science and medicine, but I don't trust the market forces driving pharmaceutical research; consider the rush to get a COVID-19 vaccine out without controlled trials, and without strong promises or trust in those promises that it'll be available to everyone who needs it. I feel the same way about GMO, I feel so sick that we're letting Monsanto's bad will rob the world of stable food sources, the GMO-free movement spreading while there is so much food-insecurity in the world and in the US is so so wrong to me, and its root cause to me is the same sort of monopolist control. "In a time where actual conspiracies are commonplace, conspiracy theories acquire a ring of plausibility."

What can we do about it?

Legislative Reform

Agitate for antitrust reform, and a return to a literal reading of the Sherman Act. Congress could pass a law clarifying it if Bitch McConnell wasn't holding the Senate hostage, but here we are! hastag GOTV. Find common ground with other activists in other consolidated industries, and work with them. Fix software patents and DMCA Section 1201.

Rain hell and fire on these monopolist bastards and hope the world economy doesn't collapse

bans on mergers between large companies, on big companies acquiring nascent competitors, and on platform companies competing directly with the companies that rely on the platforms

Adversarial Interoperability

Beyond neutral interoperability, there is “adversarial interoperability.” That’s when a manufacturer makes a product that interoperates with another manufacturer’s product despite the second manufacturer’s objections and even if that means bypassing a security system designed to prevent interoperability.

Cory has argued for this in other essays, allowing for and supporting interoperability in all scenarios, we can undermine the argument that surveillance capitalism allows for better products to be developed by breaking open the walled garden and allowing for free-flow of data. Cory doesn't spend much time addressing the current widespread anxiety around disinformation campaigns,

Adversarial interoperability was once the norm and a key contributor to the dynamic, vibrant tech scene, but now it is stuck behind a thicket of laws and regulations that add legal risks to the tried-and-true tactics of adversarial interoperability.

Re-focus Tech Exceptionalism

The solution is to not give more power to the big players, to assume that only they can solve the problems they've enabled. This raises the ladder behind the tech companies in a way that will surely stifle any upstart innovation: "The drive to force Big Tech to use automated filters to block everything from copyright infringement to sex-trafficking to violent extremism means that tech companies will have to allocate hundreds of millions to run these compliance systems."17 And the solution cannot be to focus on one narrow aspect of a greater snake oil, as it were. I still intend to read The Age of Surveillance Capitalism, I think having more resources in the fight against snake oil is important, but there is crude in the water.

"Tech is not a substitute for democratic accountability, the rule of law, fairness, or stability — but it’s a means to achieve these things." Cory is nervous about the future, of climate change and inequality, and so am I, and like Cory I am still of the belief that free, democratically operated tech is the lever which will allow us to course-correct on these generational mistakes we've made. I've given my adult life to Free Software as a Community of Mutual Aide, to cutting out the middle-man between people, commerce, and ideas and I think it's the correct course still. I self-host my own systems not only because I want to operate them, but to provide that valued escape route – I just need to "productionise" it.

Zero-margins free tech is a rare and uniquely modern tool, shifting power and voice towards the masses more than its ever gone before, and it's natural for power to react to this and defend itself and so far we have failed to re-focus our efforts, redouble the attack, having accepted our fate so quickly that monopolies are unsolvable less than 100 years after we busted up a bunch of them we accept them in to our lives while they refine those powers once again. "Antitrust was neutered as a key part of the project to make the wealthy wealthier, and that project has worked. The vast majority of people on Earth have a negative net worth, and even the dwindling middle class is in a precarious state, undersaved for retirement, underinsured for medical disasters, and undersecured against climate and technology shocks."

I call on myself and others to build small tech that can be operated by communities, make that tech respectful and private and interoperable and consensual. 20 or 100 people can fund one or four people to run a service that can support a thousand. We are so blinded by Big Tech inserting itself in every transaction and raising the cost of operation, but like the case of the Dutch Public Broadcaster's switch to Contextual Ads Increasing Revenue, we'll find that these middle-men took more than they give, and they can't make it up in volume forever.

Footnotes