• Funs Jacobs
  • Posts
  • Why Privacy Was Once Treated Like A Weapon

Why Privacy Was Once Treated Like A Weapon

How the cryptography wars of the 1990s decided whether we’d have online shopping, private messages, or any freedom at all in the digital age.

In partnership with

The Battle for Encryption: A History of Freedom Online

We are living in an interesting moment in time and it feels like many, very defining, technologies and trends are taking shape. Some of those have turned into massive (non-physical) battles it feels like. It’s crypto vs the banks, centralization vs decentralization, open source vs closed source, big tech vs your privacy, you vs bots(?) and probably many more.

A lot of these wars have important outcomes, as they will determine how our future internet and (digital) lives will take shape. Will we live in a world where the immense power of AI is in the hand of a very few people, or will it be in the hands of everyone? And which one of those is the safer option? That is just one example of the current battles being fought.

This is not the first time that these battles, for the lack of a better word, happened when it comes to our internet. The internet as we know it today, could have looked a lot different if the outcomes of one of these battles would have been different. To understand how we got here today, might give us a bit of understanding into how the future will unfold and especially, why these battles matter so much.

Time for another history lesson ladies and gentlemen! 🤣 Today we are going to talk about the cryptography wars, a two-decade long struggle in order to make sure that, today, we can encrypt things on the internet.

Crypto(graphy) is NOT crypto

Before you think “here’s Funs again with another crypto story 😉” let me explain that crypto is not cryptography. Crypto (coins, tokens, blockchains) exists thanks to cryptography, but cryptography itself is simply the math that scrambles information so only the right person can read it.

Think of cryptography as the locks and keys of the digital world. At its core, it works in two ways.

• Symmetric encryption is like sharing a house key. The same key locks and unlocks the door. It’s fast and simple, but it has one big problem: you need a safe way to hand over the key in the first place. If someone intercepts it, they can open everything.

This is still widely used today though in, for example, Wi-Fi, in hard-drive encryption, and in messaging apps once the connection is established, because it’s fast and efficient.

• Asymmetric encryption (also called public key cryptography) solved that problem in the 1970s. Imagine a mailbox: anyone can drop a letter in using the public slot, but only the person with the private key can open it. Suddenly, you could send secrets to strangers without ever meeting them to exchange keys first.

This is the system that now powers secure messaging apps like WhatsApp and iMessage, your online banking login, email encryption, and even blockchain transactions.

Hacking Security Breach GIF by Matthew Butler

In other words: without cryptography, our digital lives would be completely open. Now you can already feel it coming, this did not happen without a fight.

The birth of the internet

Below image shows a record of the first message ever sent over the ARPANET. It took place at 22:30 hours on October 29, 1969. This record is an excerpt from the "IMP Log" that was kept at UCLA. Professor Kleinrock was supervising his student/programmer Charley Kline (CSK) and they set up a message transmission to go from the UCLA SDS Sigma 7 Host computer to another programmer, Bill Duvall, at the SRI SDS 940 Host computer. The transmission itself was simply to "login" to SRI from UCLA. They succeeded in transmitting the "l" and the "o" and then the system crashed! Hence, the first message on the Internet was "lo", as in "lo and behold! They were able to do the full login about an hour later.

The early internet wasn’t built with any of this in mind. The goal was openness and collaboration. Academics wanted to share files, not hide them. Privacy wasn’t needed.

That worked fine when only universities, governments, and a handful of companies were online. But by the late 80s and early 90s, more people and businesses started joining. Suddenly, the flaws became obvious: the internet was like a giant postcard system. Every email, every login, every credit card number could be read by anyone handling the traffic. Whether that was a nosy administrator, a hacker, or a government agency..

This is the moment when cryptography moved from being an academic curiosity to a survival tool. If the internet was ever going to support commerce, private communication, or even basic trust, it needed locks on the doors.

And that’s exactly where the conflict began because governments weren’t ready to let go of the keys.

So where did this all begin?

In the mid-1970s, two Stanford researchers, Whitfield Diffie and Martin Hellman, introduced the idea of public-key cryptography. A few years later, Ronald Rivest, Adi Shamir, and Leonard Adleman (the “RSA” in RSA encryption) turned that idea into a practical algorithm. RSA encryption is a specific form of public-key cryptography that uses very large prime numbers to create a pair of keys. One public to lock the message, and one private to unlock it. It’s the reason you can safely type your credit card number into Amazon without ever meeting them to exchange a secret key first. For the first time, ordinary people could in theory exchange secrets securely over an open network.

1977 - Ron Rivest, Adi Shamir and Leonard Adleman (RSA) of MIT

This was revolutionary and terrifying to governments.. Up until then, strong cryptography was almost exclusively in the hands of militaries and intelligence agencies. If everyday citizens could suddenly lock their digital doors, law enforcement and spies feared they’d lose visibility into communications.

Their response was not lightly. What set the cryptography wars off was a decision by the US Government that encryption was officially classified as a munition. Yes, the same legal category as tanks, bombs, and missiles. Exporting strong encryption software without government approval was a crime. (for more information/details, see this UNC School of Law paper from 1994).

This is an important point and something to let sink in for a minute. In order for the government to keep the internet open they made it illegal to have any secrets on the OPEN world-wide-web. You can probably now imagine why his started the full on Cryptography Wars. A 20-year struggle between governments trying to keep control and technologists determined to put encryption into the hands of everyone.

PGP: the first personal flashpoint

In 1991, software engineer Phil Zimmermann released Pretty Good Privacy (PGP), a free tool that let anyone encrypt email and files with strong, public-key cryptography. He shared it widely on the early internet so activists, journalists, and ordinary users could protect their communications.

Because strong encryption was treated like a munition for export, as mentioned above, PGP spreading to servers outside the U.S. looked to authorities like an unlicensed weapons export 😅. In 1993, U.S. Customs opened a criminal investigation into Zimmermann for alleged export violations. The investigation ran for years and created a chilling effect for individual developers.

PGP in action

Enter the Cypherpunks

At the same time that Phil Zimmermann was fighting his PGP battle, a group of activists, mathematicians, and hackers were organizing around a radical idea: if privacy was essential for freedom in the digital age, the only way to protect it was to build it.

They called themselves Cypherpunks. Their motto: “Cypherpunks write code.” In other words: don’t wait for governments or corporations to give you privacy. Make it yourself.

• In 1988, Timothy C. May published the Crypto Anarchist Manifesto, predicting a future where cryptography would upend the balance of power between individuals and states.

• In 1992, May, Eric Hughes, and John Gilmore started the Cypherpunks mailing list in the Bay Area. It quickly grew into thousands of members worldwide, sharing code, tactics, and philosophy.

• In 1993, Eric Hughes wrote A Cypherpunk’s Manifesto, declaring: “Privacy is necessary for an open society in the electronic age. … We cannot expect governments, corporations, or other large, faceless organizations to grant us privacy… We must defend our own privacy if we expect to have any.”

PGP was proof their ideas could work. The government’s crackdown only fueled their urgency.

The cultural pushback

What started as a government investigation into one man (Zimmermann & PGP) quickly turned into a wider cultural battle. Developers, academics, and activists argued that code itself was a form of speech and that banning or restricting it was a violation of free expression.

To make the point, MIT Press published the full PGP source code as a printed book in 1995. Why? Because under U.S. law, books were protected speech under the First Amendment, but code was treated like a weapon. By printing it, they highlighted the absurdity: the exact same information was legal if bound in paper but illegal if stored on a floppy disk.

Hackers and activists got creative. They printed the RSA encryption algorithm on T-shirts and bumper stickers with slogans like “This T-shirt is a munition.” Wearing one across a border technically counted as exporting arms. It was satire with teeth, a way to make the stakes visible to anyone, not just programmers.

The message was clear: privacy is a right, and trying to ban math is futile.

How it ended for PGP

By 1996, after three years of pressure, the U.S. government quietly dropped its investigation into Zimmermann without charges. By that point, the genie was out of the bottle: PGP had already spread worldwide and become the backbone of a new generation of secure communication tools.

Zimmermann became a symbol of digital civil liberties, and the case showed that attempts to ban cryptography would only strengthen the movement around it.

The Clipper Chip: Plan B for control

When the pushback around PGP made it clear that banning cryptography outright wasn’t going to work, the U.S. government tried a different approach. If they couldn’t stop strong encryption, maybe they could build in a secret way around it.

In 1993, the Clinton administration unveiled the Clipper Chip, a government-designed encryption device intended for phones and computers. It promised strong encryption for users, but with one catch: every device included a built-in backdoor key. Officially, this was called “key escrow.” The idea was that law enforcement could unlock any conversation or file if they got the legal authority.

The pitch was “trust us, only the good guys will use it.” But the backlash was instant and fierce. The Clipper Chip quickly became a symbol of government overreach in the digital era. Public trust collapsed, adoption never took off, and by the late 1990s the project was quietly abandoned.

Netscape & SSL: Business picks a side

While governments were busy fighting to keep control, the private sector saw the problem from a different angle: money. By 1994, the web was growing fast, but one thing held it back.. no one was going to type their credit card number into a website if every packet was as readable as a postcard.

Enter Netscape, founded that same year by Silicon Valley veteran Jim Clark and 23-year-old programmer Marc Andreessen, fresh off building the Mosaic browser at the University of Illinois, the first widely used web browser. Andreessen would later co-found the venture capital firm Andreessen Horowitz and back companies like Facebook, Airbnb, and Coinbase, but back then he was just a student-turned-entrepreneur with a bold idea: if the web was going to go mainstream, it needed trust.

I don’t know why but it was the first browser I downloaded outside of the usual Internet Explorer that came on Windows. It felt cool, different, you could add themes etc for the first time. I liked it! Good times!

Netscape built SSL (Secure Sockets Layer), a protocol that put an encrypted “lock” around web traffic. Suddenly, you could send sensitive data across the internet. Shopping, banking, even logging in, without worrying that anyone along the way could read it.

That little padlock in your browser bar became the symbol of trust on the web. More importantly, it turned cryptography from a political and academic debate into a business necessity. Without SSL, there would be no Amazon, no eBay, no PayPal, no digital economy as we know it.

While the U.S. government was still treating strong encryption like a weapon, two entrepreneurs and a scrappy startup had already put it in the hands of millions. People didn’t need to understand the math; they just clicked the lock and felt safe.

The end of round one

In 1999, the Clinton administration finally conceded. Export controls were relaxed, and mass-market products like web browsers, email clients, and operating systems were free to ship with strong encryption worldwide. What had been treated as a weapon (in the same legal category as missiles 🤣) was now just a standard feature of everyday software.

The first Crypto Wars were over. But the outcome wasn’t absolute: governments hadn’t given up, they had simply lost the opening round. And the battles over backdoors, privacy, and control of digital infrastructure would return again and again..

Apple vs. FBI: the sequel

In December 2015, after the San Bernardino attack, the FBI recovered an iPhone used by one of the shooters. They asked a judge to order Apple to build a special version of iOS that would disable safeguards and let investigators guess unlimited passcodes without the phone erasing itself.

Tim Cook refused. In an open letter, Apple argued that creating such software would be a backdoor in everything but name. If it existed once, it could be demanded again by courts around the world and eventually stolen or abused. Apple framed it as a matter of safety for all customers, not a favor to one suspect.

The fight dominated headlines for weeks. Then, in March 2016, the FBI abruptly dropped the case. A third-party security company, later revealed to be Azimuth Security, had discovered a vulnerability in the iPhone 5C’s hardware. Their exploit let investigators brute-force the passcode without triggering the phone’s data wipe. The government got into this one phone, but the principle at stake didn’t disappear.

You cannot create a backdoor that only good guys use.

Why this history matters now

Now, we’re back at the same fork in the road. Only this time the battlegrounds are bigger:

• AI — will the most powerful models be open or locked behind the walls of a few companies?
• Privacy — will we get true control over our data, or perpetual surveillance in the name of safety?
• Centralization vs. decentralization — will the digital world belong to a handful of platforms, or to the communities that build on them?

The internet we use today, with e-commerce, encrypted messaging, and basic trust, exists because technologists, businesses, and activists stood firm in the 1990s. They fought for the simple idea that privacy is not a privilege, it’s a right.

And that idea matters even more now. The more our lives move online, our work, our health, our relationships, our creativity, the more those rights must carry over. We don’t accept governments opening every letter we send by post. We shouldn’t accept it for our emails, our chats, or our data either.

Freedom in the digital age is not an upgrade or a nice-to-have. It’s the same freedom we demand offline, extended to the spaces where we now actually live.

The Crypto Wars taught us one timeless truth: if we give up that freedom once, we may never get it back. And in a world that’s more digital every day, that would mean giving up freedom itself.

PS... If you’re enjoying my articles, will you take 6 seconds and refer this to a friend? It goes a long way in helping me grow the newsletter (and help more people understand our current technology shift). Much appreciated!

PS 2... and if you are really loving it and want to buy me some coffee to support. Feel free! 😉 

Want to become a marketing GURU?

What do Nicole Kidman, Amy Porterfield & the Guinness Book of World Records have in common? They’ll all be at GURU Conference 2025.

If you're obsessed with marketing like we're obsessed with marketing, Guru is the must-attend conference of the year. We'll be covering all things email marketing: B2B, B2C, newsletters, email design, AI & more.

You can expect to walk away with new email strategies, the very latest digital trends, and how to step up your email performance. But don’t worry, we also like to have fun (dance contests, anyone?)

Don’t miss out. Join us on Nov 6th & 7th for the largest virtual & free email marketing conference.

Thank you for reading and until next time!

Brad Pitt Kiss GIF

Who am I and why you should be here:

Over the years, I’ve navigated industries like advertising, music, sports, and gaming, always chasing what’s next and figuring out how to make it work for brands, businesses, and myself. From strategizing for global companies to experimenting with the latest tech, I’ve been on a constant journey of learning and sharing.

This newsletter is where I’ll bring all of that together—my raw thoughts, ideas, and emotions about AI, blockchain, gaming, Gen Z & Alpha, and life in general. No perfection, just me being as real as it gets.

Every week (or whenever inspiration hits), I’ll share what’s on my mind: whether it’s deep dives into tech, rants about the state of the world, or random experiments that I got myself into. The goal? To keep it valuable, human, and worth your time.

Reply

or to participate.