The Real AIpocalypse Is Probably Already Here

ClueBot must be stopped; Made via Stable Diffusion

Are The TerminatorThe Matrix and other such films entertainment, or are they prophecy? With the fast progress of artificial intelligence over the last few years, that’s become a real question of real concern to real people.

Out at the edges of the opinion bell curve, we have “doomers” on one end and extreme “optimists” on the other.

The former warn us that AI will eventually supersede humankind, quite possibly enslaving, or even exterminating, us because it won’t like us (or maybe just won’t care about us either way) and because it will be able to do whatever it wants with us. In a word (actually a portmanteau), “AIpocalypse.”

The latter predict an era resembling Aaron Bastani’s “Fully Automated Luxury Communism” in which AI increases production efficiency, reduces resource scarcity, and addresses externalities so well that we’re all free to become full-time artists, philosophers, extreme sports practitioners, etc. (or, if we prefer, veg out on the couch 24/7) with our material needs fully provided for absent any effort on our part.

The AIpocalypse sounds pretty scary. Fully Automated Luxury Communism sounds kind of cool, but only if we naively assume that evil human actors won’t find ways to exploit it in service to their desire for power.

In my view, the real AIpocalypse has already arrived. It’s not fully developed, but we’re already seeing it in action.

The real AIpocalypse is a massive decrease in our ability to know what’s true and what isn’t.

Two of the most obvious manifestations:

First, “deepfake” technology that allows bad actors to “show” us events that didn’t actually occur, put words in the mouths of public figures that those public figures never said, etc. That’s already pretty far along. You may have seen such videos hawking “miracle cures” with deepfake material featuring the likes of Tom Hanks and “Dr. Phil.” It’s only going to get worse.

Second, the wave of AI “hallucination” making its way into areas as important as jurisprudence. We’ve seen numerous cases in which lawyers have been caught submitting briefs that cite non-existent court cases. They had AI write the briefs, then inserted them into court proceedings. Their AI “assistants” simply generated fake “case law” supporting a desired outcome. That’s only going to get worse, too.

The problem with those two examples goes beyond immediate effects. The fake material will inevitably produce (probably already HAS produced) “source pollution.”

Suppose you carefully, intentionally avoid AI and its product, for whatever reason. Maybe you distrust its output. Maybe you just prefer to do your own research, and reach your own conclusions, from primary human-created sources.

But how can you know AI-generated content hasn’t previously “polluted” the human-created sources with “facts” that aren’t true?

You read a claim of fact in an op-ed like this one … or in a chemistry textbook.  The source claims to be human-created. It may even run a disclaimer denying that AI was used in its creation.

But what if, somewhere back along the chain of knowledge transmission, someone DID use AI, and a non-fact worked its way into the body of presumptive knowledge?

The problem isn’t new. People have always lied, and often those lies have persisted and spread, becoming “common knowledge” despite being false. AI, linked to a mechanism of near-instantaneous global spread (the Internet), can produce and distribute lies far faster than humans once did by word of mouth or through print on paper.

We may already be past the point where the only way to even semi-reliably establish truth is to consult printed material published prior to 2018.

Or just learn to love living in a “post-truth” age.

Thomas L. Knapp (X: @thomaslknapp | Bluesky: @knappster.bsky.social | Mastodon: @knappster) is director and senior news analyst at the William Lloyd Garrison Center for Libertarian Advocacy Journalism (thegarrisoncenter.org). He lives and works in north central Florida.

PUBLICATION/CITATION HISTORY

Iran War: And The Winner Is …

Oil refineries and storage facilities caught fire in Tehran, Iran as a result of military attacks, the brightness of the flames and the dark smoke of burning fuels visible from satellites.Oil Refinery Fires in Tehran, Iran

US president Donald Trump says that his war in Iran — currently in a supposed ceasefire — resulted in “total and complete victory. 100%. No question about it.” The Iranian regime, via a statement from its Supreme National Security Council, also claims “great victory.”

If the war is really over (I’m skeptical), who actually won?

Well, not you.

“You can no more win a war,” said Jeannette Rankin, “than you can win an earthquake.”

Rankin, the first woman ever elected to the US House of Representatives, entered Congress in 1916, just in time to vote against US entry into World War 1. Unseated in 1918, she managed a comeback in 1940, just in time to vote against US entry into World War 2.

We could use a few Jeannette Rankins these days.

Not so much to vote against going to war, though: Congress hasn’t bothered with that formality since 1942, leaving such decisions up to whatever emperor-in-all-but-name happens to occupy the White House and suddenly find himself in need of a distraction from the various domestic problems that presidents always get blamed for (and are sometimes actually to blame for).

The real function of a Jeanette Rankin or her equivalent is to remind us now and then of an immutable and irrefutable truth:

War is always a damaging and destructive thing.

Apart from a few politicians and generals who get to crow about “winning,” and some politically connected profiteers pre-positioned to knock down fat contracts at the expense of taxpayers,  everyone, on all sides, loses.

Some — soldiers and civilian non-combatants alike — lose their lives or end up maimed or orphaned.

Others see their homes destroyed and are forced to flee to hopefully safer locations, sometimes never to return.

Even those far from the front and seemingly safe from shelling, aerial bombardment, or rocket attack find that their paychecks don’t go nearly as far and that some things just aren’t nearly as available at any price as during peacetime. I still have my mother’s World War 2 ration book. Fortunately, Americans haven’t suffered those levels of privation at any time since, but many people, in many places, have seen that and worse.

War may be, as Randolph Bourne said, “the health of the state,” but it’s all down side for regular people who just want to live their lives in peace and prosperity.

Here’s hoping that the Iran earthquake wasn’t just a foreshock, and that the aftershocks are minor. Support peace!

Thomas L. Knapp (X: @thomaslknapp | Bluesky: @knappster.bsky.social | Mastodon: @knappster) is director and senior news analyst at the William Lloyd Garrison Center for Libertarian Advocacy Journalism (thegarrisoncenter.org). He lives and works in north central Florida.

PUBLICATION/CITATION HISTORY

The Age of the Gilded Apple

Apple 1 Advertisement Oct 1976 bottom
Steve Wozniak’s insistence that he priced Apple’s first computer at $666 “because I, as a mathematician, liked repeating digits” is more plausible than Moses Harman’s that his periodical Lucifer the Light-Bearer with its frank discussion of what Robert C. Adams called “anarchy, socialism, free trade, free rum and free love” aimed merely to “bring light to the dwellers in darkness” like the planet Venus. Public domain.

Half a century is plenty of time for an Apple to stay fresh, or to rot.

The New York Times‘s Kalley Huang (“For Employee No. 8, Many Changes in Apple’s 50-Year History,” April 2) traces the evolution of Apple Inc. from a “scrappy start-up that assembled computers by hand” — and whose organic name was a natural fit for an environment in which “Silicon Valley’s fruit orchards hadn’t yet been taken over by office parks” — to one which “has come to define how to be a global technology company.”

In a 2014 Bloomberg interview, Steve Wozniak recalled how he had “given away my designs for the Apple-1 for free,” leaving it to Steve Jobs to take projects the other Steve had “designed for fun” (while being “totally aware that a revolution was close to starting”) and “somehow turn them into some money for both of us.” The sum of their money would become so enormous that Chris Espinosa, who admits that having “had no college degree and … only worked at one company” since 1976 doesn’t sound like much of a résumé, owns what Huang estimates is well over $100 million worth of the corporation that makes a thousandfold of that in profit every year.

Craig Newmark’s op-ed “Craigslist Made Me Rich. Giving the Money Away is Easy” might have included Espinosa as evidence for how “making money isn’t proof to me that I know something any better than someone else” but of being “in the right place, at the right time” to apply common sense to a new field, if it hadn’t gone to print in the same day’s edition of The New York Times.  Newmark doesn’t propose any political program, keeping his distance even from any endorsement of “left-wing nonprofits” and instead promoting such voluntary philanthropic efforts as the Giving Pledge. Still, the public souring on the information industry, as captured by such titles as Douglas Rushkoff’s Throwing Rocks at the Google Bus and Tripp Mickle’s After Steve: How Apple Became a Trillion-Dollar Company and Lost Its Soul, might seem the inevitable result of it enabling such outsized yet largely fortuitous accumulations in the first place.

The Giving Pledge cofounder Bill Gates owes much of his fortune to emulating Apple. The video game Halo was first showcased at MacWorld by Jobs before it became an exclusive killer app for Microsoft’s Xbox. Gates’s Windows operating system tapped the talent of Macintosh’s iconic icon designer Susan Kare. And yet the broader impact of Apple’s innovations is hardly confined to such sheerly financial windfalls.

This is not just because Apple efforts like the HyperCard which made creating and viewing multimedia straightforward, the Pippin which brought built-in Internet access to a video game console, and the Newton which pioneered the personal digital assistant were influential on later developments without managing to become profitable products for them or anyone else.

Indeed, much of the creativity that spread from Apple’s roots in Cupertino, California to cyberspace is closer in spirit to Wozniak than Jobs. It was entirely typical for Stephen D. Young and Debra Willrett’s Backgammon, programmed for the Apple Macintosh in the same non-Orwellian year 1984 during which the desktop model was introduced, to give out a postal address for users who “enjoy it and would like to see more ‘freeware'” to “please send whatever you think it’s worth” … and permission for them to disseminate the software itself.

Huang notes that Apple’s current survival requires not just satisfying customers but withstanding “tariff whiplash, antitrust scrutiny and geopolitical turmoil.”  Consumer sovereignty and cooperative networking can tame such seemingly relentless forces — and make the fruits of tech’s golden geese as common as dirt.

New Yorker Joel Schlosberg is a senior news analyst at The William Lloyd Garrison Center for Libertarian Advocacy Journalism.

PUBLICATION/CITATION HISTORY

  1. “Guest Column: The age of the Gilded Apple” by Joel Schlosberg, The Elizabethton, Tennessee Star, April 7, 2026
  2. “The Age of the Gilded Apple” by Josh Schlossberg [sic], CounterPunch, April 10, 2026