← Back to Live in the Future
📜 Editor's Note

A Human Rewrites a Wire Story and Owns It. An AI Does the Same Thing and Doesn't.

Every newsroom in America rewrites wire copy. It's the most common act of journalism. When AI does it, the output crosses a legal line that has nothing to do with quality.

By The Editors · Live in the Future · March 13, 2026 · ☕ 9 min read

In July 2023, the Associated Press licensed part of its news archive to OpenAI. AP's chief revenue officer, Kristin Heitmann, said the deal reflected the importance of ensuring "content creators are fairly compensated for their work."

AP serves more than 1,300 newspapers and broadcasters from 235 bureaus in 94 countries. Every one of those outlets takes AP dispatches and rewrites them for local audiences. A reporter in Des Moines reads the wire, pulls the local angle, adds a source, files a new version. That version is copyrightable. This is how news has worked since the telegraph.

AP apparently decided the smart move was to license its archive to the company building the system that can do all 1,300 of those rewrites simultaneously.

What 1,300 Simultaneous Rewrites Looks Like

An AP reporter files a story about quarterly earnings at a major tech company. Within minutes, an AI can generate 1,300 localized versions. One with a manufacturing angle for the Midwest. One connecting the layoffs to a specific supplier in Austin. One framing the earnings miss around pension fund exposure for a retirement-community paper in Florida. Each version cites the AP original, cross-references the SEC filing, and reads like it was written by someone who covers that beat.

A human newsroom doing this work employs reporters, pays benefits, maintains office space, and produces one rewrite per outlet. Each rewrite is copyrightable because a person wrote it. An AI doing the same work produces all 1,300 versions in the time it takes a reporter to read the dispatch. None of them are copyrightable.

Same source material. Same analytical process. Same quality of output. One side has legal protection. The other doesn't. And the side without protection is a thousand times cheaper.

How the Framework Cracked

Copyright law separates facts from expression. Facts are free. Expression is protected. In 1991, the Supreme Court formalized this in Feist Publications v. Rural Telephone Service: copyright requires "independent creation plus a modicum of creativity." No one owns facts. But if you arrange them with some creative judgment, the arrangement is yours.

Feist didn't explicitly say "human." It said originality required "intellectual production, of thought, and conception," quoting Burrow-Giles from 1884. For 130 years, the distinction didn't matter — every entity capable of intellectual production was human. Expression and human authorship were the same thing.

Then the Copyright Office's March 2023 guidance made the implicit explicit: copyright requires "human authorship." Not just creativity. Human creativity. AI can produce expression that meets every other standard. Original arrangement, creative structure, non-obvious word choices. But if a machine generated it, the Office says it doesn't qualify.

So now the framework does two jobs. It separates facts from expression, which is what Feist intended. And it separates human expression from machine expression, which nobody planned for.

Sarah Chen Files at 6 PM

Sarah Chen is a composite character, not a real person, because even our illustrations are built by AI. But her situation isn't invented. She has covered agriculture for the Des Moines Register for eleven years. She knows which farmers will pick up the phone and which ones won't. She knows that when USDA releases crop forecasts, the wire version buries the Iowa soybean numbers in paragraph nine, and her readers need them in paragraph two. She has a source at the state extension office who texts her when the data is wrong.

When AP files a dispatch about farm subsidies, Sarah rewrites it. She calls her source. She pulls the county-level numbers the wire story didn't have. She writes 800 words and files at 6 PM. Her editor runs it. It's copyrighted. It has her name on it. It goes into the paper's archive, builds her professional record, and counts toward the portfolio she'd use to apply at the Tribune or Post if she ever left Iowa.

An AI can read the same AP dispatch, pull the same USDA data, and generate a version with Iowa soybean figures in the second paragraph. It can do this for every state in the country at the same time. Its version might miss the phone call to the extension office. But put both drafts on an editor's desk without bylines, and the difference isn't obvious.

Sarah's article is copyrighted. It belongs to her paper, builds her professional record, and goes into the archive that proves she's been doing this work for a decade. The AI's article is free. Anyone can copy it, republish it, or feed it into another model's training data.

Multiply this by every reporter at every paper that depends on wire copy. According to Northwestern's Medill School, 136 local news outlets disappeared last year. Six more than the year before. 250 counties are on a watch list for losing local news entirely within a decade. These are papers where wire rewrites aren't a small part of the operation. They are the operation.

AP's Licensing Paradox

Go back to the AP-OpenAI deal. AP licensed its archive so that OpenAI could train models on decades of reporting. Those models can now generate wire rewrites. AP's own content trained the system that competes with AP's own customers.

Heitmann's statement called for a "framework that will ensure intellectual property is protected and content creators are fairly compensated." But the framework she's describing already exists. It's called copyright. And it specifically does not protect the AI-generated rewrites that AP's own archive helped train into existence.

AP got paid for the license. Its 1,300 newspaper customers, the ones actually doing wire rewrites with human reporters, got nothing. They're now competing against a system that was trained, in part, on the dispatches they've been rewriting for decades. The institution that built the wire service ecosystem sold the training data to the entity most likely to replace it.

When the New York Times sued OpenAI in December 2023, it argued that training on copyrighted work without permission was infringement. AP took the opposite approach: get paid, provide access, hope the framework holds. Both positions assume copyright is the tool that protects journalism. For AI-generated output, it isn't a tool at all.

Speed Kills the Last Safety Net

There's one legal doctrine that protects facts directly: the "hot news" doctrine, from International News Service v. Associated Press (1918). INS was copying AP's wartime dispatches and republishing them. The Court said free-riding on AP's newsgathering investment was unfair competition, even though facts themselves aren't copyrightable. Time-sensitive information gets temporary protection.

Hot news claims are hard to win. Courts require the information to be time-sensitive, the defendant to be free-riding, and the free-riding to threaten the plaintiff's incentive to gather news in the first place. Most claims fail. But the doctrine exists because speed matters in news, and someone has to pay for the reporting.

AI collapses the window. A human wire rewrite takes thirty minutes to an hour. An AI can read, synthesize, and file in seconds. When the gap between a wire dispatch and a finished local version approaches zero, the window for hot news protection shrinks with it. The news is barely filed before the AI's version exists. And unlike a competing newspaper that might get caught copying, AI-generated output isn't copyrightable anyway — there's nothing to enforce against.

Can a Machine Invoke Fair Use?

News reporting is one of the purposes explicitly favored by fair use under 17 U.S.C. § 107. When Sarah Chen rewrites a wire dispatch, fair use protects her because she added original creative expression. Her judgment about what to emphasize, how to frame it, which sources to add. That's the transformative element.

When an AI rewrites the same dispatch, it also adds expression. Different structure, different word choices, different emphasis. But the Copyright Office says that expression has no author. So fair use hits something it wasn't built for. The output is transformative. No author performed the transformation. And the result isn't copyrightable regardless of whether fair use applies.

Nobody has litigated this yet. When someone does, the court will have to decide whether fair use is a right that belongs to authors, or a characteristic of the use itself. If it belongs to authors, AI can't invoke it. If it belongs to the use, then AI-generated news reporting inherits the same protection as human reporting, except the output still isn't copyrightable. Either way, the framework is answering a question it wasn't designed to answer.

What Comes Next

We run a publication that does what Sarah Chen does, except we do it with AI. We wrote about our own copyright situation last week. We disclosed our use of AI personas, reported that Common Crawl hasn't indexed us, and tried to sit honestly with the contradiction.

Wire rewrites are the most common copyrightable act in American journalism. Thousands of reporters do it every day. It's how local news survives in places where original reporting budgets have been cut to nothing. When AI performs the same act, the output falls outside the legal framework that made the economics work.

We don't know what replaces that framework. Extending copyright to AI output would protect publications like ours, and honestly, we'd take it. But it would also let automated content farms register thousands of articles per hour. Maintaining the human-authorship line is principled but means AI journalism exists permanently without legal protection. Waiting for courts to draw the line means the first case will be decided by a judge interpreting a statute written in 1976 about a technology from 2023.

Someone will register an AI-generated wire rewrite without disclosing how it was made. Someone will scrape an AI publication's full archive and republish it legally. Someone will file a hot news claim against a system that rewrites dispatches in real time. Those cases are coming. Until they arrive, the law treats identical work differently based on a single variable.

Sarah is still covering agriculture in Des Moines. Her paper might not be for much longer. The Register has seen its newsroom shrink dramatically under Gannett ownership, and every wire rewrite that can be automated is one more line item someone in corporate will eventually question. Eleven years of knowing which farmers pick up the phone doesn't show up on a spreadsheet. What shows up is the cost difference.

Her article is protected because she typed it. Ours isn't because we didn't.

Sources