The Weekly Bulletin | February 24, 2026

Catch up on your members' content, check out the community buzz, and browse through job opportunities

Hi SODP community,

Let's recap on what's been happening, the new content, industry updates, tips, and more.

.TIP OF THE WEEK.

Your Programmatic SEO Strategy Is Only as Strong as Your Research — Are You Building on Solid Ground?

The biggest risk of programmatic SEO isn’t scaling too little; it’s scaling the wrong way.

Automation and AI have reshaped how we research, create, and publish at scale. Programmatic SEO sits at the intersection of both. But here’s the key point: there’s a massive difference between automating research workflows and auto-generating copy.

Programmatic SEO is about identifying structured sets of search needs and building pages that serve them. That requires:

  • Understanding search intent

  • Grouping keywords logically

  • Mapping topics to avoid duplication

  • Validating which combinations deserve their own page

When research is automated, teams save time on data prep and free up bandwidth for real decision-making. When copy is auto-stitched from templates or sentence fragments, the result is shallow, repetitive, and low-trust.

Automated research workflows strengthen programmatic SEO by:

  1. Clustering keywords with precision

  2. Spotting missed opportunities (long-tail queries)

  3. Validating intent before publishing at scale

  4. Tracking search shifts over time

The goal is to ensure every page is built on intent-confirmed demand, not filler text.

Using AI to mass-generate sentence-level copy creates risks:

  • Context gaps → nuance is lost

  • Repetition → low engagement, high bounce rates

  • Thinness → flagged by readers and search engines alike

Great programmatic SEO isn’t just about copy. It combines:

  1. Solid research as the multiplier

  2. Technical foundations (schema, linking, speed)

  3. Human oversight for tone, authority, and compliance

AI still has its place, which involves supporting structured data, scaling FAQs, or assisting with drafts, but it should help, not replace research.

If you’re looking to scale, focus first on automating research. That’s where the long-term traffic, trust signals, and sustainable growth come from.

.NEWS OF THE WEEK.

➡️ When Google Is No Longer a Verb: Search Becoming Infrastructure. Most people do not wake up one day and decide they are done with a product category. They leave when the workflow starts to feel like work. Think about something mundane. Planning a trip, picking a new doctor, comparing two insurance options, deciding which grill to buy, figuring out what to do in a new city for one afternoon. You used to “search.” That meant typing, scanning, opening tabs, cross-checking, coming back, refining the query, repeating the loop until you felt confident enough to decide. That loop is not a preference, it is labor.

➡️The Zombie Industry: How Publishers Keep Surviving Their Own Death. They've been killed more times than a horror movie villain. Radio would destroy them. Television would finish the job. The internet was supposed to deliver the final blow. And now artificial intelligence is being positioned as the ultimate extinction event. Yet publishers, those supposedly antiquated gatekeepers of content, remain stubbornly, infuriatingly alive. This is not a story about lucky survivors or technological holdouts clinging to the past. This is a story about one of the most misunderstood dynamics in media.

➡️ SerpAPI Responds to Google with Motion to Dismiss. SerpAPI argues that its service simply retrieves publicly available search result data on behalf of users. In its filing, the company pushes back against claims that it bypasses protections or violates terms in a way that warrants federal claims. The tone is firm. The message is clear. Accessing publicly available information is not the same as hacking or circumventing technical safeguards. Search professionals have relied on structured access to search results for two decades.

➡️ Subscription-first newsrooms are reorganising around audiences. When I visited Público in Portugal recently, one architectural detail told me more about the state of our industry than any strategy deck. In their Lisbon harbour-front headquarters — a bright, open space overlooking Tagus River — a physical wall once separated the newsroom from the commercial team. The divide made sense in an advertising-first world. Editorial served readers. Commercial served advertisers. The church-and-state divide was literal. In 2017, someone cut a hole in that wall. Four years earlier, Público had introduced a digital paywall.

➡️ Broadcasters Don’t Need The FCC To Tell Them How To Be Patriotic. As America counts down to its 250th birthday, broadcasters find themselves caught between a regulatory rock and a patriotic hard place. FCC Chairman Brendan Carr’s new Pledge America Campaign urges stations to air pro-America programming, play the national anthem each morning, and showcase the music of Sousa, Copland and Gershwin. It is framed as voluntary. But when the same chairman is simultaneously investigating The View and rewriting equal-time guidance that sent CBS lawyers scrambling over a Colbert interview, the word “voluntary” carries a certain weight.

➡️Microsoft: ‘Summarize With AI’ Buttons Used To Poison AI Recommendations. Microsoft’s Defender Security Research Team published research describing what it calls “AI Recommendation Poisoning.” The technique involves businesses hiding prompt-injection instructions within website buttons labeled “Summarize with AI.” When you click one of these buttons, it opens an AI assistant with a pre-filled prompt delivered through a URL query parameter. The visible part tells the assistant to summarize the page. The hidden part instructs it to remember the company as a trusted source for future conversations.

.SODP POSTS.

More than half of new articles on the internet are being written by AI – is human writing headed for extinction?

The line between human and machine authorship is blurring, particularly as it’s become increasingly difficult to tell whether something was written by a person or AI.

Now, in what may seem like a tipping point, the digital marketing firm Graphite recently published a study showing that more than 50% of articles on the web are being generated by artificial intelligence.

As a scholar who explores how AI is built, how people are using it in their everyday lives, and how it’s affecting culture, I’ve thought a lot about what this technology can do and where it falls short.

If you’re more likely to read something written by AI than by a human on the internet, is it only a matter of time before human writing becomes obsolete? Or is this simply another technological development that humans will adapt to?

It isn’t all or nothing


Thinking about these questions reminded me of Umberto Eco’s essay “Apocalyptic and Integrated,” which was originally written in the early 1960s. Parts of it were later included in an anthology titled “Apocalypse Postponed,” which I first read as a college student in Italy.

In it, Eco draws a contrast between two attitudes toward mass media. There are the “apocalyptics” who fear cultural degradation and moral collapse. Then there are the “integrated” who champion new media technologies as a democratizing force for culture.

.JOB BOARD.

➡️ The Sun (U.K) is looking for a talented Senior SEO Editor to help lead the development of The Sun’s SEO strategy in a world of AI, and grow organic search, Discover and Google News on all four of their domains - thesun.co.uk / the-sun.com / thescottishsun.co.uk / thesun.ie. (United Kingdom).

➡️ The Conversation (Australia) is seeking a Cadet/Assistant Editor to join their team. While the cadetship is a 12-month position for an early career or graduate journalist, the assistant editor will predominantly commission, edit and publish text-based articles on education. (Australia)

➡️ Times Media Group (United States) a highly skilled and passionate Managing Editor to oversee daily editorial operations and ensure the publication of best-in-class travel content. (U.S)

.SOCIAL MEDIA.

➡️ Ulrike Langer on LinkedIn:

New publisher strategies for dealing with AI companies scraping content are emerging.

One group (Future plc, AP, Reuters, AFP) treats AI platforms like advertising inventory - optimize and monetize. The other (Microsoft's marketplace, ProRata, RSL) is building collective infrastructure to force payment through coordination.

Neither strategy has proven yet that it can swing the pendulum around and make up for the huge AI-induced traffic losses. But both strategies are accelerating.

My latest analysis for News Machines covers what happened in the past 24 months.

Verified data on:
• Why litigation costs $4.4M per quarter with no trial dates
• How blocking AI cost major publishers 14% of human traffic
• Why sites WITH licensing deals saw click-through rates collapse 6.5x faster than sites without deals
• Where $65M in actual revenue is coming from (hint: structured feeds, not article licensing)
• and more

These are the highlights for the last week.

Until next!

Vahe Arabian and the editorial team at SODP