How I Use an LLM to Produce the HiveToday Newsletter
There was an 18-month gap between issue #73 and issue #74 of HiveToday.
September 2024 to March 2026. A year and a half of silence from a newsletter that's supposed to cover the pulse of the Hive ecosystem every single week.
I could give you a bunch of reasons. Life got busy. The market was quiet. The energy wasn't there. All true. But the real reason, if I'm honest? The manual process of producing a newsletter that covers an entire blockchain ecosystem was just exhausting. I burned out.
This post is about how I fixed that.
What Producing an Issue Used to Look Like
Let me paint you a picture of newsletter day in the old workflow.
Open browser tabs for Splinterlands, Ecency, PeakD, 3Speak, LeoFinance, and a dozen other project accounts. Scan each one for anything posted in the past seven days. Open separate tabs for GitLab and GitHub to check dev activity across the core repos. Query the blockchain for witness data, DHF proposals, power-up stats. Pull market prices. Track down the top rewarded posts for the week.
Then write it up. Then format it. Then copy-paste it into PeakD, WordPress, Medium, Substack, X — platform by platform, each with its own quirks.
An easy issue took four hours. A complex one could eat most of a day. And if I missed a week, the guilt of the backlog made starting the next one even harder. That's how you get an 18-month gap.
The Realization
At some point, I started thinking about which parts of that process actually required me — my judgment, my editorial instincts, my voice — and which parts were just... mechanical.
The answer was uncomfortable. About 80% of what I was doing was data retrieval and formatting. Scanning accounts, querying APIs, pulling numbers, reformatting the same content for different platforms. Repetitive, systematic, and completely delegatable.
The other 20% — what to lead with, what story the data was telling, the Editor's Take, the framing that makes a stat feel meaningful — that was actually me. That's the 20% I want to be spending my time on.
So I started asking: what if an LLM did the 80%?
How It Works Now
I don't think of it as "AI writes my newsletter." It's more accurate to say the AI is my research assistant, my first-draft writer, and my publishing intern — all in one.
The workflow has four phases.
Gather. The AI works through a systematic checklist every week: hot and trending posts on Hive, a watchlist of project accounts (Splinterlands, Ecency, PeakD, 3Speak, LeoFinance, and about a dozen others), GitLab and GitHub repos for dev activity, governance data from the blockchain (witnesses, DHF proposals, power-up stats), and market prices. Every source I used to check manually, now checked systematically, every week, without me having to remember.
Draft. From that raw data, the AI produces a first draft of the newsletter using an established template and editorial guidelines. It knows the voice, the structure, what kinds of things get included and what don't. The draft is never perfect — it's not supposed to be — but it's a solid starting point that's 70-80% of the way there.
Publish. Python scripts push the finished issue to multiple platforms automatically. Hive, WordPress, WhiteWind (with an automatic Bluesky teaser), Nostr, Substack — what used to be an hour of copy-pasting across a dozen tabs is now a handful of commands. Manual platforms (Medium, X, Publish0x, a few others) still get a human touch, but the heavy lift is gone.
Track. After publishing, the system checks engagement metrics, logs follower counts across platforms, and keeps a running record of how each issue performed.
What's Still Mine
I want to be clear about this part, because I think it's easy to misunderstand what "AI-assisted" actually means.
Every issue still has a human editor. That's me.
I decide what the week's narrative thread is. I write the Editor's Take — the section where I share an actual opinion about what's happening in the ecosystem. I decide which stories lead and which ones get cut. I review the AI's draft, edit it, push back on its framing when it's wrong, and add context that only comes from knowing this community for years. I hit publish.
The AI's first draft is the floor, not the ceiling. The version that goes out is always better than what it produced.
The "so what?" that makes stats meaningful — why this power-up trend matters, what a particular development milestone actually signals for the network — that's editorial judgment. It's not something I'd trust to any first draft, human or machine.
A Note on Disclosure
HiveToday posts on Hive include an ai_tools metadata tag indicating that AI was used in research and drafting. I think transparency matters here. The research is AI-assisted. The drafting is AI-assisted. The editorial decisions and voice are mine. That's the deal, and I'd rather be upfront about it.
What This Changes
The reason HiveToday went dark for 18 months was a labor problem. The AI workflow removes that excuse.
The barrier to producing an issue is now lower than at any point in the newsletter's history. The research phase that used to take hours takes minutes. The cross-platform publishing that used to be a tedious manual chore is largely automated. The friction that accumulated into burnout has been mostly cleared.
The goal, simply, is to never miss a week again. Not because I'm more disciplined than I was before — but because the system is finally designed to make consistency easier than inconsistency.
What's Next
The workflow is functional but not finished. I want to expand coverage — more project watchlists, better signal on ecosystem activity that doesn't make it onto trending. I want the analytics layer to get smarter. I want the editorial voice to sharpen.
The vision is for HiveToday to be the most reliable chronicle of what's happening on Hive — not because a single person is grinding through it manually every week, but because the system is designed to make that level of coverage sustainable.
Issue #74 was the proof of concept. Now we build.
Image generated with NotebookLM
HiveToday publishes weekly on Hive, Substack, and across the decentralized web.