Hello everyone!
I have had the pleasure of working on Hive for six incredible years, and I am grateful for the support you have all given me. I would like to carry on working as a core developer and contributing to Hive for a seventh year.
Who am I
I have been on this chain for more than 9 years now.
I went from regular blogging, to contributing to open source on Hive, to building dapps and running a top 20 witness (, now deprecated). And now I sit at rank 21 with my renewed witness (
). I became much more involved in core development when we forked away from Steem, rising to the occasion as we needed every hand we could get to birth the chain. I contributed to the soft fork that locked Justin Sun's funds and later to the very first hard fork that created Hive. Ever since, I've been working as a core developer contributing to Hive and Hivemind, implementing sensible requests from the community and hosting core dev meetings where every month, core contributors to the ecosystem get together and collaborate on ideas and share what they are doing.
If you're interested in my full journey, I made a throwback post retracing most of my activities on Steem and then Hive here a while ago: @howo/my-hive-story
My work
Over the years I've shipped a number of features you may be familiar with: recurrent transfers, RC delegations, the Mesh API (coinbase integration), to name a few. Rather than revisiting those at length, I'd like to focus on what I've been building more recently.
Hard Fork 28
HF28 shipped a few months ago and included a bunch of my features:
Multiple recurrent transfers for the same sender/receiver pair: Previously, you could only have a single active recurrent transfer between any two accounts. This was a meaningful limitation for builders — imagine wanting to run both a subscription payment and a separate payment stream to the same account. HF28 lifts that restriction entirely, allowing any number of concurrent recurrent transfers. This is what peakd leverages for their open market feature
Removing DHF HBD from inflation calculations: This one is directly relevant to the current inflation debate (more on that later). Before this change, there was a risk of an uncontrolled inflationary spiral: if the HIVE price dropped far enough, the HBD sitting in the DHF would represent a growing share of market cap, which would increase inflation — which would push the price down further, which would increase inflation, and so on. The only way to prevent this was a proposal to burn DHF funds, but the protocol prevents burning more than 1% of the fund per day — which is far too slow to react to a fast-moving market event (and also forces us to burn pretty much the majority of the DHF funds). This change eliminates that risk entirely. A side effect of it is also that inflation via the reward pool and witness rewards dropped significantly post hard fork.
Community Features
This year has been focused on a lot of in-depth reworks. Thanks to big advancements in agentic coding (AI), many tasks that I've been putting off for years because they would take too long are now possible and being tackled.
For instance I've migrated most of the community code from Python to PgSQL, which led to massive improvements, doubling the speed at which we process those transactions by moving all of that logic into optimized SQL functions that run directly in the database instead of round-tripping through Python.
Another big rework completely changed the way notifications are processed for communities, enabling a lot of UX improvements for users and moderators. You now know when your post gets muted (and why!). When your role or title changes, you get notified. Users can now flag a post for moderators to review if they think it broke the rules.
There's also a bunch of features that are ready to ship but haven't been included in the latest release yet:
In line with the better UX for communities, I've added a complete moderation API to let community owners actually manage their communities properly. You'll be able to pull up any user and see their full history - how many times they've been flagged, muted, what actions were taken and by whom, with full date-based pagination so you can browse through everything. There's also a moderation stats endpoint that gives you an overview of your team's and users' activity. The idea is to give community owners real tools to make informed decisions instead of just guessing whether someone deserves a permanent ban or just a warning.
On the notification side, several users have surfaced the need for the ability to "subscribe" to a post and get notified of every reply. Right now if you make a post and a conversation starts in the comments, you have no way to follow it without manually checking back. With this feature, you'll be able to subscribe to any post or comment and receive a notification for every new reply in that thread. It works as a simple custom_json operation on-chain, so any frontend can integrate it easily. There's a cap of 16 subscriptions per user to keep things reasonable, and unsubscribing is just as straightforward.
For You Feed
Finally, my most recent work tackles one of Hive's biggest usability gaps: content discovery. Right now, finding content you actually care about means manually jumping between communities, trending pages, curator feeds, and reblogs. That was fine in the Myspace era, but it's not how people expect social media to work in 2026.
I'm building a "for you" feed - a single unified feed that pulls posts from all your sources (communities, follows, trending, tags) and ranks them based on your actual behavior. If you start commenting on gardening posts, more gardening shows up. If you interact with a specific author daily, their new posts surface first. If your close mutual connections are all engaging with a post from someone you don't follow, it gets surfaced as a discovery candidate.
Under the hood, every post gets scored on multiple signals: how much you interact with the author, comment velocity, payout quality, freshness, social proof from mutual connections, and community/tag affinity. There are also diversity rules to prevent any single author from dominating and to ensure a healthy mix of followed content and new discoveries.
This is still a work in progress - it's computationally heavy and because we want to stay as decentralized as possible, we can't just throw more hardware at it. But the foundation is there and I'm building it up piece by piece. I'm also exploring extending this to post-level recommendations, think "if you liked this post, check out this one."
If you're interested in the full technical breakdown, I wrote about it here: @howo/content-discovery-feeds-and-decentralized-social-media
The small tasks:tm:
Then there's all the smaller stuff that doesn't always make it into changelogs but matters just as much: fixing bugs that users and Dapp developers report, speeding up slow queries, plugging gaps in input validation, keeping the devportal docs accurate and up to date, squashing UI annoyances on Block Explorer and Condenser (dark mode glitches, broken pickers, missing translations), cleaning up old code nobody needs anymore (dead MySQL references, deprecated tables, leftover files), tweaking APIs when Dapp teams ask for something reasonable (extra filter params, cleaner response formats, consistent errors), making tests less flaky. None of it is glamorous, but it's what keeps things running smoothly and saves everyone else headaches down the line.
A word on the current spending debate
There has been a lot of discussion lately about DHF spending, inflation and price, and I want to address it directly rather than pretend it isn't happening.
I am reducing my proposal by approximately 17% — going from 350 to 300 HBD/day. I think some belt-tightening is healthy for the ecosystem, and I want to lead by example rather than just talk about it.
That said, I think it's misguided to focus all the pressure on reducing spending. We are living through a moment where AI is changing the economics of software development.
That can be used in one of two ways: we can go at the same pace for a third of the cost, or we can go ten times faster for the same cost. Every serious competitor in this space is choosing the second option. Beyond the promise of better inflation numbers, I don't see a concrete plan for what Hive does with the savings. I'd much rather see us continue to use all the resources available, move fast, and take a real shot at growing and succeeding than limp along into irrelevance while our developers quietly migrate to other ecosystems.
What's next
In the short term, the priority is getting the features that are already done into production: the moderation API and post subscriptions. These are built, tested, and waiting for a release.
The for-you feed is the big ongoing project and will likely last for a while as I fine-tune it with feedback and monitor performance impacts. The foundation is solid and I'm actively working on it. I'll also be looking into expanding the algorithm for post-level suggestions (think "if you liked this, check this out") to improve content virality and make users want to spend more time on front ends and make our chain more valuable.
Payment infrastructure for HBD
One thing I've been thinking about a lot is how to make HBD actually useful beyond our ecosystem. We have a stablecoin with zero transfer fees, 3-second settlement, and a 14% savings rate. That's a genuinely compelling product. But the infrastructure for actually using HBD for real-world payments and subscriptions doesn't exist at the protocol level. Every project that wants to build on this has to start from scratch.
I want to change that by building the core infrastructure that makes payment-based applications possible on Hive. This isn't about building a product. It's about building the rails that products can run on. Concretely:
A HAF application for subscription and payment state tracking. Right now, if you want to know "does account X have an active recurrent transfer to account Y of at least 5 HBD per month?", you have to scan the chain yourself. I'm building a dedicated HAF app that indexes all recurrent transfer relationships and exposes them through clean APIs. Any frontend or dapp can query subscriber lists, verify access rights, and pull payment analytics without building their own indexer. This is the same kind of infrastructure work I've done with communities: moving complex state tracking into optimized SQL and making it available to everyone.
A standardized payment request protocol. There's currently no standard way for a merchant or creator to say "here's an invoice, pay this amount, here's your receipt." I'll be defining custom_json schemas for payment requests, confirmations, and receipts, the kind of boring-but-essential plumbing that every payment application needs. For transactions that need buyer protection, the standard will integrate with Hive's existing escrow operations, so you get built-in dispute resolution without reinventing the wheel. This builds directly on the recurrent transfer and escrow infrastructure already in the protocol that I've been working towards over the past years.
A content encryption standard for gated content. This is the part I'm most excited about. Hive has all the building blocks for a decentralized Patreon/Substack: recurrent transfers for billing, custom_json for metadata, and memo keys for encryption. The missing piece is a standard way to encrypt premium content so that only paying subscribers can read it. Since everything on Hive is on-chain and public, Hivemind-level gating isn't enough. You need real cryptographic enforcement. I'm designing an encryption scheme that uses our existing memo key infrastructure (ECDH shared secrets) to distribute content decryption keys to subscribers. Monthly key rotation means cancelled subscribers lose access to new content, and the whole thing works without any hard fork. It's custom_json operations, a HAF indexer, and a reference encryption library that frontends can integrate.
The key distribution can be handled by multiple independent delegates. No single service controls access. A delegate is just a process that watches the HAF database for new subscribers and posts wrapped keys. If one goes down, others pick up the slack. It's a similar redundancy model to how we run API nodes today.
Community applications and gated communities. Communities today are fully open: anyone can subscribe and post. There's no concept of "apply to join" where moderators vet users before they get in. I want to add a proper application workflow that opens up an entire category of community types Hive currently can't support. Users submit an application (with community-defined questions), moderators review through the dashboard I already built, and approval either makes them a member outright or activates a recurring membership payment. Communities can be configured as open, application-required, or application-plus-payment-required.
This ties directly into the rest of the payment stack: membership fees are just recurrent transfers in HBD, member-only posts use the same encryption standard as creator subscriptions, and revenue splits across moderators and contributors use the existing beneficiary infrastructure. The result is something genuinely new: professional communities (verified credentials, alumni networks, expert groups), paid niche communities (trading insights, premium hobby groups), and vetted discussion spaces, all natively on Hive with built-in billing and access control. It also fixes the cold-start problem of individual creator subscriptions: a community with 10 contributors and 500 paying members is easier to bootstrap than 10 creators each trying to find 50 subscribers. Most of the infrastructure is already there (community tables, roles, moderation API, notifications, recurrent transfers). What's new is the application workflow, community-level membership gating, and the glue that connects it all.
Why this matters for token demand: every subscription payment, membership fee, and merchant transaction is in HBD. HBD demand means HIVE demand through the conversion mechanism. Creators and community treasuries holding HBD in savings at 14% means supply locked up rather than sold. It's not speculative, it's mechanical. And the infrastructure I'm building is general-purpose: it enables creator subscriptions, paid communities, merchant payments, SaaS billing, or anything else that needs "has this person paid?" logic on-chain.
To be clear about what's in scope and what isn't: I'm building the protocol layer and infrastructure. The consumer-facing product, the pretty frontend, the fiat on-ramp, the creator marketing, that's for Dapps teams. My job is to make sure the rails are there and they work. That being said I will build a demo patreon/substack clone to showcase investors and builders what's possible.
Ongoing work
I'll also be spending time on the reputation system. It's been broken for a while and there are ways to improve it to make it a more relevant metric again. Speaking of reputation, I'll be working on tackling issues around certain users spamming the chain and improving the overall experience for normal users so they can effectively mute them into nonexistence. This has already started with some of my API changes but there is more work to do.
There is also a lot of optimization work waiting to be done, especially now that making large overhauls is so much cheaper. Very much in line with what I've done with communities, there are plenty of opportunities to speed things up significantly.
On top of that, AI has gotten incredibly good at finding vulnerabilities, as we've seen in recent developments. This is both a good and a bad thing: it's now a race to leverage those tools to find and fix vulnerabilities before they are exploited. The Hive ecosystem is no different, and I've already started this work with input validation hardening and XSS fixes on Condenser and HiveD. Expect more of this going forward.
Beyond that, I'll keep doing what I've always done: fixing bugs, responding to feedback from users and Dapp teams, keeping the devportal and docs up to date, and hosting the monthly core dev meetings. The work is rarely glamorous but it's what keeps the chain healthy and the developer experience good.
If you've followed my work over the years, you know what you're voting for. More of the same, but faster, and with a clear focus on making HBD useful enough to drive real demand for HIVE. Note that this list isn't exhaustive. I'll definitely be building more, but priorities shift quickly, especially now with AI, and we adjust month to month with the rest of the core team. For instance, once the light accounts specification is finalized, I'll likely lend a hand on the implementation side.
Voting
Here is an easy link to vote on the proposal:
https://peakd.com/proposals/371
https://ecency.com/proposals/371
You can view all proposals on:
https://wallet.hive.blog/proposals
https://ecency.com/proposals
https://peakd.com/proposals
(Make sure to vote on the upcoming proposal and not the old one!)
Closing words
If you have any questions, please feel free to ask them in the comments!