Being part of Utopian's development moderation team and to help the team be aware of the baseline of the current review time - one of our Key Performance Indicators (KPI) - this author think it best to take a closer look into the moderation (review) time for the development category.
Outline
- Monthly Contributions and Rejection Rate
- Reject Rate per Moderator
- Monthly Average Review Time
- January
- February
- March
- Were There Moderation Abuse?
- Target KPI and Conclusion
Scope of Analysis
The data extracted was from January 19, 2018 to March 2018.
1. Monthly Contributions and Rejection Rate
Based on the left chart, from Jan. 19 to Mar 8, 2018, there were a total of 839 contributions reviewed by 22 moderators.
701 (83.55%) of these were accepted while the remaining 138 (16.45%) were rejected.
2. Reject Rate per Moderator
There were 5 moderators who flagged at 100%. While there were 8 moderators who accepted at 100%. In between, the moderator who rejected the most was rejecting at 53%.
It's easy to accept, but requires guts to reject.

How about sorting this by the number of reviewed contributions?
, the top reviewer, flagged at 22%, while
flagged at 21%.
Let's look into the details of how long it took to review these contributions.
3. Monthly Average Review Time
3.1. January
For January, the average review time was around 8 hours. This came from 12 moderators reviewing 236 contributions.
The quickest of these reviewers were and
reviewing contributions within the first 2 hours, while the moderators who reviewed problematic perhaps trivial contributions was
taking an average of 14 hours.
Initially, I thought that the longer review times was bad, but when I started to become a moderator, I understood why some contributions took time to review - it was coming up with the decision whether to accept a contribution or not.
3.2. February
For February, the average review time increased to 32 hours. This came from 21 moderators reviewing 475 contributions.
The quickest of these reviewers were who approved one contribution within an hour, while the moderators who took on the oldest contributions was
approving/rejecting a contribution after 3 days.
3.3. March
For March, the average review time again increased to 40 hours. This came from 13 moderators reviewing 128 contributions.
The quickest of these reviewers were who approved one contribution within 5 hours after its creation, while the moderator who took on the challenge to review the oldest contribution was
approving/rejecting a contribution after 3.5 days.
From January to March, there was an increase in review time maxing this month with 40 hours.
4. Were There Moderation Abuse?
One of the things moderators or supervisors have to check is moderation abuse - I am familiar or somehow know the people behind the development moderators and see them as moderators with character and I hope they don't take offense in me looking into this data.
Because I had difficulty in choosing and studying a visualization tool that can help detect abuse (a single moderator approving the same account multiple times), I chose the top 3 contributors and see which moderators approved those author's contributions and how often.
The top 3 contributors subjected to analysis were ,
, and
.
Based on the chart above, there were 6 moderators who approved 's 47 contributions.
For , there were 4 moderators who approved his 27 contributions.
While for , there were 8 moderators who approved his 27 contributions.
Perhaps this is a good indicator of NO abuse - an account being approved by multiple moderators - not just a single moderator
5. Conclusion and Target KPI
Based on the data presented above, we saw that from January there was an increase of the review time of the development category. From 8 hours in January, it increased to 32 hours in February, and is now at 40 hours for March. We also saw that the oldest contribution that was reviewed was past 3 days. This is still above the 2 days (48 hours) review time indicated in Utopian when a contribution is submitted for review.
We also learned that the approval % of the development category currently stands at around 84% the same figure with 's analysis.
and I may have different sources, but we came out with the same figures - the beauty of analysis done correctly.
And lastly, we've also looked at a potential visual tool to detect moderation abuse. The abuse check done here was meant to be a trial in order to use for succeeding analysis of potential moderation abuse. We also discovered a potential indicator of NO abuse - when multiple moderators approve an author's contributions.
Tools
- Power BI for charts
- Utopian.io local DB (MongoDB) for the data
- Studio 3T MongoDB for exporting data from local DB
{ "json_metadata.type": "development", $and: [ { "created": { $gte: "2018-01-19" } } ] }
I am part of a Blockchain Business Intelligence community. We all post under the tag #BlockchainBI. If you have an analysis you would like carried out on Steemit or Blockchain data, please do contact me or any of the #BlockchainBI team and we will do our best to help you...
You can find #blockchainbi on discord https://discordapp.com/invite/JN7Yv7j
Posted on Utopian.io - Rewarding Open Source Contributors