VRChat Security
Today, there was a new announcement about the security of VRChat, and what the developers are doing to protect it. They announced a new system that they have been working on, called a trust system. I highly disagree with this choice.
I'll share the original post by the VRChat devs, and explain why I disagree. I would also love to hear what everyone else thinks about a system like this within a social game.
Trust Systems — Content Gating
Hello, VRChat! We’ve been working on some new “Trust” systems to help make VRChat a friendlier place. These systems will be used to help gate various features until users have proven themselves to be a friendly member of the community. One of the first parts of the Trust system is called “Content Gating”. This system is designed to reduce abusive or annoying behavior involving avatars or other content.Here’s generally how it works. When a user first creates a new VRChat account, they will be unable to upload custom content like worlds or avatars. After spending some time in the app and having positive interactions with other users, they will eventually receive in-app and email notifications that their account has access to world and avatar creation capability. This time may vary from user to user depending on various factors.
If the new user chooses to spend time in VRChat behaving badly or maliciously against other users, they may lose the capability to upload content. They will receive a notification in-app and via email that they have lost access to content uploading. If they spend more time in the app and follow the Community Guidelines, then they will eventually regain access to these systems. Again, this time may vary depending on various factors.
If a user does not have access to content uploading and they attempt to upload something via the SDK, they will receive an error. If they aren’t running the latest SDK, this error may not appear properly, but they still will be unable to upload content.
When the system launches, all VRChat account users will receive notifications indicating their status in the Content Gating system. If you believe there has been some error with the system, please feel free to contact us — however, VRChat will not help you sidestep or bypass the Content Gating system.
As an aside, VRChat staff and developers will not delve into the details of how any of the Trust systems work to any user. Development of the Trust systems will be an ongoing effort so that we can make VRChat a better place for all of our users.
Finally, as always, ensure that you are utilizing our Social features to defend yourself against users that you do not wish to interact with. Using Mute, Block, Vote Kick, and Safe Mode are the best way to keep yourself safe and secure.
For anyone who wants a TLDR version:
They are basically adding a system that will keep you from uploading any content (worlds and custom avatars) until you are considered "trusted" in the VRChat world. Trusted, as in not blocked by people, not kicked, not misbehaving in any ways that they decide to consider "misbehaving". So basically you get punished just for being a new user.
Here was my response to the post.
This seems like a stupid, quickly thrown together way to deal with the long term issue of security. The outline better show an intuitive trust system that won’t be punishing users that might not have done anything “wrong”. Like others said, these systems can often be easily abused, and are usually very flawed.
e.g. What if someone (maybe a streamer with lots of followers) simply disagrees with someone’s opinion, and sicks a bunch of people to block (or “report” if that’s going to be an option) someone who hasn’t really broken any rules. Suddenly, they are punished for having an opinion, not really breaking any rules. There is no system where they can appeal to a mod or even have proof to defend themselves.
I’m curious to see what you present to us, and I hope you are thinking of better long-term options for the security of VRChat.
I also added a comment under my post
I would also like to add, how does this help against hackers, broken SDKS, and some of the new hackers who even block the option to report them, block them, mute them, or interact with them at all? Their names don’t even appear in the room list. Like I said, this seems like a Band-Aid for a larger wound.
Here are some other comments:
I agree with both sides.
If they manage to make an intuitive trust system, that doesn't allow for abuse, I would love to see it better our VRChat experience. On the other hand, if it's the same old trust system with the usual exploits, they might as well trash it and go back to the drawing board. I've been kicked from rooms just for joining them before. I've also seen friends kick each other as a joke. Now, they might lose the ability to upload avatars just because of something like that? How can the system determine why you were kicked, and that it wasn't just because you joined a Knuckles room without a Knuckles avatar or something? There are sooo many flaws.
What do you think about this system?