- Jun 10, 2017
- 11,205
- 16,911
And there'sA few? There are literally hundreds of patreon creators that violate at least one of these in their games.
You must be registered to see the links
. So, yes, hundreds is still "few". Even if there's 860 who violate the Terms of Use, it would still be only 10% of them.Add to those 8,686 creators, theWhen the rules are enforced at extreme randomness, [...]
You must be registered to see the links
who create adult animations, the
You must be registered to see the links
who are doing adult photography,
You must be registered to see the links
who make adult videos, the
You must be registered to see the links
who create adult comics, and the
You must be registered to see the links
who make adult drawing or painting, and you get 55,176 creators.Yet there's still thousands creators that fall in the other adult categories, what lead easily to 70,000 creators, or more, just for the adult content.
And of course, this is without counting the hundreds of thousands creators who do not make adult content, but still have to comply with the Terms of Use.
In the end, there's
You must be registered to see the links
, and therefore that need to be controlled.By the way, globally one creator out of five fall under an adult category, so the famous "Patreon don't like porn" is pure bullshit.
I'm not really sure that you understand how many peoples would be needed to control them all. Especially since their content evolve with time, what mean that you need to regularly go back to their page.
Let's assume that one person can control 10 creators by day. With 8 worked hours by day, it's a reasonable guess, some content like games or videos would need more than one hour to be controlled, while content like photography or drawing would need only only few minutes.
If each creator was only controlled once every six months, it's still 225 persons that would be needed, and who would do nothing else of their days than this.
And of course, with a control every six months, you would still claim that Patreon do no efforts to enforce their rules.
Now, if you want, I'm sure that they can train an AI to do this. Each creator would then be controlled once a month without real problem.
At least, "without real problem" for the AI, because for the creators it would be something else seen how AI are absolutely not reliable when it come to content moderation. It's not a handful of creators that would be banned, time to time, for their actual violation of the Terms of Use, but dozens that would be suspended each day, due to false positive detection. And I'm not sure that it's a better solution...