As it becomes increasingly obvious a sea change is occurring at YouTube with respect to how the company conducts business and governs its user base, it’s time we had a meaningful conversation about the use of third-party content aggregation platforms and the long-term effects of putting too many eggs into the same basket.
Only a few generations have been lucky enough to witness the birth of the World Wide Web (and mass commercialization of the Internet proper) and still have the privilege of living a reasonable number of years on both sides of that flashbulb moment in history. Mine is one of them: together, we’ve grown with it, nurtured it, augmented our lives with it, watched it evolve — and we’ve drawn incredible benefit from the technological revolution that followed. Today all manner of computer systems cross paths with our lives hundreds of times on a daily basis, and most times, it rarely elicits a thought.
We’ve become so intimately tied to our technology that invisible design has become an exquisitely refined, and generally expected, norm. Where once the sharing of content on the Web was an intellectually expensive and fairly time-consuming undertaking — often requiring an individual to learn various back-end technologies and programming languages as well as visual design and its attendant software — nowadays, most people rely on a multitude of turn-key solutions that do much of the thinking and heavy lifting for us, offering decent integration with very little downtime.
Then we have a problem.
As interest in the near-effortless sharing phenomenon has grown and given rise to social networking websites, massive resources and unhealthy levels of dependence have gone into propping up an ever-fattening handful of too-big-to-fail Web corporations, all of whom do business under an astoundingly wide umbrella of disclaimers and indemnities that shield them from most forms of liability (and certainly don’t put them on any kind of imaginable level playing field with the end user). Studies have noted that users rarely read the Terms of Service before clicking “I Accept.”
As any seasoned Web host knows, user-generated content (UGC) is notoriously unpredictable in terms of subject matter, and content policing is a nightmare in the most literal sense of the word.
Having all of those restrictions and rules in the ToS is a hedge against the inevitable conflict that arises when some random idiot decides to upload offensive or illegal material. It’s much less legally perilous to the site operator, not to mention easier to address the source of the material in general, when the network implements a consistent usage policy with well-defined boundaries.
And yes, it’s also true that under a benevolent ecosystem, where the sysadmins are mostly friendly and have the necessary time to address questionable or flagged material, and common sense is used, things generally turn out well for the user base. On a basic site that’s limited to a subset of content types (such as a bulletin board), a community of thousands can be managed by a moderation team of five to fifty individuals, or, in certain cases with the right software or a particularly well-disciplined audience, even a single moderator can pull it off.
This is not true of social media, which carries a general audience and behaves as an entirely different beast. Once a user-generated content network begins undergoing the kind of exponential growth that results in the propagation of platforms like YouTube, Twitter, and Facebook, what also happens is the volume of incoming data becomes so vast that any in-house moderation team and technological measures are pushed far beyond their competence and processing limits. There’s no way humans can sift through that much data, and since AI is still not a fully practical solution, attempts to employ machine algorithms in place of humans can result in unpredictable outcomes.
For those who aren’t big on following tech news, yes, this has happened, and yes, it’s affected many of the big-name social media platforms.
After getting their pride trampled by an influx of violent video uploaders, propagandists, and troll farms, Twitter and Facebook have both turned substantial resources toward hiring more moderators in an effort to stave off the tide of low-quality content that’s become a problem for those networks in recent years.
Google, too, has sought legions of content reviewers in the wake of an ever-expanding spiral of negative press and advertiser withdrawals stemming from poor implementation of machine algorithms and uneven enforcement of its community guidelines by human moderators. On the other hand, adding panic and chaos to YouTube’s situation is the fact the newer, stricter algorithms and moderation systems have caused additional problems by being unpredictable and heavy-handed, potentially drawing even more unwanted attention to the original problem.
It’s been argued that while some companies have made efforts to increase moderation resources in an attempt to stabilize the ecosystem, the longer trend may be turning toward the de-emphasis of user created content, with most of the major video streaming platforms already inking deals to build up their inventories of professionally produced and paid-premium content.
Somewhere in the middle of that hot mess, this court case (PDF) and its subsequent dismissal (PDF) come in. If someone asked me to come up with a symbol that represents the unpleasant state of the video platform situation at present, I think we’ve got ourselves a winner.
Below is a half-hour video by Leonard French, a lawyer who periodically blogs about public interest cases on the Internet, in which he sums up his take on the case:
In short, after losing nearly all video revenue during a YouTube crackdown that resulted in widespread collateral damage to many creators as a result of bad algorithms and oppressive moderation, ZombieGoBoom LLC filed an attempted class action in hopes of being able to remedy the lost earnings and content management issues. Unfortunately for them, YouTube’s Terms of Service agreement precludes any sort of ‘right’ to have ads served on videos, so due to that and other reasons, the suit was dismissed with prejudice.
The real tragedy of the ZombieGoBoom lawsuit is it illuminates the humiliating lack of bargaining power YouTubers possess. If even big-name artists can’t get a reasonable callback or a speedy resolution in the course of dealing with the company and its platform moderators, everyone has a serious problem on their hands.
Currently, there is no ‘strong arm of the union’ to back up all of the beleaguered artists who are getting smacked around for no good reason. There’s no one who can organize a strike, or stand up to the toxicity of the half-baked censorship. Either live with the Terms of Service as they stand, or show yourself the door? That’s not a healthy work environment, especially when you factor in the sheer randomness and instability surrounding much of the enforcement that’s been going around.
Recently, one of YouTube’s more well-known creators, Joerg Sprave, put forward an answer to this dilemma. He does want to create this union, and he seems serious about making it work. Given the tailspin the platform is in, I hope it’s not too late. Either way, his message is worth spreading. Artists need someone on their side, too.
When media platforms achieve the size and reach that Facebook, Twitter, and YouTube have, that power comes with a duty to protect the user base from undue disruption, conflict, and drama.
If a company can’t / won’t / doesn’t show its users through clear actions that it cares, there’s eventually going to be a crisis of confidence, and people will start jumping ship. In YouTube’s case, I’ve witnessed this happening since about 2013 as more creators take their revenue streams off-platform by enrolling with Patreon, Paypal, MMO gaming promoters, private T-shirt shops, brand sponsorships, and more.
And guess what? It’s not letting up.
If anything, Adpocalypse was akin to some insane manager drenching the entire office in gasoline and threatening to light a match. It scared the shit out of creators, and even more of them are now working with off-platform income streams.
When someone threatens to destroy your livelihood, the last thing any reasonable person does is sit down, shut up, and not act at all.
Hopefully the good folks at ZombieGoBoom have already made the leap. I applaud their efforts to stand up for themselves and others in court, but clearly, it’s not just a matter of wage losses at this point: we need to change the platform itself, and level the playing field so that creators’ voices matter.
There’s a fear lurking at the back of my mind that the current exodus from user-generated video networks could potentially drive a return to the kind of fragmentation we saw during the early days of the Web when the kinds of fertile content creation platforms we have today simply didn’t exist. Audiences were small, reach was limited, and monetization was a nightmare.
Modern social media systems and user-generated content may have their drawbacks, but in the bigger picture they’ve made far more positive gains and contributions to the world through widespread interconnection and democratization of technology.
It would be best if we learned from that, and improved what we have today. There’s no good reason to tear it all down and reinvent the wheel, just like there’s no good reason to be smacking innocent creators around as if one is doing it for sport.