Video Platform Go Boom: Perspectives on the Adpocalypse

As it becomes increas­ing­ly obvi­ous a sea change is occur­ring at YouTube with respect to how the com­pa­ny con­ducts busi­ness and gov­erns its user base, it’s time we had a mean­ing­ful con­ver­sa­tion about the use of third-par­ty con­tent aggre­ga­tion plat­forms and the long-term effects of putting too many eggs into the same bas­ket.

Only a few gen­er­a­tions have been lucky enough to wit­ness the birth of the World Wide Web (and mass com­mer­cial­iza­tion of the Inter­net prop­er) and still have the priv­i­lege of liv­ing a rea­son­able num­ber of years on both sides of that flash­bulb moment in his­to­ry. Mine is one of them: togeth­er, we’ve grown with it, nur­tured it, aug­ment­ed our lives with it, watched it evolve — and we’ve drawn incred­i­ble ben­e­fit from the tech­no­log­i­cal rev­o­lu­tion that fol­lowed. Today all man­ner of com­put­er sys­tems cross paths with our lives hun­dreds of times on a dai­ly basis, and most times, it rarely elic­its a thought.

We’ve become so inti­mate­ly tied to our tech­nol­o­gy that invis­i­ble design has become an exquis­ite­ly refined, and gen­er­al­ly expect­ed, norm. Where once the shar­ing of con­tent on the Web was an intel­lec­tu­al­ly expen­sive and fair­ly time-con­sum­ing under­tak­ing — often requir­ing an indi­vid­ual to learn var­i­ous back-end tech­nolo­gies and pro­gram­ming lan­guages as well as visu­al design and its atten­dant soft­ware — nowa­days, most peo­ple rely on a mul­ti­tude of turn-key solu­tions that do much of the think­ing and heavy lift­ing for us, offer­ing decent inte­gra­tion with very lit­tle down­time.

Well, at least until that ser­vice changes the rules, lim­its its fea­tures, crash­es, or liq­ui­dates its assets.

Then we have a prob­lem.

As inter­est in the near-effort­less shar­ing phe­nom­e­non has grown and giv­en rise to social net­work­ing web­sites, mas­sive resources and unhealthy lev­els of depen­dence have gone into prop­ping up an ever-fat­ten­ing hand­ful of too-big-to-fail Web cor­po­ra­tions, all of whom do busi­ness under an astound­ing­ly wide umbrel­la of dis­claimers and indem­ni­ties that shield them from most forms of lia­bil­i­ty (and cer­tain­ly don’t put them on any kind of imag­in­able lev­el play­ing field with the end user). Stud­ies have not­ed that users rarely read the Terms of Ser­vice before click­ing “I Accept.”

As any sea­soned Web host knows, user-gen­er­at­ed con­tent (UGC) is noto­ri­ous­ly unpre­dictable in terms of sub­ject mat­ter, and con­tent polic­ing is a night­mare in the most lit­er­al sense of the word.

Hav­ing all of those restric­tions and rules in the ToS is a hedge against the inevitable con­flict that aris­es when some ran­dom idiot decides to upload offen­sive or ille­gal mate­r­i­al. It’s much less legal­ly per­ilous to the site oper­a­tor, not to men­tion eas­i­er to address the source of the mate­r­i­al in gen­er­al, when the net­work imple­ments a con­sis­tent usage pol­i­cy with well-defined bound­aries.

And yes, it’s also true that under a benev­o­lent ecosys­tem, where the sysad­mins are most­ly friend­ly and have the nec­es­sary time to address ques­tion­able or flagged mate­r­i­al, and com­mon sense is used, things gen­er­al­ly turn out well for the user base. On a basic site that’s lim­it­ed to a sub­set of con­tent types (such as a bul­letin board), a com­mu­ni­ty of thou­sands can be man­aged by a mod­er­a­tion team of five to fifty indi­vid­u­als, or, in cer­tain cas­es with the right soft­ware or a par­tic­u­lar­ly well-dis­ci­plined audi­ence, even a sin­gle mod­er­a­tor can pull it off.

This is not true of social media, which car­ries a gen­er­al audi­ence and behaves as an entire­ly dif­fer­ent beast. Once a user-gen­er­at­ed con­tent net­work begins under­go­ing the kind of expo­nen­tial growth that results in the prop­a­ga­tion of plat­forms like YouTube, Twit­ter, and Face­book, what also hap­pens is the vol­ume of incom­ing data becomes so vast that any in-house mod­er­a­tion team and tech­no­log­i­cal mea­sures are pushed far beyond their com­pe­tence and pro­cess­ing lim­its. There’s no way humans can sift through that much data, and since AI is still not a ful­ly prac­ti­cal solu­tion, attempts to employ machine algo­rithms in place of humans can result in unpre­dictable out­comes.

For those who aren’t big on fol­low­ing tech news, yes, this has hap­pened, and yes, it’s affect­ed many of the big-name social media plat­forms.

After get­ting their pride tram­pled by an influx of vio­lent video upload­ers, pro­pa­gan­dists, and troll farms, Twit­ter and Face­book have both turned sub­stan­tial resources toward hir­ing more mod­er­a­tors in an effort to stave off the tide of low-qual­i­ty con­tent that’s become a prob­lem for those net­works in recent years.

Google, too, has sought legions of con­tent review­ers in the wake of an ever-expand­ing spi­ral of neg­a­tive press and adver­tis­er with­drawals stem­ming from poor imple­men­ta­tion of machine algo­rithms and uneven enforce­ment of its com­mu­ni­ty guide­lines by human mod­er­a­tors. On the oth­er hand, adding pan­ic and chaos to YouTube’s sit­u­a­tion is the fact the new­er, stricter algo­rithms and mod­er­a­tion sys­tems have caused addi­tion­al prob­lems by being unpre­dictable and heavy-hand­ed, poten­tial­ly draw­ing even more unwant­ed atten­tion to the orig­i­nal prob­lem.

It’s been argued that while some com­pa­nies have made efforts to increase mod­er­a­tion resources in an attempt to sta­bi­lize the ecosys­tem, the longer trend may be turn­ing toward the de-empha­sis of user cre­at­ed con­tent, with most of the major video stream­ing plat­forms already ink­ing deals to build up their inven­to­ries of pro­fes­sion­al­ly pro­duced and paid-pre­mi­um con­tent.

Some­where in the mid­dle of that hot mess, this court case (PDF) and its sub­se­quent dis­missal (PDF) come in. If some­one asked me to come up with a sym­bol that rep­re­sents the unpleas­ant state of the video plat­form sit­u­a­tion at present, I think we’ve got our­selves a win­ner.

Below is a half-hour video by Leonard French, a lawyer who peri­od­i­cal­ly blogs about pub­lic inter­est cas­es on the Inter­net, in which he sums up his take on the case:

In short, after los­ing near­ly all video rev­enue dur­ing a YouTube crack­down that result­ed in wide­spread col­lat­er­al dam­age to many cre­ators as a result of bad algo­rithms and oppres­sive mod­er­a­tion, Zom­bieGo­B­oom LLC filed an attempt­ed class action in hopes of being able to rem­e­dy the lost earn­ings and con­tent man­age­ment issues. Unfor­tu­nate­ly for them, YouTube’s Terms of Ser­vice agree­ment pre­cludes any sort of ‘right’ to have ads served on videos, so due to that and oth­er rea­sons, the suit was dis­missed with prej­u­dice.

The real tragedy of the Zom­bieGo­B­oom law­suit is it illu­mi­nates the humil­i­at­ing lack of bar­gain­ing pow­er YouTu­bers pos­sess. If even big-name artists can’t get a rea­son­able call­back or a speedy res­o­lu­tion in the course of deal­ing with the com­pa­ny and its plat­form mod­er­a­tors, every­one has a seri­ous prob­lem on their hands.

Cur­rent­ly, there is no ‘strong arm of the union’ to back up all of the belea­guered artists who are get­ting smacked around for no good rea­son. There’s no one who can orga­nize a strike, or stand up to the tox­i­c­i­ty of the half-baked cen­sor­ship. Either live with the Terms of Ser­vice as they stand, or show your­self the door? That’s not a healthy work envi­ron­ment, espe­cial­ly when you fac­tor in the sheer ran­dom­ness and insta­bil­i­ty sur­round­ing much of the enforce­ment that’s been going around.

Recent­ly, one of YouTube’s more well-known cre­ators, Joerg Sprave, put for­ward an answer to this dilem­ma. He does want to cre­ate this union, and he seems seri­ous about mak­ing it work. Giv­en the tail­spin the plat­form is in, I hope it’s not too late. Either way, his mes­sage is worth spread­ing. Artists need some­one on their side, too.

 

When media plat­forms achieve the size and reach that Face­book, Twit­ter, and YouTube have, that pow­er comes with a duty to pro­tect the user base from undue dis­rup­tion, con­flict, and dra­ma.

If a com­pa­ny can’t / won’t / doesn’t show its users through clear actions that it cares, there’s even­tu­al­ly going to be a cri­sis of con­fi­dence, and peo­ple will start jump­ing ship. In YouTube’s case, I’ve wit­nessed this hap­pen­ing since about 2013 as more cre­ators take their rev­enue streams off-plat­form by enrolling with Patre­on, Pay­pal, MMO gam­ing pro­mot­ers, pri­vate T-shirt shops, brand spon­sor­ships, and more.

And guess what? It’s not let­ting up.

If any­thing, Adpoca­lypse was akin to some insane man­ag­er drench­ing the entire office in gaso­line and threat­en­ing to light a match. It scared the shit out of cre­ators, and even more of them are now work­ing with off-plat­form income streams.

When some­one threat­ens to destroy your liveli­hood, the last thing any rea­son­able per­son does is sit down, shut up, and not act at all.

Hope­ful­ly the good folks at Zom­bieGo­B­oom have already made the leap. I applaud their efforts to stand up for them­selves and oth­ers in court, but clear­ly, it’s not just a mat­ter of wage loss­es at this point: we need to change the plat­form itself, and lev­el the play­ing field so that cre­ators’ voic­es mat­ter.

There’s a fear lurk­ing at the back of my mind that the cur­rent exo­dus from user-gen­er­at­ed video net­works could poten­tial­ly dri­ve a return to the kind of frag­men­ta­tion we saw dur­ing the ear­ly days of the Web when the kinds of fer­tile con­tent cre­ation plat­forms we have today sim­ply didn’t exist. Audi­ences were small, reach was lim­it­ed, and mon­e­ti­za­tion was a night­mare.

Mod­ern social media sys­tems and user-gen­er­at­ed con­tent may have their draw­backs, but in the big­ger pic­ture they’ve made far more pos­i­tive gains and con­tri­bu­tions to the world through wide­spread inter­con­nec­tion and democ­ra­ti­za­tion of tech­nol­o­gy.

It would be best if we learned from that, and improved what we have today. There’s no good rea­son to tear it all down and rein­vent the wheel, just like there’s no good rea­son to be smack­ing inno­cent cre­ators around as if one is doing it for sport.

Comments are closed.