The Sea-Change at YouTube

It’s time we had a con­ver­sa­tion about cen­sor­ship.

Recent­ly a mass exo­dus of major adver­tis­ers occurred at YouTube, which has since caused the ecosys­tem of that plat­form to fall into dis­ar­ray. As not­ed by both YouTu­bers and main­stream media out­lets alike, the pre­cip­i­tat­ing event seems to have been a small num­ber of gov­ern­ment and cor­po­rate ads appear­ing along­side racist hate videos on a very small num­ber of chan­nels. The issue was brought to the atten­tion of gov­ern­ments and cor­po­ra­tions in a high pro­file man­ner, and from there, indus­try brass decid­ed to pull all adver­tis­ing off the YouTube plat­form, cit­ing the desire to not be asso­ci­at­ed with harm­ful con­tent.

As var­i­ous media out­lets have report­ed, it’s an odd nar­ra­tive to fol­low giv­en the fact this prob­lem has exist­ed for many, many years. Until the mid­dle of 2016, it’s been an issue that’s rarely made the news. Fur­ther­more, despite the his­tor­i­cal efforts made by media com­pa­nies (espe­cial­ly Google) to stamp out racist and oth­er extrem­ist con­tent, the issue remains dif­fi­cult to address owing to the sheer vol­ume of data being uploaded at any giv­en time.

In Youtube’s case, at least 300 hours of video is uploaded each minute (though some put that num­ber as high as 400 hrs/min). If we go with the low­est esti­mate, that’s still 18,000 hours of video in an hour, 432,000 hours of video in a day, or 12.96 mil­lion hours in a 30-day month. These num­bers are def­i­nite­ly not in Google’s favour, and despite valiant efforts to screen user-gen­er­at­ed con­tent, Inter­net media com­pa­nies as a rule tend to be faced with a nev­er-end­ing, uphill bat­tle when it comes to man­ag­ing these enor­mous vol­umes of user-gen­er­at­ed con­tent.

Sim­i­lar to the ongo­ing sit­u­a­tion at Face­book (and its impli­ca­tions for that network’s 1.2 bil­lion dai­ly users), the logis­tics are impos­si­ble when it comes to set­ting up a pure­ly human inter­ven­tion as a solu­tion to harm­ful con­tent. There’s no prac­ti­cal way for Google, or any ultra high vol­ume media com­pa­ny for that mat­ter, to retain suf­fi­cient human staffing in order to indi­vid­u­al­ly review each piece of user-gen­er­at­ed con­tent that comes in the door. As a result, indus­try stan­dard prac­tices include the use of soft­ware algo­rithms as gate­keep­ers and the automa­tion of most issues relat­ed to pol­i­cy enforce­ment and con­tent man­age­ment.

For the most part, this func­tions quite well. The fact peo­ple don’t encounter more offen­sive mate­r­i­al on a giv­en plat­form than is present­ly vis­i­ble remains an impres­sive tes­ta­ment to this tech­no­log­i­cal achieve­ment. Most peo­ple who haven’t worked for an ISP or a user-cre­at­ed media com­pa­ny aren’t aware of the vast com­plex of sys­tems run­ning 24/7 behind the scenes which are respon­si­ble for the major­i­ty of the user expe­ri­ence, nev­er mind the innu­mer­able roles and respon­si­bil­i­ties filled by humans in com­pa­ny InfoS­ec, LEO Liai­son, Mem­ber Safe­ty, and oth­er depart­ments. If things are work­ing nor­mal­ly, 99% of back-end oper­a­tions tend to be invis­i­ble to end-users. Gen­er­al­ly speak­ing, it’s only when some­thing breaks or goes hor­ri­bly wrong that peo­ple sit up and take notice.

At the time of this writ­ing, the world stands in the shad­ow of the 2016 US elec­tion and its news cycle, dur­ing which major con­tro­ver­sies con­cern­ing fake news and racism whipped up cor­po­ra­tions, gov­ern­ments, and pub­lic alike into a mas­sive fren­zy, chang­ing the online rules of engage­ment prac­ti­cal­ly overnight. As a result, peo­ple are not only tak­ing aim at legit­i­mate­ly harm­ful mate­r­i­al like pro­pa­gan­da and racism, but the exces­sive­ly sud­den and slop­py deploy­ment of cen­sor­ship has result­ed in a back­lash due to the phe­nom­e­nal amount of unin­tend­ed, inno­cent by-catch.

Five years ago, this shit wouldn’t fly. If the cur­rent polit­i­cal anx­i­ety didn’t exist today, we’d see cor­po­ra­tions and gov­ern­ments exer­cis­ing more restraint, which would take the form of find­ing an inte­grat­ed tech­no­log­i­cal solu­tion to address the actu­al issue, rather than scape­goat­ing and uproot­ing the vast sanc­tu­ar­ies of free expres­sion we’ve worked so hard to cul­ti­vate.

I’m talk­ing specif­i­cal­ly about the YouTube adver­tis­er boy­cott here. Don’t get me wrong, boy­cotts can achieve admirable results when done for the right rea­sons and when orga­nized and exe­cut­ed in good faith, but I don’t think we’re look­ing at noble moti­va­tions or good faith in this sit­u­a­tion. Every­thing about it, quite frankly, stinks.

Mil­lions of peo­ple rely on YouTube as a com­mu­ni­ca­tions medi­um to net­work with oth­ers — they’re not using it for illic­it pur­pos­es. Thou­sands more use it for employ­ment, and again, they’re not using it for illic­it pur­pos­es either. Why should even one of these indi­vid­u­als bear any part of the pun­ish­ment levied by the ad indus­try and media com­pa­nies over the actions of a small num­ber of bad actors? This is what we call col­lat­er­al dam­age, pure and sim­ple, and the clum­sy approach tak­en in this case didn’t help, either. The innate­ly com­pli­cat­ed and sen­si­tive nature of oper­at­ing a media com­pa­ny demands nuance and cau­tion to get results that a major­i­ty of its cre­ators and con­sumers can live with. This is why the ‘ad-poca­lypse,’ its appli­ca­tion as pol­i­cy, and its han­dling since appli­ca­tion have all come across as crass and irre­spon­si­ble.

Mas­sive shocks to the rev­enue sys­tem also aren’t good for a media com­pa­ny that might stand to re-cir­cu­late some of that ad mon­ey for staffing and R&D needs, which in turn can pay div­i­dends toward more effec­tive con­tent man­age­ment. See where I’m going with this? Over months or years, a suf­fi­cient income loss could actu­al­ly make it more dif­fi­cult to advance long-term, inte­grat­ed solu­tions when it comes to tack­ling the prob­lem of racist extrem­ism. What hap­pens in that case? We end up see­ing much less sophis­ti­cat­ed (and less effec­tive) solu­tions deployed.

Case in point: the lat­est changes to YouTube. Google has been aug­ment­ing its TOS with more extreme mea­sures and enforce­ment, and tight­en­ing its fil­ter­ing algo­rithms, in order to save face with adver­tis­ers. Judg­ing by the response from the ad indus­try, that effort came way too late to change their minds, how­ev­er, now every­one is stuck with a new prob­lem: the stricter “adver­tis­er friend­ly” rule has become finan­cial­ly dis­as­trous for many high-income con­tent cre­ators. The major­i­ty of these prob­lems seem to have risen out of the stag­ger­ing amount of acci­den­tal by-catch by the beefed-up fil­ter­ing algo­rithms, but occa­sion­al­ly some blame has also been placed on overzeal­ous enforce­ment of var­i­ous site poli­cies in an effort to stay ‘adver­tis­er friend­ly.’ There’s been no short­age of chan­nel oper­a­tors and con­tent cre­ators will­ing to share their per­son­al expe­ri­ences with the mass demon­e­ti­za­tion, cen­sor­ship, and acci­den­tal dele­tion of videos that has been tak­ing place.

Ini­tial­ly, the LGBTQ com­mu­ni­ty was the major­i­ty of this by-catch when the issue first became known in the sec­ond half of 2016. Fol­low­ing a pro­tract­ed stream of respons­es by YouTu­bers and cov­er­age by main­stream media out­lets, Google apol­o­gized and attempt­ed to cor­rect the error, but the con­tro­ver­sy only spread as algo­rithm and pol­i­cy issues wors­ened and began to affect an increas­ing­ly wider user base, from LGBTQ chan­nels to gun cul­ture to make­up reviews to var­i­ous polit­i­cal com­men­ta­tors and many oth­er peo­ple and orga­ni­za­tions from all walks of life.

Prob­lems encoun­tered by YouTube users while try­ing to appeal wrong­ful demon­e­ti­za­tion and cen­sor­ship have caused the sys­tem to fur­ther come under fire, as a recent bug led to the inabil­i­ty to lodge an appeal at all. Despite anoth­er sheep­ish Google apol­o­gy, the dam­age has been done, and the YouTube user base has clear­ly become upset.

Many unin­tend­ed con­se­quences have come as a result of the recent dis­rup­tions, but per­haps none are as vis­i­ble and telling as the num­ber of high pro­file YouTu­bers who have begun focus­ing their mon­e­ti­za­tion efforts off-site. Have you noticed your favourite YouTu­bers ask­ing for sup­port on Patre­on? What about direct­ing view­ers to off-site van­i­ty domains or chan­nel relat­ed con­sumer prod­ucts sites? Again, it’s not hard to read between the lines here: the writ­ing is on the wall. If you haven’t noticed it yet, you will before 2017 is over. These changes are bell­wethers for the way cre­ators intend to struc­ture their incomes over the next five to ten years. It’s their way of vot­ing with their wal­lets.

While it remains to be seen how seri­ous (or how not-seri­ous) an effect the migra­tion of income exchange sys­tems could have on YouTube and its adver­tis­ers, the mes­sage it com­mu­ni­cates is clear: peo­ple are get­ting scared, and they don’t want all their eggs in one bas­ket. This trend actu­al­ly began a cou­ple of years ago, but the Patre­on phe­nom­e­non in par­tic­u­lar explod­ed when YouTube clamped down and began mass demon­e­ti­za­tion of videos in 2016.

What hap­pens to plat­form loy­al­ty if the rev­enue streams fall into decline? For that mat­ter, what hap­pens to user loy­al­ty when a network’s terms of ser­vice become less than tol­er­a­ble?

With YouTube not­ing growth of +50% year-over-year growth in chan­nels that are being paid six-fig­ure salaries, it seems espe­cial­ly unfor­tu­nate to lose so many adver­tis­ers right now, not to men­tion all the antipa­thy and bad press its over­re­ac­tion to the ad-poca­lypse has man­aged to rack up. Can such growth and cre­ator incen­tives con­tin­ue? Will the YouTube ecosys­tem be sus­tain­able once the cur­rent con­tro­ver­sy dies down?

While I hope Google finds a way to smooth things over, it will not be accept­able if this comes at the cost of the user base, or at the cost of basic free­dom of expres­sion. YouTube was found­ed on open­ness, shar­ing, free speech and expres­sion, and unrea­son­able demands that con­tent now be ‘adver­tis­er friend­ly’ or else face loss of rev­enue or cen­sor­ship are exact­ly that — unrea­son­able. It’s not all that much dif­fer­ent from a dic­ta­to­r­i­al gov­ern­ment sud­den­ly mov­ing in and shak­ing things up, telling peo­ple what they can and can’t do.

Because of their place in the online rev­enue cycle, adver­tis­ers by default share a sig­nif­i­cant por­tion of the moral and fis­cal respon­si­bil­i­ty to sup­port free expres­sion and play the long game when it comes to sus­tain­ing and aid­ing a high qual­i­ty media prod­uct such as YouTube. This includes being will­ing to do the appro­pri­ate tech­ni­cal con­sul­ta­tions, read­ing, and nego­ti­a­tion when deal­ing with the more unpleas­ant aspects of Inter­net life in order to under­stand what’s going on and how to deal with it in a way that yields pro­duc­tive results.

If adver­tis­ers would rather fol­low tran­sient polit­i­cal winds and per­mit them­selves to be end­less­ly dis­tract­ed by the super­fi­cial­i­ty of the news cycle, things might even­tu­al­ly take an ugli­er turn. In that sce­nario, they might find they have no one but them­selves to blame once sub­scribers start look­ing else­where and set up a democ­ra­tized, free form sys­tem of col­lab­o­ra­tion that is less ordered and less eas­i­ly mon­e­tized. Social media demo­graph­ics are indeed in flux, and should the worst hap­pen, it wouldn’t be the first time a major net­work has col­lapsed. It gen­er­al­ly takes a num­ber of years, though.

Either way, impos­ing arbi­trary sanc­tions from afar with­out the con­sent of the major­i­ty is a guar­an­teed los­er — both in terms of loy­al­ty and sta­bil­i­ty. It’s brand sui­cide because it alien­ates every­one. It’s fuel for the cycle of uncer­tain­ty that’s caused peo­ple to start look­ing for oth­er plat­forms and rev­enue streams.

To save the plat­form from the cur­rent con­tro­ver­sy, the best approach would be for all sides to calm down and resume work­ing with­in the sys­tem to mod­i­fy and improve upon what already exists.

The small amount of racist mate­r­i­al that’s run­ning the cur­rent news cycle at a fever pace is some­thing that’s best han­dled by an inte­grat­ed approach involv­ing a com­bi­na­tion of high­ly nuanced human inter­ven­tion and machine algo­rithms. It will take effort, and it will cer­tain­ly take time, but it’s cru­cial that every effort be made along the way to pre­vent inno­cent sub­scribers and cre­ators falling into the crosshairs of over-eager enforce­ment efforts as we try to sort this out.

Comments are closed.