YouTube and some of its advertisers apparently have no problem making money off videos of the WDBJ shooting incident

When it comes to making money, management at YouTube apparently has no shame

It’s no secret that YouTube slaps advertising on pretty much anything without regard for subject matter or ownership, but making money off of last week’s on-air murder of WDBJ-TV reporter Alison Parker and her cameraman Adam Ward is a new low.  A source tipped me off to the fact that a number of opportunistic (and shameless) YouTube “partners” have uploaded and monetized clips of both the station’s live broadcast and the video taken, (and uploaded to Twitter) by the deranged murderer as he executed the two journalists during a televised live-shot for the morning news.

While there has been an ongoing debate among news organizations about how to handle the disturbing footage, there should be no debate as to whether this footage is monetization worthy.  Earlier this year YouTube (and advertisers) were embarrassed by reports of advertisements appearing on terrorist recruiting videos.  Now this.

The ads appear as sidebar ads, pre-roll ads, and overlay ads.  It wouldn’t be difficult for YouTube to prevent this type of disturbing video from being uploaded in the first place, much less monetized.  After all, YouTube brags about what a great job its Content ID program does keeping infringing content off the site.  Why not use it to block these type of uploads?  Can’t YouTube use its own technology to safeguard advertisers?

YouTube monetizes anythingWhile the debate as to whether these clips are newsworthy will continue, are videos depicting the cold-blooded murder of two people really ad-worthy?  Where are the advertisers in all this?  Are they even aware of where their ads appear?  They are culpable in this fiasco too.  When ads were placed on ISIS videos advertisers several advertisers expressed their displeasure with YouTube and pledged to take action.   With this latest revelation it appears their words may have simply been spin control.  After all, we’ve heard time and time again how the ad industry is concerned about “brand integrity” online.  Perhaps the industry should look at the consistent lack of “integrity” in YouTube’s monetization practices?

As for YouTube itself, in the past, company representatives have defended its hands-off approach.  When called out for the ads on ISIS recruiting videos earlier this year a spokesman tried to justify YouTube’s approach in a statement to NBC News:

“YouTube has clear policies prohibiting content intended to incite violence, and we remove videos violating these policies when flagged by our users. We also have stringent advertising guidelines, and work to prevent ads appearing against any video once we determine that the content is not appropriate for our advertising partners,” a YouTube spokesperson said Tuesday in a statement to NBC News. YouTube videos are frequently preceded by ads that are picked at random by an algorithm. That means often neither YouTube nor the advertiser will know what ads are playing before which videos.



WDBJ Shooting videos make money for YouTube and its “Partners”

YouTube boasts its monetized videos provide “Advertiser Friendly Content”

YouTube purports to require that partner monetized videos provide “advertiser friendly content.”  What exactly is that?  Well, this is how YouTube explains its standards on for its “Partner Program” :

Even though content may be acceptable for YouTube under our policies, not all of it is appropriate for Google advertising. Google has principles around what we monetize that we expect our content creators who want to monetize to comply with. Advertisers also have their own standards and requirements on the type of content that meets their individual needs. [emphasis added] Learn more below about how YouTube defines “advertiser-friendly” content and how we prevent ads from serving against videos that do not meet this criteria.

In short, advertiser-friendly content is appropriate for all audiences, from our youngest to older viewers. It is content that has little to no inappropriate and/or mature content in the video stream, thumbnail, or metadata such as video title. If there may be inappropriate content, the context is usually newsworthy or comedic where the creator’s intent is to inform or entertain, and not offend or shock.

Content that YouTube considers to be inappropriate for advertising includes but is not limited to:

  • Sexually suggestive content, including partial nudity and sexual humor

  • Violence, including display of serious injury and events related to violent extremism

  • Inappropriate language, including harassment, profanity and vulgar language

  • Promotion of drugs and regulated substances, including selling, use and abuse of such items

  • Controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown

If any of the above describes any portion of your video, then it may not be approved for monetization. In cases where monetization is approved, your video may not be eligible for all of the ad formats we offer. YouTube reserves the right to not monetize a video, as well as suspend monetization feature on channels who repeatedly submit videos that violate our policies.

The implication here is that some sort of quality control is happening.  The Partner Program information continues with this disingenuous gem:

How do we qualify content as “advertiser-friendly”?

YouTube relies on sophisticated technology and our policy enforcement processes when determining if a video is suitable for advertising. [emphasis added] We have trained systems that automatically check various features of a video – from the video title, metadata, and visual imagery – and makes a decision on how appropriate this video is for general advertising.

In conjunction with these automated checks, we also depend on our user community to flag inappropriate videos to us for our review. Depending on the nature of the policy violation, videos can be removed from the site or age-restricted. Monetization is disabled on age-restricted videos and Google will immediately stop serving ads on these videos.

Sophisticated technology?  Huh?  Did YouTube’s “sophisticated technology” deem video depicting the murder of two innocent people suitable? The implication here is that some sort of quality control is happening, but that’s not at all the case.  Crap uploaded by “partners” (aka scammers) routinely gets monetized on YouTube by without any sort of approval process. Whether its videos for ISIS or those promoting peeping Toms, it’s only when someone flags it or publishes a story, that YouTube takes action and even then, too little, too late.

Money over morals is the YouTube mantra

Of course one of the videos (with advertising) that I saw had attracted more than 600,000 hits.  Eyeballs mean money for YouTube and the partner who uploaded the video, never mind he didn’t own the rights to it.  Apparently money matters more than ethics.

Where are the advertisers in all this?  YouTube infers that they have their own “standards” that must be met.  Do these clips showing the murder of two people on live TV qualify? Do Celebrity Cruises, Hitachi, NFL GamePass, SolarCity, Book of Mormon Musical, Sprint, Save the Children, PayPal, Honda, Flir, Claritin and other major brands really want their products slapped onto these videos?

WDBJ officials could probably get some of these videos taken down, but I imagine they probably have better things to do–like mourn their colleagues–than send DMCA notices to YouTube.

I can’t imagine what it must be like for the family and friends of these victims to know that the murder of their loved ones has become a money-making opportunity for the likes of YouTube/Google.  It’s beyond shameful and there’s absolutely no excuse. YouTube needs to clean up its act and if company officials won’t make it happen, advertisers better demand better accountability.