Quote Originally Posted by RandBlade View Post
YouTube and Google punched in their bottom line, promise more urgent action: http://www.bbc.co.uk/news/technology-42110068

A regulatory investigation would barely have got off the ground by now if that.
But Randblade, I thought there was no issue here? That there's always iffy content on the Internets and it's no ones problem but the parents if some of that iffy content comes up next to kids videos on Youtube?

So why are you now praising Youtube and their advertisers for taking action against the thing I said was a problem and you've said isn't?

To answer the other dubious points in this thread I couldn't be bothered with before:

a) Yes, Youtube does have human moderators, but nowhere near enough which is why they have to rely so heavily on algorithms to police their platform. If you'd been aware of the recent kerfuffle about their bots going around demonetizing more or less everything as unsuitable for advertisers, which was in itself a reaction the last time they got in trouble over the kind of content they have on their platform (hate speech) you'd understand this.
b) Yes, parents have a responsibility to monitor what media their children consumes, but if something is advertised as kid friendly than I think parents should be able to take it in good faith that the thing is, in fact kid friendly and the people who marketed it is as such have a duty of care to ensure that it so. If a parent brought their child of crisps and they turned out to contain a black widow spider or rattle snake or something I think they'd have a right to be annoyed and 'well, parents should monitor what their kids eat' wouldn't cut it as a defense.