Should Adults Control Kid-Created Content?
Fourteen year-old Adora Svitak wishes that Facebook came up with a popup window that read, “Are you going to regret this later?” before allowing people to post their updates.
It’s that kind of long-term vision that’s missing from a lot of how kids act and how they’re being educated about using social media. And because adults are navigating the same uncharted waters alongside — or in many cases, far behind — their kids, sometimes using what’s considered common sense at the time might not even be enough of a filter.
Svitak is already a fairly savvy social media user herself, having launched her own Facebook brand, website, and even TED talks. She and her peers are pushing boundaries on sites like Tumblr, posting videos on YouTube and creating their own blogs — and getting a lot of traction.
Cases in point: Teenager Rebecca Black’s Friday, last year’s viral YouTube music video (more than 32 million views) and the Kony 2012 video, whose 90 million views was propelled by kids passing it along to each other.
“We are co-creators of the world we live in. We’re not just watching the screen in front of us. Whether it’s good or bad, you can’t argue it’s influential.”
“We can have tremendous influence on the cultural landscape,” Svitak said at the recent Big Tent event in San Jose. “We are co-creators of the world we live in. We’re not just watching the screen in front of us. Whether it’s good or bad, you can’t argue it’s influential.”
She’s got a point there — kids’ influence can be powerful, especially with the help of social media sites like YouTube and Twitter. But unlike the kids who create the content that goes on those sites, the companies that host the content are forced to weigh in on whether it’s “good” or “bad,” or more pointedly, what they should do about it.
Victoria Grand, director of communications and policy at YouTube, said company staff is constantly searching for questionable content and deciding what action to take. For example, a spate of videos created by girls who ask the viewing public, “Am I pretty or ugly,” have been circulating for about four years, and YouTube must contend with whether, as a private company, it should take action to remove the videos.
“Is this a disturbing teenage trend? Should a private company take away the right of a girl to ask the world if she’s pretty? Is this hate speech territory? Do we make it go away?” she said, listing the litany of questions the company must contend with each day. The answer is not always clear. After some investigation, Grand said one of the girls posting the “Am I pretty or ugly” video was a 21-year-old art student embarking on an art project.
Other examples: the cinnamon challenge, whereby people upload videos of themselves eating a spoon full of cinnamon, and their subsequent reactions, which are, as Grand puts it “just repulsive.” So far, 30,000 such YouTube videos have been tagged, and Grand said YouTube has been contacted by physicians imploring the company to take those videos down because eating raw cinnamon can be bad for the respiratory system. The question, again, YouTube must contend with: How dangerous is this?
“Should a private company take away the right of a girl to ask the world if she’s pretty? Is this hate speech territory? Do we make it go away?”
And what should YouTube do about what Grand calls “self-injury videos,” especially those that show girls cutting themselves? “The cutting videos are really interesting,” Grand said. “In large part, they’re public service announcements by fellow teenagers. They’re saying, ‘Don’t do it,’ or they’re documenting it in a neutral way that says, ‘This is what’s happening.’ Only a fraction is promoting self injury.”
In that light, Grand said the videos reflect authentic voices that can be helpful to other girls considering cutting themselves. At the same time, though, she said “the very act of cutting triggers additional trigger.”
The same premise applies to videos showing people smoking. YouTube has been asked to remove videos of people smoking, because “when you see images of people smoking, it leads to more smoking,” Grand said.
Though companies can — and in some cases, should — edit online content that might lead to dangerous behavior, the more proactive approach is educating kids about possible dangers, said Amanda Lenhart, a senior research specialist at the Pew Research Center’s Internet & American Life Project, where she directs research on young adults, teens, children and families. These are global companies with a reach that goes far and wide, and it could be impossible to manage these issues on a global scale.
“There’s certainly a side of it [that can be approached with] advocacy and work with kids,” she said. “On the other hand, we have to protect free speech. Those are the real tensions behind the debate.”
Educating kids about the issues underlying the content is what Lenhart calls the “middle ground.”
“It can be incredibly localized,” she said. “It starts at the user, and it doesn’t require tech innovation or regulation. But it requires a lot of work on the part of parents and end users.”
“Algorithms can’t do most of this work. With things like nudity, algorithm doesn’t know if it’s surgery, or if it’s a breast cancer announcement.”
But as Svitak pointed out, though kids’ influence is powerful, especially with the help of the vast online megaphone, kids don’t always understand the repercussions of their online behavior. And because this is still very much new territory for a lot of adults, education around these issues doesn’t always flow from parent to child. In that case, who’s teaching whom? And what are the appropriate models?
For its part, YouTube wants to draw youth to teach youth about these issues, Grand said.
“But it’s like feeding people spinach,” she said. “How do you rally youth around issues like flagging inappropriate content and privacy controls? It’s hard… They have a limited attention span. We have breakthroughs sometimes with the teen safety community, but those blog posts don’t get nearly enough awareness. It’s hard to cut through the noise.”
Technology can actually help is this realm — at least to some extent. For the “Am I Ugly” videos, images can be blurred, and comment settings can be adjusted to protect kids. The “safety mode” scans videos for fleshtones, she said, but as a result, videos of babies are deleted. And what happens to artistic videos that include nudity?
“People say we should come up with some kind of predictive algorithm: ‘You’re Google — figure it out!'” she said. “But algorithms can’t do most of this work. With things like nudity, algorithm doesn’t know if it’s surgery, or if it’s a breast cancer announcement.”
YouTube does organize videos for review, and those that rank high on flesh tones will be reviewed faster by YouTube staff. And factors like the flagger’s reputation, the number of times the video has been flagged, how “hot” it is in terms of virality, all help YouTube prioritize review of the algorithm.
“But humans need to see it,” she said. And that takes time.
THE STOVE APPROACH
Fourteen-year-old Svitak offers some advice to adults: think long-term. At her school, she says all devices must be turned off — no blogs, no email, and no access to websites.
“This is a short-term approach,” she said, adding that she prefers the “touch-the-stove approach.” When she was younger, she learned quickly to stay away from the stove after she touched it — but in her effort to exert independence and push boundaries, she didn’t get badly burned.
“If you never teach the child to cross the street, they won’t know how to do it,” she said. “We need to emphasize the long-term approach with education — not just block everything, but teach kids how to evaluate. Then they won’t post inappropriate content.”