News, Video

YOUNG CHILDREN CAN SEE EASILY THE CONTENT OF THE SHOW IN A YOUTUBE AGE RESTRICTIONS

Violent and sexual images found in videos with characters intended for children.

The YouTube video begins with a popular cartoon character for British children, Peppa Pig, who introduces herself and her family, but there are almost immediate signs of trouble. During the ninth second, Peppa’s mother opens her mouth and shouts, “Smoke grass!”

The video, a doctoral version of a real episode of Peppa, is getting worse from there. For five minutes, there are racist and homophobic explosions and insults that lead to Peppa and his parents wearing sunglasses that smoke marijuana while rapper Snoop Dogg dances nearby. Since it was downloaded in 2015, the modified video, which has no age limit, has been viewed more than 1.5 million times.

After years of swearing at the police on inappropriate content, YouTube continues to offer violent images, references to drugs, sexually suggestive footage and vulgar and racist comments in clips that reach children at a worrying pace, say, researchers, parents, and consumer groups.

They say that YouTube’s recommendation algorithm is not able to reliably segment content for the appropriate ages and that its default autoplay feature offers an almost infinite number of videos that confuse, disrupt and excite young people minds. Although many parents try to monitor what their kids are seeing, the sheer volume of YouTube content consumed by many users makes this a mess, especially when a short clip can deliver a variety of misplaced messages.

“Today, YouTube is the biggest problem for parents,” said James Steyer, CEO of Common Sense Media, a nonprofit organization based in San Francisco that advocates for families and schools around the world. “Children are content to constantly find completely inappropriate content because it is managed by algorithm.”

Steyer and others say that YouTube is a good example of the tectonic shift in children’s programming from the previous generation, away from the world of open, government-regulated, and time-bound television. The increasingly dominant online world offers unregulated content that can be loaded by virtually anyone, almost anywhere in the world, and can reach children at any time, depending on the vigilance parents.

YouTube has always said that its platform is not targeting children and created the YouTube Kids app in 2015 to meet the demand for more content for a younger audience.

“Protecting children and families is a priority for us,” YouTube said in response to questions about inappropriate content for children. “Because YouTube is not for kids, we’ve invested heavily in creating the Kids Kids app, a destination specifically designed for kids, and we do not allow users under the age of 13 to create or have accounts on YouTube. ” in the case of a minor, we cancel this account, we cancel thousands of accounts each week as part of this process. “

However, parents, consumer groups, and pediatricians said YouTube was very popular with children, rather than children’s apps. Kids watch YouTube on mobile devices, theirs and their parents’, as well as on enabled TVs for the Internet and browsers on laptops. Through browsers, YouTube users are not required to sign in, except in cases where a video is age-restricted, which means there is no practical barrier to what children watch most videos. in the service.

Videos with age restrictions require users to sign in to an account before viewing so that they can establish that they are at least 18 years old. This step may discourage children, although experts say many children are lying of their age to create accounts on YouTube and other services. YouTube said its “review team” imposes age restrictions on content containing vulgar language, nudity, violence, or dangerous activities when these videos catch the attention of YouTube. Content with age restrictions cannot advertise or be monetized by its creators.

In a survey conducted last year at the Pew Research Center of parents of children under the age of 12, more than 80% said their children were watching YouTube and 34% said they see it regularly. In its 2017 annual survey, market research firm Smarty Pants named YouTube as “the # 1 brand for kids”, calling it “the most powerful brand in kids’ lives.” The survey distinguished it from the YouTube Kids app, which was less popular among 6 to 12-year-olds.

The researchers say that YouTube’s content recommendation algorithms pose special problems for children because they often sit in front of a screen for long periods of time and automatically watch what is being read. The content, even inappropriate for parents, can hypnotize immature children away from unhealthy words or pictures.

Problems can be particularly serious when children search for popular and seemingly harmless terms such as “Spider-man”, “Superhero” and “Elsa”. The blond princess who played in Frozen in 2013 has generated such inappropriate content on YouTube that critics group such videos under the short name “Elsagate”. Disney did not respond to requests for comment.

Critics also say that the problem is not visibly improving. The “Peppa” writing on the YouTube search engine, for example, has generated at least one video recommended by researchers as “disturbing” 70% of the time, according to a study published in January on the basis of 2,500 videos of Youtube

Researchers who looked at these keywords found clips showing a lightly dressed Elsa, overlapping another partially nude cartoon character, Spider-Man squirming under the covers with a life-size Rapunzel doll and Peppa monster with knives for the hands that, in the middle of a dialogue full of blasphemies, Slices open the head of several characters in bloody attacks. (YouTube stated that none of these videos had violated their policies and none had appeared in the YouTube Kids app – the one showing a violent monster with age restrictions).

The researchers also found that children had a 45% chance of seeing at least one “disturbing” or inappropriate video in the 10 video clips, a duration that often equates to about one hour of viewing. Some videos were raw animations, some of which featured actors in disguise. Others, including Peppa’s video on marijuana, were real children’s videos that had been manipulated with new words and images related to the original.

Peppa Pig, a British preschool animated television series that debuted in 2004 and has an international audience, is a particularly popular target for those creating fake alternative versions of original episodes or videos. entirely new based on program characters. Entertainment One, produced by Peppa, declined to comment.

Graduates of the Cyprus University of Technology, who watched videos featuring several popular characters among young children, discovered that fake videos often contain violence, profanity and sexually charged behavior.

“I’m a father, it certainly happens more often than I’m not comfortable,” said Jeremy Blackburn, a computer science professor at the University of Alabama in Birmingham, co-author of the study and father of three children. “The problem is that they can be served and recommend things that are not just garbage, but inappropriate things, just bad things.”

A promise of cleanliness that failed

Biologist Leonore Reiser, who lives in Oakland, Calif., Said her 9-year-old daughter recently reported seeing disturbing YouTube videos. When she reviewed her visual history, Ms. Reiser found a call called “Two Girls Want a Man”.

He represented two young women in swimsuits competing for a man’s attention in a public swimming pool, and in another part of the video, the same man caressing one of the woman’s breasts. Reiser then determined that her son, while watching YouTube unattended, had searched for desecration, which made the video appear as a recommended option.

When the Washington Post watched the clip, the next video series that recommended the YouTube algorithm, in a panel on the right side of the screen, under the title “Next Up”, included many scenes of sexual intimacy they reached. nudity. Most were not limited by age.

Although she was not happy with the content found by her son during a secular search, Reiser was more upset by the lyrics of the rap videos that often accompany the football and basketball highlights that her son frequently watched. . “In fact, I’m less concerned about the curses than the men who talk about b * s, hoes and violence,” Reiser said. “That’s what really catches my goat: violence against women.”

YouTube has long been committed to cleaning inappropriate platform content in the midst of a series of controversies involving violent extremism, odious online conspiracies and disturbing content for children. The company hired thousands of human moderators and sought to refine its artificial intelligence systems to better identify and eliminate videos that violate the guidelines of its community.

But efforts have failed with respect to many types of objectionable content, according to researchers, parents and advocates. In recent weeks, a video blogger documented how pedophiles followed children’s videos and shared sexually suggestive tones, compared to a “network of soft-core pedophiles” and a Florida pediatrician. He discovered that the video clip who explains how to commit suicide has been split into children’s videos on YouTube and YouTube Kids.

These controversies have led some advertisers to leave YouTube, as have Disney, Nestle and others recently. However, the persistence of the problems has led some of those who study the platform to conclude that the almost incomprehensible size of YouTube, with 500 hours of new videos per minute, made control so difficult that parents had to prevent their children from seeing them.

James Bridle, a prominent critic, artist, and author, who used a Medium publication in November 2017 to highlight the disturbing content that YouTube is broadcasting for children, said that the arrangements the company attempted did not work. The algorithm continues to identify and provide the most extreme and stimulating content.

“It seems pretty obvious that this is just a game of madness,” said Bridle, author of New Dark Age: Technology and the End of the Future. “The more extreme the video, the more clicks … Imagine now that this applies to young children.”

He was more direct in a TED talk last year: “If you have young children, move them away from YouTube.”

A seemingly bottomless video well.

YouTube’s recommendation algorithm uses machine learning and artificial intelligence to study what users watch and suggest other videos. When AutoPlay is enabled, the default service on YouTube will continue to stream videos with similar themes and features indefinitely.

Former YouTube engineer Guillaume Chaslot, who left the company in 2013 and now leads the AlgoTransparency monitoring group, said YouTube would fight to stop the flow of inappropriate content as long as its artificial intelligence is trained to maximize “time “a monitoring metric that feeds the business-based advertising business model.

The result, he said, is a referral system that prioritizes user stimulation first and foremost. In the recent scandal involving suspected pedophiles, which lists the timestamps of provocative images in video commentary sections, Chaslot said that the recommendation algorithm had allowed the content to be broadcast. “The best thing to do for (the algorithm) is to find the pedophile videos and broadcast them to the people most likely to become so, so that the best artificial intelligence in the world does.”

The federal law on privacy also complicates the image. Sites for children under 13 years old are not allowed to collect most personal data of users without parental permission. YouTube, which takes advantage of data-based advertising, avoids this restriction by maintaining that the service should not be used by children at all. If YouTube explicitly recommended some content for children, this position would become unsustainable, Chaslot said.

Last year, several consumer groups lodged a complaint with the federal regulators, stating that YouTube knew that children used the site regularly, despite their policies, and thus routinely violated the law on the protection of privacy.

Mr. Chaslot argued that to make YouTube safer for kids, the company should prioritize something other than “watching the time,” ideally a system in which parents rate videos based on their value. educational or their relevance. simply what kids are clicking or watching automatically in the auto game. . YouTube says it has reduced the emphasis on “see time” in its recommendation algorithm.

Pediatrician Jenny Radesky, a researcher at the University of Michigan who examines the effects of advanced technologies on children and families, said children had trouble understanding some videos, especially when the characters they admired were acting inappropriately. This puts into question their emerging sense of good and evil and creates confusion. It can also cause some children to mimic the behaviors they observe, said Ms. Radesky.

And the pressure on parents is serious, as they seek to protect their children from disturbing images and messages that are always as close as the nearest mobile device. Many parents report that constant supervision is impractical and content controls are limited and difficult to use.

“It’s difficult, it’s a big part of parents’ responsibility to monitor and supervise children,” Radesky said. “You need a design solution.”

Senator Edward Markey, a Massachusetts Democrat, said he would soon introduce legislation to deal with children’s online content such as traditional children’s television. He favors the age-appropriate labeling of videos to help families make visualization decisions, and wants to examine the use of design features, such as automatic gaming, that foment particularly intense or even compulsive consumption. .

“There is extremely strong evidence that algorithms generate content on YouTube that is not age-appropriate, and that’s not correct,” said Markey. “YouTube is where kids go, so we have to solve the problem … We need to set rules to protect children.”

Ms. Reiser, the Oakland mother whose 9-year-old found inappropriate videos, said her recent discoveries have made her want to block the platform from her home. “Because of all the creepy, weird stuff he’s finding, we’re actually watching YouTube less. I deleted it from the TV, and I’m deleting it from my iPhone.”

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *