YouTube is high on the success of YouTube Shorts, a short-form video offering they introduced in September 2020 to compete with TikTok. In June, Shorts had more than 1.5 billion monthly users. And last month, YouTube announced new monetization opportunities for the format, a move that indicated that it was becoming big business for YouTube and a viable revenue source for creators.
However, reports of users being shown transphobic Shorts are spreading across other social media platforms. Users say transphobic videos are showing up on their feeds among seemingly unrelated content. The largest call out has come from one of the platform's oldest, most-respected, and prolific creators, Hank Green:
Tweet may have been deleted (opens in a new tab)
Shorts is the default tab in the YouTube app, which means it's the first thing you see when you open it. Users have been noting how being confronted with that kind of content makes them feel:
Tweet may have been deleted (opens in a new tab)
Tweet may have been deleted (opens in a new tab)
YouTube has historically struggled to moderate hate and misinformation. But some users are shocked by just how prevalent transphobic content is recommended on Shorts. One Redditor claims, "A solid half of the videos recommended to me are intensely transphobic. I hit 'don’t recommend me this channel,' but it doesn’t seem to work."
Like TikTok, the Shorts recommendation algorithm feeds content to the user in an endless scroll. A user that might not click on a transphobic long-form YouTube video, but they aren't given that choice when spoon-fed a 15-second video in the Shorts tab.
Tweet may have been deleted (opens in a new tab)
Tweet may have been deleted (opens in a new tab)
Mashable has reached out to YouTube for comment.
from Mashable https://ift.tt/hWz2ucl
via IFTTT