Despite missing nine first-choice players because of IPL, New Zealand played brilliantly against Pakistan to share the honours in the T20 series
cricket2 hours ago
Being part of an ecosystem that is dynamic and ever evolving means ensuring that you are always vigilant when it comes to protecting users from any threats.
This is a mission that YouTube takes very seriously, says Tarek Amin, director of YouTube MENA. “With so many incredible stories to share, and billions of people using YouTube to learn new skills and be entertained, YouTube’s top priority is to protect the community from harmful content.”
“We’re constantly thinking about how to make YouTube the best storytelling tool for everyone to share their stories,” he told Khaleej Times. “YouTube is a place of community, collaboration and commerce. Every minute, 500 hours of video are uploaded to YouTube, ranging from study tips to powerful stories about people in the Middle East, North Africa and beyond. Content creators on the platform are able to earn a living and build a business on YouTube that contributes to the wider community.”
In order to protect the community, Amin shared how the platform follows the ‘4 Rs’ and ensures strict enforcement of its Community Guidelines. The 4 Rs refer to remove, raise, reward, and reduce. First, content that violates YouTube’s policies is removed as quickly as possible and authoritative, high quality voices are raised, especially when people look for breaking news and information. YouTube also makes sure that trusted, eligible media partners, creators and artists are rewarded, and the spread of what is referred to as borderline content – content that comes close to, but is not violative of YouTube’s community guidelines – is reduced.
In addition, YouTube’s Community Guidelines clearly list what is, and is not, allowed on YouTube and YouTube Kids. These policies address all types of content including videos, comments, links, thumbnails and ads. Amin explained that they are constantly evolving to ensure adaptability in the face of new realities, and to pre-empt trends such as the policies launched to address medical misinformation related to Covid-19.
“Content is flagged either by our automated systems or by users,” Amin pointed out. “Through a combination of trained teams made up of thousands of people from around the world and YouTube’s machine learning systems, content that has been flagged is reviewed. If the content violates YouTube’s guidelines, it is removed; content that may not be appropriate for all audiences is age-restricted; and content that is not found to be violative is left up.”
Such a system helps ensure wider coverage and effectiveness, as machines are better at finding content at scale, and reviewers make the decisions on what content should be removed according to YouTube’s guidelines and the context. On the other hand, ads have a separate set of strict policies, which prohibit some types of ads from running on the platforms, especially when they link to sites which may be unsafe due to malware downloads, spam, or adult content. If an advertiser fails to comply with the policies, YouTube immediately revokes their ability to run ads on the platform and reserves the right to disable their Google Ads account, barring them from running any ads on the platforms.
According to YouTube’s latest quarterly Community Guidelines Enforcement report, between October and December 2021, over 3.7 million videos were removed for violating YouTube’s community guidelines. Out of these videos, 38 per cent had 1-10 views and almost 32 per cent had no views at all, which shows that YouTube’s machine learning systems took action against violative videos before their impact was felt.
When it comes to child safety, Amin revealed that YouTube accepts a lower level of accuracy when reviewing flagged videos to make sure that as many pieces of violative content are removed as quickly as possible. “This means that a higher amount of content that does not violate our policies is also removed. In the same time period, over 1.1 million videos were removed due to child safety concerns, including videos that could be potentially harmful to children such as dares, challenges, or even innocently posted content that might be a target for predators.”
Looking back, Amin said that YouTube’s policies have come a long way over the years, and that the world and people’s expectations are ever-changing. “YouTube’s policies continue to evolve, pre-empt and adapt in the face of new challenges and our community plays an important role by flagging content. Regardless of the circumstances, the commitment to protecting our community from harmful content on both YouTube and YouTube Kids is unwavering.”
rohma@khaleejtimes.com
Despite missing nine first-choice players because of IPL, New Zealand played brilliantly against Pakistan to share the honours in the T20 series
cricket2 hours ago
RPM reported net profit of Dh49.39 million during 2023
business2 hours ago
Emerging economies’ bloc ditches greenback in $260b worth trade
economy2 hours ago
National security spokesperson to continue pressing for a temporary ceasefire that Washington wants to last for at least six weeks
mena2 hours ago
It was RCB's third win — and second in a row — in 10 matches so far this season, keeping their slim hopes of reaching the playoffs alive
cricket3 hours ago
Offshoring business operations to captive centres has proven to be a cost-effective and efficient strategy
realty3 hours ago
Tourism minister Ahmed Al Khateeb says all the kingdom's projects are far away from the conflict
mena3 hours ago
A wave of exceptionally hot weather has blasted the region over the past week, sending the mercury as high as 45 degrees Celsius
asia3 hours ago