Yubo has grown as a social media platform since it was founded in 2015. As it’s grown, the platform has set up multiple safety measures to help ensure that it remains a safe place for users to interact. The following is a sampling of its core safety features.
Terms Define What’s Allowed
The foundation of Yubo’s safety features is a basic set of terms and conditions. These aren’t fancy, but there are good reasons they’re a standard practice among social media platforms.
The terms and conditions first delineate what’s not appropriate for posting on Yubo. The platform takes a conservative approach to content, not allowing partial nudity, full nudity, weapons, violence blood, or drugs.
The terms and conditions secondarily serve as the foundation for any disciplinary action that the platform takes against users. Should the platform’s safety specialists determine that a user has posted inappropriate content, the terms are on what the specialists base their determination to warn, suspend or expel the user.
Users Are Classified as Adult or Children
When users initially sign up, they must complete an age verification process. This involves uploading a short video and a photo, which are used to confirm the user’s self-reported age.
Artificial intelligence can confirm almost everyone’s age (at 98.9 percent accuracy) based on the picture. It only uses the video to confirm that the photo is true of the actual user who’s signing up.
The primary purpose of this age verification is to confirm that a user is indeed old enough to legally use social media. Yubo secondarily uses the verification and user-provided age to classify the user’s account. Adults and minors have different settings for allowed content.
Classifying accounts as either adult account or minor account helps ensure that younger users only see content appropriate for their age.
This age verification feature was an industry-first feature when Yubo debuted it in 2019. The feature was initially used only for suspicious accounts and then piloted for accounts of users who are 13 or 14 years old. It’s since been rolled out to verify and classify all new accounts. Instagram intends to follow suit with age verification for all new accounts by December 2022.
Content is Monitored in Real Time
Among social media platforms, Yubo especially needs a process by which content can be monitored as it’s posted. The platform primarily features live streams among small groups. These streams can’t merely be monitored after they’re posted, but must be checked as they’re streamed.
To address this issue, Yubo has developed a two-stage monitoring process that’s constantly active. Artificial intelligence takes screenshots each second and analyzes the screenshots for indications of content that violates the terms and conditions.
Any content that might be in violation is forwarded to a safety team, which then manually reviews the content and takes action if appropriate. The action could be removing the content and/or penalizing the user account that posted it.
A Trio of Safety Features
Together, these three safety features form a solid system that helps ensure Yubo remains a place where everyone can feel secure.