Post by account_disabled on Feb 13, 2024 23:19:18 GMT -10
TikTok the well-known short video platform, is amid criticism about its safety protocols aimed at users under 13 years of age. Is TiTok accessible to children under 13? What risks could it entail? TikTok without CSR ? A recent investigation by The Guardian found that moderators were instructed to allow minors to remain on the platform. According to their statement, the parents were monitoring their accounts and this allowed them to continue using the medium. This finding raises questions about the effectiveness of the measures implemented by TikTok to protect its younger audience. Likewise, it highlights the importance of a thorough examination of its moderation practices. We tell you everything about it: TikTok for minors, what is the impact? Although the impact varies between users and may depend on factors such as age, emotional maturity, parental support and the way the platform is used, the use of TikTok on minors may have a greater impact on them due to its addictive tendency. As a note from Education 3.0 indicates , the danger of minors from the TikTok video application goes further. Young people may also face danger regarding their personal data as the platform is known for its data processing and storage. Precisely, these are the reasons why TikTok is not suitable for children under thirteen years of age. Tiktok for children under 13 years old However, with The Guardian's investigation, a particular case was revealed.
Year-old user was allowed on the platform because the account claimed to be supervised by parents. Internal communication reveals that moderators consulted a quality analyst about potentially banning the account, and the response was that if the bio mentioned parental supervision, the account would be allowed to remain. TikTok, however, denies these allegations, calling them incorrect or based on misunderstandings. A spokesperson for the platform stated that TikTok's policies are applied uniformly and that children under 13 years of age are not allowed access to the platform. Antecedents and consequences of persistence This revelation comes after previous clashes with regulators over the handling of accounts of under-18s, including Bahrain Phone Numbers List significant fines in Ireland and the United Kingdom. For its part, TikTok has affirmed its commitment to the safety of underage users, stating on its website that they are "deeply committed to ensuring that TikTok is a safe and positive experience for people under the age of 18." However, The Guardian's investigation reveals that some accounts potentially belonging to minors have received internal labels that give them preferential treatment, such as the "lead creator" designation. This finding raises questions about the consistency and effectiveness of TikTok's moderation practices. The UK Children's Code, designed to protect children's data online, states that processing a child's personal data is lawful if they are at least 13 years old.
However research suggests that TikTok may not be enforcing these rules effectively. Tiktok for children under 13 years old Processing of data of minors: In accordance with the rules of the children's code, the data of minors under 13 years of age should not be processed, as this would violate legal regulations intended for their protection. Additionally, TikTok is also subject to OFCOJM regulation in the UK, which is being incorporated into the recently introduced Online Safety Act. The law requires technology platforms to establish measures in their terms of service to prevent access by minors and apply them consistently. On the one hand, TikTok claims to have more than 6,000 moderators in Europe who apply the platform's guidelines equally. However, the evidence presented by The Guardian raises questions about the effectiveness of these measures and highlights the need for greater oversight of the platform's moderation practices. Is TikTok taking insufficient measures? Since early 2023, TikTok has made proposals to address minors' use of its app, questions remain about their effectiveness. The initiatives, which include timers and the removal of notifications to regulate usage time, appear insufficient when it comes to safeguarding the privacy and data of younger users. Ultimately, the discussion about TikTok's insufficient measures highlights the importance of a more rigorous approach to protecting minors in online environments.
Year-old user was allowed on the platform because the account claimed to be supervised by parents. Internal communication reveals that moderators consulted a quality analyst about potentially banning the account, and the response was that if the bio mentioned parental supervision, the account would be allowed to remain. TikTok, however, denies these allegations, calling them incorrect or based on misunderstandings. A spokesperson for the platform stated that TikTok's policies are applied uniformly and that children under 13 years of age are not allowed access to the platform. Antecedents and consequences of persistence This revelation comes after previous clashes with regulators over the handling of accounts of under-18s, including Bahrain Phone Numbers List significant fines in Ireland and the United Kingdom. For its part, TikTok has affirmed its commitment to the safety of underage users, stating on its website that they are "deeply committed to ensuring that TikTok is a safe and positive experience for people under the age of 18." However, The Guardian's investigation reveals that some accounts potentially belonging to minors have received internal labels that give them preferential treatment, such as the "lead creator" designation. This finding raises questions about the consistency and effectiveness of TikTok's moderation practices. The UK Children's Code, designed to protect children's data online, states that processing a child's personal data is lawful if they are at least 13 years old.
However research suggests that TikTok may not be enforcing these rules effectively. Tiktok for children under 13 years old Processing of data of minors: In accordance with the rules of the children's code, the data of minors under 13 years of age should not be processed, as this would violate legal regulations intended for their protection. Additionally, TikTok is also subject to OFCOJM regulation in the UK, which is being incorporated into the recently introduced Online Safety Act. The law requires technology platforms to establish measures in their terms of service to prevent access by minors and apply them consistently. On the one hand, TikTok claims to have more than 6,000 moderators in Europe who apply the platform's guidelines equally. However, the evidence presented by The Guardian raises questions about the effectiveness of these measures and highlights the need for greater oversight of the platform's moderation practices. Is TikTok taking insufficient measures? Since early 2023, TikTok has made proposals to address minors' use of its app, questions remain about their effectiveness. The initiatives, which include timers and the removal of notifications to regulate usage time, appear insufficient when it comes to safeguarding the privacy and data of younger users. Ultimately, the discussion about TikTok's insufficient measures highlights the importance of a more rigorous approach to protecting minors in online environments.