Kid-safe immersive tech could spark over-censorship: report by info.odysseyx@gmail.com September 4, 2024 written by info.odysseyx@gmail.com September 4, 2024 0 comment 7 views 7 Efforts to protect children’s safety in the two-dimensional realm of online social media may backfire in the 3D world of augmented and virtual reality, according to a report released Tuesday by a Washington, DC, technology think tank. Legislative efforts like the Kids Online Safety and Privacy Act (KOPSA), which passed the US Senate and is now before the House of Representatives, could lead to harmful censorship of AR/VR content, the report maintains. Information Technology and Innovation Foundation. Once KOPSA becomes law, AR/VR platforms may be forced to implement the same procedures as traditional social media platforms, the report explained. Given the FTC’s authority to consider harmful content on these platforms, it continued, the FTC may additionally censor content on AR/VR platforms, or the platforms themselves may censor content to avoid liability, which may include content relevant to children’s education, entertainment, and identity. “One of our concerns with COPSA is that it opens the door to potential additional censorship by giving the FTC (Federal Trade Commission) the power to decide what qualifies as harmful,” said policy analyst Alex Ambrose, the report’s author. “It’s another way to determine which is harmful to a political party,” he told TechNewsWorld. “The FTC can say content such as environmental protection, global warming and climate change is anxiety-inducing. So we have to completely get rid of anything related to climate change because it can cause anxiety among children.” Excessive censorship can be avoided Andy Lulham, its COO verify meAn age and content verification provider based in London, acknowledged that discussions about online regulation are looming over censorship. “But I strongly believe this fear, while understandable, is largely misplaced,” he told TechNewsWorld. “Well-crafted government regulation is not the enemy of free expression, but its guardian in the digital age.” Lulham maintained that the key to regulation lies in methodology. “Blanket, heavy-handed regulations risk tipping the scale toward excessive censorship,” he said. “However, I envision a more nuanced, policy-based regulatory framework that can enhance online freedom while protecting vulnerable users. We have seen examples of such a balanced approach in privacy regulations such as GDPR.” The GDPR — General Data Protection Regulation — which has been in effect since 2018, is a comprehensive data protection law in the European Union that regulates how companies collect, store and use the personal data of EU residents. “I strongly believe that regulations should focus on mandating strong safeguards and processes rather than dictating specific content decisions,” continued Lulham. “This approach shifts responsibility to platforms to build comprehensive trust and security strategies, encouraging innovation rather than creating a culture of fear.” He stressed that transparency will be the key driver of effective regulation. “Mandating detailed transparency reports can hold platforms accountable without heavy-handed content policing,” he explained. “This not only helps prevent overreach but also builds public confidence in both the platform and the regulatory framework.” “Furthermore,” he added, “I advocate regulations that require clear, accessible appeals processes for content removal decisions. This safety valve can help correct inevitable mistakes and prevent unwanted censorship.” “Critics may argue that any regulation will inevitably lead to some censorship,” Lulham admits. “However, I contend that the greater threat to free expression comes from unregulated spaces where vulnerable users are silenced through abuse and harassment. Well-designed regulations can create a more level playing field, amplifying diverse voices that might otherwise be drowned out.” The good, the bad and the ugly of AR/VR The ITIF report notes that conversations about online safety often ignore AR/VR technology. Immersive technology increases social connection and stimulates creativity and imagination, it explains. Play, imagination and creativity are all essential to children’s development. The report acknowledges, however, that accurately addressing the risks children face with immersive technology is a challenge. Most existing immersive technology is not designed for children under 13, it continues. Children explore spaces designed by adults, which leads to exposure to age-inappropriate content and can create habits and behaviors harmful to children’s emotional and social development. Addressing these risks will require a combination of market innovation and thoughtful policymaking, it added. Company design decisions, content moderation practices, parental control tools, and trust and security strategies will largely shape the security environment in Metaverse. It has, however, recognized that some security threats require public policy intervention. Policymakers are already discussing the safety of children on “2D” platforms like social media, leading to regulations that could affect AR/VR technology, ITIF noted. Before enacting these regulations, the report advises policymakers to consider the ongoing security efforts of AR/VR developers and ensure these tools maintain their effectiveness. When security tools are inadequate, it continued, policymakers should focus on targeted interventions to address proven harms, not hypothetical risks. “Most online services are working to remove harmful content, but the sheer volume of that content online means that some of it will inevitably fall through the cracks,” Ambrose said. “The problems we see on platforms today, such as violence, vandalism, and the spread of harmful content and misinformation, will only continue on immersive platforms.” “The metaverse is going to thrive on massive amounts of data, so we can assume that these problems will be widespread — maybe even more widespread than what we see today,” he added. Safety by design Lulham agrees with the report’s contention that companies’ design decisions will shape the security environment of the metaverse. “In my view, the decisions that companies make about online safety will be critical to creating a safe digital environment for children,” he said. “The current landscape is fraught with risk, and I believe companies have both the responsibility and the power to reshape it.” He maintains that user interface design is the first line of defense to protect children. “Companies prioritize intuitive, age-appropriate designs that can fundamentally change how children interact with online platforms,” he explains. “By creating interfaces that naturally guide users to safer behaviors and educate them, we can significantly reduce harmful encounters.” Content moderation is at a critical juncture, he added. “The volume of content demands a paradigm shift in our approach,” he observed. “While AI-powered tools are essential, they are not a panacea I argue that the future lies in a hybrid approach, combining advanced AI with human oversight to navigate the fine line between protection and censorship.” Parental control tools are often overlooked but important, he maintained. They should not be mere add-ons but core features designed with the same attention as the core platform. “I envision a future where these tools are so intuitive and effective that they become an integral part of family digital life,” he said. He claimed that trust and security strategies would separate thriving platforms from weak ones. “Companies will set the gold standard by adopting a holistic approach, integrating robust age verification, real-time monitoring and transparent reporting,” he declared. “Regular engagement with child protection experts and policymakers will be non-negotiable for companies serious about protecting young users.” “In summary,” he continued, “I see the future of online safety for children as one where ‘safety by design’ is not just a buzzword but the fundamental principle that drives all aspects of platform development.” The report notes that children, as drivers of the metaverse, play an important role in market adoption of immersive technology. Creating a safe environment for all users of AR/VR technology while ensuring innovation can flourish will be a complex challenge, recognizing that parents, corporations and regulators all have roles to play in balancing privacy and security concerns. While creating engaging and innovative immersive experiences. Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post Explore AI-Enhanced Productivity with Surface Pro 11th Edition & Surface Laptop 7th Edition next post Exciting Telesales Executive Opportunities at NDPS Reality Pvt Ltd in Noida – Apply Now! You may also like Insights from MVPs at the Power Platform Community Conference October 10, 2024 Restoring an MS SQL 2022 DB from a ANF SnapShot October 10, 2024 Your guide to Intune at Microsoft Ignite 2024 October 10, 2024 Partner Blog | Build your team’s AI expertise with upcoming Microsoft partner skilling opportunities October 10, 2024 Attend Microsoft Ignite from anywhere in the world! October 10, 2024 Get tailored support with the new Partner Center AI assistant (preview) October 10, 2024 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.