Meta, the parent company of social media platforms Facebook and Instagram, has unveiled its strategy for safeguarding young users online.
What’s needed are industry leaders like Apple and Google, operators of prominent mobile app stores, to empower parents by requiring their approval before their children can install popular apps designed for younger generations.
Following a mere six months after Australia’s eSafety Commissioner, Julie Inman Grant, issued a deadline for major tech companies to devise measures for safeguarding youth and adolescents from online risks, Meta has made the announcement.
Apple countered Meta’s move by accusing the company of abandoning its online security commitments.
As critical tech platforms are called upon to significantly strengthen online security measures during this pivotal moment? As an alternative, it seems that two opposing camps are locked in a heated debate over what constitutes the most effective measures for safeguarding children online, with each side convinced that their approach is the only viable solution.
The Australian Government’s eSafety Commissioner has established guidelines for online platforms, requiring them to collaborate on ensuring digital safety across their respective networks. Doing this could lead to substantial benefits for people of all ages, from children to seniors. The announcement from Meta, coupled with Apple’s response, implies that leading tech companies might squander this vital opportunity.
What’s Meta’s proposal?
The potential online harms affecting children and young people are widely recognized by many parents.
According to the report, children typically first come across pornography online when they are around 13 years old. Australia’s authorities have recently initiated a trial of “age assurance” technology aimed at preventing minors from accessing online pornography.
While some argue that children under 16 should be prohibited from using social media.
Under Meta’s proposed policy, children under the age of 16 will be restricted from downloading apps on their mobile devices without parental consent beforehand.
Existing parental control apps allow parents to monitor and restrict their children’s device use, including requiring approval for app installations?
A potential approach Meta’s proposal could take is to make such features mandatory when setting up a cellphone for someone under the age of 16, rather than offering them as optional choices.
Is it a good suggestion?
Meta’s proposal recognises the significant advantage of testing an individual’s age at the earliest possible opportunity – namely, when a cellphone is initially set up – rather than relying on each instance where a child or teenager attempts to install an app or access a website.
If mobile operating system providers such as Apple and Google were to implement comprehensive child safety features across all their devices, youngsters could potentially be safer online.
Notwithstanding this, the responsibility for online security is squarely placed on the shoulders of Apple and Google, regardless.
By abandoning its commitment to transparency and security, Meta undermines its moral obligation to protect user data, compromising its relationship with customers. While companies like Meta have successfully integrated certain features into their apps, such as those found on Instagram, there is actually much more that can potentially be accomplished.
A greater different
Meta and companies behind popular social media platforms like Snapchat and TikTok should collaborate with Apple and Google to develop comprehensive child protection measures.
When parents approve their child’s request to join Instagram or Snapchat, the mobile platform should notify the app that it is being used by a minor. The app should utilize this information to prioritize the security of the user’s cellphone.
By enabling automatic security settings and mandating parental consent for visibility, that approach could potentially prevail. The platform may also incorporate enhanced security features, including image blurring or masking, automatic detection of explicit content, and notifications to parents when potentially inappropriate photos are shared, ensuring a safer online environment for minors.
To ensure a safe and enjoyable experience for all players, we propose the following settings: the satanic element should be optional, available but not enforced by default, with parental consent required to disable it altogether.
There’s no reason why these types of options need to be limited to social media apps?
Internet browsers such as Google Chrome and Apple Safari must integrate with on-device youth-oriented security features and controls, potentially including blocking access to adult content websites.
To strengthen digital ecosystem security, Apple, Google, and prominent app distributors should collaborate with Meta to establish a unified standard for platform security features, providing a standardized API for apps to leverage these capabilities seamlessly.
Authorities may establish minimum standards for the security features required from Google’s and Apple’s platforms, as well as guidelines on how developers must implement these features in their apps. If these requirements are not met, app suppliers may face enforcement actions from the major app stores, which already scrutinize apps for security and safety before permitting their listing.
To achieve this successfully will indeed necessitate a significant amount of collaboration and coordination among all parties involved? Despite claims of cooperation, major tech companies are increasingly absent from key decisions on the latest innovations.