By Michael R. Sisak
Australia will require social media platforms to act to prevent online harm to users
Australia plans to require social media platforms to act to prevent online harms to users such as bullying, predatory behavior and algorithms pushing destructive content, the government said Thursday.
"The Digital Duty of Care will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms," Communications Minister Michelle Rowland said in a statement.
The proposed changes to the Online Safety Act were announced before the government next week introduces to Parliament world-first legislation that would ban children younger than 16 from platforms including X, Instagram, Facebook and TikTok.
Critics have argued that removing children from social media reduced incentives for platforms to provide safer online environments.
Social media has been blamed for an increase in children taking their own lives and developing eating disorders due to bulling and exposures to negative body images.
Rowland said making tech companies legally responsible for keeping Australians safe was an approach already adopted by Britain and the European Union.
Digital businesses would be required to take reasonable steps to prevent foreseeable harms on their platforms and services. The duty of care framework would be underpinned by risk assessment and risk mitigation, and informed by safety-by-design principles, the minister said.
Legislating a duty of care would mean services can't "set and forget." Instead, their obligations would mean they need to continually identify and mitigate potential risks, as technology and service offerings change and evolve, she said.
The categories of harm in the legislation include harm to young people and mental well-being, promotion of harmful practices and... Read More