Share this article on:
Facebook parent company Meta has revealed that it scrapes the photos and data of its Australian users to train its AI, without providing them with an opt-out option.
Meta AI launched in 2015 and rolled out as a generative AI chatbot on Facebook, Instagram, Messenger, and WhatsApp in April this year, allowing users to search queries and have conversations with the chatbot.
The company previously revealed that it would be using its users to train the AI, but it explicitly stated that it does “not use the content of … private messages with friends and family to train [its] AIs”.
Despite offering the opt-out to its European users, the company revealed that it scrapes the data of all public Facebook accounts.
Meta’s global privacy director, Melinda Claybaugh, initially rejected the claim that it scrapes Australian data to build its AI following questions by Labor Senator Tony Sheldon.
However, after further questioning by Greens Senator David Shoebridge, Claybaugh confirmed that data from public accounts was being scraped.
“The truth of the matter is that unless you have consciously set those posts to private since 2007, Meta has just decided that you will scrape all of the photos and all of the texts from every public post on Instagram or Facebook since 2007, unless there was a conscious decision to set them on private. That’s the reality, isn’t it?” asked Senator Shoebridge, to which Claybaugh responded “correct”.
While Claybaugh did then confirm that accounts of under 18-year-olds would not be scraped, following questioning by Senator Sheldon asking if public photos of his children on his account would be scraped, Claybaugh said yes.
The Meta privacy director also admitted that the opt-out options were not available to Australian users, and the only mechanisms available locally were to make accounts private.
Claybaugh also said that the reason the opt-out option was available in the EU was in response to uncertainty in regard to EU privacy laws.
“In Europe, there is an ongoing legal question around what is the interpretation of existing privacy law with respect to AI training,” said Claybaugh.
“We have paused launching our AI products in Europe while there is a lack of certainty. So you are correct that we are offering an opt-out to users in Europe. I will say that the ongoing conversation in Europe is the direct result of the existing regulatory landscape.”
The discussion comes as Australia explores mandatory guardrails and legislation to manage the development of AI tools.
In a release by Minister for Industry and Science Ed Husic on 5 September, the government announced two new initiatives to make AI use safer.
“Australians want stronger protections on AI, we’ve heard that, we’ve listened,” said Minister Husic.
“Australians know AI can do great things, but people want to know there are protections in place if things go off the rails. From today, we’re starting to put those protections in place.”
The first is the new Voluntary AI Safety Standard, which will come into effect immediately. The standard aims to provide businesses using high-risk AI with “practical guidance” so they can “implement best practice” to protect themselves and others.
The standard will be updated as the technology and global standards develop.
The second announcement is the introduction of new guardrails to guide AI use and development.
According to the Tech Council, generative AI alone could bolster the Australian economy by $45 billion to $115 billion by 2030.
However, while businesses are interested in implementing the technology, the government has been asked repeatedly to comment on or implement AI regulation to ensure that the use of the technology within business is productive and safe.
In response, the government has announced a Proposals Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings, which was informed by a government-appointed AI expert group.
The paper intends to set a proposed definition of what high-risk AI is and introduce 10 proposed mandatory guardrails and three regulatory options to enforce them.