A real estate listing advertisement that mentioned fake schools has highlighted how widely artificial intelligence is being used across the Australian real estate industry – and the potential risks of relying on it.
An LJ Hooker branch this week admitted to using ChatGPT to write a listing for a rental in regional New South Wales, which erroneously advertised the house as being near two schools that didn’t exist.
The real estate agent who generated the ad apologised – but claimed “all estate agents” are using AI.
Leanne Pilkington, president of the Real Estate Institute of Australia, told Guardian Australia the industry was embracing AI technology for “routine tasks” so agents could spend more time on “meaningful client interactions”.
“AI is increasingly being used for a range of tasks, including listings, blogs, email responses, and market reports,” she said. “Agents need to be aware that tools like ChatGPT can generate inaccurate information, so it should never be relied upon 100%.”
Australia’s largest real estate agency network, the Ray White Group, confirmed it was using AI to generate copy and enhance photos of properties.
The company’s chief systems officer, Jason Alford, said it was exploring the technology as a way to “enhance agency productivity”.
“Ultimately people always need to check their copy or customer-facing marketing materials for accuracy,” he said. “Virtual styling is also very popular but agents must always let people know the images are digitally enhanced.”
Raine & Horne said its agencies had the ability to use “certain ChatGPT functions”.
The widespread use of AI in the real estate industry raises serious questions about ethics, regulation and the potential for publishing misleading advertising, including by promoting features that do not exist.
The Australian Competition and Consumer Commission said businesses that used AI models to generate advertising material should ensure it is checked for any false, misleading or inaccurate information.
“Under the Australian consumer law businesses should not engage in false or misleading conduct in the supply of goods or services, this includes in advertising properties for sale or rent,” a spokesperson for the watchdog said.
LJ Hooker had advertised a four-bedroom home for rent in Farley, in the NSW Hunter region, on its own website, as well as Domain and realestate.com.au, as being close to the “excellent educational facilities” Farley High school and Farley primary school.
Neither school exists. LJ Hooker did not disclose on the property listing that it had been generated by AI.
The agency corrected the ad shortly after being contacted by Guardian Australia on Monday and removed references to schools.
The principal of the LJ Hooker Edensor Park branch, Patrick Huynh, said the ad had been generated by ChatGPT and it was a mistake that no one checked it closely enough before it was published.
“I don’t know any real estate agent that doesn’t use AI,” he said. “Most people use ChatGPT now. We have to use AI to help with [producing ads quickly].”
Tim McKibbin, chief executive of the Real Estate Institute of NSW (REINSW), said the mistake demonstrated why AI should only be used to “assist” real estate agents.
“AI is clearly going to be a part of our life going forward,” he said. “But it doesn’t replace us … it is a tool to assist us to do the work that we do.”
McKibbin said he was in Broken Hill on Tuesday giving a presentation to local real estate agents about REINSW’s use of AI.
The industry was now using AI widely, including to analyse data, and the technology was already so advanced it was “almost science fiction”, he said.
“There’s going to be some huge challenges for us in this space,” McKibbin said. “We need to have a look at the regulatory controls. There’s going to be some ethical questions to be asked.”
Domain and realestate.com.au declined to comment.