News location:

Canberra Today 9°/11° | Monday, May 6, 2024 | Digital Edition | Crossword & Sudoku

Political dangers of a brave new AI world

Cartoon: Paul Dorin

“Tailored messages directed to an individual by employing AI will become more and more sophisticated. AI has the ability to scour the internet, to build a clear profile on individuals and ‘rapidly produce targeted campaign emails, texts or videos’,” writes political columnist MICHAEL MOORE

ALDOUS Huxley’s novel “Brave New World” and George Orwell’s “1984” describe a range of techniques governments use to monitor citizens. 

Michael Moore.

However, their worlds will seem relatively benign considering how artificial intelligence (AI) is set to allow politicians to monitor, collect and store so much information. 

Journalists Keppler and Swenson pointed out in an Associated Press article that “computer engineers and tech-inclined political scientists have warned for years that cheap, powerful artificial intelligence tools would soon allow anyone to create fake images, video and audio that was realistic enough to fool voters and perhaps sway an election”.

The article examined the likely impact on the American election next year with a warning about creating fake images and using AI to deliberately mislead voters. 

What are the chances that AI will also play a part in the next ACT election to be held just a few weeks earlier on October 19 in 2024? 

The warnings that Keppler and Swenson flag are also warnings that ought to be heeded by the people of the ACT. Political campaigns in the US have been much more blatant in the way they mislead. 

In Australia, and particularly in the electorates in Canberra, the framing of messages and the use of “spin” to put the best foot forward are more common techniques than blatant lies.

At the last federal election, Senator David Pocock was falsely portrayed as a Greens candidate. The banners set up on the side of the road showed the candidate pulling back his shirt, Superman style, to reveal the Greens’ logo. This simple technique illustrates that attempts to mislead the public are seen as a normal part of campaigning.

Tailored messages directed to an individual by employing AI will become more and more sophisticated. There is a myriad of individual personal information available thanks to personal memberships of organisations, use of cards to gain “points” and, most importantly, liberal use of social media platforms. AI has the ability to scour the internet, to build a clear profile on individuals and “rapidly produce targeted campaign emails, texts or videos”.

It is these profiles that allow misleading personalised messages. Even in the current environment, most people using social media will have been exposed to tailored messages suggesting purchase of particular items. Imagine how the same approach could be used to tailor political messages to address individual concerns.

Such messages do not have to be based on the truth with a goal to appeal to the values of the individual. A more sinister approach based on fear and confusion is described by Keppler and Swenson where “AI is used to create synthetic media for the purposes of confusing voters, slandering a candidate or even inciting violence”.

They provide further examples from people with AI expertise. These include, “automated robocall messages, in a candidate’s voice, instructing voters to cast ballots on the wrong date; audio recordings of a candidate supposedly confessing to a crime or expressing racist views; video footage showing someone giving a speech or interview they never gave”. 

Additionally, there have already been examples of “fake images designed to look like local news reports, falsely claiming a candidate dropped out of the race”.

These techniques are American based, and some would not work so well in Australia where it is compulsory to submit a vote. Additionally, there are many foreign groups and governments that would see some advantage in interfering with an election in the US. Potential ACT candidates do not have to be so concerned about this aspect.

There have been some attempts to legislate more honest campaigns and to have AI identified (or “watermarked”). A search of the Australian Electoral Commission’s (AEC) website with the words “Artificial Intelligence” returns a zero result. However, this does not mean it is not being considered. It should be expected that their Electoral Integrity Assurance Taskforce would have AI on their agenda.

The federal, state and territory electoral commissions would also benefit from work done by each other.

There is Buckley’s chance that political parties will not use AI as part of their campaigning devices. Politicians are always looking for the edge. Our democracy needs enough vigilance to contain the genie in the bottle for as long as possible – at least until appropriate legislation is understood and put in place.

Michael Moore is a former member of the ACT Legislative Assembly and an independent minister for health. He has been a political columnist with “CityNews” since 2006.

 

Who can be trusted?

In a world of spin and confusion, there’s never been a more important time to support independent journalism in Canberra.

If you trust our work online and want to enforce the power of independent voices, I invite you to make a small contribution.

Every dollar of support is invested back into our journalism to help keep citynews.com.au strong and free.

Become a supporter

Thank you,

Ian Meikle, editor

Michael Moore

Michael Moore

Share this

Leave a Reply

Follow us on Instagram @canberracitynews