ACS Forum Wrap up – Ethics in Artificial Intelligence and I.T.

Last night I attended the Australian Computer Society forum discussing “Ethics in Artificial Intelligence and IT” at the SAHMRI building in Adelaide.

Chris Radbone (ACS) lead a provocative discussion with a panel of thought leaders; Dr Anisha Fernando, Kirsten Wahlstrom, Sarah Jamieson and Owen Churches.

The group discussed some examples of where AI has already gone wrong, e.g. judicial sentencing algorithms in the US, where it was proven that the algorithm was biased towards Caucasian people, placing weight on people of colour re-offending and imposing heavier sentences. The HR algorithms at Amazon which were shown to be sexist in how they selected potential candidates for jobs. Do we need a secondary AI to keep the primary one honest in these systems? Is it possible to embed fairness in to the code of this AI technology? Interestingly, students in the US are now being taught ethics as part of their Computer Science degrees.

Ethical use of ‘Smart Dust’ technology was raised. If you haven’t seen these google them – miniature sensors which can record almost any type of information and they’re very difficult to detect. The ‘Sophia’ project by Hanson Robotics was mentioned, which saw a robot gain citizenship in Saudi Arabia (this reeks of Skynet!). With Facebook’s launch of blockchain currency (Libra) it now has the power to become a ‘country’ with a government presiding over 2.6 billion users. All hail king Zuckerberg!

A comment was made about big data, and how governments are typically using this data to gain insight on what could be considered a lower social demographic, think public health system, corrections departments, child protection.

Definitely makes you wonder how government legislation can possibly keep up with technology, this is why organisations such as the Australian Computer Society and other impartial bodies are needed to provide guidance to leaders.

The quote of the night for me came from Owen Churches when asked what he would say to kids now to guide them for the future – “Googles top search result may not be the most suitable for you, technology is not neutral, and data is not objective.”