Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here’s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

Staying in control

By Peter McMahon - posted Tuesday, 20 October 2015


Recently a group of high-powered researchers and businesspeople called for a rethink in relation to our ever-growing reliance on robotic technologies. More specifically, they objected to the development of 'killer robots' that could decide for themselves whether or not to kill human beings.

In an open letter presented to the Joint Conference on Artificial Intelligence in Buenos Aires, over a thousand of the world's experts on AI warned of a 'military artificial intelligence arms race' and called for a ban on 'offensive autonomous weapons'.

The group included Elon Musk of Tesla, Steve Wozniak, co-founder of Apple, Demis Hassabis, chief executive of Google DeepMind, and super-brain Stephen Hawking. It would be hard to find a better informed and generally more intellectually capable group of people on the planet.

Advertisement

The most familiar forms of such autonomous weapons are the drones increasingly used around the world, but the principles apply equally to ground weapons as well as those for use on and in the oceans. Although this proposed ban applies to military weaponry, these military advances tend to quickly find their way into broader security-related activities, and general policing. The question of how much control to cede to the machine on the spot is now a very urgent one.

As for the air where drones mostly operate, it has been mooted that the next generation of fighters will be the last flown by human pilots because of the limits on manoeuvrability due to human frailty. At a certain point, human pilots just can't take the extra gees and black out. Given this situation, in a few decades the world's major air forces may be all robotic.

Weapons with AI capability have been called the third generation in warfare, following gunpowder and nuclear arms, but this is somewhat misleading. Gunpowder and nuclear arms are both explosives and they radically increased firepower, whereas AI weaponry totally transforms the character of warfare itself. They remove the main constraints on the practice of warfare, the desire for self-preservation and basic compassion. Furthermore, unlike nuclear weapons, robotic systems are relatively cheap removing an important constraint on use.

There is of course another practical factor of real concern – operators of such weaponry may not be able to maintain physical contact and control. AI weapons are by definition complex digital systems, and such systems are inherently vulnerable to disruption. There is growing focus on what has been called cyber-warfare which is all about disrupting digital systems through some form of hacking. Not only could such weapons be made to malfunction, they could possibly even be taken over to attack the attackers.

Given enough intent it is possible to ban specific weapons, although the record is poor starting with the Church's attempts to ban crossbows in medieval times. The harsh reality of history is that if a weapon is decisive it will be used, at least to threaten. Recently a ban was effected on laser weapons designed to blind, another result of fast-developing new technology. But this is a marginal weapon with very limited impact whereas AI weaponry is a game changer.

As serious as it is, the issue of autonomous weaponry is but one aspect of a much wider problem. The underlying issue here is that humanity is now facing a fundamental challenge: our technology, in its latest digital form, is on the verge of being able to take over most of our essential modes of living.

Advertisement

The world financial system is no longer understood by any humans and is now basically run by algorithmic programs. This is due to the basic complexity of huge numbers of constantly interacting units of money in a system many times the size of the real economy of goods and services. Algorithmic trading systems can act instantaneously according to certain criteria, shifting billions at near light speed and leaving human traders way behind. Ultimately, of course, the application of such algorithmic systems only adds to the overall complexity.

Our main industrial production systems are also increasingly run by computer systems and the actual work is done by robots. Most jobs that don't require an actual human body, such as the trades like plumbing and electrical, perhaps hairdressing, are in the process of being automated. The shift in retail, for instance, is now well advanced. Self-service in supermarkets and Internet shopping mean we can get what we need, or just want, without interacting with a live human being at all.

Even jobs that currently require high skill levels, like journalism, law, some medical and middle managerial work, are being done by 'intelligent systems'. Some medical and counselling systems already work better than the human variety. Computer systems remember everything and don't exhibit the frailties of often over-worked or otherwise stressed human beings.

  1. Pages:
  2. Page 1
  3. 2
  4. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

2 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

Dr Peter McMahon has worked in a number of jobs including in politics at local, state and federal level. He has also taught Australian studies, politics and political economy at university level, and until recently he taught sustainable development at Murdoch University. He has been published in various newspapers, journals and magazines in Australia and has written a short history of economic development and sustainability in Western Australia. His book Global Control: Information Technology and Globalisation was published in the UK in 2002. He is now an independent researcher and writer on issues related to global change.

Other articles by this Author

All articles by Peter McMahon

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Photo of Peter McMahon
Article Tools
Comment 2 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy