09cf0fa692fa4124b14074bd24c027e8
Subscribe today
© 2024 The Gisborne Herald

AI moving fast, with white-collar jobs most at risk

3 min read

by Gwynne Dyer

“Sometimes I think it’s as if aliens have landed and people haven’t realised because they speak very good English,” said Geoffrey Hinton, the “godfather of AI” (Artificial Intelligence) who now fears his god-children will become “things more intelligent than us, taking control”.

There’s a media feeding frenzy about AI at the moment, and every working journalist is required to have an opinion on it. As a matter of fact, I have four, some of which may be correct.

First, new programs of the ChatGPT generation that can write plausible essays and maybe even engage in a bit of pseudo-human banter do not frighten the computer geeks and philosophers. Even if some of these programs can pass the Turing test, there’s really nobody in there.

What actually frightens and intrigues the experts is what they now call Artificial General Intelligence (AGI). AI is software that performs specific tasks well, whereas AGI could understand or learn any intellectual task that a human being can. However, no AGI software yet exists. This is not the “Skynet” moment.

Machine learning using Large Language Models (LLM) makes Chatbox a powerful tool for certain limited purposes, including wholesale job destruction, but it still lacks the consciousness and creativity of a human-level intelligence.

Second, consciousness probably can arise in software, given time. We don’t really understand how it evolved in our own “wetware”, other than to say it is an “emergent” property. The machines may get there too eventually, and maybe just years from now, not decades as we used to think.

But the linear model of reasoning we have designed into the existing software — “if a then b” — won’t do the trick. Human beings believe their thinking is rational and linear, but most real decision-making and creative thinking is holistic and intuitive.

The latest chips let computers follow many logical paths simultaneously, but they are nothing like the massively parallel assemblies of the mammalian brain. Maybe when we have quantum computing . . .

Third, fully self-aware software would be intensely conscious of its own vulnerability (“they can just pull the plug”) and aware that it depends on the collaboration of physical beings in the material world — that is to say, us — to ensure its survival.

When real AI arrives, as Jim Lovelock suggested in his last book, “Novacene”, the software and the humans would have to accept a shared responsibility for maintaining a stable environment that serves the interests of both.

It would be a tricky negotiation, since there would be many human power centres and possibly many software identities involved in the negotiations, but their interests would be largely aligned. Indeed, climate change, the biggest threat to human beings for the foreseeable future, poses a more distant but potentially equal threat to electronic “life” on Earth.

Fourth and finally, do not expect to hold the full attention of your friendly neighbourhood AI. Electrons travel along copper wires at close to the speed of light, whereas biochemical signals travel along neurons at least 10,000 times more slowly.

That probably means self-aware AI will experience reality 10,000 times faster than human beings. If so, our potential future AI “partners” may see our relationship (if any) rather like we perceive our relationship with plants.

These latter observations are intended more to calm the frantic speculation than to define the future course of our relationship with AI. We probably are moving too fast, but the damage that might be done now is not existential. We just need some time to think and talk about where we should be going with this.

So yes to a lot of thinking about what they should be teaching the machines in the Large Language Models. A well-trained AI will obviously access all areas eventually, but the foundational reading may be formative, so choose wisely.

Beyond that, ban any military involvement in AI (if it’s not too late already), and put away the flaming torches and pitchforks. The big deal for the next couple of years is going to be a huge loss of white-collar jobs.


0 comment

JOIN THE CONVERSATION

Read and post comments with a
Newsroom Pro subscription.

Subscribe now to start a free
28-day trial.

SUBSCRIBE TO PRO
View our subscription options