By Mushfiq M, MehtA+ AI/Machine Learning Research Bootcamp alum
In part 2 of a three part series on Artificial Intelligence (AI), we talk about the history of AI. If you would like to learn more about artificial intelligence, check out the AI camps MehtA+ offers at https://mehtaplus.com/.
We are currently amidst the Fourth Industrial Revolution.
In the first industrial revolution, and the one that we are most familiar with, humans were able to mechanize processes for the first time ever as they discovered coal could be used as a source of power. In the second industrial revolution, humans discovered gas, oil and electricity which transformed the communication and transportation industries.
The third industrial revolution brought forth the rise of computers and renewable energy sources such as nuclear energy. And here we are now in the fourth industrial revolution where we will be using Artificial Intelligence (AI) to revolutionize the way machines interact with the physical world.
Government and tech giants of the world are heavily investing in artificial intelligence. From autocompleting emails to instant translations, from recognizing faces to showing relevant ads, from recommending products to analyzing health data, AI is being used in Meta, Google, Amazon, Microsoft and Apple among other companies.
It might be surprising to learn that the concept of artificial intelligence is nothing new. Yes, the AI sector has undergone incredible changes in the last three decades. But the concept of AI has been around since ancient Greece!
In one ancient Greek myth, Hephaestus, the stone-sculpting technology god of Greek mythology, created a warrior named Talos completely out of bronze to protect the island of Crete against invaders. This story illustrates perhaps, one of the earliest conceptions of AI and robotics! However, work towards truly making AI a reality was not done until after the Second World War.
During World War II, there was a need for rapid technological advancements to combat the enemy. During this time, neurologist Grey Walter conceptualised intelligent machines by creating cybernetic (a field focused on automatic control systems in machines and living things) tortoises. These robots were created with photoreceptors and moved like a tortoise towards light. Walter did not believe that the brain was very complex and felt that it could be modeled by a few switches.
Unlike Walter, British mathematician Alan Turing believed the brain was more complex. Around 1948, Turing started working on formulating the idea of the imitation game or a ‘Turing test’, in which the main objective is to test a machine’s ability to exhibit indistinguishable behavior from that of a human.
The imitation game is as follows — there are two rooms next to each other. In one, there is a man and in the other, there is a machine. There is also an interrogator outside the room who asks questions to the man and the machine. This interrogator does not know in which room is the man and in which the machine. No words or voices can be used — only the printed answers are given to the interrogator.
If the interrogator cannot tell the machine and the man apart, then the machine is a ‘thinking’ machine, Turing posited in his seminal paper, Computing Machinery and Intelligence. We now call the attempt for a machine to replicate human intelligence ‘artificial intelligence’, a term first used by American computer and cognitive scientist John McCarthy.
It has been more than 70 years since Turing’s paper and while a machine will not be able to truly feel, chatbots such as ChatGPT have passed the Turing test and fooled humans into thinking they are talking with a human being and not a machine.
The history of artificial intelligence is a fascinating and ongoing story of human ingenuity and technological progress. From the early days of AI research to the modern era of deep learning and machine learning, we have seen incredible advancements in this field.
Despite the many challenges and setbacks along the way, researchers and developers have persevered and made significant strides in creating intelligent machines that can perform tasks once thought impossible. And as AI continues to evolve and improve, we can only imagine the many new possibilities and innovations that lie ahead.