Artificial Intelligence
- Hashtag Kalakar
- Oct 27
- 3 min read
By Aashna Sinha
Alan Turing, the father of artificial intelligence predicted, “at the end of the century[1900s] the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.” Though the man was called mad in his time, we now see how true his words were.
Ada Lovelace, the first person to recognise that a computer had applications other than calculations, stated the principle, which is also known as the Lovelace Objection: “The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform.” The progress of AI has overcome this objection.
Artificial intelligence can be traced back to ancient myths and thinkers of the Medieval and Modern Ages. Talos of Crete, a bronze automaton from Greek Mythology, is considered to be the earliest consideration of a machine with a human-like mind. AI is also a muse to many writers. From Mary Shelley’s Frankenstein to the works of numerous contemporary writers, many have used the idea of humanoid mechanical figures. A common theme in most of these writings, is the fear of the machine gaining a conscience.
Computers have become a necessity rather than a convenience in today’s time. While AI is an accolade to the brilliant minds who created it and a tool that makes the daily lives of countless persons facile, it has its downside. It eases our struggles and fastens the pace of our work, but it also numbs our mind. Individuals blindly depend upon its efficiency and accuracy, without gaining any actual knowledge from it. It is a device of much help yet little benefit. Will innovation be our undoing?
In the recent years, the obsession with AI imitating humans has taking the world by a storm; leading to a fear of machines taking over. This fear is most marked when it comes to generative AI. The world is obsessed with AI’s yield. It has made itself a part of our fast-paced lives and fascinated all of us with its outputs. Large language models (LLMs) like ChatGPT and Meta AI have shown great promise when it comes to giving human-like responses and concocting complex sentences.
Many persons now resort to machines for not just tasks like calculations or analysation, but also for creative undertakings like writing and drawing. While AI prowess in imaginative domains acceptable at best, that doesn’t stop many from relying on it for writing essays and stories for academic purposes or from using it to create art. It lacks the originality and the personal signature that humans possess. It can show you a painting by Van Gogh or Picasso, but it won’t be able to conceive an original idea that could be called its own style. AI’s stories lack the deep and raw emotion that writers are able to experience and describe. It can’t be as sensitive as Dostoevsky, bureaucratic surrealism like Kafka, empowered young women like Austen without skirting along the fine line that divides inspiration and plagiarism.
Many teenagers and young adults with a sound understanding of technology are starting their own businesses and startups with the assistance of AI. But as they do that, they replace man with machine. Professionals like accountants, cashiers, telemarketers, receptionists, paralegals etc. While AI hasn’t taken over the jobs of every individual with these jobs, they are at a high risk of being replaced by machines in the near future. Researchers from Stanford's Digital Economy Lab report that, since the widespread adoption of generative AI in late 2022, early-career workers (ages 22–25) in the most AI-exposed occupations have experienced a thirteen percent relative decline in employment.
Stephen Hawking wrote in his book Brief Answers to Big Questions, "We may face an intelligence explosion that ultimately results in machines whose intelligence exceeds ours by more than ours exceeds that of snails." Though that day hasn’t come yet, with AI’s rapid progress, it doesn’t seem too far away. This AI revolution, is a paradigm shift in the timeline of human existence, and if not taken forward with moderation and control, can spell disaster for the human race.
By Aashna Sinha

Comments