“Artificial Intelligence is about replacing human decision making with more sophisticated technologies.” (Falguni Desai) is and why it matters
Artificial intelligence (AI) makes it possible for machines to learn from experience, adjust to new inputs, and perform human-like tasks. Most AI examples that you hear about today (from chess-playing computers to self-driving cars) rely heavily on deep learning and natural language processing (NLP). Using these technologies, computers can be trained to accomplish specific tasks by processing large amounts of data and recognizing patterns.
The term artificial intelligence was coined in 1956, but AI has become more popular today thanks to increased data volumes, advanced algorithms, and improvements in computing power and storage.
Early AI research in the 1950s explored topics like problem solving and symbolic methods. In the 1960s, the US Department of Defense took interest in this type of work and began training computers to mimic basic human reasoning. For example, the Defense Advanced Research Projects Agency (DARPA) completed street mapping projects in the 1970s. And DARPA produced intelligent personal assistants in 2003.
While Hollywood movies and science fiction novels depict AI as human-like robots that take over the world, the current evolution of AI technologies isn’t that scary or quite that smart. Instead, AI has evolved to provide many specific benefits in every industry. Here are some things you may not know about AI:.
1.
AI could make a
visiting a doctor a thing of the past: Visits to the
doctor have not changed at all in recent years. You visit the doctor; explain
your symptoms, and you are given a physical exam. AI could change this by
allowing patients to describe their symptoms which could help doctors to narrow
down the causes. Patients could explain their symptoms to the AI application
which uses NLP to understand speech.
The symptoms
could then be compared to a database which could potentially reduce
misdiagnoses and speed up doctor’s appointments. Telemedicine
is the use of telecommunication and information
technology
to provide clinical health care from a distance. It
has been used to overcome distance barriers and to improve access to medical
services that would often not be consistently available in distant rural communities.
It is also used to save lives in critical care and emergency situations.
2.
AI could one
day give us minority style predictive policing: Police
departments currently analyze data to spot crime tends but in the future, AI
applications could take that a step further by observing crime statistics in
real time. Computers could also act as digital police by recommending where and
when to deploy officers.
3.
AI goes way
back to ancient Greece: Even though most of us think of
robots that can speak and reason like humans when AI is mentioned, the idea of
a man like machine can trace is roots back as far as the ancient Greeks.
According to History Extra, Hephaestus (Vulcan to the Romans) was the blacksmith
of Olympus who created lifelike metal automatons.
4.
AI is powered
by graphics processing
units not central processing
unit s: Graphics processing unit s (GPUs) generate the graphics for your games consoles but they
are also known for their amazing number crunching capabilities. AI applications
are trained using large amounts of data rather than being explicitly programmed
like traditional computer programs.
The time taken
to complete this training phase is significantly reduced by using GPUs because
they are able to quickly perform many calculations in parallel, which central processing unit s ( CPUs) can't match. Many of the big
players in AI such as Google, Microsoft, Facebook
and IBM use high-end GPUs like Nvidia's Tesla chips for AI
applications instead of traditional CPUs.
5.
AI must learn
to understand our languages: One of the main
goals in AI is to get computers to process and understand our natural languages.
There is a research topic for this which is called NLP. What this means is that
computers must be able to understand human languages such as English, Spanish,
German, and understand what exactly is being communicated. Computers must also
learn to talk and generate
its speech.
6.
AI research
formally began in the 1950's: Although
research into Artificial Intelligence began before this date, it was formalized
as an academic research topic at The Dartmouth Conference in 1956. Many of
those who attended the conference became leaders in the field for a number of
decades.
7.
Alan Turing had
a major influence on AI: Alan Turing published a paper in the
1950's which focused exclusively on machine intelligence. In the paper, he
posed the question, 'Can machines think?". In the paper, he proposed a
test called the "Imitation Game", which was inspired from a party
game where a computer had to work out the gender of two players. The paper also
introduced the Turing Test which is still used to this day as computers
attempt to fool human judges
into thinking they are also human.
8.
Driverless cars
won't work without AI: Driverless cars are beginning to
transform how we get from one place to the other. They are certainly one of the
biggest technological achievements of the 21st century and would not be
possible without the progress made with AI.
9.
Investors are
donating a great deal in AI programs: AI is a hot
market at the moment for startups, and it shows no signs of slowing down. So
far there has been $5.4 billion invested in AI start-ups. Companies to
look out for include robotics company Autonomous, team productivity
software maker Crux, and AI social news aggregator Zero Slant.
“Just as electricity transformed almost everything 100 years ago, today I actually have a hard time thinking of an industry that I don’t think AI (Artificial Intelligence) will transform in the next several years.” (Andrew Ng)[i]
[i] Sources used:
·
“9 Things You
Probably Don’t Know about Artificial Intelligence” by Baz Edwards
·
“Artificial
Intelligence” by sas
·
“Benefits
& Risks of Artificial Intelligence” by Future of Life
·
“Central Processing Unit” “Telemedicine” “GPU (disambiguation)” from Wikipedia
No comments:
Post a Comment