Machine-learning and artificial intelligence
algorithms used in sophisticated applications such as for autonomous
cars are not foolproof and can be easily manipulated by introducing
errors, Indian Institute of Science (IISc) researchers have warned, as Gadgets Now reports.
Machine-learning and AI software are trained with initial sets of
data such as images of cats and it learns to identify feline images as
more such data are fed. A common example is Google throwing up better results as more people search for the same information.
Use of AI applications is becoming mainstream in areas such as
healthcare, payments processing, deploying drones to monitor crowds, and
for facial recognition in offices and airports.
“If your data input is not clear and vetted, the AI machine could
throw up surprising results and that could end up being hazardous. In
autonomous driving, the AI engine should be trained properly on all road
signs.
If the input sign is different, then it could change the course
of the vehicle, leading to a catastrophe,” R Venkatesha Babu, Associate
Professor at IISc’s Department of Computational Sciences, told ET.
“The
system also needs to have enough cyber security measures to prevent
hackers from intruding and altering inputs,” he said.
Babu and his
students Konda Reddy Mopuri and Aditya Ganesan, in a paper published in
the prestigious Trans. Pattern Analysis and Machine Intelligence of
IEEE, have demonstrated how errors introduced in machine-learning
algorithms could throw up varied results—like African chameleon for a
missile, and a custard apple for a banana...
More Work on AI Needed
Analysts say the research throws light
on the “hype of AI” and that more work is needed to improve its
efficiency and security. “If the technologies can be confused so easily,
we are in trouble. It is like having computers that can easily be
hacked; the first generations had practically no security,” said Vivek Wadhwa, a Distinguished Fellow at Carnegie Mellon University’s College
of Engineering.
Read more...
Source: Gadgets Now