Translate to multiple languages

Subscribe to my Email updates

https://feedburner.google.com/fb/a/mailverify?uri=helgeScherlundelearning
Enjoy what you've read, make sure you subscribe to my Email Updates

Tuesday, May 22, 2018

Neural network? Machine Learning? Here's all you need to know about AI | Technology, In Other News - Deccan Chronicle

AI is an umbrella term for a range of computer algorithms and approaches that allow machines to sense, reason, act and adapt like humans, as Deccan Chronicle reports.

The human-like capabilities include things like apps that recognise your face in photos, robots that can navigate hotels and factory floors, and devices capable of having (somewhat) natural conversations with you.
Photo: Pixabay

Artificial intelligence encapsulates a broad set of computer science for perception, logic and learning. One method of AI is machine learning – programs that perform better over time and with more data input. Deep learning is among the most promising approaches to machine learning. It uses algorithms based on neural networks – a way to connect inputs and outputs based on a model of how we think the brain works – that find the best way to solve problems by themselves, as opposed to by the programmer or scientist writing them. Training is how deep learning applications are “programmed” – feeding them more input and tuning them. Inference is how they run, to perform analysis or make decisions...

Training and Inference
There are two more quick concepts worth noting: training and inference. Training is the part of machine learning in which you’re building your algorithm, shaping it with data to do what you want it to do. “Training is the process by which our system finds patterns in data,” wrote the Intel AI team. “During training, we pass data through the neural network, error-correct after each sample and iterate until the best network parametrization is achieved. After the network has been trained, the resulting architecture can be used for inference.”

And then there’s inference, which fits its dictionary definition to the letter: “The act or process of deriving logical conclusions from premises known or assumed to be true.” In the software analogy, training is writing the program, while inference is using it.

“Inference is the process of using the trained model to make predictions about data we have not previously seen,” wrote those savvy Intel folks. This is where the function that a consumer might see – Aier’s camera assessing the health of your eyes, Bing answering your questions or a drone that auto-magically steers around an obstacle – actually occurs.
Read more...  

Source: Deccan Chronicle