Artificial Intelligence (AI) is present everywhere, even if we don’t recognize it. Our smartphones feature smart assistants like Google Assistant, Microsoft’s Cortana, Amazon’s Alexa or Apple’s Siri. Some of the high-end mobile SoCs already include neural network accelerator, like Neural Engine in A12 chip found in iPhone Xs (including Xs Max and Xr) or AI engine in Snapdragon 855 found in Samsung Galaxy S10+ (including S10 and S10e). Our smartphones and computers do not automatically need to have neural network accelerator HW to fully benefit from AI – it can run on server, like in case of Google Search, so you are experiencing AI features even on Intel Atoms found in cheap netbooks.
First, it’s not AI as AI. If you are playing game like Warcraft 3 against computer-controlled player, it’s not true AI – it’s just piece of code with predefined behaviour, while real AI has capability to learn new things or even to learn from mistakes in games, like in case of ancient game of Go, where Google’s AI defeated human master in this game. This AI is extremely resource demanding, so it has to run on Google’s servers. Our current technology is not advanced enough, and we are pretty far from quantum computers integrated into small smartphones.
Today, thanks to machine learning, you can type to search engines, like Google’s one, a question like “how tall is Mount Everest” and Google Search will reply “8848 metres”. Also, try to type “how tall is tallest mountain in world” and it will show you the exact answer, even if you don’t mention word “Mount Everest” or your query is not grammatically correct. This applies to many search engines, like mentioned Google Search or Microsoft Bing:
Microsoft’s Bing and its machine-learning algorithms are used to build vectors – essentially, long lists of numbers – that in some sense represent their input data, whether it be text on a webpage, images, sound, or videos. Bing captures billions of these vectors for all the different kinds of media that it indexes. To search the vectors, Microsoft uses an algorithm it calls SPTAG (“Space Partition Tree and Graph”). An input query is converted into a vector, and SPTAG is used to quickly find “approximate nearest neighbors” (ANN), which is to say, vectors that are similar to the input (source: Arstechnica).
By comparing search queries like “how tall is mount everest” and “how tall is the highest mountain in the world” and “what’s the name of tallest mountain in the world”, machine-learning algorithm can learn and provide correct answers for literally whole world. Microsoft has released its SPTAG algorithm – used to answer questions in Bing – as MIT-licensed open source on GitHub. Developers can use this algorithm to search their own sets of vectors, and it’s pretty fast – a single machine can handle 250 million vectors and answer to 1000 queries per second.
Based on a variety of celebrations, a Microsoft CEO Satay Nadella has spoken about his desire to “Equalize AI” and make it offered to everybody, producing not simply a centralized, specialized tool that requires substantial competence however something that a vast array of designers, resolving a vast array of issues, can utilize as part of their toolkit.
Releasing of SPTAG is an example of how Microsoft is putting those words into real world. Future of the AI may be interesting, and with the invention of the quantum computers, AI may mean a lot in human evolution.