Artificial intelligence – everyone is talking about it, everyone is excited about it, everyone thinks they know what it is, but do you really know what artificial intelligence is? What is the definition of artificial intelligence and how is it related to Machine Learning and Deep Learning?
One of the hottest topics today in the field of computer science is artificial intelligence, also called Artificial Intelligence or for short – AI. It is likely that there is no one who has not heard about artificial intelligence by now, whether it is in the news on TV, in newspapers, around the Internet or even in science fiction movies. On the one hand, artificial intelligence sounds like a familiar and well-known concept, but if you take a moment and think, can you accurately define the concept of artificial intelligence?
True, on the one hand it is a phrase that has managed to reach all of us, there is no one who hasn’t had the opportunity to think or talk about artificial intelligence at one point or another, or even research the subject, but what exactly is artificial intelligence? What is AI? And why is everyone talking about it and why now? Can intelligence really be artificial?
Until now, we have raised many questions and I will answer all of them and even more in this article. It is important to note that the purpose of this article is to give a theoretical background in everything related to artificial intelligence, machine learning and deep learning. During the article I will explain mainly in a theoretical way about the concepts and will not go into technical concepts or writing code at all.
What is artificial intelligence?
Artificial intelligence is a fairly general concept, it includes many different and varied fields and approaches that began to develop believe it or not already around the 50s. Artificial intelligence is a concept that refers to any method or way of making a computer or machine display some level of intelligence.
In other words, artificial intelligence is a shell that includes a long list of tools, methods and ways designed to give the computer capabilities similar to those of humans, for example: facial recognition. Artificial intelligence, as we said, started already in the 1950s and in its early years consisted mainly of algorithms built according to certain databases, that is, every change of information requires a change of the system, so to speak “relearning”.
Machine Learning – Machine Learning
In the definition of artificial intelligence – AI, we noted that it is a general concept that includes many subfields. Machine Learning or abbreviated ML, is one of those fields and was the most common until recently. Machine Learning is an approach that allows you to “teach” the computer to perform a wide variety of tasks.
Machine Learning is a collective name for a collection of algorithms that give a computer the ability to perform a variety of tasks that normal algorithms have difficulty performing. As part of using Machine Learning algorithms, researchers or developers often perform a process called feature extraction or in Hebrew: feature extraction. In this process, with the help of algorithms written by humans, the computer knows what details to look for in the input. In the case of facial recognition, it is likely that the developers will instruct the computer to look for unique details such as eye structure and features that can be found here.
There are currently many algorithms under the Machine Learning category such as SVM and KNN. While they are great for some problems such as creating groups of information with a common denominator, such algorithms may not be very relevant to other problems. The KNN algorithm, for example, is not very effective when it is necessary to identify objects in images.
Deep Learning – Deep Learning
The field of Deep Learning is a subfield within Machine Learning. Deep learning differs from machine learning in one important and significant point, the feature extraction procedure. In deep learning, also called Deep Learning or DL, there is no feature extraction process by humans at all. While in classical Machine Learning the computer will search for certain information that it was instructed to search for according to algorithms defined by humans, in Deep Learning the process is slightly different.
In deep learning, the computer “learns by itself” which features to look for. Interestingly, the computer also gives different importance to different features. Deep Learning algorithms learn with the help of examples and tests. If the goal of the algorithm is to identify types of vehicles then the developers will bring the algorithm lots of pictures of vehicles and ask it to say the type of each vehicle. The algorithm will then receive the correct answers from the developers and compare them to its own answer. In the final stage, the algorithm will make the necessary changes in order to increase its chances of giving a correct answer. This process is repeated a large number of times until a sufficient level of accuracy is reached.
artificial intelligence encapsulation
Interim summary: ML, AI and DL
Before we move on to the second part of the article, it is useful to make a short interim summary and put together everything that has been said or rather written so far. There is a direct and prominent connection between the three concepts: AI, ML and DL. Artificial intelligence – AI includes machine learning ML which includes deep learning – DL. Until recently, Deep Learning was not a very common field compared to other Machine Learning algorithms and there is a fairly simple explanation for this.
The explanation is the disadvantages of Deep Learning. For Deep Learning requires huge amounts of information, which is sometimes very difficult to obtain for the simple reason that there is not always enough accessible and sorted information suitable for the needs of training, learning and testing the algorithms. The development procedure of Deep Learning algorithms may last for many months and sometimes without success and this is one of the biggest fears of the researchers.
Many Machine Learning algorithms are also accompanied by substantial shortcomings, in fact Feature extraction is also a difficult challenge. How do we define what is important? How, for example, do we know the difference between a dog and a cat? What parameters are we looking at? These are exactly the questions you should not ask when dealing with deep learning – Deep Learning, which by the way is becoming more and more popular over time, thanks to databases that are becoming larger, more responsive and more accessible thanks to technologies from companies such as Intel, NVIDIA, IBM and more.
The technological development of the processors, the video cards, the storage components alongside the collection of the massive data allowed
Feel at any given moment make the field of deep learning much more attractive and relevant to the various companies. If in the past we were used to seeing the field of Deep Learning as very limited, today it is extremely broad and the focus around it is greater than ever.
One of the common questions is why now, as we mentioned artificial intelligence has been around since the 1950’s but when exactly did the “boom” and the great concern around artificial intelligence start?. To answer the question we will go back in time to 2012, to a researcher named Alex Krzyzewski. Somewhere in 2012, Krzyzewski introduced an artificial intelligence network called AlexNet where its uniqueness was that it learns and is based on video cards and presented an extremely high percentage of accuracy. The great advantage of video cards is their ability to process a lot of information at the same time, on a scale that is not possible on processors.
AlexNet was not only one of the first to leverage the capabilities of the video cards but also managed to show a significant and impressive jump in the level of accuracy of its Deep Learning algorithm. AlexNet managed to prevail by a respectable margin over several other algorithms that existed at the time and the enthusiasm surrounding it was a major factor in the acceleration of the field of Deep Learning.
The Nvidia company, which recognized the enormous potential of artificial intelligence together with the enormous capacity of video cards, devoted huge research and development budgets to this issue, several billions a year to be exact. Over the years, Nvidia has developed hardware and software to improve the performance and accuracy of artificial intelligence while shortening the time needed to learn and simplifying its complexity. Currently, the company has many services such as NGC and Quadro vWS alongside hardware and architectures such as Volta which is entirely dedicated to artificial intelligence.
Human touches AI
How artificial intelligence affects us today and how it will affect us tomorrow
Artificial intelligence is able to influence us without us noticing it. For example, did you know that Google’s search engines use artificial intelligence? Have you ever wondered how, with the help of a verbal description, Google manages to provide suitable images? The personal assistants of Google and Apple, facial recognition, fingerprint recognition, object recognition in Google Lens, scene recognition in the built-in camera applications on the computer are all examples of the use of artificial intelligence.
Until now, we have given a very limited amount of examples showing how artificial intelligence entered our lives and managed to influence them positively, but artificial intelligence has such a huge potential in every area of life that just comes to mind. Artificial intelligence is probably one of the most important inventions since electricity.
Think for a moment about an artificial intelligence that analyzes x-rays or an artificial intelligence that discovers new drugs, maybe even drugs for terminal diseases like cancer. Notice what significant and revolutionary examples we mentioned without even mentioning autonomous vehicles or nursing robots. Artificial intelligence is an important tool in making the world a better place and we would be hard pressed to find anyone who could disagree with that.
Where is Israel in all this story?
Some of you are probably wondering where Israel, the startup nation, is in the whole story of artificial intelligence and the answer is that Israel is right in the center. As of today, thousands of start-ups using artificial intelligence are working in Israel. At every event we attend, we are exposed to additional areas and ways in which artificial intelligence will affect us in the future. It is rare to see the same idea or the same company in two different events.
One of the biggest purchases made in Israel was the purchase of Israeli Mellanox by Nvidia. Together, the two companies provide the tools and resources required for artificial intelligence developers to keep moving forward, Nvidia’s processing capabilities alongside Mellanox’s amazing connectivity allow researchers and developers to process and transfer huge amounts of information. I had the opportunity to meet the CEO of Nvidia, Mr. Jensen Huang twice, and in each of them he was enthusiastic about Israel, he is enthusiastic about the number of start-ups, the ideas, the success and the amazing things that the people here produce thanks to artificial intelligence.
We think that many times when talking or thinking about artificial intelligence one important point is missed. In this whole process, basically, computers, or in other words materials like silicon and metal, are taught to “think like humans” and this is amazing. Think about it for a moment, silicon and metal that have undergone chemical and physical processes are taught to “think” and perform actions that, until recently, only humans could do. This is exactly what makes AI so impressive.
In the course of this article, we have touched on three main concepts, ML, AI and DL, on each of which one could write entire articles of hundreds of pages. If you want to engage in artificial intelligence and develop one, then it is blessed and today there are many sources of information and ways to learn the field. From courses of companies dealing in the field to courses on sites like Udemy and Coursera.