What Is Machine Learning?
Learn how machine learning is defined and how to distinguish between the various forms of machine learning.
Definition of machine learning
While there are many definitions of machine learning, this course uses the following: “The field of study that gives computers the capability to learn without being explicitly programmed.”
This definition was written by Arthur Samuel, an early pioneer in artificial intelligence who is credited with coining the term “machine learning” in 1959.
Data scientists use machine learning algorithms developed by others to allow the computer to learn from data.
What is an algorithm?
An algorithm is a well-defined procedure or formula that takes input and produces output. That’s a bit abstract, so consider the following analogy: preparing a meal using a recipe.
Cooking recipes are like algorithms.
A typical recipe starts with a list of the raw ingredients. In the case of machine learning algorithms, the raw ingredients are data.
Next, a recipe provides instructions on preparing and processing the ingredients. Often, recipes have multiple preparation steps. Machine learning algorithms typically have many steps for preparing and processing the data.
After preparing and processing, a recipe provides instructions for cooking the ingredients (e.g., baking the meal in the oven for 45 minutes at 425 degrees). Similarly, machine learning algorithms also have settings used to “cook” the data.
Last, completion of the recipe gives the output: a meal. Similarly, a machine learning algorithm’s output is a predictive model.
Create a free account to view this lesson.
By signing up, you agree to Educative's Terms of Service and Privacy Policy