Logical View

Learn about the logical view of the ongoing case study.

Here’s the overview of some of the classes shown in the previous chapter’s case study. An important omission from those definitions was the classify algorithm of the Hyperparameter class:

Press + to interact
Class diagram overview
Class diagram overview

In the previous chapter, we avoided delving into the classification algorithm. This reflects a common design strategy, sometimes called “Hard Part, Do Later,” also called “Do The Easy Part First.” This strategy encourages following common design patterns where possible to isolate the hard part. In effect, the easy parts define a number of fences that enclose and constrain the novel and unknown parts.

The classification we’re doing is based on the k-nearest neighbors algorithm, kk-NN. Given a set of known samples, and an unknown sample, we want to find neighbors near the unknown sample; the majority of the neighbors tell us how to classify the newcomer. This means kk is usually an odd number, so the majority is easy to compute. We’ve been avoiding the question, “What do we mean by nearest?”

Euclidean distance

In a conventional, two-dimensional geometric sense, we use the Euclidean distance between samples. Given an unlocated sample at (uxu_{x}, uyu_{y} ...