Welcome to the 21th part of our machine learning tutorial series and the next part in our Support Vector Machine section. In this tutorial, we're going to be covering some of the basics of vectors, which are integral to the concepts of the Support Vector Machine.
First, a vector has both a magnitude and a direction:
In the above example, vector A (denoted with an arrow above it), is moving towards [3,4]. Think of each "coordinate" as movement in that "dimension." In our case, we have two dimensions. We're moving 3 units in dimension 1, and 4 units in dimension 2. That's the direction, what's the magnitude? That's something we've already seen before, which is Euclidean distance, the norm, or the magnitude, it's all the same. Most importantly to us, the calculation for all of it is the same for us (the square root of the squared, and summed, constituents).
In our case, the magnitude of a is 5. If you look at the graph as well, you might notice something else:
Looks a whole lot like Pythagorean's Theorem for a triange's hypotenuse! It is indeed the same formula, only we're possibly going to be going into many more dimensions, and there wont be a simple triangle anymore.
Simple enough, next up: Dot Product. What happens when we do the vector dot product? Let's say we have two vectors, A and B. A is [1,3] and vector B is [4,2]. What we do is take each matching placed value and mutliply them together, then sum all of those. For example:
Alright, now that we have those things down, we're ready to move on to the Support Vector Machine itself, first off will be some of the assertions we, the scientist, are making on the machine.