Member-only story
Support Vector Machines (SVM) clearly explained: A python tutorial for classification problems with 3D plots
In this article I explain the core of the SVMs, why and how to use them. Additionally, I show how to plot the support vectors and the decision boundaries in 2D and 3D.

Introduction
Everyone has heard about the famous and widely-used Support Vector Machines (SVMs). The original SVM algorithm was invented by Vladimir N. Vapnik and Alexey Ya. Chervonenkis in 1963.
SVMs are supervised machine learning models that are usually employed for classification (SVC — Support Vector Classification) or regression (SVR — Support Vector Regression) problems. Depending on the characteristics of target variable (that we wish to predict), our problem is going to be a classification task if we have a discrete target variable (e.g. class labels), or a regression task if we have a continuous target variable (e.g. house prices).
SVMs are more commonly used for classification problems and for this reason, in this article, I will only focus on the SVC models.
NEW: After a great deal of hard work and staying behind the scenes for quite a while, we’re excited to now offer…