An intuitive overview of a perceptron with python implementation (PART 2: Animating the Learning Process)

Mohammad Abdin
6 min readJan 8, 2021

--

Learning process animated

In the previous article we modelled a perceptron classifier in python from scratch and used the iris flower as our sample dataset. In this part we will extend on our model adding a animatedFit() function that will show us the seperation line and how our perceptron gradually tweaks the weights to fit the line so that it seperates our two classes.

Before we dive into the code, I recommend you read This article by Thomas Countz explains the algebra used to derive the slope and y-intercept for our linear equation (i.e the seperation line)

Full code for this project can be downloaded here. Also if you would like to download the code used in the first part only of this series which this part builds on, you can find it here

Defining a function to plot the seperation line

We first define a plot_data() within our perceptron class. It takes in our features array to set the start and end point of our linear equation line using the np.amin() and np.amax() functions.

in the first 2 lines of our function we define two empty lists ‘x’ and ‘y’, and then use the weight array to compute the slope and y-intercept of our seperation line. Then we start a loop to populate our lists so that they can be plotted on our matplotlib figure.

np.linspace() is basically a range object to give us sequential ‘x’ values that will be used with the slope and y-intercept values to compute our ‘y’ values.

Now to test it out we can train our model using the fit() function and then call the plot_line() to plot our seperation line:

Linear seperation (petal width vs petal length)

Usually you will just need the final seperation line to analyze models, but what if we want to see the whole learning process by plotting this line with each iteration on the same graph as shown by the first gif image in this article.

Animating the learning process

For that we can use the FuncAnimation class provided by matplotlib. I’ll quickly go over how animations are built using this library, but feel free to check out matplotlib’s documentation and examples as well. Before anything we need to import:

from matplotlib.animation import FuncAnimation

Now we can define a FuncAnimation object as:

anim = FuncAnimation(fig, animate, init_func=init_plot, fargs=(X, y,), frames=50, interval=200, blit=True)

This object takes a pyplot figure ‘fig’ then uses the ‘init_func’ to initialize the plot data which in our case will be a straight line, then it iteratively calls the ‘animate’ function which is passed as our second parameter to update the values of our straight line 50 times as indicted by the ‘frames’ parameter, maintaining a 200 millisecondinterval’ between each frame. fargs’ is used when we want to pass parameters to the animate function.

Before we start animating we need to make an adjustment to our fit() function. Since we want to get the seperation line for each iteration we cannot call fit() in our animate() function, because we’ll basically be fitting our data with each frame (i.e each frame will have the optimized weights) instead we define a new function step_fit() where we only run the inner loop and leave our animate function to handle the outer loop:

Now that we have a step_fit() and a plot_data() functions we can start building our animation functions, namely the init_func() & animate().

Our initialization function is straightforward, we just need to set our line with empty x’s and y’s to be filled by the animate function. We will later define self.line once we have all our components complete but for now you can ignore the ‘undefined’ error:

Next we can define our animate function, in it we need to call step_fit(), then run plot_data() to get update our seperation line according to the weight update in that iteration.

The function takes 3 parameters, ‘i’ is passed by default by our FuncAnimation object and is the frame number. ‘X’ and ‘y’ are the features and labels for our training data, respectively, which will be passed to the step_fit() function to update the weights.

Now that we have all the components we can define one last function that integrates them all:

The weights are randomly initialized here, we can’t define the weight in our new step_fit() function because that will cause them to reset to a random value with each call of animate(). We also initialize our figure, setting all the required labels, defining the axis, plotting the training data, and defining our seperation line.

We use the parameter ‘fargs’ to tell our object that we need to pass these variables to our animate() function, that is our training data.

Note that when we pass a function as a parameter as with animate and init_plot we dont include the parentheses because we’re technically passing a pointer to that function and not actually calling it. Also we pass the n_iter value as the number of frames simply because each frame is an iteration.

Finally we use anim.save() to start the whole process and save the resulting plot animation as a gif file in the same python file directory.

Putting it all together

The model is now ready, so let’s test it out:

Learning process animated

Note that we decreased the learning rate and increased our iterations. This because as we saw in our previous article, since we are working with a simple dataset, our model converged in less than 10 iterations so if we tried to animate 10 frames we barely get a look at what’s actually happening. Playing around with these values and observing the resulting animation can give you a better idea of their purpose.

Summary & Conclusions:

In this article we covered how some basic linear algebra can be used to derive a linear seperation line for a 2-dimensional feature classification problem using the optimized weights. We then briefly went over how matplotlib animations work to allow us to create interesting graphs and giving us a powerful method for visualizing our models.

Finally we tweaked our previous perceptron model to get a visual representation of the learning process to further understand what’s happening with each iteration. Keep in mind that this model comes with some limitations as it can only work on 2-d linear classification problems, but the main goal of this series is to achieve a solid understanding of the ins and outs of a perceptron, while learning interesting things along the way such as matplotlib’s animation in this story.

This being only my second article, I would really appreciete any feedback, input, or questions. My goal is to provide value for my readers while filling my own knowledge gaps while working on these mini projects. Hoping you found this insightful, and happy coding!

References

[1] 2015. Chapter 2. Training Simple Machine Learning Algorithms for Classification . In S. Raschka, Python Machine Learning (pp. 63–83). Packt Publishing.

[2] Countz, T. (2018). Calculate the Decision Boundary of a Single Perceptron — Visualizing Linear Separability. Learning Machine Learning Journal #5. Retrieved from https://medium.com/@thomascountz/calculate-the-decision-boundary-of-a-single-perceptron-visualizing-linear-separability-c4d77099ef38

[3] Pandey, P. (2019). Animations with Matplotlib. Towards Data Science, https://towardsdatascience.com/animations-with-matplotlib-d96375c5442c.

--

--

Mohammad Abdin
Mohammad Abdin

Written by Mohammad Abdin

Founder at Moonlitplatform.com | Interested in Tech & Entrepreneurship

No responses yet