Your Turn!
Now it’s your turn. Download this document for a recap of the blocks we covered in this lesson and some hands-on exercises for you to explore.
What You Have Learned in This Lesson
In this lesson we demonstrated how neural networks can be used as a complementary alternative way to program.
Neural Networks as a different approach to programming
Instead of writing functions ourselves, we built them with the help of a neural network configured by us. The neural network was trained with the generate _ for_ in _ block from the "Neural Networks" library.
We were able to generate a predicate for binary classification for the data set in the block’s third input slot with the structure "is _ tag?"
In our case, we made the is _ forgery? block to predict whether a bank note sample is a forgery.

The generate _ block takes care of some house-keeping things like the number of epochs used to train the network. It can be manually set to a specific value or kept on "auto", meaning the network runs for at least 20 epochs and then checks after each epoch whether the results are still improving.
It automatically partitions the data in a training and validation set. By default, the data is shuffled and the first 80% of the shuffled data set are used for training, the remaining records are kept for the validation of the trained network.
Finally, you can select the number of hidden layers and the number of neurons in each of those hidden layers.
The resulting binary classifier is _ a forgery? is based on the neural network. If you right-click -> "edit" to access its block definition, you can see that it contains the weights of that network.

Additionally, we could create multi-class classifiers like the species of_ reporter, which lets us distinguish the three iris species I. setosa, I. versicolor and I. virginica.
The result of a multi-class classification is an output vector with one value for each class. Those output vector values are converted into probabilities for each class by passing them through the so-called "softmax" function. The class with the highest probability is the model’s final prediction.

Neural networks offer an alternative approach to traditional programming: instead of identifying patterns and explicitly formulating rules to classify those patterns, we can train a model on data to automatically find and learn those patterns.
In this case, the implementation of the algorithm isn’t the whole program. Only when paired with the right training data to generate well working weights the algorithm can learn to solve the problem.
However, keep in mind that you’re always working with probabilities and the predictions a model makes contain biases and are only as good as the provided training data.