samedi, mai 23, 2020

Deep Learning With Python Jason Brownlee


  • In this post, I write my feedback on this again excellent book "Deep Learning With Python" from Jason Brownlee. I would recommend this book if you want to progress on this Deep Learning topic with real examples and not complex math.



  • You will need a minimum of 15 days to read and run through the different examples that are provided alongside the book.
  • This book help me to progress at the understanding of important concepts such as Multilayer Perceptron, usage of the KERAS library, Convolutional Neural Networks, Recurrent Neural Networks.

    • I particularly appreciated the chapter related to the predictions of sentiments. It's about the word representation which are part of Natural Processing language (NPL). Jason reference an interesting link about the model used in NPL: "Learning Word Vectors for Sentiment Analysis".
    • The more I was progressing along the book, the more I was needing to run the code on an external computer. Indeed some program takes more than an hour. For example the classification of images provided with CIFAR10 run in one hour on my iMAC whereas it took only six minutes on Google Colab using GPU. The Colab platform is easy to access and you can use your own data file stored in Google Drive. Here is the link on howto to upload your own files.
    • This kind of experience help at understanding the notion of Tensors: a generalization of matrices and are represented using n-dimensional arrays. A vector is one-dimensional or first order tensor and a matrix is two dimensional or second order tensor. Python together with numpy and pandas libraries are perfect for managing matrix.
    • I learned the different techniques of using Keras, the library that wraps TensorFlow. The KerasClassifierWrapper class takes a function that creates and returns your neural network model.
    • I was also made aware of how to reduce overfitting, tuning hyper parameters such as batch size, number of epochs, learning rate decay, usage of momentum, choosing a Time-based learning rate schedule or a Drop-based learning rate schedule. All these notions are clearly explained and demonstrated with the code examples.
    • I was reading the book and in parallel I am also subscribing to the email of Jason as there is always some corollary subjects that helped me understanding subjects such as Discretization, Time series prediction, Whitening, Principal Component Analysis (PCA) and Zero-phase Component Analysis (ZCA).
    • I am using the Anaconda container which comes with Sypder, a Python editor. This environment is perfect for me as it provides a stable environment with up to date libraries. Setting up Anaconda is explained in the annex of the book.

    • I will certainly explore the Google seed-bank for other examples of deep learning.

    Aucun commentaire: