Having just finished the specialisation, I want to share my thoughts on how I felt about the whole journey.
Before I start, I want to mention my experience and knowledge in deep learning prior to taking the specialization.
I had watched the lecture videos of the Stanford Computer Vision and deep learning course, CS 231n. This was my entry in the field of deep learning. To be honest, I couldn’t understand much of the content. Nevertheless, I continued just because of how cool everything was (and also the fact that it was a Stanford course). After having gone through the lectures, I had a sort of vague idea about what deep learning is. After consulting some friends I decided to code up a neural network, and after that a CNN, using Keras. Before doing this, awaited me the hellfire of installing TensorFlow!
Finally everything was set and I created a digit classifier model on the MNIST data set; the “Hello World” of deep learning.
Doing, this made me realise two things:
- I was sub consciously treating a neural network as a black box.
- I was just randomly tuning the hyper parameters and hoping the result would become better.
This is not a good thing! Finally after searching online and reading many articles, I came to the realisation: I need to cement my fundamentals PROPERLY!
This lead me to the Deep Learning Specialization!
Phew! I hope I didn’t bore you with my past! Now, let’s move onto the part this blog is actually about!
First an Overview
The specialisation consists of 5 courses. Each course is designed to be completed in under a month.
Course 1: Neural Networks and Deep Learning
Course 3: Structuring Machine Learning Projects
Course 4: Convolutional Neural Networks
Course 5: Sequence Models
This course introduces neural networks. First logistic regression is explained using a “neural network mindset” and finally, we dive into neural networks themselves. Of course, first you are taught about “shallow” neural networks and finally “deep”er ones. Many key concepts are taught along the way including activation functions and gradient decent. In the programming assignments you get to code up a neural network yourself from scratch using plain python and numpy. Of course, this means implementing the back propagation yourself as well. Fun right!
This course, like the title suggests, teaches how you can improve your neural networks. Key concepts like regularisation, dropout, data pre-processing, different optimisation algorithms and batch normalisation are taught. The course also teaches how you can debug your neural networks.
The best thing is, you get to practically implement all of these yourself in the programming assignments. You even get to use TensorFlow in one of the assignments.
This is pretty small course but teaches really important concepts when you are trying to work on some practical problem in the real world. You learn how to set up your evaluation metric and how to compare the performance of your models. You also learn about different types of learning and when each of them should be preferred. Sadly, there aren’t programming assignments in this course but the quizzes are really good and make you use your mind a lot. The quizzes provide a simulation of working on a big real life machine learning problem. Pretty cool in my opinion!
This course is huge! There is a lot of knowledge packed into this one. Starting of with the workings of a CNN and the different layers involved in a typical CNN model. You also learn about different CNN models that work really well. From classification, the course moves into object detection. In the last week you get to know how face verification and recognition works and also about neural style transfer, this in my opinion is one of the coolest (or as I should be saying, artistic) thing done in deep learning! In the programming assignments, you get to implement a CNN from scratch in just python and nympy and also get to implement neural style transfer just to name a few.
This course introduces RNNs, LSTMs and GRUs. You learn about attention, word embedding, natural language processing and even speech recognition. The programming assignments are super fun and interactive! You get implement a RNN and LSTM from scratch, make your own jazz music and also make a neural network write a Sonnet just for you!
The theory was explained really well. Now I have a really good understanding of how dimensions match and gradients flow in a neural network. Because of the programming assignments I gained some valuable hands on practical experience on working with different types of models. In my opinion, most of my learning happened through the programming assignments and this was also one of the reason for taking the specialisation. The forums are really nice and welcoming. You’re bound to get an answer, either from one of the mentors or a fellow classmate, there. After finishing a week I would go to that weeks forum and read through some of the interesting posts and even try answering some myself. This also helped my clear any doubts I might have had. I even got invited to become a mentor for the first course in this specialisation. Another great point was that, instead of having hour long lecture videos there were many short videos instead. This allowed me to take breaks and digest the material taught before moving on. I am the type of person who doesn’t like stopping in middle of a video, and sometimes understanding everything taught in a one hour lecture is not easy! Also, the transition between the courses was really nice. Course 1 was very easy to grasp and progressively the courses became slightly tougher and challenging. This forced me to come out of my comfort zone and like the saying goes,“You can’t learn anything if you don’t get challenged”! I would definitely recommend this course to anyone wanting to start deep learning or just wanting a quick refresher of the basics.
Of course the course isn’t perfect. Some of the programming assignments have problems, especially in Course 5 (I guess this is because it just got released). Because of this it becomes a pain trying to find why you aren’t passing the assignments even though your output matches the expected outputs given! Some more advanced topics could have been introduced. For example, R-CNN and its improvements could have have been discussed instead of only the YOLO algorithm in object detection. Also, I find it very sad that they didn’t continue the “Heroes of Deep Learning” section. After completing each course and watching the interviews gave me inspiration and hope. Another thing is that in the programming assignments we should have been asked to implement more than what is currently demanded. Much of the code in the assignments are are already written and we are only asked to write the code for some of the key concepts. This helps to finish the assignments quickly but reduces the learning potential of the assignment. Another thing, it would be nice if there were readable lecture notes for each week. This helps to quickly revise the concepts as compared to watching the videos all over again. I also found the speed of the videos to very slow but watching at 2x speed solved the issue for m.
Advice to anyone planning to or already taking the specialization
GET 100% SCORE!
Trust me, following the above two lines will help you. Let me elaborate:
Go through all of the content. Even if you believe you already know some things already just go through it anyways. It will be a quick refresher and you might just learn something new. Also, don’t leave out the (OPTIONAL) material. Many of the advanced topics are given only here. There are optional videos and also parts in assignments. Just go through them! It’s not going to take too much time anyways. In fact, if any of the course creators are reading this, I request you to please remove the (OPTIONAL) labels.
Second advice is, try getting full marks in the assignments and quizzes. Don’t just do the courses to pass them and get a certificate. Extract all you can from them. There isn’t any limit to how many times you can re-take a quiz and re-submit your assignment. So do it until you get 100%
I would like to express my gratitude to creators of the specialization Andrew Ng, Younes Bensouda Mourri and Kian Katanforoosh for creating such a wonderful series.
Professor Andrew Ng, thank you so much again, for explaining the contents so clearly in the lecture videos and giving insights to how deep learning is applied in the real world. I really liked when you showed how Baidu allows entry to employees!