Data Science: Deep Learning and Neural Networks in Python
What you’ll learn
-
Learn how Deep Learning REALLY works (not just some diagrams and magical black box code)
-
Learn how a neural network is built from basic building blocks (the neuron)
-
Code a neural network from scratch in Python and numpy
-
Code a neural network using Google’s TensorFlow
-
Describe different types of neural networks and the different types of problems they are used for
-
Derive the backpropagation rule from first principles
-
Create a neural network with an output that has K > 2 classes using softmax
-
Describe the various terms related to neural networks, such as “activation”, “backpropagation” and “feedforward”
-
Install TensorFlow
This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.
We extend the previous binary classification model to multiple classes using the softmax function, and we derive the very important training method called “backpropagation” using first principles. I show you how to code backpropagation in Numpy, first “the slow way”, and then “the fast way” using Numpy features.
Next, we implement a neural network using Google’s new TensorFlow library.
You should take this course if you are interested in starting your journey toward becoming a master at deep learning, or if you are interested in machine learning and data science in general. We go beyond basic models like logistic regression and linear regression and I show you something that automatically learns features.
This course provides you with many practical examples so that you can really see how deep learning can be used on anything. Throughout the course, we’ll do a course project, which will show you how to predict user actions on a website given user data like whether or not that user is on a mobile device, the number of products they viewed, how long they stayed on your site, whether or not they are a returning visitor, and what time of day they visited.
Another project at the end of the course shows you how you can use deep learning for facial expression recognition. Imagine being able to predict someone’s emotions just based on a picture!
After getting your feet wet with the fundamentals, I provide a brief overview of some of the newest developments in neural networks – slightly modified architectures and what they are used for.
NOTE:
If you already know about softmax and backpropagation, and you want to skip over the theory and speed things up using more advanced techniques along with GPU-optimization, check out my follow-up course on this topic, Data Science: Practical Deep Learning Concepts in Theano and TensorFlow.
I have other courses that cover more advanced topics, such as Convolutional Neural Networks, Restricted Boltzmann Machines, Autoencoders, and more! But you want to be very comfortable with the material in this course before moving on to more advanced subjects.
This course focuses on “how to build and understand“, not just “how to use”. Anyone can learn to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
“If you can’t implement it, you don’t understand it”
-
Or as the great physicist Richard Feynman said: “What I cannot create, I do not understand”.
-
My courses are the ONLY courses where you will learn how to implement machine learning algorithms from scratch
-
Other courses will teach you how to plug in your data into a library, but do you really need help with 3 lines of code?
-
After doing the same thing with 10 datasets, you realize you didn’t learn 10 things. You learned 1 thing, and just repeated the same 3 lines of code 10 times…
Suggested Prerequisites:
-
calculus (taking derivatives)
-
matrix arithmetic
-
probability
-
Python coding: if/else, loops, lists, dicts, sets
-
Numpy coding: matrix and vector operations, loading a CSV file
-
Be familiar with basic linear models such as linear regression and logistic regression
WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:
-
Check out the lecture “Machine Learning and AI Prerequisite Roadmap” (available in the FAQ of any of my courses, including the free Numpy course)
Who this course is for:
- Students interested in machine learning – you’ll get all the tidbits you need to do well in a neural networks course
- Professionals who want to use neural networks in their machine learning and data science pipeline. Be able to apply more powerful models, and know its drawbacks.
12 reviews for Data Science: Deep Learning and Neural Networks in Python
Add a review
Original price was: $84.99.$14.99Current price is: $14.99.
Frederick Zhang –
This is going to be the longest Udemy course review I’ve written thus far. The executive summary is thus as follows: I am very glad I took this course.
The course is the third in the Deep Learning track after the Linear and Logistic Regression courses, so you should either have taken them or be familiar with the material before attempting this one. The core topic of this course is backward propagation, which is a fancy name for gradient descent. However, you should not underestimate the math involved — it gets very complicated. If you are familiar with undergrad calculus, particularly how to solve for derivatives using the Chain Rule, you won’t be learning too much new math. That does not mean the math you’ll be doing is simple or easy. It is neither! You will be calculating gradient descent for the Sigmoid and Softmax activation functions and the instructor insists you do them by hand — calculations that can take days for you to do correctly. It’s a whole different ball game from deriving linear regression equations.
A matter of uncertainty I’ve had for a while that this course cleared up for me is this: Sigmoid is used for binary classification, whereas Softmax is used for multiclass classification (K = 3+).
If you have a job interview in a month, I’d recommend for you to review at least this course and maybe even Logistic Regression because unless you have an eidetic memory or manually calculate Softmax gradient descent on a weekly basis, there is no way you will remember how to do any of the math. Personally, though, I cannot imagine any real-world scenario in which you would actually need to unless you’re implementing your own ANN from scratch. It’s mostly important that you understand the concept of what it means and how it works. That is what I learned from this course, and I’m very grateful that the instructor took the time to flesh out the math for us.
With that said, if there is one component I feel missing from the course, it’s a walkthrough of Softmax derivation with a sample dataset. The instructor was kind enough to supply some supplemental materials in the appendix lectures, which are indeed very helpful, but I was so hoping for an actual walkthrough along the lines of the Feedforward in Slow Motion lectures in which we applied Sigmoid derivation on a small dataset. Given that backprop is the crux of the course and the foundation of more complicated neural networks, such as CNNs and RNNs, I feel it is super important to get the concept really drilled in with actual numbers and not just k primes and y primes. Even so, I’ll probably need to review this course if I ever plan on interviewing for any Deep Learning position.
Again, to sum up, this is a very good course — perhaps the most math-intensive of any AI/ML course I’ve ever taken. If you’re serious about learning how to build ANNs from the ground up, this course is for you. You’re probably going to feel very stupid, but you’ll be glad you took the time and stuck with it.
Robert Ledang –
Course content is alright.
Unfortunately, a lot of time is spent on saying what the course is not about over and over again. It gets very repetive after a while when it’s the same thing in every video.
Anthony Withrow –
I like the detailed explanations. I appreciated deriving the equations for backpropagation, I feel like I have a clear understanding of what is going on. The pace was comfortable.
Shubham Mali –
good but programming part is little fast.
Ilija Simonovic –
This course is amazing! Backpropagation is derived from scratch, and every detail is clearly explained. The math is translated into Numpy code in a very straightforward way. Multiclass classification, binary classification, and regression are all thoroughly analyzed in terms of theory, math, and code. I strongly recommend this course to anyone who wants to understand how neural networks actually work and how they learn from data.
Rajesh Madan –
It was great! I loved the advanced content. Now I have to build some confidence to put backpropagation on my resume.
Alex Maestas –
Detailed teaching of backpropgation. I found it easy and intuitive to follow. Good work.
Tehan Seneviratne –
i am a university student and took this course to better understand the content, however the material is not that helpful and he goes over core concepts way too fast building no intuition behind the concepts.no effort to educate .i took the course with the good reviews on it so i guess this is subjective therefore might be useful to take the 30 day money back guarantee into account.
Dusan Perovic –
This course is perfect match for those professionals that want to enter the world of deep learning from explanatory aspect. The best thing is instructor approach towards the content of the course and precise way of explaining topic with including every step in both math and programming path. This course will definitely provide some initial knowledge about how neural networks work and why are they used for. the only thing is that this basic knowledge about neural networks should be improved with further research and practice by professionals that took this course. At the end, special gratitude to the instructor for forcing me to review math (calculus) knowledge from university. It is something that does not hurt and by contrary it is only useful.
Goh Yong Yu Jeffrey –
This trainer is pretty blunt in his expectation for this course which is pretty satisfying to me. No 1-2 hours wasted on teaching how to set up python or development environment in general. Halfway thru the course, this trainer is able to deep dive into deep learning and neural network as per his claims. If anything negative it would be that his GitHub repo consist of multiple courses all in one repository which is messy to me.
Henrik Bachmann –
Exquisite breakdown and buildup of the backpropagation algorithm. Lazy is an amazing teacher and doesn’t hold back. He is very knowledgeable, and explains everything in a step by step process that anyone can follow. If you want to understand deep learning on a deep level, this is a good place to start.
Jim Hargreaves –
This course teaches deep learning fundamentals well, but most importantly it teaches you how to THINK like a data scientist!
Definitely do it if you want to learn how deep learning works at its core.