Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU) are two layer types commonly used to build recurrent neural networks in Keras. This video introduces these two network types as a foundation towards Natural Language Processing (NLP) and time series prediction.
Code for This Video:
Course Homepage: https://sites.wustl.edu/jeffheaton/t81-558/
Tweets by jeffheaton
Support Me on Patreon: https://www.patreon.com/jeffheaton
Video Rating: / 5
In this lesson, you will learn a multi-step time series prediction using RNN LSTM for household power consumption prediction. We will predict the power consumption of the coming week based on the power consumption of past weeks.
Download the working file: https://github.com/laxmimerit/Multi-Step-Time-Series-Prediction-using-RNN-LSTM-for-household-power-consumption
Learn Complete Data Science with these 5 video series.
1. Python for Beginners
2. Machine Learning for Beginners
3. Feature Selection in Machine Learning
4. Deep Learning with TensorFlow 2.0 and Keras
5. Natural Language Processing (NLP) Tutorials
The working code is given in the video description of each video. You can download the Jupyter notebook from GitHub.
Please Like and Subscribe to show your support.
### Like Facebook Page:
### Make Your Own Automated Email Marketing Software in Python