# Sequence Model (Deep Learning Ai)

## Deep learning by Andrew Ng

Posted by CHENEY WANG on November 19, 2018

#Sequence Models

## Normal LSTM

### Four gates:

LSTM中最重要的就是它的四个gates. 四个分别包含着记忆的丢弃和增加功能。 ####Forget gate layer It looks at $h_{t-1}$ and $x_t$, and outputs a number between 0 and 1 for each number in the cell state $C_{t−1}$. A 1 represents “completely keep this” while a 0 represents “completely get rid of this.”

#### Input gate layer

A sigmoid layer called the “input gate layer” decides which values we’ll update. Next, a tanh layer creates a vector of new candidate values, $C̃_t$, that could be added to the state. In the next step, we’ll combine these two to create an update to the state.

#### Output gate Layer

First, we run a sigmoid layer which decides what parts of the cell state we’re going to output. Then, we put the cell state through tanh (to push the values to be between −1 and 1) and multiply it by the output of the sigmoid gate, so that we only output the parts we decided to.

What I’ve described so far is a pretty is the core concept of normal lstm. But not all LSTM are the same as the above.