Explain Long Short Term Memory Algorithm in brief

Question

With code snippet in R or Python

in progress 0
TheDataMonk 55 years 1 Answer 876 views Grand Master 0

Answer ( 1 )

  1. When we arrange our calendar for the day, we prioritize our appointments right? If in case we need to make some space for anything important we know which meeting could be canceled to accommodate a possible meeting.

    Turns out that an RNN doesn’t do so. In order to add a new information, it transforms the existing information completely by applying a function. Because of this, the entire information is modified, on the whole, i. e. there is no consideration for ‘important’ information and ‘not so important’ information.

    LSTMs on the other hand, make small modifications to the information by multiplications and additions. With LSTMs, the information flows through a mechanism known as cell states. This way, LSTMs can selectively remember or forget things. The information at a particular cell state has three different dependencies which are as follows:

    The previous cell state (i.e. the information that was present in the memory after the previous time step)
    The previous hidden state (i.e. this is the same as the output of the previous cell)
    The input at the current time step (i.e. the new information that is being fed in at that moment)

    LSTM introduces long-term memory into recurrent neural networks. It mitigates the vanishing gradient problem, which is where the neural network stops learning because the updates to the various weights within a given neural network become smaller and smaller. It does this by using a series of ‘gates’. These are contained in memory blocks which are connected through layers.

    There are three types of gates within a unit:

    Input Gate: Scales input to cell (write)
    Output Gate: Scales output to cell (read)
    Forget Gate: Scales old cell value (reset)
    Each gate is like a switch that controls the read/write, thus incorporating the long-term memory function into the model.

Leave an answer

Browse
Browse