GRU
← Back to Recurrent Neural Networks
Gated Recurrent Unit. A simplified version of LSTM with fewer parameters: uses reset and update gates instead of three separate gates. Often performs comparably to LSTM with less computation.
Related
- LSTM (more complex alternative)
- Vanilla RNN (simpler but limited)