00 Abstract - PAI-yoonsung/lstm-paper GitHub Wiki
Abstract
Long Short-Term Memory Recurrent Neural Networks (LSTM-RNN) are one of the most powerful dynamic classifiers publicly known. The network itself and the related learning algorithms are reasonably well documented to get an idea how it works.
Long Short-Term Memory Recurrent Neural Networks (LSTM-RNN) μ λ§€μ° κ°λ ₯νλ€κ³ μλ €μ§ λμ λΆλ₯κΈ°μ λλ€. λλ¬Έμ, μ΄ λ€νΈμν¬ κ΅¬μ‘°μ λλΆμ΄ κ΄λ ¨λ νμ΅ μκ³ λ¦¬μ¦μ΄ μ΄λ€ μμΌλ‘ λμνλ κ°μ λν λ¬Έμνκ° μ μ΄λ£¨μ΄μ Έ μλ νΈμ λλ€.
This paper will shed more light into understanding how LSTM-RNNs evolved and why they work impressively well, focusing on the early, ground-breaking publications.
μ΄ λ Όλ¬Έμμλ LSTM-RNN λͺ¨λΈλ€μ΄ μ΄λ»κ² μ§νν΄μλ μ§μ μ μ μλνλμ§λ₯Ό μ΄μ°½κΈ°μ νκΈ°μ μΈ λ°νλ¬Όλ€μ μ€μ¬μ μΌλ‘ μ‘°λͺ ν΄λ³Ό κ²μ λλ€.
We significantly improved documentation and fixed a number of errors and inconsistencies that accumulated in previous publications. To support understanding we as well revised and unified the notation used.
μ°λ¦¬λ μ°λ¦¬νν λ λ΄μ©μ΄ μ½κ² μ΄ν΄λ μ μλλ‘ νκΈ° μνμ¬ κ°μ κ³Ό μ¬μ©ν νκΈ°λ²λ€μ ν΅μΌνλ κ²μΌλ‘ λ¬Έμλ₯Ό ν¬κ² κ°μ μμΌ°κ³ μ΄μ μ λ°νλ¬Όλ€μμ λ³Ό μ μμλ λͺλͺ μ€λ₯μ λͺ¨μμ λ€μ μμ νμμ΅λλ€.