articles - shigashiyama/nlp_survey GitHub Wiki
Technical Articles
Conference report
- 2017
- Highlights of EMNLP 2017: Exciting Datasets, Return of the Clusters, and More!
http://blog.aylien.com/highlights-emnlp-2017-exciting-datasets-return-clusters/ - Four deep learning trends from ACL 2017
http://www.abigailsee.com/2017/08/30/four-deep-learning-trends-from-acl-2017-part-1.html
- Highlights of EMNLP 2017: Exciting Datasets, Return of the Clusters, and More!
Neural networks
- NN basics and tips
- 誤差逆伝播法のノート
http://qiita.com/Ugo-Nama/items/04814a13c9ea84978a4c - Optimizer : 深層学習における勾配法について
http://qiita.com/tokkuman/items/1944c00415d129ca0ee9 - 深層学習:ハイパーパラメータの設定に迷っている人へ
http://s0sem0y.hatenablog.com/entry/2016/11/13/035443 - CS231n: Neural Networks Part 3: Learning and Evaluation
http://cs231n.github.io/neural-networks-3/
- 誤差逆伝播法のノート
- NN for NLP
- Deep Learning for NLP Best Practices
http://ruder.io/deep-learning-nlp-best-practices/
- Deep Learning for NLP Best Practices
- CNN
- 自然言語処理における畳み込みニューラルネットワークを理解する
http://tkengo.github.io/blog/2016/03/11/understanding-convolutional-neural-networks-for-nlp/ - Convolutional Methods for Text
https://medium.com/@TalPerry/convolutional-methods-for-text-d5260fd5675f
- 自然言語処理における畳み込みニューラルネットワークを理解する
- RNN
- A Beginner’s Guide to Recurrent Networks and LSTMs
https://deeplearning4j.org/lstm.html - How to Prepare Sequence Prediction for Truncated Backpropagation Through Time in Keras
http://machinelearningmastery.com/truncated-backpropagation-through-time-in-keras/ - RNN Training Tips and Tricks
https://medium.com/towards-data-science/rnn-training-tips-and-tricks-2bf687e67527 - Tips for Training Recurrent Neural Networks
http://danijar.com/tips-for-training-recurrent-neural-networks/ - Non-Zero Initial States for Recurrent Neural Networks
https://r2rt.com/non-zero-initial-states-for-recurrent-neural-networks.html - Attention and Augmented Recurrent Neural Networks
https://distill.pub/2016/augmented-rnns/
- A Beginner’s Guide to Recurrent Networks and LSTMs
- LSTM
- LSTMネットワークの概要
http://qiita.com/KojiOhki/items/89cd7b69a8a6239d67ca - わかるLSTM ~ 最近の動向と共に
http://qiita.com/t_Signull/items/21b82be280b46f467d1b - 再帰型ネットワークと長・短期記憶についての初心者ガイド
https://deeplearning4j.org/ja/lstm
- LSTMネットワークの概要
- GAN
- Fantastic GANs and where to find them - Guim Perarnau
http://guimperarnau.com/blog/2017/03/Fantastic-GANs-and-where-to-find-them
- Fantastic GANs and where to find them - Guim Perarnau
- Other models
- 論文解説 Memory Networks (MemNN)
http://deeplearning.hatenablog.com/entry/memory_networks - How to Train a GAN? Tips and tricks to make GANs work https://github.com/soumith/ganhacks
- 論文解説 Memory Networks (MemNN)
- Visualization
- 高次元のデータを可視化するt-SNEの効果的な使い方
https://deepage.net/machine_learning/2017/03/08/tsne.html
- 高次元のデータを可視化するt-SNEの効果的な使い方
NLP
- Fundamental NLP Tasks
- Natural Language Processing: Introduction Introduction to Syntactic Syntactic Parsing
http://disi.unitn.it/moschitti/Teaching-slides/NLP-IR/NLP-Parsing.pdf - Named Entity Recognition and the Road to Deep Learning
http://nlp.town/blog/ner-and-the-road-to-deep-learning/
- Natural Language Processing: Introduction Introduction to Syntactic Syntactic Parsing
- Word Embedding
- Word embeddings in 2017: Trends and future directions
http://ruder.io/word-embeddings-2017/
- Word embeddings in 2017: Trends and future directions
NMT
- 【前編】深層学習による自然言語処理 - ニューラル機械翻訳への理論 -
http://deeplearning.hatenablog.com/entry/neural_machine_translation_theory
Multi-task learning
- An Overview of Multi-Task Learning in Deep Neural Networks
http://sebastianruder.com/multi-task/