Home - lngvietthang/das GitHub Wiki

Welcome to the Document Abstractive Summarization Project Wiki Page!

Introduction

Methodology

Related work

Experiments

Datasets

References

Abstractive Sentence Summarization

[1] Rush, A. M., Chopra, S. & Weston, J. (2015). A Neural Attention Model for Abstractive Sentence Summarization.. In L. Màrquez, C. Callison-Burch, J. Su, D. Pighin & Y. Marton (eds.), EMNLP (p./pp. 379-389), : The Association for Computational Linguistics. ISBN: 978-1-941643-32-7 (Download)

[2] Chopra, S., Michael, A. & Rush, A. M (2016). Abstractive Sentence Summarization with Attentive Recurrent Neural Networks. NAACL (Download)

[3] Ramesh, N., Bing, X. & Bowen Z. (2016). Sequence-to-sequence RNNs for Text Summarization. ICLR Workshop (Download)

Hierarchical Attention Network

[4] Zichao, Y., Diyi, Y., Chris, D., Xiaodong, H., Alex, S. & Eduard H. (2016). Hierarchical Attention Networks for Document Classification. NAACL (Download)

[5] Ramesh, N., Bowen, Z., Cicero d. S., Caglar, G. & Bing, X. (2016). Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond. Arxiv (Download)

Pointer Network

[6] Oriol, V., Meire, F. & Navdeep, J. (2015). Pointer Networks. NIPS (Download)

Sequence-to-sequence

[7] Ilya, S., Oriol, V. & Quoc, V. L. (2014). Sequence to Sequence Learning with Neural Networks. NIPS (Download)

[8] Kyunghyun, C., Bart, v. M., Çaglar G., Dzmitry B., Fethi B., Holger S., Yoshua B. (2014). Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. EMNLP (Download)

Advantage Sequence-to-sequence

[9] Fandong, M., Zhengdong, L., Zhaopeng, T., Hang, L. & Qun, L. (2016). A Deep Memory-based Architecture for Sequence-to-sequence Learning. ICLR Workshop ([Download])

[10] Jiatao, G., Zhengdong, L., Hang, L. & Victor O.K. L. (2016). Incorporating Copying Mechanism in Sequence-to-sequence Learning. Arxiv (Download)