본문 바로가기
Paper review/NLP

Seq2Seq: Sequence to Sequence Learning with Neural Networks

by 오서영 2025. 3. 14.

논문 원본: https://arxiv.org/abs/1409.3215

 

Sequence to Sequence Learning with Neural Networks

Deep Neural Networks (DNNs) are powerful models that have achieved excellent performance on difficult learning tasks. Although DNNs work well whenever large labeled training sets are available, they cannot be used to map sequences to sequences. In this pap

arxiv.org

출처: https://ffighting.net/deep-learning-paper-review/deep-learning-paper-guide/deep-learning-paper-guide/

 

논문 간단 요약:

이 논문에서는 Seq2Seq (Sequence-to-Sequence) 모델을 제안하여 입력 시퀀스를 출력 시퀀스로 변환하는 방식을 학습하는 방법을 소개함.
기존의 RNN이 단일 출력 값을 예측하는 방식과 달리, Seq2Seq는 가변 길이의 입력을 받아 가변 길이의 출력을 생성할 수 있음.


Abstract

 

1. Introduction

 

2. The Model

3. Experiments


3.1 Dataset Details


3.2 Decoding and Rescoring


3.3 Reversing the Source Sentences


3.4 Training Details


3.5 Parallelization


3.6 Experimental Results


3.7 Performance on Long Sentences


3.8 Model Analysis

 

4. Related Work

 

5. Conclusion