This item is non-discoverable
A systematic review on sequence-to-sequence learning with neural network and its models
Date
2021
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
We develop a precise writing survey on sequence-to-sequence learning with
neural network and its models. The primary aim of this report is to enhance
the knowledge of the sequence-to-sequence neural network and to locate the
best way to deal with executing it. Three models are mostly used in
sequence-to-sequence neural network applications, namely: recurrent neural
networks (RNN), connectionist temporal classification (CTC), and attention
model. The evidence we adopted in conducting this survey included utilizing
the examination inquiries or research questions to determine keywords,
which were used to search for bits of peer-reviewed papers, articles, or books
at scholastic directories. Through introductory hunts, 790 papers, and
scholarly works were found, and with the assistance of choice criteria and
PRISMA methodology, the number of papers reviewed decreased to 16.
Every one of the 16 articles was categorized by their contribution to each
examination question, and they were broken down. At last, the examination
papers experienced a quality appraisal where the subsequent range was from
83.3% to 100%. The proposed systematic review enabled us to collect,
evaluate, analyze, and explore different approaches of implementing
sequence-to-sequence neural network models and pointed out the most
common use in machine learning. We followed a methodology that shows
the potential of applying these models to real-world applications.