A systematic review on sequence-to-sequence learning with neural network and its models

dc.contributor.authorYousuf, Hana
dc.contributor.authorLahzi, Michael
dc.contributor.authorA. Salloum, Said
dc.contributor.authorShaalan, Khaled
dc.date.accessioned2025-02-11T04:23:14Z
dc.date.available2025-02-11T04:23:14Z
dc.date.issued2021
dc.description.abstractWe develop a precise writing survey on sequence-to-sequence learning with neural network and its models. The primary aim of this report is to enhance the knowledge of the sequence-to-sequence neural network and to locate the best way to deal with executing it. Three models are mostly used in sequence-to-sequence neural network applications, namely: recurrent neural networks (RNN), connectionist temporal classification (CTC), and attention model. The evidence we adopted in conducting this survey included utilizing the examination inquiries or research questions to determine keywords, which were used to search for bits of peer-reviewed papers, articles, or books at scholastic directories. Through introductory hunts, 790 papers, and scholarly works were found, and with the assistance of choice criteria and PRISMA methodology, the number of papers reviewed decreased to 16. Every one of the 16 articles was categorized by their contribution to each examination question, and they were broken down. At last, the examination papers experienced a quality appraisal where the subsequent range was from 83.3% to 100%. The proposed systematic review enabled us to collect, evaluate, analyze, and explore different approaches of implementing sequence-to-sequence neural network models and pointed out the most common use in machine learning. We followed a methodology that shows the potential of applying these models to real-world applications.
dc.identifier.urihttps://bspace.buid.ac.ae/handle/1234/2789
dc.language.isoen
dc.titleA systematic review on sequence-to-sequence learning with neural network and its models
dc.typeArticle
Files
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
1.35 KB
Format:
Item-specific license agreed upon to submission
Description: