1. Skansi, S. (2018). Introduction to Deep Learning: from logical calculus to artificial intelligence. Springer.
2. Graves, A. (2013). Generating sequences with recurrent neural network, .arXiv preprint arXiv:1308.0850.
3. Islam, M. S., Mousumi, S. S., Abujar, S.,and Hossain, S. A. (2019). Sequence-to-sequence Bangla sentence generation with LSTM recurrent neural networks. Procedia Computer Science, 152, 51-58.
4. Sutskever, I., Martens, J., and Hinton, G. E.: Generating text with recurrent neural networks. In ICML (2011, Jan.)
5. Kuo, C. Y., & Chien, J. T. (2018, September). Markov recurrent neural networks. In 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP) (pp. 1-6). IEEE.
6. Hanga, K.M., Kovalchuk, Y.,and Gaber, M.M. (2020). A Graph-Based Approach to Interpreting Recurrent Neural Networks in Process Mining.IEEE Access,8, 172923-172938.
7. Ritter, G. X., and Urcid, G. (2021). Introduction to Lattice Algebra: With Applications in AI, Pattern Recognition, Image Analysis, and Biomimetic Neural Networks.
8. B. Liu, I. Lane: Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling, In proceedings of Interspeech 2016, DOI: 10.21437/Interspeech.2016-1352
9. Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, Vadim Sheinin: Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks, https://doi.org/10.48550/arXiv.1804.00823, arXiv:1804.00823
10. Abonyi, J., Károly, R. and Dörgő, G. (2021). Event-Tree Based Sequence Mining Using LSTM Deep-Learning Model. Complexity, 2021