[1] S. Subramanian, S. Rajeswar, F. Dutil, C. Pal and A. Courville, "Adversarial Generation of Natural Language," Proc. of the 2nd Workshop on Representation Learning for NLP, pp. 241–251, 2017.
[2] R. Yan, H. Jiang, M. Lapata, S.-D. Lin, X. Lv and X. Li, "i, Poet: Automatic Chinese Poetry Composition through a Generative Summarization Framework under Constrained Optimization," Proc. of the 23rd Int. Joint Conf. on Artificial Intelligence, pp. 2197-2203, 2013.
[3] A. Das and B. Gambäck, "Poetic Machine: Computational Creativity for Automatic Poetry Generation in Bengali," Proc. of the Int. Conf. on Innovative Comp. and Cloud Comp. (ICCC), pp. 230–238, 2014.
[4] H. G. Oliveira and A. Cardoso, "Poetry Generation with PoeTryMe," Proc. of Computational Creativity Research: Towards Creative Machines, Part of the Atlantis Thinking Machines Book Series (ATLANTISTM), vol. 7, pp. 243–266, Springer, 2015.
[5] M. Ghazvininejad, X. Shi, Y. Choi and K. Knight, "Generating Topical Poetry," Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing, pp. 1183–1191, Austin, Texas, 2016.
[6] M. Ghazvininejad, X. Shi, J. Priyadarshi and K. Knight, "Hafez: An Interactive Poetry Generation System," Proc. of ACL 2017, System Demonstrations, pp. 43–48, Vancouver, Canada, 2017.
[7] D. Singh, M. Ackerman and R. Y. Pérez, "A Ballad of the Mexicas: Automated Lyrical Narrative Writing," Proc. of the 8ht Int. Conf. on Computational Creativity (ICCC), [Online], Available: http://ilitia.cua.uam.mx:8080/jspui/handle/123456789/442, 2017.
[8] L. Xu, L. Jiang, C. Qin, Z. Wang and D. Du, "How Images Inspire Poems: Generating Classical Chinese Poetry from Images with Memory Networks," Proc. of the 32nd Conf. on Artificial Intelligence, vol. 32, Article No. 689, pp. 5618–5625, 2018.
[9] A. Zugarini, S. Melacci and M. Maggini, "Neural Poetry: Learning to Generate Poems Using Syllables," Proc. of the Int. Conf. on Artificial Neural Networks, pp. 313–325, DOI: 10.1007/978-3-030-30490-4_26, Springer, 2019.
[10] T. Van de Cruys, "Automatic Poetry Generation from Prosaic Text," Proc. of the 58th Annual Meeting of the Associ. for Computational Linguistics, pp. 2471–2480, DOI: 10.18653/v1/2020.acl-main.223, 2020.
[11] C.-L. Zhou, W. You and X. Ding, "Genetic Algorithm and Its Implementation of Automatic Generation of Chinese Songci," Journal of Software, vol. 21, no. 3, pp. 427–437, 2010.
[12] J. He, M. Zhou and L. Jiang, "Generating Chinese Classical Poems with Statistical Machine Translation Models," Proc. of the 26th AAAI Conference on Artificial Intelligence, vol. 26, no. 1, pp. 1650–1656, DOI: 10.1609/aaai.v26i1.8344, 2012.
[13] Q. Wang, T. Luo and D. Wang, "Can Machine Generate Traditional Chinese poetry? A Feigenbaum Test," Proc. of the Int. Conf. on Brain Inspired Cognitive Systems, Part of the Lecture Notes in Computer Science Book Series (LNAI), vol. 10023, pp. 34–46, Springer, 2016.
[14] J. Zhang, Y. Feng, D. Wang, Y. Wang, A. Abel, S. Zhang and A. Zhang, "Flexible and Creative Chinese Poetry Generation Using Neural Memory," arXiv: 1705.03773, DOI: 10.48550/arXiv.1705.03773, 2017.
[15] Z. Wang, W. He, H. Wu, H. Wu, W. Li, H. Wang and E. Chen, "Chinese Poetry Generation with Planning Based Neural Network," arXiv:1610.09889, DOI: 10.48550/arXiv.1610.09889, 2016.
[16] X. Yang, X. Lin, S. Suo and M. Li, "Generating Thematic Chinese Poetry Using Conditional Variational Auto-encoders with hybrid decoders," arXiv: 1711.07632, DOI: 10.48550/arXiv.1711.07632, 2017.
[17] Z. N. Abdel-Malek, Towards a New Theory of Arabic Prosody, 5th Edn., ISSN: 0258-3976, Tajdid Online Forum for Facilitating Arabic Studies, 2019.
[18] T. Brown, B. Mann, N. Ryder et al., "Language Models Are Few-shot Learners," Advances in Neural Information Processing Systems, vol. 33, pp. 1877–1901, 2020.
[19] M. E. G. Beheitt and M. B. H. Hmida, "Automatic Arabic Poem Generation with GPT-2," Proc. of the 14th Int. Conf. on Agents and Artificial Intelligence (ICAART 2022), vol. 2, pp. 366-374, 2022.
[20] X. Yi, M. Sun, R. Li and Z. Yang, "Chinese Poetry Generation with a Working Memory Model," Proc. of the 27th Int. Joint Conference on Artificial Intelligence (IJCAI'18), pp. 4553–4559, 2018.
[21] Z. Liu, Z. Fu, J. Cao, G. de Melo, Y.-C. Tam, C. Niu and J. Zhou, "Rhetorically Controlled Encoder-Decoder for Modern Chinese Poetry Generation," Proc. of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 1992–2001, Florence, Italy, 2019.
[22] L. Deng, J. Wang, H. Liang et al., "An Iterative Polishing Framework Based on Quality Aware Masked Language Model for Chinese Poetry Generation," Proc. of the AAAI Conference on Artificial Intelligence, pp. 7643–7650, 2020.
[23] J. Devlin, M.-W. Chang, K. Lee and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," Proc. of the 2019 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 1 (Long and Short Papers), pp. 4171–4186, 2019.
[24] L. Shen, X. Guo and M. Chen, "Compose Like Humans: Jointly Improving the Coherence and Novelty for Modern Chinese Poetry Generation," Proc. of the 2020 IEEE Int. Joint Conf. on Neural Networks (IJCNN), pp. 1–8, Glasgow, UK, 2020.
[25] D. P. Kingma and J. Ba, "Adam: A Method for Stochastic Optimization," arXiv: 1412.6980, 2014.
[26] J. H. Lau, T. Cohn, T. Baldwin, J. Brooke and A. Hammond, "Deep-speare: A Joint Neural Model of Poetic Language, Meter and Rhyme," Proc. of the 56th Annual Meeting of the Association for Computational Linguistics, vol. 1: Long Papers, pp. 1948–1958, 2018.
[27] D. Bahdanau, K. H. Cho and Y. Bengio, "Neural Machine Translation by Jointly Learning to Align and Translate," Proc. of the 3rd Int. Conf. on Learning Representations (ICLR 2015), arXiv: 1409.0473, 2015.
[28] M. C. Santillan and A. P. Azcarraga, "Poem Generation Using Transformers and doc2vec Embeddings," Proc. of the IEEE Int. Joint Conf. on Neural Networks (IJCNN), pp. 1–7, Glasgow, UK, 2022.
[29] A. Vaswani, N. Shazeer, N. Parmar et al., "Attention Is All You Need," arXiv: 1706.03762, DOI: 10.48550/arXiv.1706.03762, 2017.
[30] Q. Le and T. Mikolov, "Distributed Representations of Sentences and Documents," Proc. of the Int. Conf. on Machine Learning (PMLR), pp. 1188–1196, arXiv: 1405.4053, 2014.
[31] B. Bena and J. Kalita, "Introducing Aspects of Creativity in Automatic Poetry Generation," arXiv: 2002.02511, DOI: 10.48550/arXiv.2002.02511, 2020.
[32] A. Radford, J. Wu, R. Child, D. Luan, D. Amodei and I. Sutskever, "Language Models Are Unsupervised Multitask Learners," OpenAI blog, vol. 1, no. 8, p. 9, 2019.
[33] M. H. Moghadam and B. Panahbehagh, "Creating a New Persian Poet Based on Machine Learning," arXiv e-prints, pp. arXiv–1810, DOI: 10.48550/arXiv.1810.06898, 2018.
[34] S. Talafha and B. Rekabdar, "Arabic Poem Generation with Hierarchical Recurrent Attentional Network," Proc. of the IEEE 13th Int. Conf. on Semantic Comp. (ICSC), pp. 316–323, Newport Beach, USA, 2019.
[35] S. Talafha et al., "Poetry Generation Model via Deep Learning Incorporating Extended Phonetic and Semantic Embeddings," Proc. of the IEEE 15th Int. Conf. on Semantic Comp., pp. 48–55, USA, 2021.
[36] A. Hakami, R. Alqarni, M. Almutairi and A. Alhothali, "Arabic Poems Generation Using LSTM, Markov-LSTM and Pre-trained GPT-2 Models," Computer Science & Information Technology (CS&IT), vol. 11, no. 15, pp. 139–147, 2021.
[37] T. Mikolov, I. Sutskever, K. Chen, G. Corrado and J. Dean, "Distributed Representations of Words and Phrases and Their Compositionality," arXiv:1310.4546, DOI: 10.48550/arXiv.1310.4546 2013.
[38] A. Radford, K. Narasimhan, T. Salimans et al., "Improving Language Understanding by Generative Pre-training," [Online], Available: https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/ language-unsupervised/language_understanding_paper.pdf, 2018.
[39] H. Robbins and S. Monro, "A Stochastic Approximation Method," The Annals of Mathematical Statistics, vol. 22, no. 3, pp. 400–407, 1951.
[40] Y. Zhu, R. Kiros, R. Zemel et al., "Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books," Proc. of the IEEE Int. Conf. on Computer Vision (ICCV), pp. 19–27, DOI: 10.1109/ICCV.2015.11, 2015.
[41] R. Sennrich, B. Haddow and A. Birch, "Neural Machine Translation of Rare Words with Subword Units," arXiv: 1508.07909, DOI: 10.48550/arXiv.1508.07909, 2015.
[42] T. B. Brown, B. Mann, N. Ryder et al., "Language Models Are Few-shot Learners," arXiv: 2005.14165, DOI: 10.48550/arXiv.2005.14165, 2020.
[43] M. Abbas and K. Smaili, "Comparison of Topic Identification Methods for Arabic Language," Proc. of Int. Conf. on Recent Advances in Natural Lang. Process. (RANLP), pp. 14–17, Borovets, Bulgaria, 2005.
[44] M. Abbas, K. Smaïli and D. Berkani, "Evaluation of Topic Identification Methods on Arabic Corpora," Journal of Digital Information Management, vol. 9, no. 5, pp. 185–192, 2011.
[45] K. Papineni, S. Roukos, T. Ward and W.-J. Zhu, "BLEU: AMethod for Automatic Evaluation of Machine Translation," Proc. of the 40th Annual Meeting of the Association for Computational Linguistics (ACL), pp. 311–318, Philadelphia, USA, 2002.
[46] J. Li, Y. Song, H. Zhang, D. Chen, S. Shi, D. Zhao and R. Yan, "Generating Classical Chinese Poems via Conditional Variational Autoencoder and Adversarial Training," Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing, pp. 3890–3900, Brussels, Belgium, 2018.
[47] R. Yan, "i, Poet: Automatic Poetry Composition through Recurrent Neural Networks with Iterative Polishing Schema," Proc. of the 25th Int. Joint Conf. on Artificial Intell. (IJCAI-16), pp. 2238–2244, 2016.
[48] X. Zhang and M. Lapata, "Chinese Poetry Generation with Recurrent Neural Networks," Proc. of the 2014 Conf. on Empirical Methods in Natural Lang. Process. (EMNLP), pp. 670–680, Doha, Qatar, 2014.
[49] K. Krippendorff, Content Analysis: An Introduction to Its Methodology, 3rd Edition, Sage Publications, 2013.