Paper
11 July 2024 Chinese fable generation based on large scale pretrained language models
Rong Jing, Fucheng Wan
Author Affiliations +
Proceedings Volume 13210, Third International Symposium on Computer Applications and Information Systems (ISCAIS 2024); 132101V (2024) https://doi.org/10.1117/12.3034882
Event: Third International Symposium on Computer Applications and Information Systems (ISCAIS 2023), 2024, Wuhan, China
Abstract
Story generation is an important part of natural language processing, text generation tasks have received extensive attention from scholars in recent years. Existing methods are difficult to generate complete, logically fluent texts with low repetition rate based on a given outline vocabulary. To solve this problem, based on the dataset generated from outline conditional stories, we use the latest GLM model and instruct-tuning to generate Chinese fables and compare it with other big model fine-tuning with different parameter settings, and the experimental results show that the fables generated by Instruct-tuning have stronger logic and lower repetition rate. The models we used showed an increase of 1.5%, 2.43%, 1.51%, 3.25%, and 2.09% over the baseline model on the Blue-2, distinct-3, distinct-2, distinct-3, and distinct-4 metrics.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Rong Jing and Fucheng Wan "Chinese fable generation based on large scale pretrained language models", Proc. SPIE 13210, Third International Symposium on Computer Applications and Information Systems (ISCAIS 2024), 132101V (11 July 2024); https://doi.org/10.1117/12.3034882
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Transformers

Logic

Autoregressive models

Back to Top