Academic Publication Pre-Trained Language Models for Text Generation: A Survey
Research Abstract & Technology Focus
Text Generation aims to produce plausible and readable text in human language from input data. The resurgence of deep learning has greatly advanced this field, in particular, with the help of neural generation models based on pre-trained language models (PLMs). Text generation based on PLMs is viewed as a promising approach in both academia and industry. In this article, we provide a survey on the utilization of PLMs in text generation. We begin with introducing two key aspects of applying PLMs to text generation: (1) how to design an effective PLM to serve as the generation model; and (2) how to effectively optimize PLMs given the reference text and to ensure that the generated texts satisfy special text properties. Then, we show the major challenges that have arisen in these aspects, as well as possible solutions for them. We also include a summary of various useful resources and typical text generation applications based on PLMs. Finally, we highlight the future research directions which will further improve these PLMs for text generation. This comprehensive survey is intended to help researchers interested in text generation problems to learn the core concepts, the main techniques and the latest developments in this area based on PLMs.
Correlated Market Trend: Business Models
Bridging academia to market: The 60-day public search velocity mapping directly to the core technology of this paper. Dashed line represents 7-day moving average.
Commercial Realization
Startups and Open Source tools heavily associated with the concepts explored in this paper.
-
GitHubalvinunreal/awesome-opensource-ai
-
GitHubArthur-Ficial/apfel
-
Product HuntOllang DX
-
Product HuntGoogle Gemma 4
Market Trends