首页> 外文会议>Conference of the European Chapter of the Association for Computational Linguistics >Breaking Writer's Block: Low-cost Fine-tuning of Natural Language Generation Models
【24h】

Breaking Writer's Block: Low-cost Fine-tuning of Natural Language Generation Models

机译:打破作家的块:自然语言生成模型的低成本微调

获取原文

摘要

It is standard procedure these days to solve Information Extraction task by fine-tuning large pre-trained language models. This is not the case for generation task, which relies on a variety of techniques for controlled language generation. In this paper, we describe a system that fine-tunes a natural language generation model for the problem of solving Writer's Block. The fine-tuning changes the conditioning to also include the right context in addition to the left context, as well as an optional list of entities, the size, the genre and a summary of the paragraph that the human author wishes to generate. Our proposed fine-tuning obtains excellent results, even with a small number of epochs and a total cost of USD 150. The system can be accessed as a web-service, and all the code is released. A video showcasing the interface and the model is also available.
机译:这几天是标准程序,通过微调大型预先训练的语言模型来解决信息提取任务。 这不是生成任务的情况,这依赖于对受控语言生成的各种技术。 在本文中,我们描述了一种微调求解作者块的问题的自然语言生成模型的系统。 除了左下文之外,微调还会改变调节还包括右侧上下文,以及人为作者希望生成的段落的可选实体,大小,类型和段落摘要。 我们所提出的微调获得了卓越的结果,即使是少数时期和150美元的总成本。可以访问系统作为网络服务,所有代码都已释放。 展示界面和模型的视频也可用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号