首页> 外文会议>Conference on the North American Chapter of the Association for Computational Linguistics: Human Language Technologies >SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling
【24h】

SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling

机译:SC-LSTM:用于序列标记的多任务学习中的特定于特定于任务的表示

获取原文

摘要

Multi-task learning (MTL) has been sludied recently for sequence labeling. Typically, auxiliary tasks are selected specifically in order to improve the performance of a target task. Jointly learning multiple tasks in a way that benefit all of them simultaneously can increase the utility of MTL. In order to do so, we propose a new LSTM cell which contains both shared parameters that can learn from all tasks, and task-specific parameters that can learn task specific information. We name it a Shared-Cell Long-Short Term Memory (SC-LSTM). Experimental results on three sequence labeling benchmarks (named-entity recognition. text chunking, and part-of-speech tagging) demonstrate the effectiveness of our SC-LSTM cell.
机译:最近,多任务学习(MTL)已被划分用于序列标记。通常,专门选择辅助任务,以提高目标任务的性能。以一种方式共同学习多个任务,这些任务同时可以增加MTL的效用。为此,我们提出了一个新的LSTM单元,其中包含可以从所有任务中学习的共享参数,以及可以学习任务特定信息的任务特定参数。我们将其命名为共享单元长短短期内存(SC-LSTM)。三个序列标记基准测试的实验结果(命名实体识别。文本块和词语标记)展示了SC-LSTM单元的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号