...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Students Need More Attention BERT-based Attention Model for Small Data with Application to Automatic Patient Message Triage
【24h】

Students Need More Attention BERT-based Attention Model for Small Data with Application to Automatic Patient Message Triage

机译:学生需要更多地关注基于BERT的注意力模型,用于自动患者消息分类

获取原文
           

摘要

Small and imbalanced datasets commonly seen in healthcare represent a challenge when training classifiers based on deep learning models. So motivated, we propose a novel framework based on BioBERT (Bidirectional Encoder Representations from Transformers for Biomedical TextMining). Specifically, (i) we introduce Label Embeddings for Self-Attention in each layer of BERT, which we call LESA-BERT, and (ii) by distilling LESA-BERT to smaller variants, we aim to reduce over fitting and model size when working on small datasets. As an application, our framework is utilized to build a model for patient portal message triage that classifies the urgency of a message into three categories: non-urgent, medium and urgent. Experiments demonstrate that our approach can outperform several strong baseline classifiers by a significant margin of 4.3% in terms of macro F1 score.
机译:医疗保健中常见的小型和不平衡数据集代表了基于深度学习模型的培训分类器时代表了挑战。如此动机,我们提出了一种基于Biobert的新框架(来自变压器的双向编码器表示为生物医学教科仪器)。具体而言,(i)我们在每层伯特中引入标签嵌入,我们通过蒸馏Lesa-bert致电较小的变体,我们致电Lesa-Bert,我们的目标是在工作时减少拟合和模型尺寸在小型数据集上。作为一个应用程序,我们的框架用于构建患者门户网站消息分类模型,将消息的紧迫性分为三类:非紧急,中和紧急。实验表明,我们的方法可以在宏F1分数方面以4.3%的重大边际优越差异。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号