首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Improving Textual Network Embedding with Global Attention via Optimal Transport
【24h】

Improving Textual Network Embedding with Global Attention via Optimal Transport

机译:通过最佳传输改进具有全球关注度的文本网络嵌入

获取原文

摘要

Constituting highly informative network em-beddings is an important tool for network analysis. It encodes network topology, along with other useful side information, into low-dimensional node-based feature representations that can be exploited by statistical modeling. This work focuses on learning context-aware network embeddings augmented with text data. We reformulate the network-embedding problem, and present two novel strategies to improve over traditional attention mechanisms: (ⅰ) a content-aware sparse attention module based on optimal transport, and (ⅱ) a high-level attention parsing module. Our approach yields naturally sparse and self-normalized relational inference. It can capture long-term interactions between sequences, thus addressing the challenges faced by existing textual network embedding schemes. Extensive experiments are conducted to demonstrate our model can consistently outperform alternative state-of-the-art methods.
机译:组成信息量很大的网络嵌入是网络分析的重要工具。它将网络拓扑以及其他有用的辅助信息编码为可通过统计建模利用的基于低维节点的特征表示。这项工作的重点是学习增强了文本数据的上下文感知网络嵌入。我们重新制定了网络嵌入问题,并提出了两种新颖的策略来改进传统的注意力机制:(ⅰ)基于最佳传输的内容感知稀疏注意力模块,以及(ⅱ)高级注意力解析模块。我们的方法产生自然稀疏和自规范化的关系推断。它可以捕获序列之间的长期交互,从而解决现有文本网络嵌入方案所面临的挑战。进行了广泛的实验,以证明我们的模型可以始终胜过其他先进技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号