...
首页> 外文期刊>Journal of web semantics: >Representation Learning for the Semantic Web
【24h】

Representation Learning for the Semantic Web

机译:表示语义网络的代表学习

获取原文
获取原文并翻译 | 示例
           

摘要

In the past years, learning vector space embeddings has rapidly gained attention, first in the natural language processing community with the advent of word2vec, and more recently also in the Semantic Web community, e.g., with the adaptations RDF2vec or node2vec, as well as the RESCAL, HolE and Trans* family. Their properties - the representation of entities in a dense vector space, the proximity of semantically related entities, and the preservation of the direction of semantic relations - make them interesting for many applications. Consequently, the field of embedding learning has recently gained a considerable uptake in the Semantic Web community.There are various ways of creating such embeddings. They range from applying the word2vec paradigm to sequences derived from graphs to translation learning and tensor factorization. Those methods differ in many aspects, such as the strategies used, the embedding target (e.g., nodes, relations, classes), the scalability on different types of input datasets, as well as in the characteristics of the resulting embeddings.
机译:在过去的几年里,学习矢量空间嵌入在迅速获得的关注中,首先在自然语言处理社区中,通过Word2VEC的出现,最近也在语义网络社区中,例如,使用适应rdf2vec或node2vec,以及Rescal,Hole和Trans *家庭。它们的属性 - 密集矢量空间中的实体的代表,语义相关实体的附近以及语义关系的保存 - 使它们对许多应用程序有趣。因此,嵌入学习领域最近在语义网络社区中获得了相当大的吸收。有各种各样的方法可以创建这样的嵌入品。它们的范围从将Word2Vec范例应用于从图形到翻译学习和张量分解的序列。这些方法在许多方面不同,例如使用的策略,嵌入目标(例如,节点,关系,类),不同类型的输入数据集上的可伸缩性,以及所产生的嵌入的特征。

著录项

  • 来源
    《Journal of web semantics:》 |2020年第3期|100570.1-100570.2|共2页
  • 作者单位

    University of Mannheim Germany;

    Siemens Corporate Technology and Ludwig Maximilian University of Munich Germany;

    Tsinghua University China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号