首页>
外国专利>
Building scalable N-gram language models using maximum likelihood maximum entropy N-gram models
Building scalable N-gram language models using maximum likelihood maximum entropy N-gram models
展开▼
机译:使用最大似然最大熵N-gram模型构建可伸缩的N-gram语言模型
展开▼
页面导航
摘要
著录项
相似文献
摘要
The present invention is an n-gram language modeler which significantly reduces the memory storage requirement and convergence time for language modelling systems and methods. The present invention aligns each n-gram with one of "n" number of non-intersecting classes. A count is determined for each n-gram representing the number of times each n- gram occurred in the training data. The n-grams are separated into classes and complement counts are determined. Using these counts and complement counts factors are determined, one factor for each class, using an iterative scaling algorithm. The language model probability, i.e. , the probability that a word occurs given the occurrence of the previous two words, is determined using these factors.
展开▼