Invention Grant
US09536518B2 Unsupervised training method, training apparatus, and training program for an N-gram language model based upon recognition reliability
有权
基于识别可靠性的N-gram语言模型的无监督训练方法,训练装置和训练程序
- Patent Title: Unsupervised training method, training apparatus, and training program for an N-gram language model based upon recognition reliability
- Patent Title (中): 基于识别可靠性的N-gram语言模型的无监督训练方法,训练装置和训练程序
-
Application No.: US14643316Application Date: 2015-03-10
-
Publication No.: US09536518B2Publication Date: 2017-01-03
- Inventor: Nobuyasu Itoh , Gakuto Kurata , Masafumi Nishimura
- Applicant: International Business Machines Corporation
- Applicant Address: US NY Armonk
- Assignee: International Business Machines Corporation
- Current Assignee: International Business Machines Corporation
- Current Assignee Address: US NY Armonk
- Agent Scott S. Dobson
- Priority: JP2014-065470 20140327
- Main IPC: G10L15/06
- IPC: G10L15/06 ; G10L15/183 ; G10L15/197 ; G10L15/18

Abstract:
A computer-based, unsupervised training method for an N-gram language model includes reading, by a computer, recognition results obtained as a result of speech recognition of speech data; acquiring, by the computer, a reliability for each of the read recognition results; referring, by the computer, to the recognition result and the acquired reliability to select an N-gram entry; and training, by the computer, the N-gram language model about selected one of more of the N-gram entries using all recognition results.
Public/Granted literature
- US20150279353A1 UNSUPERVISED TRAINING METHOD, TRAINING APPARATUS, AND TRAINING PROGRAM FOR N-GRAM LANGUAGE MODEL Public/Granted day:2015-10-01
Information query