V. MItra, H. Nam, C. Espy-Wilson, E. Saltzman, E. and L. Goldstein, “Recognizing articulatory gestures from speech for robust speech recognition,” The Journal of the Acoustical Society of America, vol 131, no. 3, pp. 2270–2287.
Abstract
Studies have shown that supplementary articulatory information can help to improve the recognition rate of automatic speech recognition systems. Unfortunately, articulatory information is not directly observable, necessitating its estimation from the speech signal. This study describes a system that recognizes articulatory gestures from speech, and uses the recognized gestures in a speech recognition system. Speech gestures are constriction actions produced by distinct constrictors of the vocal tract, and are associated with corresponding sets of dynamic parameters (target position, stiffness). Once a given gesture is activated, it generates tract-variable time function(s) that represent trajectories of the degree and/or location coordinates of the constriction…
Share this



