Dr. Boxing CHEN
Machine Intelligence Lab
DAMO Academy of Alibaba Group
With the significant performance improvement of automatic speech recognition and machine translation, speech translation comes to the real life from the science fiction. There are two paradigms for speech translation: cascaded and end2end speech translation systems. Due to the size of the available training data, cascaded speech translation system still over-performs the end2end speech translation system. However, cascaded speech translation system also suffers the error propagation problem. ASR errors, punctuation restoration, language informality, and disfluency etc. are the main challenges for speech translation. We invest effort in all components of the cascaded speech translation system. We achieve improvement for each component in the pipeline, therefor result better overall speech translation performance. We show some scenarios at Alibaba, which we apply speech translation for technical lecture, communications between buyer and seller, oversea traveling, etc.
Boxing Chen is a Senior Algorithm Expert at Machine Intelligence Lab, DAMO Academy of Alibaba Group. He works on natural language processing, mainly focus on machine translation. Prior to Alibaba, he was a Research Officer at the National Research Council Canada (NRC), Senior Research Fellow at the Institute for Infocomm Research in Singapore, a Postdoc at FBK-IRST in Italy and a Postdoc at the University of Grenoble in France. He received his PhD degree from Chinese Academy of Science and Bachelor degree from Peking University in China. He has co-authored more than 50 papers in the NLP conferences and journals. He has received the “The best paper award” from MT Summit 2013, and “The best paper award nomination” from ACL 2013. His teams ranked the first place in WMT 2018 for five translation tasks and six quality estimation sub-tasks; the first place in WMT 2017 Russian-to-English translation, the first place in the NIST 2012 OpenMT Chinese-to-English translation, the first place in the IWSLT 2007 Chinese-to-English translation and IWSLT 2005 Chinese-to-English and Japanese-to-English translation, etc.