Title: Optimal Beam Search for Machine Translation
Speaker: Alexander Rush (Columbia)
Time: 2:15pm-3:30pm, May 3, Friday
Place: Room 6496, CUNY Graduate Center. 5th Ave & 34th St.


Beam search and cube pruning have become the de facto inference
algorithms for phrase-based and syntactic translation respectively. In
practice the algorithms often find near-optimal solutions very quickly.
However, the effectiveness of both algorithms is theoretically
surprising, since neither provides a guarantee of optimality or even a
bound on approximation error.

In this talk, we describe new beam-search algorithms for finding optimal
translations in phrase-based and syntax-based models. These algorithms
piggyback on the practical efficiency of beam search and cube pruning,
while extending them to provide additional guarantees. The algorithms
have the following benefits: (1) they provide formal guarantees about
approximation gap and certificates of optimality, (2) they are simple
and extend off-the-shelf beam search decoders, and (3) preliminary
results show that they are significantly faster than other exact methods
for translation decoding.  This is joint work with Yin-Wen Chang and
Michael Collins.


Alexander Rush is a Ph.D. candidate in Computer Science at the
Massachusetts Institute of Technology studying with Professor Michael
Collins. He received his A.B. in Computer Science from Harvard
University in 2007. Before starting graduate study, he worked as lead
engineer on the platform/API team at Facebook. His research interest is
in formally sound, but empirically fast inference methods for natural
language processing, with a focus on models for syntactic parsing,
translation, and speech. He has received best paper awards from EMNLP
2010 and NAACL 2012, the latter for work completed as an intern at
Google research. Currently he is the sole teaching assistant for
Columbia's first MOOC, NLP on Coursera, with over 30,000 registered