mlcow



Good reads

Word embeddings & Language modeling:

  • Yoshua Bengio, Réjean Ducharme, Pascal Vincent, and Christian Janvin. A neural probabilistic language model. J. Mach. Learn. Res., 3:1137–1155, March 2003. URL: www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf.
  • One of the initial works in word embeddings
  • Ronan Collobert, Jason Weston, Léon Bottou, Michael Karlen, Koray Kavukcuoglu, and Pavel P. Kuksa. Natural language processing (almost) from scratch. CoRR, 2011. URL: http://arxiv.org/abs/1103.0398, arXiv:1103.0398.
  • Graham Neubig. Neural machine translation and sequence-to-sequence models: A tutorial. CoRR, 2017. URL: http://arxiv.org/abs/1703.01619, arXiv:1703.01619.
  • Detailed tutorial on machine translation.
  • Introduces various language modeling techniques: ngram models, NN, RNN, LSTM and Attention based models
  • Sébastien Jean, Kyunghyun Cho, Roland Memisevic, and Yoshua Bengio. On using very large target vocabulary for neural machine translation. CoRR, 2014. URL: http://arxiv.org/abs/1412.2007, arXiv:1412.2007.
  • Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient estimation of word representations in vector space. CoRR, 2013. URL: http://arxiv.org/abs/1301.3781, arXiv:1301.3781.

NN

Search