Deep Learning for NLP: PyTorch vs Tensorflow

Abstract

In this talk, I will discuss some of the best practices and latest trends in natural language processing (NLP) research. The main goal is to provide a comprehensive comparison between machine learning frameworks (PyTorch and Tensorflow) when used for NLP-related tasks, such as sentiment analysis and emotion recognition from textual data. I will cover how to program and train widely-used algorithms, such as neural word embeddings and long short-term memory (LSTM) networks, for sentence classification. I will discuss some challenges and opportunities in deep learning for NLP research together with the advantages and disadvantages of using PyTorch and Tensorflow.

Description

The main objective of this talk is to discuss the main differences between two popular deep learning frameworks, namely [PyTorch](http://pytorch.org/) and [Tensorflow](https://www.tensorflow.org/). The emphasis of the talk will be on how to use these frameworks to build and train deep learning models for natural language processing (NLP) tasks, such as sentiment analysis and emotion recognition from text. Specifically, convolutional neural networks (CNNs), word2vec, and long short-term memory (LSTM) networks will be implemented to perform sentence classification on sentiment and emotion datasets. The disadvantages and advantages of using both deep learning frameworks will be highlighted. This includes topics such as "data flow graphs", "documentation", "community", "computational speed", "resources", etc. Future trends and best practises in deep learning for NLP will also be discussed.

Speaker

Elvis Saravia

Doctoral Researcher @ NTHU (Machine Learning / AI), web developer, founder, designer, blogger, and open-source advocate.