back...

Visual Tools for Debugging Neural Language Models
Xin Rong and Eytan Adar

While neural language models are powerful, they need just the right configuration of hyperparameters and a large amount of training data to perform well. When a model under development or training fails to give expected output, it is difficult for the user to debug, as the model is often a black box to the user. In this work, we discuss a set of visual tools designed to support the debugging process of neural language models. These are implemented in our LAnguage Model Visual Inspector (LAMVI) system, an interactive visual environment for exploring and debugging word embedding models.


PDF (834Kb), International Conference on Machine Learning (ICML) Workshop on Visualization for Deep Learning (2016), New York, June 23, 2016 (Best Paper Award)

Code and demo on github