Debugging Neural Nets for NLP

Debugging Neural Nets for NLP

Graham Neubig via YouTube Direct link

Intro

1 of 17

1 of 17

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Debugging Neural Nets for NLP

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 In Neural Networks, Tuning is Paramount!
  3. 3 A Typical Situation
  4. 4 Possible Causes
  5. 5 Identifying Training Time Problems
  6. 6 Is My Model Too Weak? Your model needs to be big enough to learn . Model size depends on task . For language modeling, at least 512 nodes • For natural language analysis, 128 or so may do . Multiple …
  7. 7 Be Careful of Deep Models
  8. 8 Trouble w/ Optimization
  9. 9 Reminder: Optimizers
  10. 10 Initialization
  11. 11 Bucketing/Sorting • If we use sentences of different lengths, too much padding and sorting can result in slow training • To remedy this sort sentences so similarly-lengthed sentences are in the same …
  12. 12 Debugging Decoding
  13. 13 Beam Search
  14. 14 Debugging Search
  15. 15 Look At Your Data!
  16. 16 Symptoms of Overfitting
  17. 17 Reminder: Dev-driven Learning Rate Decay Start w/ a high learning rate, then degrade learning rate when start overfitting the development set (the newbob learning rate schedule)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.