Common causes of nans during training of neural networks

I came across this phenomenon several times. Here are my observations: Gradient blow up Reason: large gradients throw the learning process off-track. What you should expect: Looking at the runtime log, you should look at the loss values per-iteration. You’ll notice that the loss starts to grow significantly from iteration to iteration, eventually the loss … Read more

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)