当前位置:网站首页>[deep learning][pytorch][original]crnn trains loss on the higher version of pytorch as a solution for Nan

[deep learning][pytorch][original]crnn trains loss on the higher version of pytorch as a solution for Nan

2022-06-24 11:14:00 FL1623863129

Recently, I have studied CRNN Various pytorch edition , I found that most of them were training problems , The typical problem is Loss Train a few epoch It becomes nan, So the project is github There are a lot of them , I'm using pytorch==1.7.0 edition , Then I found a good solution . Like what is said on the Internet to change the learning rate , Gradient cutting and so on all tried to be useless , He accidentally succeeded in a project and found out why he was right , Turned out to be CTCLoss Problems setting up the , In the high version pytorch Inside , Need to be in the initial CTCLoss When you add a parameter .

from torch.nn import CTCLoss

ctc_loss=CTCLoss(zero_infinity=True)

So there won't be loss by nan problem , And the test found that the model prediction was also normal , It seems that this method is feasible . If you encounter this kind of problem, you can try , If you find it useful, you can leave a message below .

原网站

版权声明
本文为[FL1623863129]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/175/202206240946112075.html