RuntimeError解决方案:避免二次反向传播和直接访问已保存张量的问题
文章目录
问题描述:
在使用 PyTorch
训练模型时出现如下问题 RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
Traceback
作者:海洋 之心