728x90
반응형
for name, param in model.named_parameters():
if [LAYER_NAME] in name:
param.requires_grad = True
else:
param.requires_grad = False
ex)
for name, param in model.named_parameters():
if 'rec_head' in name:
param.requires_grad = True
else:
param.requires_grad = False
optimizer에서 grad 에러 뜰 경우,
optimizer = torch.optim.Adam([param for param in model.parameters() if param.requires_grad == True], lr=cfg.train_cfg.lr)
https://gist.github.com/L0SG/2f6d81e4ad119c4f798ab81fa8d62d3f
https://github.com/pytorch/pytorch/issues/679
728x90
반응형