-
Notifications
You must be signed in to change notification settings - Fork 735
[BUG] Solve the unpickling error in weight Loading #2000
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #2000 +/- ##
=======================================
Coverage ? 86.99%
=======================================
Files ? 160
Lines ? 9480
Branches ? 0
=======================================
Hits ? 8247
Misses ? 1233
Partials ? 0
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
hm, not yet, right? |
|
Yes, this PR is incomplete until we find a way to pass |
|
would overriding |
pytorch_forecasting/models/temporal_fusion_transformer/tuning.py
Outdated
Show resolved
Hide resolved
pytorch_forecasting/models/temporal_fusion_transformer/tuning.py
Outdated
Show resolved
Hide resolved
| try: | ||
| return super().lr_find(*args, **kwargs) | ||
| finally: | ||
| torch.load = original_load |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
doing this feels dangerous, because this has side effects on everything!
I would not do this. Is there a way we can override only in a call, or override a method in a class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have wrapped trainer.fit here then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wrapping around trainer.fit didn't work as lr_find calls trainer._checkpoint_connector.restore after fit that overrode the weights_only param from fit, so now I have wrapped load_checkpoint instead (its deeper in the traceback call - close to torch.load, but still in the lightning layer). I think this should solve the issue?
fkiraly
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting find that this works!
Though I do not agree with the pattern - overriding the torch installation imo is a bad idea, as it can have unpredictable side effects, as it would override torch.load for any and every call.
Is there a way to avoid doing that, and instead injecting the override at call level?
There is an Unpickling Error in weight loading of
pytorch-forecasting-v1because the weights are not loaded correctly.The issue seems to be something related to torch which is leading to fail in CI/CD pipeline.
The issue surfaced after the latest
lightningupdate where they exposed theweights_onlyparam and lettorchhandle it itself.Fixes #1998