Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs or notebook: better explain learning rate schedulers #275

Open
benjamin-work opened this issue Jul 6, 2018 · 2 comments · May be fixed by #1074
Open

docs or notebook: better explain learning rate schedulers #275

benjamin-work opened this issue Jul 6, 2018 · 2 comments · May be fixed by #1074

Comments

@benjamin-work
Copy link
Contributor

skorch has pretty good support for learning rate schedulers but unfortunately, they are not well documented yet. I would like to see either a dedicated section in the docs or in a notebook that illustrates their usage. With the new addition of simulating the learning rate (see #269), it should be easy to include some plots.

@ParagEkbote
Copy link

ParagEkbote commented Oct 28, 2024

Hi, could you please assign this issue to me? I have some experience in using optimizers such as Adam,AdamW for finetuning using LoRA and would love to contribute a dedicated notebook in the docs. 👍

cc: @thomasjpfan , @BenjaminBossan

@BenjaminBossan
Copy link
Collaborator

@ParagEkbote Done, thanks.

@ParagEkbote ParagEkbote linked a pull request Nov 28, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants