Skip to content

Refactor Optimizer Config API #45

@bhavnicksm

Description

@bhavnicksm

Currently every optimizer comes with a config specifically for that optimizer that manages the hyperparameters for the optimizer.

This is made because of the following reasons:

  • A lot of hyper-parameters need to transfer properly with Inheritance, which is essential for the composability of the optimizers. By having the hyper-parameters inside a Dataclass, we can easily append these Dataclasses by only the new additional parameters we might need, along with the defaults that are useful.
  • Ease of transfer and storage: Keep all Optimizer hyper-params in one place, transform them as a dict, or as a yaml, save it, load it. This increases the replication ability of these optimizers. Hyper-parameters need to be saved, just like the optimizer state is saved.

This diverges from the PyTorch-esque way optimizers are where the classes need to be provided with full information on all the parameters.

This issue is to find a decent middle ground between the two -> Allow for immediate tiny changes in the Hyperparameters on the spot, with the object, while also allowing for inheritable parameters.

This adds a significant overhead and makes things a bit more complex, so it should ideally be abstracted out in the BaseConfig and BaseOptimizer, with all the utility functions that transform from one format to the other.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or requestwontfixThis will not be worked on

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions