Skip to content

Releases: kozistr/pytorch_optimizer

pytorch-optimizer v3.0.0

21 May 09:02
eda736f
Compare
Choose a tag to compare

Change Log

The major version is updated! (v2.12.0 -> v3.0.0) (#164)

Many optimizers, learning rate schedulers, and objective functions are in pytorch-optimizer.
Currently, pytorch-optimizer supports 67 optimizers (+ bitsandbytes), 11 lr schedulers, and 13 loss functions, and reached about 4 ~ 50K downloads / month (peak is 75K downloads / month)!

The reason for updating the major version from v2 to v3 is that I think it's a good time to ship the recent implementations (the last update was about 7 months ago) and plan to pivot to new concepts like training utilities while maintaining the original features (e.g. optimizers).
Also, rich test cases, benchmarks, and examples are on the list!

Finally, thanks for using the pytorch-optimizer, and feel free to make any requests :)

Feature

Fix

  • Fix SRMM to allow operation beyond memory_length. (#227)

Dependency

  • Drop Python 3.7 support officially. (#221)
  • Update bitsandbytes to 0.43.0. (#228)

Docs

  • Add missing parameters in Ranger21 optimizer document. (#214, #215)
  • Fix WSAM optimizer paper link. (#219)

Contributions

thanks to @sdbds, @i404788

Diff

pytorch-optimizer v2.12.0

07 Oct 06:06
14b6b58
Compare
Choose a tag to compare

Change Log

Feature

  • Support bitsandbytes optimizer. (#211)
    • now, you can install with pip3 install pytorch-optimizer[bitsandbytes]
    • supports 8 bnb optimizers.
      • bnb_adagrad8bit, bnb_adam8bit, bnb_adamw8bit, bnb_lion8bit, bnb_lamb8bit, bnb_lars8bit, bnb_rmsprop8bit, bnb_sgd8bit.

Docs

Diff

2.11.2...2.12.0

pytorch-optimizer v2.11.2

02 Sep 05:57
25618d7
Compare
Choose a tag to compare

Change Log

Feature

Fix

  • Fix Lookahead optimizer (#200, #201, #202)
    • When using PyTorch Lightning which expects your optimiser to be a subclass of Optimizer.
  • Fix default rectify to False in AdaBelief optimizer (#203)

Test

  • Add DynamicLossScaler test case

Docs

  • Highlight the code blocks
  • Fix pepy badges

Contributions

thanks to @georg-wolflein

Diff

2.11.1...2.11.2

pytorch-optimizer v2.11.1

19 Jul 12:00
aaaf303
Compare
Choose a tag to compare

pytorch-optimizer v2.11.0

27 Jun 13:48
42df19c
Compare
Choose a tag to compare

pytorch-optimizer v2.10.1

13 Jun 06:37
baa65c4
Compare
Choose a tag to compare

Change Log

Feature

Fix

  • perturb isn't multiplied by -step_size in SWATS optimizer. (#179)
  • chebyshev step has size of T while the permutation is 2^T. (#168, #181)

Diff

2.10.0...2.10.1

pytorch-optimizer v2.10.0

07 Jun 14:18
95eee86
Compare
Choose a tag to compare

Change Log

Feature

Diff

2.9.1...2.10.0

Contributions

thanks to @i404788

pytorch-optimizer v2.9.1

19 May 12:09
9427d3c
Compare
Choose a tag to compare

Change Log

Fix

  • fix weight decay in Ranger21 (#170)

Diff

2.9.0...2.9.1

pytorch-optimizer v2.9.0

06 May 08:07
4dbfc23
Compare
Choose a tag to compare

Change Log

Feature

Docs

  • Fix readthedocs build issue, #156
  • Move citations into table, #156

Refactor

  • Refactor validation logic, #149, #150
  • Rename amsbound, amsgrad terms into ams_bound, #149
  • Return gradient instead of the parameter, AGC. #149
  • Refactor duplicates (e.g. rectified step size, AMSBound, AdamD, AdaNorm, weight decay) into re-usable functions, #150
  • Move pytorch_optimizer.experimental under pytorch_optimizer.*.experimental

Diff

2.8.0...2.9.0

pytorch-optimizer v2.8.0

29 Apr 08:51
cdfe807
Compare
Choose a tag to compare