Skip to content

Commit

Permalink
checkout to version v0.1.1
Browse files Browse the repository at this point in the history
  • Loading branch information
laekov committed Mar 1, 2021
1 parent 53b5b8c commit 0c3aa2c
Show file tree
Hide file tree
Showing 2 changed files with 24 additions and 3 deletions.
21 changes: 21 additions & 0 deletions doc/release-note.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,24 @@
## v0.1.1

### Distributed

- Broadcast data-parallel parameters before training.

### Megatron adaption

- Initialize `FMoELinear` parameters using different seed in model parallel even using the same random seed in megatron.
- Use proper comm for mp and dp.

### Transformer-XL example

- Improve scripts.

### Misc

- Logo and slack workspace link.
- Document in Chinese.
- Figures to explain how FastMoE works.

## v0.1.0

### Functions
Expand Down
6 changes: 3 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@

if __name__ == '__main__':
setuptools.setup(
name='fmoe',
version='0.1.0',
description='An efficient Mixture-of-Experts impl. for PyTorch',
name='fastmoe',
version='0.1.1',
description='An efficient Mixture-of-Experts system for PyTorch',
author='Jiaao He, Jiezhong Qiu and Aohan Zeng',
author_email='[email protected]',
license='Apache-2',
Expand Down

0 comments on commit 0c3aa2c

Please sign in to comment.