You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/engine
Frank Lee e4685832f8
[engine] fixed bug in gradient accumulation dataloader to keep the last step (#1030)
3 years ago
..
gradient_accumulation [engine] fixed bug in gradient accumulation dataloader to keep the last step (#1030) 3 years ago
gradient_handler [doc] improved docstring and assertion messages for the engine module (#871) 3 years ago
ophooks [refactor] moving memtracer to gemini (#801) 3 years ago
paramhooks [doc] improved docstring and assertion messages for the engine module (#871) 3 years ago
schedule [pipelinable]use pipelinable to support GPT model. (#903) 3 years ago
__init__.py fix format (#580) 3 years ago
_base_engine.py [doc] improved docstring and assertion messages for the engine module (#871) 3 years ago