You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
ColossalAI/colossalai/nn/parallel/layers/cache_embedding
Jiarui Fang 21962e1593
[embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699)
2 years ago
..
__init__.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
base_embedding.py [FAW] export FAW in _ops (#1438) 2 years ago
cache_mgr.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
cached_embedding.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
copyer.py [NFC] polish doc style for ColoTensor (#1457) 2 years ago
embedding_config.py [embedding] polish parallel embedding tablewise (#1545) 2 years ago
parallel_cached_embedding.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
parallel_cached_embedding_tablewise.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago
parallel_cached_embedding_tablewise_split_cache.py [embedding] rename FreqAwareEmbedding -> CachedEmbedding (#1699) 2 years ago