Releases: Lightning-AI/litgpt
Releases · Lightning-AI/litgpt
v0.4.0
What's Changed
- Set litdata < 0.2.6 by @carmocca in #1400
- Remove per-file CLIs by @carmocca in #1397
- Simillar -> Similar by @rasbt in #1405
- LitData: set
iterate_over_all
to False forCombinedStreamingDataset
by @Andrei-Aksionov in #1404 - Allow multiline prompts by @rasbt in #1279
- Explain dataset options by @rasbt in #1407
- Support
no_sync
with Thunder FSDP by @carmocca in #1414 - Minimal Python example by @rasbt in #1410
- Fix bug where LitData doesn't use seed by @bradfordlynch in #1425
- Add prompt style mapping for llama3 by @davmacario in #1406
- Simplify code by @rasbt in #1429
- OptimizerArgs by @rasbt in #1409
- Fix optimizer init with fused=True by @carmocca in #1434
- Fix learning rate calculation in pretrain by @rasbt in #1435
- Align readme by @rasbt in #1438
- Pin litdata by @rasbt in #1440
- Fix README.md alignment by @rasbt in #1439
- Update README.md for one last time by @rasbt in #1442
- A more centered look by @rasbt in #1449
- New CLI by @rasbt in #1437
- Update error message by @rasbt in #1453
- Explain how to list all available models by @rasbt in #1455
- Detect tensor cores by @rasbt in #1456
- Check checkpoint_dir and add
checkpoints
to path by @rasbt in #1454 - Add MicroLlama training support by @keeeeenw in #1457
- Streaming for serving with chat's generate function by @rasbt in #1426
- Fix sequence length bug by @rasbt in #1462
- Add
lr_warmup_steps
,max_steps
values validation by @shenxiangzhuang in #1460 - Fix issue where path in merge_lora is overwritten by @rasbt in #1465
- Option to skip expensive final validation by @rasbt in #1372
- Allow batch size "auto" setting in evaluate by @rasbt in #1469
- Warn users when there is a bnb mismatch by @rasbt in #1468
- Allow batch argument with batch recomputation by @rasbt in #1470
- LitGPT Python API draft by @rasbt in #1459
- Bump version for PyPI release by @rasbt in #1476
New Contributors
- @bradfordlynch made their first contribution in #1425
- @davmacario made their first contribution in #1406
- @keeeeenw made their first contribution in #1457
Full Changelog: v0.3.1...v0.4.0
Development release 0.4.0.dev0
What's Changed
- Streaming for serving with chat's generate function by @rasbt in #1426
- Add MicroLlama training support by @keeeeenw in #1457
- Check checkpoint_dir and add
checkpoints
to path by @rasbt in #1454 - Detect tensor cores by @rasbt in #1456
- Explain how to list all available models by @rasbt in #1455
- Update error message by @rasbt in #1453
- New CLI by @rasbt in #1437
- A more centered look by @rasbt in #1449
- Update README.md for one last time by @rasbt in #1442
- Fix README.md alignment by @rasbt in #1439
- Pin litdata by @rasbt in #1440
- Align readme by @rasbt in #1438
- Fix learning rate calculation in pretrain by @rasbt in #1435
- Fix optimizer init with fused=True by @carmocca in #1434
- OptimizerArgs by @rasbt in #1409
- Simplify code by @rasbt in #1429
- Add prompt style mapping for llama3 by @davmacario in #1406
- Fix bug where LitData doesn't use seed by @bradfordlynch in #1425
- Minimal Python example by @rasbt in #1410
- Support
no_sync
with Thunder FSDP by @carmocca in #1414 - Explain dataset options by @rasbt in #1407
- Allow multiline prompts by @rasbt in #1279
- LitData: set
iterate_over_all
to False forCombinedStreamingDataset
by @Andrei-Aksionov in #1404 - Simillar -> Similar by @rasbt in #1405
- Remove per-file CLIs by @carmocca in #1397
- Set litdata < 0.2.6 by @carmocca in #1400
Full Changelog: View on GitHub