Skip Navigation
Machine Learning - Learning/Language Models @lemmy.intai.tech manitcor @lemmy.intai.tech

MPT-30B: Raising the bar for open-source foundation models

www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models

Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.

MPT-30B: Raising the bar for open-source foundation models
0
0 comments