- 包括base和三个微调模型,instruct,chat,写作
- 其中写作模型支持65k的上下文!是GPT4的两倍。(甚至支持到 84k 。)
- 包含了开源代码训练
- 在基准测试中达到了 LLaMA-7B 的水平。
官方介绍:
https://mosaicml.com/blog/mpt-7b
----------------------
Mosaicml
Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs
Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on…
----------------------
via AI News - Telegram Channel