Mistral’s new Codestral Mamba to aid longer code generation

Published on:

The corporate examined Codestral Mamba on in-context retrieval capabilities as much as 256k tokens — twice the quantity seen in OpenAI’s GPT4o — and located its 7B model performing higher than open supply fashions in a number of benchmarking exams, comparable to HumanEval, MBPP, Spider, and CruxE.

The bigger 22B parameter model of the brand new mannequin additionally carried out considerably higher than CodeLlama-34B apart from the CruxE benchmark.

- Advertisement -

Whereas the 7B model is on the market below the Apache 2.0 license, the bigger 22B model is on the market below a industrial license for self-deployment or group license for testing functions.

See also  SingleStore acquires BryteFlow to boost data ingestion capabilities
- Advertisment -

Related

- Advertisment -

Leave a Reply

Please enter your comment!
Please enter your name here