v24.5 (2024-09-13)
✨ Highlights
-
Mojo and MAX are magical! We've created a new package and virtual environment manager,
magic, for MAX and Mojo. -
New Llama3.1 pipeline built with the new MAX Graph Python API.
-
We have not one, but two new Python APIs that we're introducing in this release:
⭐️ New
-
Added
repeat_interleavegraph op. -
Added caching for MAX graph models. This means that graph compilation is cached and the executable model is retrieved from cache on the 2nd and subsequent runs. Note that the model cache is architecture specific and isn't portable across different targets.
-
Support for Python 3.12.
MAX Graph Python API
This Python API will ultimately provide the same low-level programming interface for high-performance inference graphs as the Mojo API. As with the Mojo API, it's an API for graph-building only, and it does not implement support for training.
You can take a look at how the API works in the MAX Graph Python API reference.
MAX Driver Python API
The MAX Driver API allows you to interact with devices (such as CPUs and GPUs) and allocate memory directly onto them. With this API, you interact with this memory as tensors.
Note that this API is still under development, with support for non-host devices, such as GPUs, planned for a future release.
To learn more, check out the MAX Driver Python APIreference.
MAX C API
New APIs for adding torch metadata libraries:
M_setTorchMetadataLibraryPathM_setTorchMetadataLibraryPtr
🦋 Changed
MAX Engine performance
- Compared to v24.4, MAX Engine v24.5 generates tokens for Llama an average of 15%-48% faster.
MAX C API
Simplified the API for adding torch library paths, which now only takes one path per API call, but can be called multiple times to add paths to the config:
M_setTorchLibraries->M_setTorchLibraryPath
⚠️ Deprecated
- The
maxcommand line tool is no longer supported and will be removed in a future release.
❌ Removed
- Dropped support for Ubuntu 20.04. If you're using Ubuntu, we currently support Ubuntu 22.04 LTS only.
- Dropped support for Python 3.8.
- Removed built-in PyTorch libraries from the max package. See the FAQ for information on supported torch versions.
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!