Ollama Now Powered by MLX on Apple Silicon in Preview

Ollama has released a preview version that leverages Apple’s MLX machine learning framework. This update aims to provide the fastest performance for running Ollama on Apple Silicon devices, enhancing local AI model execution.

Source: Ollama