Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
C++alibaba/MNN

MNN

MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.

93.8/100
14.5KForks: 2.2K
View on GitHubHomepage →
Loading report...

Similar Projects

deeplake

86

Database for AI. Store Vectors, Images, Texts, Videos, etc. Use with LLMs/LangChain. Store, query, version, & visualize any AI data. Stream data in real-time to PyTorch/TensorFlow. https://activeloop.ai

C++9.0K

serving

77

A flexible, high-performance serving system for machine learning models

C++6.3K

LeanCopilot

79

LLMs as Copilots for Theorem Proving in Lean

C++1.2K

model_server

85

A scalable inference server for models optimized with OpenVINO™

C++836
Back to List