Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
PythonBlinkDL/RWKV-LM

RWKV-LM

RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.

85.1/100
14.4KForks: 993
View on GitHub
Loading report...

Similar Projects

text-generation-inference

75

Large Language Model Text Generation Inference

Python10.8K

ChatRWKV

68

ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.

Python9.5K

LMFlow

81

An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.

Python8.5K

xTuring

87

Build, personalize and control your own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6

Python2.7K
Back to List