MemGPT/letta/local_llm
2025-02-13 10:17:53 -08:00
..
grammars run black, add isort config to pyproject.toml 2024-12-26 19:43:11 -08:00
koboldcpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llamacpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llm_chat_completion_wrappers run black, add isort config to pyproject.toml 2024-12-26 19:43:11 -08:00
lmstudio chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
ollama chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
settings run black, add isort config to pyproject.toml 2024-12-26 19:43:11 -08:00
vllm feat: add VLLMProvider (#1866) 2024-10-11 15:58:12 -07:00
webui chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
__init__.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
chat_completion_proxy.py run black, add isort config to pyproject.toml 2024-12-26 19:43:11 -08:00
constants.py fix: fix lmstudio docker (#788) 2025-01-26 19:19:31 -08:00
function_parser.py chore: Add testing around base tools (#2268) 2024-12-17 15:46:05 -08:00
json_parser.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
README.md chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
utils.py chore: Remove overly verbose warning (#986) 2025-02-13 10:17:53 -08:00

Letta + local LLMs

See https://letta.readme.io/docs/local_llm for documentation on running Letta with custom LLM backends.