MemGPT/letta/local_llm
Charles Packer 22c7c71c66 feat: initial MCP support (#1229)
Co-authored-by: Matt Zhou <mattzh1314@gmail.com>
2025-03-12 17:33:24 -07:00
..
grammars feat: Composio tools execute on-the-fly (#999) 2025-02-13 16:13:29 -08:00
koboldcpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llamacpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llm_chat_completion_wrappers feat: Composio tools execute on-the-fly (#999) 2025-02-13 16:13:29 -08:00
lmstudio feat: support deepseek models (#821) 2025-02-18 15:28:01 -08:00
ollama chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
settings run black, add isort config to pyproject.toml 2024-12-26 19:43:11 -08:00
vllm feat: add VLLMProvider (#1866) 2024-10-11 15:58:12 -07:00
webui chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
__init__.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
chat_completion_proxy.py feat: log request data to otel (#1149) 2025-03-03 11:51:05 -08:00
constants.py fix: fix lmstudio docker (#788) 2025-01-26 19:19:31 -08:00
function_parser.py feat: Composio tools execute on-the-fly (#999) 2025-02-13 16:13:29 -08:00
json_parser.py feat: Composio tools execute on-the-fly (#999) 2025-02-13 16:13:29 -08:00
README.md chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
utils.py feat: initial MCP support (#1229) 2025-03-12 17:33:24 -07:00

Letta + local LLMs

See https://letta.readme.io/docs/local_llm for documentation on running Letta with custom LLM backends.