MemGPT/letta/local_llm
2024-11-18 15:15:05 -08:00
..
grammars chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
koboldcpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llamacpp chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
llm_chat_completion_wrappers feat: Add default external tools (#1899) 2024-10-17 10:26:37 -07:00
lmstudio chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
ollama chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
settings chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
vllm feat: add VLLMProvider (#1866) 2024-10-11 15:58:12 -07:00
webui chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
__init__.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
chat_completion_proxy.py fix: patch errors with OllamaProvider (#1875) 2024-10-12 20:19:20 -07:00
constants.py chore: add CLI CI test (#1858) 2024-10-09 14:34:36 -07:00
function_parser.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
json_parser.py chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
README.md chore: migrate package name to letta (#1775) 2024-09-23 09:15:18 -07:00
utils.py feat: support togetherAI via /completions (#2045) 2024-11-18 15:15:05 -08:00

Letta + local LLMs

See https://letta.readme.io/docs/local_llm for documentation on running Letta with custom LLM backends.