diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md index 6988a2e8b..ff63f2ac0 100644 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -24,7 +24,7 @@ If applicable, add screenshots to help explain your problem. Add any other context about the problem here. **Letta Config** -Please attach your `~/.letta/config` file or copy past it below. +Please attach your `~/.letta/config` file or copy paste it below. --- diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index 69acf6ac1..8035af383 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -8,7 +8,7 @@ How can we test your PR during review? What commands should we run? What outcome Have you tested the latest commit on the PR? If so please provide outputs from your tests. **Related issues or PRs** -Please link any related GitHub [issues](https://github.com/cpacker/Letta/issues) or [PRs](https://github.com/cpacker/Letta/pulls). +Please link any related GitHub [issues](https://github.com/letta-ai/letta/issues) or [PRs](https://github.com/letta-ai/letta/pulls). **Is your PR over 500 lines of code?** If so, please break up your PR into multiple smaller PRs so that we can review them quickly, or provide justification for its length. diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index a55d87a19..0d8f16f7a 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -112,14 +112,14 @@ poetry run black . -l 140 You're almost there! It's time to share your brilliance with the world. 🌍 -1. Visit [Letta](https://github.com/cpacker/letta). +1. Visit [Letta](https://github.com/letta-ai/letta). 2. Click "New Pull Request" button. 3. Choose the base branch (`main`) and the compare branch (your feature branch). 4. Whip up a catchy title and describe your changes in the description. 🪄 ## 6. 🔍 Review and Approval -The maintainers, will take a look and might suggest some cool upgrades or ask for more details. Once they give the thumbs up, your creation becomes part of Letta! +The maintainers will take a look and might suggest some cool upgrades or ask for more details. Once they give the thumbs up, your creation becomes part of Letta! ## 7. 📜 Code of Conduct diff --git a/README.md b/README.md index b73622f6a..9ccb2a505 100644 --- a/README.md +++ b/README.md @@ -85,7 +85,7 @@ Once the Letta server is running, you can access it via port `8283` (e.g. sendin > [!NOTE] > The Letta ADE is a graphical user interface for creating, deploying, interacting and observing with your Letta agents. > -> For example, if you're running a Letta server to power an end-user application (such as a customer support chatbot), you can use the ADE to test, debug, and observe the agents in your server. You can also use the ADE as a general chat interface to interacting with your Letta agents. +> For example, if you're running a Letta server to power an end-user application (such as a customer support chatbot), you can use the ADE to test, debug, and observe the agents in your server. You can also use the ADE as a general chat interface to interact with your Letta agents.

@@ -139,7 +139,7 @@ No, you can install Letta using `pip` (via `pip install -U letta`), as well as f Letta gives your agents persistence (they live indefinitely) by storing all your agent data in a database. Letta is designed to be used with a [PostgreSQL](https://en.wikipedia.org/wiki/PostgreSQL) (the world's most popular database), however, it is not possible to install PostgreSQL via `pip`, so the `pip` install of Letta defaults to using [SQLite](https://www.sqlite.org/). If you have a PostgreSQL instance running on your own computer, you can still connect Letta (installed via `pip`) to PostgreSQL by setting the environment variable `LETTA_PG_URI`. -**Database migrations are not officially supported for Letta when using SQLite**, so you would like to ensure that if you're able to upgrade to the latest Letta version and migrate your Letta agents data, make sure that you're using PostgreSQL as your Letta database backend. Full compatability table below: +**Database migrations are not officially supported for Letta when using SQLite**, so if you would like to ensure that you're able to upgrade to the latest Letta version and migrate your Letta agents data, make sure that you're using PostgreSQL as your Letta database backend. Full compatability table below: | Installation method | Start server command | Database backend | Data migrations supported? | |---|---|---|---| @@ -211,7 +211,7 @@ Hit enter to begin (will request first Letta message) ## ⚡ Quickstart (pip) > [!WARNING] -> **Database migrations are not officially support with `SQLite`** +> **Database migrations are not officially supported with `SQLite`** > > When you install Letta with `pip`, the default database backend is `SQLite` (you can still use an external `postgres` service with your `pip` install of Letta by setting `LETTA_PG_URI`). > @@ -221,7 +221,7 @@ Hit enter to begin (will request first Letta message)

View instructions for installing with pip -You can also install Letta with `pip`, will default to using `SQLite` for the database backends (whereas Docker will default to using `postgres`). +You can also install Letta with `pip`, which will default to using `SQLite` for the database backends (whereas Docker will default to using `postgres`). ### Step 1 - Install Letta using `pip` ```sh @@ -295,7 +295,7 @@ Letta is an open source project built by over a hundred contributors. There are * **Contribute to the project**: Interested in contributing? Start by reading our [Contribution Guidelines](https://github.com/cpacker/MemGPT/tree/main/CONTRIBUTING.md). * **Ask a question**: Join our community on [Discord](https://discord.gg/letta) and direct your questions to the `#support` channel. -* **Report ssues or suggest features**: Have an issue or a feature request? Please submit them through our [GitHub Issues page](https://github.com/cpacker/MemGPT/issues). +* **Report issues or suggest features**: Have an issue or a feature request? Please submit them through our [GitHub Issues page](https://github.com/cpacker/MemGPT/issues). * **Explore the roadmap**: Curious about future developments? View and comment on our [project roadmap](https://github.com/cpacker/MemGPT/issues/1533). * **Join community events**: Stay updated with the [event calendar](https://lu.ma/berkeley-llm-meetup) or follow our [Twitter account](https://twitter.com/Letta_AI). diff --git a/examples/docs/rest_client.py b/examples/docs/rest_client.py index d0db8c4d8..c61577c2d 100644 --- a/examples/docs/rest_client.py +++ b/examples/docs/rest_client.py @@ -31,7 +31,7 @@ def main(): # Send a message to the agent print(f"Created agent: {agent_state.name} with ID {str(agent_state.id)}") response = client.user_message(agent_id=agent_state.id, message="Whats my name?") - print(f"Recieved response:", response.messages) + print(f"Received response:", response.messages) # Delete agent client.delete_agent(agent_id=agent_state.id) diff --git a/examples/notebooks/Customizing memory management.ipynb b/examples/notebooks/Customizing memory management.ipynb index 28ca47e78..64ceb8eb4 100644 --- a/examples/notebooks/Customizing memory management.ipynb +++ b/examples/notebooks/Customizing memory management.ipynb @@ -499,7 +499,7 @@ "response = client.send_message(\n", " agent_id=task_agent_state.id, \n", " role=\"user\", \n", - " message=\"Add 'start calling me Charles' and 'tell me a haiku about my name' as two seperate tasks.\"\n", + " message=\"Add 'start calling me Charles' and 'tell me a haiku about my name' as two separate tasks.\"\n", ")\n", "response" ] diff --git a/examples/personal_assistant_demo/README.md b/examples/personal_assistant_demo/README.md index e97497f90..bc3adf434 100644 --- a/examples/personal_assistant_demo/README.md +++ b/examples/personal_assistant_demo/README.md @@ -261,7 +261,7 @@ soon! 🙌", Then inside WhatsApp (or SMS if you used Twilio SMS): -image +image Then I sent a dummy email: ``` @@ -276,4 +276,4 @@ whatever time works best for you Follow-up inside WhatsApp: -image +image diff --git a/examples/resend_example/README.md b/examples/resend_example/README.md index 038c6d6ee..1f04a4aa3 100644 --- a/examples/resend_example/README.md +++ b/examples/resend_example/README.md @@ -51,15 +51,15 @@ def send_email(self, description: str): To create the tool in the dev portal, simply navigate to the tool creator tab, create a new tool called `send_email`, and copy-paste the above code into the code block area and press "Create Tool". -image +image Once you've created the tool, create a new agent and make sure to select `send_email` as an enabled tool. -image +image Now your agent should be able to call the `send_email` function when needed: -image +image ## Option 2 (CLI) @@ -85,8 +85,8 @@ Create an agent with that preset (disable `--stream` if you're not using a strea letta run --preset resend_preset --persona sam_pov --human cs_phd --stream ``` -image +image Waiting in our inbox: -image +image diff --git a/letta/agent.py b/letta/agent.py index 341b25fda..3e4d24432 100644 --- a/letta/agent.py +++ b/letta/agent.py @@ -1290,7 +1290,7 @@ class Agent(BaseAgent): # NOTE: a bit of a hack - we pull the timestamp from the message created_by memory_edit_timestamp = self._messages[0].created_at - # update memory (TODO: potentially update recall/archival stats seperately) + # update memory (TODO: potentially update recall/archival stats separately) new_system_message_str = compile_system_message( agent_id=self.agent_state.id, system_prompt=self.agent_state.system, diff --git a/letta/local_llm/function_parser.py b/letta/local_llm/function_parser.py index 18031bc4f..6dd788da2 100644 --- a/letta/local_llm/function_parser.py +++ b/letta/local_llm/function_parser.py @@ -32,7 +32,7 @@ def heartbeat_correction(message_history, new_message): If the last message in the stack is a user message and the new message is an assistant func call, fix the heartbeat - See: https://github.com/cpacker/Letta/issues/601 + See: https://github.com/letta-ai/letta/issues/601 """ if len(message_history) < 1: return None diff --git a/letta/server/server.py b/letta/server/server.py index 3eb3207a2..4a48b2a2e 100644 --- a/letta/server/server.py +++ b/letta/server/server.py @@ -1056,7 +1056,7 @@ class SyncServer(Server): config_copy[k] = server_utils.shorten_key_middle(v, chars_each_side=5) return config_copy - # TODO: do we need a seperate server config? + # TODO: do we need a separate server config? base_config = vars(self.config) clean_base_config = clean_keys(base_config) diff --git a/letta/server/ws_api/server.py b/letta/server/ws_api/server.py index 975bd0d20..e2408ddaa 100644 --- a/letta/server/ws_api/server.py +++ b/letta/server/ws_api/server.py @@ -33,7 +33,7 @@ class WebSocketServer: self.initialize_server() # Can play with ping_interval and ping_timeout # See: https://websockets.readthedocs.io/en/stable/topics/timeouts.html - # and https://github.com/cpacker/Letta/issues/471 + # and https://github.com/letta-ai/letta/issues/471 async with websockets.serve(self.handle_client, self.host, self.port): await asyncio.Future() # Run forever diff --git a/letta/settings.py b/letta/settings.py index 8da6b3aac..20a0c1c50 100644 --- a/letta/settings.py +++ b/letta/settings.py @@ -91,7 +91,7 @@ class Settings(BaseSettings): return f"postgresql+pg8000://letta:letta@localhost:5432/letta" # add this property to avoid being returned the default - # reference: https://github.com/cpacker/Letta/issues/1362 + # reference: https://github.com/letta-ai/letta/issues/1362 @property def letta_pg_uri_no_default(self) -> str: if self.pg_uri: diff --git a/tests/test_client.py b/tests/test_client.py index 526559b79..f37fe8626 100644 --- a/tests/test_client.py +++ b/tests/test_client.py @@ -340,7 +340,7 @@ def test_messages(client: Union[LocalClient, RESTClient], agent: AgentState): def test_send_system_message(client: Union[LocalClient, RESTClient], agent: AgentState): """Important unit test since the Letta API exposes sending system messages, but some backends don't natively support it (eg Anthropic)""" - send_system_message_response = client.send_message(agent_id=agent.id, message="Event occured: The user just logged off.", role="system") + send_system_message_response = client.send_message(agent_id=agent.id, message="Event occurred: The user just logged off.", role="system") assert send_system_message_response, "Sending message failed"