diff --git a/burr/tracking/server/s3/deployment/Dockerfile b/burr/tracking/server/s3/deployment/Dockerfile
index 4a2683076..875c972ba 100644
--- a/burr/tracking/server/s3/deployment/Dockerfile
+++ b/burr/tracking/server/s3/deployment/Dockerfile
@@ -37,8 +37,8 @@ RUN apt-get update && apt-get install -y \
# Install the dependencies
# TODO -- use the right version
-#RUN pip install "git+https://github.com/dagworks-inc/burr.git@tracker-s3#egg=burr[tracking-server-s3]"
-RUN pip install "burr[tracking-server-s3]>=0.29.0"
+#RUN pip install "git+https://github.com/apache/burr.git@tracker-s3#egg=apache-burr[tracking-server-s3]"
+RUN pip install "apache-burr[tracking-server-s3]>=0.29.0"
# Copy the nginx config file
COPY nginx.conf /etc/nginx/nginx.conf
diff --git a/docs/getting_started/install.rst b/docs/getting_started/install.rst
index 0696c0eda..0f716c146 100644
--- a/docs/getting_started/install.rst
+++ b/docs/getting_started/install.rst
@@ -38,7 +38,7 @@ Instead, please install manually using the following command:
.. code-block:: bash
- poetry add loguru "burr[tracking-client,tracking-server,streamlit,graphviz,hamilton]"
+ poetry add loguru "apache-burr[tracking-client,tracking-server,streamlit,graphviz,hamilton]"
This is just the kitchen sink for getting started -- remember, burr is dependency-free/pure python!
diff --git a/examples/conversational-rag/graph_db_example/notebook.ipynb b/examples/conversational-rag/graph_db_example/notebook.ipynb
index 7625cbe28..872165939 100644
--- a/examples/conversational-rag/graph_db_example/notebook.ipynb
+++ b/examples/conversational-rag/graph_db_example/notebook.ipynb
@@ -15,7 +15,7 @@
"metadata": {},
"source": [
"# install requirements\n",
- "!pip install falkordb openai burr[graphviz] "
+ "!pip install falkordb openai apache-burr[graphviz] "
],
"outputs": []
},
diff --git a/examples/conversational-rag/simple_example/README.md b/examples/conversational-rag/simple_example/README.md
index 8e3297351..c620b5576 100644
--- a/examples/conversational-rag/simple_example/README.md
+++ b/examples/conversational-rag/simple_example/README.md
@@ -56,7 +56,7 @@ You'll then have a text terminal where you can interact. Type exit to stop.

# Video Walkthrough via Notebook
-Open the notebook
+Open the notebook
diff --git a/examples/custom-serde/README.md b/examples/custom-serde/README.md
index 41283b7d9..15c1b9bfd 100644
--- a/examples/custom-serde/README.md
+++ b/examples/custom-serde/README.md
@@ -38,7 +38,7 @@ pip install jupyter
jupyter notebook
``
-and running the notebook. Or
+and running the notebook. Or
.
diff --git a/examples/email-assistant/README.md b/examples/email-assistant/README.md
index c6ba36687..3f24756ba 100644
--- a/examples/email-assistant/README.md
+++ b/examples/email-assistant/README.md
@@ -48,7 +48,7 @@ Note we will be adding two things to this demo:
1. An integration with the burr web app
2. a standalone server example with a walkthrough
-Open the notebook
+Open the notebook
diff --git a/examples/hamilton-integration/notebook.ipynb b/examples/hamilton-integration/notebook.ipynb
index e4e061a57..f83a9b2af 100644
--- a/examples/hamilton-integration/notebook.ipynb
+++ b/examples/hamilton-integration/notebook.ipynb
@@ -24,14 +24,14 @@
"source": [
"# execute this cell to install dependencies in the current environment\n",
"# useful when using Google Colab notebooks\n",
- "%pip install sf-hamilton[visualization] requests openai lancedb burr[start,opentelemetry] pydantic pyarrow opentelemetry-instrumentation-openai opentelemetry-instrumentation-lancedb"
+ "%pip install sf-hamilton[visualization] requests openai lancedb apache-burr[start,opentelemetry] pydantic pyarrow opentelemetry-instrumentation-openai opentelemetry-instrumentation-lancedb"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "# Use the 2-layer approach for a maintainable RAG system [](https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/hamilton-integration/notebook.ipynb) [](https://github.com/apache/burr/blob/main/examples/hamilton-integration/notebook.ipynb)\n",
+ "# Use the 2-layer approach for a maintainable RAG system [](https://colab.research.google.com/github/apache/burr/blob/main/examples/hamilton-integration/notebook.ipynb) [](https://github.com/apache/burr/blob/main/examples/hamilton-integration/notebook.ipynb)\n",
"\n",
"Ready-made solutions can get you started with GenAI, but building reliable product features with retrieval augmented generation (RAG) and LLM agents inevitably required custom code. This post shares the 2-layer approach to build a maintainable RAG application that will evolve with your needs. To illustrate these ideas, we will show how a typical RAG project might evolve.\n",
"\n",
diff --git a/examples/hello-world-counter/README.md b/examples/hello-world-counter/README.md
index 82b7f8d54..3901d7528 100644
--- a/examples/hello-world-counter/README.md
+++ b/examples/hello-world-counter/README.md
@@ -26,7 +26,7 @@ We have three files:
- [application.py](application.py) -- This contains a mainline to run the counter as well as a function to export the counter (for later use)
- [requirements.txt](requirements.txt) -- Just the requirements. All this needs is Burr/Streamlit
- [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app to interact with the counter.
-- [notebook.ipynb](notebook.ipynb) -- A notebook that shows the counter app too. Open the notebook
+- [notebook.ipynb](notebook.ipynb) -- A notebook that shows the counter app too. Open the notebook
diff --git a/examples/image-telephone/README.md b/examples/image-telephone/README.md
index ea72ab784..a382620cb 100644
--- a/examples/image-telephone/README.md
+++ b/examples/image-telephone/README.md
@@ -44,7 +44,7 @@ We recommend starting with the notebook.
### notebook.ipynb
You can use [notebook.ipynb](./notebook.ipynb) to run things. Or
-
+
diff --git a/examples/llm-adventure-game/README.md b/examples/llm-adventure-game/README.md
index 5de172c39..f221565cf 100644
--- a/examples/llm-adventure-game/README.md
+++ b/examples/llm-adventure-game/README.md
@@ -26,7 +26,7 @@ How to run:
OPENAI_API_KEY= python application.py
```
-Open the notebook
+Open the notebook
diff --git a/examples/multi-agent-collaboration/README.md b/examples/multi-agent-collaboration/README.md
index 97e63922b..41c7587b7 100644
--- a/examples/multi-agent-collaboration/README.md
+++ b/examples/multi-agent-collaboration/README.md
@@ -64,7 +64,7 @@ export TAVILY_API_KEY=YOUR_KEY
To run the example, you can do:
Run the notebook:
-
+
@@ -77,7 +77,7 @@ Application run:
or
Run the notebook:
-
+
diff --git a/examples/multi-agent-collaboration/hamilton/README.md b/examples/multi-agent-collaboration/hamilton/README.md
index 333be8772..123fc9179 100644
--- a/examples/multi-agent-collaboration/hamilton/README.md
+++ b/examples/multi-agent-collaboration/hamilton/README.md
@@ -52,7 +52,7 @@ export TAVILY_API_KEY=YOUR_KEY
```
Run the notebook:
-
+
or do it manually:
diff --git a/examples/multi-agent-collaboration/hamilton/notebook.ipynb b/examples/multi-agent-collaboration/hamilton/notebook.ipynb
index e91d2d5fc..975d6735e 100644
--- a/examples/multi-agent-collaboration/hamilton/notebook.ipynb
+++ b/examples/multi-agent-collaboration/hamilton/notebook.ipynb
@@ -43,7 +43,7 @@
},
"outputs": [],
"source": [
- "# %pip install -U burr[start] langchain-community langchain-core langchain-experimental openai sf-hamilton[visualization]"
+ "# %pip install -U apache-burr[start] langchain-community langchain-core langchain-experimental openai sf-hamilton[visualization]"
]
},
{
diff --git a/examples/multi-agent-collaboration/lcel/README.md b/examples/multi-agent-collaboration/lcel/README.md
index 8ea404412..de558266d 100644
--- a/examples/multi-agent-collaboration/lcel/README.md
+++ b/examples/multi-agent-collaboration/lcel/README.md
@@ -44,7 +44,7 @@ export TAVILY_API_KEY=YOUR_KEY
```
Run the notebook:
-
+
or do it manually:
diff --git a/examples/multi-agent-collaboration/lcel/notebook.ipynb b/examples/multi-agent-collaboration/lcel/notebook.ipynb
index 93e8d0036..8a616ba5b 100644
--- a/examples/multi-agent-collaboration/lcel/notebook.ipynb
+++ b/examples/multi-agent-collaboration/lcel/notebook.ipynb
@@ -42,7 +42,7 @@
}
},
"source": [
- "# %pip install -U burr[start] langchain-community langchain-core langchain-experimental openai"
+ "# %pip install -U apache-burr[start] langchain-community langchain-core langchain-experimental openai"
],
"outputs": []
},
diff --git a/examples/multi-modal-chatbot/README.md b/examples/multi-modal-chatbot/README.md
index 046086c0a..4c62fce18 100644
--- a/examples/multi-modal-chatbot/README.md
+++ b/examples/multi-modal-chatbot/README.md
@@ -45,7 +45,7 @@ We have a few files:
- [application.py](application.py) -- This contains a mainline to generate the graph portrayal.
- [requirements.txt](requirements.txt) -- Just the requirements. All this needs is Burr/Streamlit/openai
- [simple_streamlit_app.py](simple_streamlit_app.py) -- This contains a more sophisticated Streamlit app to interact with.
-- [notebook.ipynb](notebook.ipynb) -- A notebook that helps exercise things.
+- [notebook.ipynb](notebook.ipynb) -- A notebook that helps exercise things.
- [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app to interact with that is more
diff --git a/examples/opentelemetry/notebook.ipynb b/examples/opentelemetry/notebook.ipynb
index 49a4757b9..a6b6c2bb8 100644
--- a/examples/opentelemetry/notebook.ipynb
+++ b/examples/opentelemetry/notebook.ipynb
@@ -18,7 +18,7 @@
"outputs": [],
"source": [
"%%capture\n",
- "!pip install \"burr[start]\"\n",
+ "!pip install \"apache-burr[start]\"\n",
"!pip install opentelemetry-instrumentation-openai"
]
},
diff --git a/examples/other-examples/cowsay/README.md b/examples/other-examples/cowsay/README.md
index cb95339d6..748782545 100644
--- a/examples/other-examples/cowsay/README.md
+++ b/examples/other-examples/cowsay/README.md
@@ -26,7 +26,7 @@ We have three files:
- [application.py](application.py) -- This contains a mainline to run the cowsay app as well as a function to export the app (for later use)
- [requirements.txt](requirements.txt) -- Just the requirements. All this needs is Burr/Streamlit/cowsay
- [streamlit_app.py](streamlit_app.py) -- This contains a simple Streamlit app to interact with the cow
-- [notebook.ipynb](notebook.ipynb) -- A notebook that helps show things.
+- [notebook.ipynb](notebook.ipynb) -- A notebook that helps show things.
diff --git a/examples/parallelism/README.md b/examples/parallelism/README.md
index 884a3e551..b437eaabf 100644
--- a/examples/parallelism/README.md
+++ b/examples/parallelism/README.md
@@ -21,7 +21,7 @@
In this example we go over Burr's parallelism capabilities. It is based on the documentation (https://burr.apache.org/concepts/parallelism/), demonstrating the `MapStates` capabilities.
-See [the notebook](./notebook.ipynb) for the full example. Or
+See [the notebook](./notebook.ipynb) for the full example. Or
diff --git a/examples/parallelism/notebook.ipynb b/examples/parallelism/notebook.ipynb
index 8465295fb..7781ba19b 100644
--- a/examples/parallelism/notebook.ipynb
+++ b/examples/parallelism/notebook.ipynb
@@ -13,7 +13,7 @@
"id": "f4b744ec-ce8d-4e6b-b818-d86f6a869028",
"metadata": {},
"source": [
- "\n",
+ "\n",
"
\n",
" \n",
"or view source\n",
@@ -29,7 +29,7 @@
"outputs": [],
"source": [
"# install some dependencies and a few more\n",
- "%pip install \"burr[start,opentelemetry]\" opentelemetry-instrumentation-openai openai anthropic"
+ "%pip install \"apache-burr[start,opentelemetry]\" opentelemetry-instrumentation-openai openai anthropic"
]
},
{
diff --git a/examples/simple-chatbot-intro/README.md b/examples/simple-chatbot-intro/README.md
index f541973ef..1541fdbc7 100644
--- a/examples/simple-chatbot-intro/README.md
+++ b/examples/simple-chatbot-intro/README.md
@@ -31,6 +31,6 @@ Run the notebook:
jupyter notebook
```
-Then open `notebook.ipynb` and run the cells. Or
+Then open `notebook.ipynb` and run the cells. Or
diff --git a/examples/simple-chatbot-intro/notebook.ipynb b/examples/simple-chatbot-intro/notebook.ipynb
index 9b475aa35..723fdcdf8 100644
--- a/examples/simple-chatbot-intro/notebook.ipynb
+++ b/examples/simple-chatbot-intro/notebook.ipynb
@@ -25,7 +25,7 @@
},
"outputs": [],
"source": [
- "!pip install \"burr[start]\""
+ "!pip install \"apache-burr[start]\""
]
},
{
diff --git a/examples/streaming-overview/README.md b/examples/streaming-overview/README.md
index bf3282a4e..ab74c30ae 100644
--- a/examples/streaming-overview/README.md
+++ b/examples/streaming-overview/README.md
@@ -38,6 +38,6 @@ which demonstrates how to use streaming async. We have not hooked this up
to a streamlit application yet, but that should be trivial.
## Notebook
-The notebook also shows how things work.
+The notebook also shows how things work.
diff --git a/examples/tracing-and-spans/burr_otel_demo.ipynb b/examples/tracing-and-spans/burr_otel_demo.ipynb
index 95d0958cb..6af503520 100644
--- a/examples/tracing-and-spans/burr_otel_demo.ipynb
+++ b/examples/tracing-and-spans/burr_otel_demo.ipynb
@@ -91,7 +91,7 @@
"outputs": [],
"source": [
"# install libs if you need them\n",
- "!pip install \"burr[start]\" openai opentelemetry-instrumentation-openai"
+ "!pip install \"apache-burr[start]\" openai opentelemetry-instrumentation-openai"
]
},
{