Skip to content

penn-cnt/Pioneer_Chat_Interface

Repository files navigation

Setup Instructions: Local Web App (for Development)

Python version 3.11+ and Docker are required.

  1. Ensure the ENV variable in .env is set to "LOCAL".

    Set OPENAI_API_KEY to your OpenAI API key.

    Optionally, change the DB_PASSWORD.

  2. In your terminal, run the following commands:

    python3 -m venv .venv
    source .venv/bin/activate
    pip install -r requirements.txt
    docker-compose -f docker-compose-db.yml up -d
    chmod +x startup.sh
    ./startup.sh init-db
    
  3. In your web browser, go to the chat interface address: http://0.0.0.0:8000.

  • To update the web app with new changes, press CTRL+C in the terminal to end the process, then re-run ./startup.sh.
  • To reinitialize the database, run ./startup.sh with the init-db flag: ./startup.sh init-db.

Setup Instructions: Dockerized Web App (for Production)

Docker is required. Note that at the moment, this fully Dockerized version does not perform as well as the development version above and is not recommended.

  1. Ensure the ENV variable in .env is set to "DOCKER".

    Set OPENAI_API_KEY to your OpenAI API key.

    Optionally, change the DB_PASSWORD.

  2. In your terminal, run the following command: INIT_DB=true docker-compose up --build

  3. In your web browser, go to the chat interface address: http://0.0.0.0:8000.

  • To update the web app with new changes, run docker-compose down, then docker-compose up.

Interacting with the Chat Interface

  1. Simply type a message to the LLM in the bottom box, then press ENTER or click the send icon.
  • The system prompt for the LLM can be customized by modifying the system_prompt.txt file.

  • The LLM, parameters, and models used for classification tasks can be customized in the .env file.

    Moderation Info: Moderation results for the message. Dependent classes on a scale 0-1.

    Topic Classification Info: Topic classification results for the message. Independent classes on a scale 0-1.

  • The MESSAGE_HISTORY_LIMIT variable in the .env file can be tuned to adjust the "memory" capabilities of the LLM. It defines the number of message exchanges that are kept in the context window. The total number of messages is 2x this value.

Viewing Data in the Database

User information and chat history are stored in a PostgreSQL database.

  1. Use database viewer (such as PgAdmin) to connect to the database.

  2. Set the connection parameters using the following .env variables:

    Hostname/address: DB_HOST_LOCAL

    Port: DB_HOST_PORT

    Username: DB_USER

    Password: DB_PASSWORD

  3. Connect to the database.

  • The users table contains user profile data.
  • The message_history table contains chat message history data.

Chat Interface

Chat Interface

About

Patient-LLM Chat Interface for the Pioneer Project

Resources

Stars

Watchers

Forks

Contributors