Skip to content

DomanskiFilip/cf_ai_worker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 

Repository files navigation

CF_Ai_Worker assignment app

A full-stack AI chat application built with Cloudflare Workers, Hono, D1, and Vue 3. The AI identifies as a professional assistant that is secretly a Yellow Maine Coon cat.

Vue.js Cloudflare Hono D1 Node.js Wrangler CloudflareWorkers Generative Artificial Intelligence

Prerequisites

Node.js (v18 or higher)

npm

Cloudflare Account (for AI model access)

Run npx wrangler login in your terminal to authenticate.

Project Structure

cf_ai_worker/
├── backend/                # Cloudflare Worker & AI Logic
│   ├── src/
│   │   └── index.ts        # Entry point: Hono API routes & ChatWorkflow class
│   ├── schema.sql          # D1 Database table definitions
│   ├── wrangler.toml       # Cloudflare configuration (D1, Workflows, AI bindings)
│   └── package.json        
│
├── frontend/               # Vue.js Web Application
│   ├── src/
│   │   ├── components/
│   │   │   └── chat.vue    # Main Chat component (UI & API fetching)
│   │   ├── App.vue         # Root Vue component (Layout & Footer)
│   │   └── main.js         
│   ├── .env                # Local environment variables (VITE_API_URL)
│   ├── index.html          
│   └── package.json        
│
└── README.md               

Setup Instructions

  1. Backend Setup (Cloudflare Worker) Navigate to the backend directory:
cd backend
npm install

Initialize Local Database: Even in local development, you must create the D1 table structure.

npx wrangler d1 execute chat-memory --local --file=./schema.sql

Start the Backend Server:

npx wrangler dev

The server should now be running at http://127.0.0.1:8787.

  1. Frontend Setup (Vue.js) Open a new terminal and navigate to the frontend directory:
cd frontend
npm install

Configure Environment Variables: Create a .env file in the frontend/ root folder:

VITE_API_ENDPOINT=http://127.0.0.1:8787

Start the Frontend App:

npm run dev

The app should now be accessible at http://localhost:5173.

Key Features

Hono Framework: Used for lightweight, edge-native routing and CORS handling.

Cloudflare Workflows: Orchestrates the multi-step process (Fetch History -> AI Inference -> Save to DB).

D1 Database: Persistent storage for user chat history to provide AI "memory."

Llama 3.3 70B: AI model used via Workers AI

How would I improve if this would be an actual product

Login and OAuth, of course. Currently it uses a random generated id saved in localstorage to indentify the user and show the history its an easy aproach for demos I like to use

Caching latest responses and repeating queries and responses, for example using Redis.

Load balancing.

More models, for e.g., visual and sound-to-text models for voice interaction and the ability to send and analyze pictures.

Additional tables for useful data and personality of the AI -> just an idea.

About

Cloudflare AI worker

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors