-
Notifications
You must be signed in to change notification settings - Fork 35
⚡ Bolt: field officer visit blockchain with O(1) verify #596
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
d348fdb
a135c83
8dfcb14
8ea6c15
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,30 @@ | ||
| /home/jules/.pyenv/versions/3.12.13/lib/python3.12/site-packages/apscheduler/__init__.py:1: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. | ||
| from pkg_resources import get_distribution, DistributionNotFound | ||
| 2026-03-26 15:17:56,481 - backend.adaptive_weights - INFO - Adaptive weights loaded/reloaded. | ||
| 2026-03-26 15:17:56,515 - backend.rag_service - INFO - Loaded 5 civic policies for RAG. | ||
| /home/jules/.pyenv/versions/3.12.13/lib/python3.12/site-packages/pydub/utils.py:170: RuntimeWarning: Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work | ||
| warn("Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work", RuntimeWarning) | ||
| INFO: Started server process [16333] | ||
| INFO: Waiting for application startup. | ||
| 2026-03-26 15:17:56,898 - backend.main - INFO - Shared HTTP Client initialized. | ||
| 2026-03-26 15:17:56,898 - backend.main - INFO - Starting database initialization... | ||
| 2026-03-26 15:17:56,910 - backend.main - INFO - Base.metadata.create_all completed. | ||
| 2026-03-26 15:17:56,910 - backend.main - INFO - Database initialized successfully (migrations skipped for local dev). | ||
| 2026-03-26 15:17:56,910 - backend.main - INFO - Initializing grievance service... | ||
| 2026-03-26 15:17:56,910 - backend.main - INFO - Grievance service initialization skipped for local dev. | ||
| 2026-03-26 15:17:56,911 - backend.main - INFO - Scheduler skipped for local development | ||
| INFO: Application startup complete. | ||
| INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit) | ||
| 2026-03-26 15:17:56,912 - backend.main - INFO - AI services initialized successfully. | ||
| 2026-03-26 15:17:56,913 - backend.main - INFO - Maharashtra data pre-loaded successfully. | ||
| 2026-03-26 15:17:56,914 - backend.main - INFO - Telegram bot initialization skipped for local testing. | ||
| Starting server on port 8000 | ||
| 🤖 AI Service Type: GEMINI | ||
| INFO: 127.0.0.1:41058 - "GET /health HTTP/1.1" 200 OK | ||
| INFO: Shutting down | ||
| INFO: Waiting for application shutdown. | ||
| 2026-03-26 15:18:11,570 - backend.main - INFO - Shared HTTP Client closed. | ||
| 2026-03-26 15:18:11,570 - root - INFO - Bot thread is not initialized | ||
| 2026-03-26 15:18:11,570 - backend.main - INFO - Telegram bot thread stopped. | ||
| INFO: Application shutdown complete. | ||
| INFO: Finished server process [16333] | ||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -104,21 +104,38 @@ def generate_visit_hash(visit_data: dict) -> str: | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| HMAC-SHA256 hash of visit data | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| """ | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| try: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Normalize check_in_time to ISO format string for determinism | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Normalize check_in_time for determinism | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| check_in_time = visit_data.get('check_in_time') | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| if isinstance(check_in_time, datetime): | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| check_in_time_str = check_in_time.isoformat() | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Normalize to UTC and format consistently without timezone string | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # This ensures consistency even if DB strips timezone info | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| if check_in_time.tzinfo: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| check_in_time = check_in_time.astimezone(timezone.utc).replace(tzinfo=None) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| check_in_time_str = check_in_time.strftime('%Y-%m-%dT%H:%M:%S') | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| elif isinstance(check_in_time, str): | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Try to parse and re-format for normalization | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| try: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Handle ISO format with Z or +00:00 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ts = check_in_time.replace('Z', '+00:00') | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| dt = datetime.fromisoformat(ts) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| if dt.tzinfo: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| dt = dt.astimezone(timezone.utc).replace(tzinfo=None) | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| check_in_time_str = dt.strftime('%Y-%m-%dT%H:%M:%S') | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| except Exception: | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Comment on lines
+110
to
+124
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| # Normalize to UTC and format consistently without timezone string | |
| # This ensures consistency even if DB strips timezone info | |
| if check_in_time.tzinfo: | |
| check_in_time = check_in_time.astimezone(timezone.utc).replace(tzinfo=None) | |
| check_in_time_str = check_in_time.strftime('%Y-%m-%dT%H:%M:%S') | |
| elif isinstance(check_in_time, str): | |
| # Try to parse and re-format for normalization | |
| try: | |
| # Handle ISO format with Z or +00:00 | |
| ts = check_in_time.replace('Z', '+00:00') | |
| dt = datetime.fromisoformat(ts) | |
| if dt.tzinfo: | |
| dt = dt.astimezone(timezone.utc).replace(tzinfo=None) | |
| check_in_time_str = dt.strftime('%Y-%m-%dT%H:%M:%S') | |
| except Exception: | |
| # Normalize to a canonical UTC ISO8601 representation with microseconds. | |
| # If naive, treat as UTC; if aware, convert to UTC. | |
| if check_in_time.tzinfo is None: | |
| dt_utc = check_in_time.replace(tzinfo=timezone.utc) | |
| else: | |
| dt_utc = check_in_time.astimezone(timezone.utc) | |
| check_in_time_str = dt_utc.isoformat(timespec="microseconds").replace("+00:00", "Z") | |
| elif isinstance(check_in_time, str): | |
| # Try to parse and re-format for normalization using the same UTC ISO8601 format. | |
| try: | |
| # Handle ISO format with Z or +00:00 | |
| ts = check_in_time.replace("Z", "+00:00") | |
| dt = datetime.fromisoformat(ts) | |
| if dt.tzinfo is None: | |
| dt_utc = dt.replace(tzinfo=timezone.utc) | |
| else: | |
| dt_utc = dt.astimezone(timezone.utc) | |
| check_in_time_str = dt_utc.isoformat(timespec="microseconds").replace("+00:00", "Z") | |
| except Exception: | |
| # Fall back to the original string if parsing fails to avoid data loss. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,24 +1,28 @@ | ||
| fastapi | ||
| uvicorn | ||
| python-dotenv | ||
| sqlalchemy | ||
| python-telegram-bot | ||
| google-generativeai | ||
| python-multipart | ||
| psycopg2-binary | ||
| huggingface-hub | ||
| httpx | ||
| pywebpush | ||
| Pillow | ||
| firebase-functions | ||
| firebase-admin | ||
| a2wsgi | ||
| python-jose[cryptography] | ||
| passlib[bcrypt] | ||
| bcrypt<4.0.0 | ||
| SpeechRecognition | ||
| pydub | ||
| fastapi==0.111.0 | ||
| uvicorn==0.42.0 | ||
| python-dotenv==1.2.2 | ||
| sqlalchemy==2.0.48 | ||
| python-telegram-bot==22.7 | ||
| google-generativeai==0.8.6 | ||
| python-multipart==0.0.22 | ||
| psycopg2-binary==2.9.11 | ||
| huggingface-hub==0.36.2 | ||
| httpx>=0.27.2,<0.29.0 | ||
| pywebpush==2.3.0 | ||
| Pillow==12.1.1 | ||
| firebase-functions==0.5.0 | ||
| firebase-admin==6.8.0 | ||
| a2wsgi==1.10.10 | ||
| python-jose[cryptography]==3.5.0 | ||
| passlib[bcrypt]==1.7.4 | ||
| bcrypt==3.2.2 | ||
| SpeechRecognition==3.15.2 | ||
| pydub==0.25.1 | ||
| googletrans==4.0.2 | ||
| langdetect | ||
| numpy | ||
| scikit-learn | ||
| langdetect==1.0.9 | ||
| numpy==2.4.3 | ||
| scikit-learn==1.8.0 | ||
| python-magic==0.4.27 | ||
| joblib==1.5.3 | ||
| pytest==9.0.2 | ||
| urllib3<2.0.0 |
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -20,16 +20,19 @@ | |||||
| OfficerCheckOutRequest, | ||||||
| FieldOfficerVisitResponse, | ||||||
| PublicFieldOfficerVisitResponse, | ||||||
| BlockchainVerificationResponse, | ||||||
| VisitHistoryResponse, | ||||||
| VisitStatsResponse, | ||||||
| VisitImageUploadResponse | ||||||
| ) | ||||||
| from backend.geofencing_service import ( | ||||||
| is_within_geofence, | ||||||
| generate_visit_hash, | ||||||
| verify_visit_integrity, | ||||||
| calculate_visit_metrics, | ||||||
| get_geofencing_service | ||||||
| ) | ||||||
| from backend.cache import visit_last_hash_cache | ||||||
|
|
||||||
| logger = logging.getLogger(__name__) | ||||||
|
|
||||||
|
|
@@ -92,6 +95,15 @@ def officer_check_in(request: OfficerCheckInRequest, db: Session = Depends(get_d | |||||
| radius_meters=request.geofence_radius_meters or 100.0 | ||||||
| ) | ||||||
|
|
||||||
| # Blockchain feature: calculate integrity hash for the visit | ||||||
| # Performance Boost: Use thread-safe cache to eliminate DB query for last hash | ||||||
| prev_hash = visit_last_hash_cache.get("last_hash") | ||||||
RohanExploit marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||
| if prev_hash is None: | ||||||
| # Cache miss: Fetch only the last hash from DB | ||||||
| prev_visit = db.query(FieldOfficerVisit.visit_hash).order_by(FieldOfficerVisit.id.desc()).first() | ||||||
| prev_hash = prev_visit[0] if prev_visit and prev_visit[0] else "" | ||||||
| visit_last_hash_cache.set(data=prev_hash, key="last_hash") | ||||||
|
|
||||||
|
Comment on lines
+98
to
+106
|
||||||
| # Create visit record | ||||||
| check_in_time = datetime.now(timezone.utc) | ||||||
|
|
||||||
|
|
@@ -101,10 +113,11 @@ def officer_check_in(request: OfficerCheckInRequest, db: Session = Depends(get_d | |||||
| 'check_in_latitude': request.check_in_latitude, | ||||||
| 'check_in_longitude': request.check_in_longitude, | ||||||
| 'check_in_time': check_in_time.isoformat(), | ||||||
| 'visit_notes': request.visit_notes or '' | ||||||
| 'visit_notes': request.visit_notes or '', | ||||||
| 'previous_visit_hash': prev_hash | ||||||
| } | ||||||
|
|
||||||
| # Generate immutable hash | ||||||
| # Generate immutable hash with chaining | ||||||
| visit_hash = generate_visit_hash(visit_data) | ||||||
|
|
||||||
| new_visit = FieldOfficerVisit( | ||||||
|
|
@@ -123,13 +136,17 @@ def officer_check_in(request: OfficerCheckInRequest, db: Session = Depends(get_d | |||||
| visit_notes=request.visit_notes, | ||||||
| status='checked_in', | ||||||
| visit_hash=visit_hash, | ||||||
| previous_visit_hash=prev_hash, | ||||||
| is_public=True | ||||||
| ) | ||||||
|
|
||||||
| db.add(new_visit) | ||||||
| db.commit() | ||||||
| db.refresh(new_visit) | ||||||
|
|
||||||
| # Update cache after successful commit | ||||||
| visit_last_hash_cache.set(data=visit_hash, key="last_hash") | ||||||
|
|
||||||
| logger.info( | ||||||
| f"Officer {request.officer_name} checked in at issue {request.issue_id}. " | ||||||
| f"Distance: {distance:.2f}m, Within fence: {within_fence}" | ||||||
|
|
@@ -155,6 +172,8 @@ def officer_check_in(request: OfficerCheckInRequest, db: Session = Depends(get_d | |||||
| status=new_visit.status, | ||||||
| verified_by=new_visit.verified_by, | ||||||
| verified_at=new_visit.verified_at, | ||||||
| visit_hash=new_visit.visit_hash, | ||||||
| previous_visit_hash=new_visit.previous_visit_hash, | ||||||
| is_public=new_visit.is_public, | ||||||
| created_at=new_visit.created_at | ||||||
| ) | ||||||
|
|
@@ -230,6 +249,8 @@ def officer_check_out(request: OfficerCheckOutRequest, db: Session = Depends(get | |||||
| status=visit.status, | ||||||
| verified_by=visit.verified_by, | ||||||
| verified_at=visit.verified_at, | ||||||
| visit_hash=visit.visit_hash, | ||||||
| previous_visit_hash=visit.previous_visit_hash, | ||||||
| is_public=visit.is_public, | ||||||
| created_at=visit.created_at | ||||||
| ) | ||||||
|
|
@@ -484,3 +505,65 @@ def verify_visit( | |||||
| except Exception as e: | ||||||
| logger.error(f"Error verifying visit {visit_id}: {e}", exc_info=True) | ||||||
| raise HTTPException(status_code=500, detail="Verification failed") | ||||||
|
|
||||||
|
|
||||||
| @router.get("/field-officer/{visit_id}/blockchain-verify", response_model=BlockchainVerificationResponse) | ||||||
|
||||||
| @router.get("/field-officer/{visit_id}/blockchain-verify", response_model=BlockchainVerificationResponse) | |
| @router.get("/field-officer/visit/{visit_id}/blockchain-verify", response_model=BlockchainVerificationResponse) |
Copilot
AI
Mar 26, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BlockchainVerificationResponse’s field descriptions in backend/schemas.py are issue-specific ("issue integrity", "previous issue's hash"), but this endpoint uses it for visits. Either introduce a visit-specific response schema (preferred) or update the existing schema/docs so OpenAPI accurately describes the visit verification payload.
Copilot
AI
Mar 26, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the prev_exists failure branch, computed_hash is set to visit.visit_hash, which makes the response misleading (it no longer reflects the computed value for the payload). Compute the hash once (e.g., computed = generate_visit_hash(visit_data)) and reuse it for both is_valid and the response so clients can see what was actually recomputed.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
P2: Do not commit runtime log output files; remove this generated artifact from the PR and ignore it in VCS.
Prompt for AI agents