Skip to content

Release v1.0.7#394

Closed
ViktorSvertoka wants to merge 26 commits intomainfrom
develop
Closed

Release v1.0.7#394
ViktorSvertoka wants to merge 26 commits intomainfrom
develop

Conversation

@ViktorSvertoka
Copy link
Member

@ViktorSvertoka ViktorSvertoka commented Mar 8, 2026

Summary by CodeRabbit

  • New Features

    • Blog platform migration to database with multi-language support (UK/EN/PL)
    • Google Pay payment option for Monobank checkout
    • Cloudinary-hosted blog images
  • Bug Fixes

    • Resolved 500 errors on blog posts
    • Fixed blog rendering layout issues
    • Improved order processing and payment status handling
    • Enhanced Nova Poshta shipping integration stability
  • Improvements

    • Optimized blog queries and added database indexes
    • Expanded error messaging and translation coverage across checkout and authentication flows
    • Removed Sanity CMS runtime dependency

ViktorSvertoka and others added 25 commits March 3, 2026 22:52
… migration (#390)

* fix(blog): resolve 500 error on blog post pages

* fix(blog): remove unsafe GROQ coalesce fallbacks and revert CDN

* feat(blog): add DB schema indexes + Sanity => PostgreSQL migration script

- Add FK indexes on blog_posts.author_id and blog_post_categories.category_id
- Add one-time migration script: fetches Sanity data via REST API,
  re-uploads images to Cloudinary, converts Portable Text → Tiptap JSON,
  inserts into 7 blog tables (4 categories, 3 authors, 21 posts)
- Drizzle migration 0028 for index changes

Closes #384, #385
… to PostgreSQL (#391)

* fix(blog): resolve 500 error on blog post pages

* fix(blog): remove unsafe GROQ coalesce fallbacks and revert CDN

* feat(blog): add DB schema indexes + Sanity => PostgreSQL migration script

- Add FK indexes on blog_posts.author_id and blog_post_categories.category_id
- Add one-time migration script: fetches Sanity data via REST API,
  re-uploads images to Cloudinary, converts Portable Text → Tiptap JSON,
  inserts into 7 blog tables (4 categories, 3 authors, 21 posts)
- Drizzle migration 0028 for index changes

Closes #384, #385

* feat(blog): replace Sanity CMS with Drizzle query layer for all blog routes

Swap every blog page, API route, and header component from Sanity GROQ
queries to Drizzle ORM against PostgreSQL. Adds Tiptap JSON renderer,
shared text extraction, and typed query layer for posts/authors/categories.
Fixes runtime crash where /api/blog-author returned Portable Text bio
objects that React tried to render as children.

* fix: resolve eslint formatting issues

* fix(blog): enable proper Next image optimization and harden blog category handling
@ViktorSvertoka ViktorSvertoka self-assigned this Mar 8, 2026
@ViktorSvertoka ViktorSvertoka requested a review from AM1007 as a code owner March 8, 2026 08:02
@ViktorSvertoka ViktorSvertoka added bug Something isn't working enhancement New feature or request performance Performance and efficiency optimizations without functional changes. labels Mar 8, 2026
@ViktorSvertoka ViktorSvertoka added the refactor Code restructuring without functional changes label Mar 8, 2026
@vercel
Copy link
Contributor

vercel bot commented Mar 8, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
devlovers-net Ignored Ignored Preview Mar 8, 2026 8:25am

Request Review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 8, 2026

📝 Walkthrough

Walkthrough

Version 1.0.7 introduces a comprehensive blog platform migration from Sanity CMS to PostgreSQL with Drizzle ORM, adds Monobank Google Pay payment support with new checkout flows, refactors Nova Poshta warehouse caching to use city references, enhances authentication form validation with internationalization, and expands webhook handling for wallet attribution tracking across multiple payment providers.

Changes

Cohort / File(s) Summary
Blog Platform Migration (DB Schema)
frontend/db/schema/blog.ts, frontend/drizzle/0027_material_bloodscream.sql, frontend/drizzle/0028_bitter_wrecking_crew.sql
New comprehensive blog schema with tables for categories, authors, posts, translations, and post-category relationships; includes indexes on author_id and category_id for query performance; all tables use UUIDs and locale-based translations.
Blog Data Fetching Layer
frontend/db/queries/blog/blog-posts.ts, frontend/db/queries/blog/blog-categories.ts, frontend/db/queries/blog/blog-authors.ts
New query helpers replacing Sanity GROQ calls with database-backed functions; getBlogPosts, getBlogPostBySlug, getBlogPostsByCategory support locale-aware translations; getCachedBlogCategories provides memoized access with 7-day TTL.
Blog Content Rendering & Components
frontend/components/blog/BlogPostRenderer.tsx, frontend/app/[locale]/blog/[slug]/PostDetails.tsx, frontend/app/[locale]/blog/[slug]/page.tsx
Replaces portable-text rendering with new BlogPostRenderer component; PostDetails refactored to use DB query helpers and server-side rendering; page.tsx changed to force-dynamic with DB-backed metadata generation.
Blog UI Components & Types
frontend/components/blog/BlogFilters.tsx, frontend/components/blog/BlogCard.tsx, frontend/components/blog/BlogHeaderSearch.tsx, frontend/components/blog/BlogGrid.tsx, frontend/components/blog/BlogCategoryLinks.tsx
Updated blog data model from Sanity portable-text structure to plain-text/Tiptap JSON; Post type now uses id/slug strings and plain-text body; categories nested as objects with slug/title; removed BlogNavLinks component; extractPlainText utility added.
Blog Content Migration Script
frontend/db/seed-blog-migration.ts
One-time idempotent migration script that fetches data from Sanity REST API, transforms portable-text to Tiptap JSON, uploads images to Cloudinary, and seeds all blog tables with locale translations; handles category, author, and post migrations with image re-hosting.
Blog API Routes & Search
frontend/app/api/blog-author/route.ts, frontend/app/api/blog-search/route.ts, frontend/app/[locale]/blog/page.tsx, frontend/app/[locale]/blog/category/[category]/page.tsx
Replaced GROQ queries with database query helpers; blog-search endpoint now maps DB posts to simplified item structure; category page uses new getBlogPostsByCategory helper; all routes optimized with parallel data loading via Promise.all.
Monobank Google Pay Checkout
frontend/app/[locale]/shop/cart/CartPageClient.tsx, frontend/app/[locale]/shop/cart/page.tsx
Added monobankGooglePayEnabled prop and payment method state management; new resolveDefaultMethodForProvider and resolveMonobankGooglePayEnabled helpers; enhanced city lookup with normalization and parsing; conditional UI for monobank_google_pay option; idempotency tracking includes payment method.
Monobank Google Pay API Routes
frontend/app/api/shop/orders/[id]/payment/monobank/google-pay/config/route.ts, frontend/app/api/shop/orders/[id]/payment/monobank/google-pay/submit/route.ts, frontend/app/api/shop/orders/[id]/payment/monobank/invoice/route.ts
New endpoints for Google Pay configuration, payment submission with idempotency, and invoice generation; all validate order payability, enforce currency/provider constraints, and return structured responses with status tokens and redirect URLs.
Monobank Google Pay Return Flow
frontend/app/[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx, frontend/app/[locale]/shop/checkout/return/monobank/MonobankReturnStatus.tsx, frontend/app/[locale]/shop/checkout/return/monobank/page.tsx
New client-side Google Pay handler with script loading, readiness detection, and submission logic; MonobankReturnStatus polls order status endpoint and navigates on payment completion; return page handles status token parsing and order lookup.
Monobank Wallet Service & Utilities
frontend/app/api/shop/orders/[id]/payment/monobank/_shared.ts, frontend/lib/services/orders/monobank-wallet.ts, frontend/lib/psp/monobank.ts
New shared utilities for Monobank payment validation, order state checks, and wallet payment support; comprehensive wallet payment service with idempotency, conflict detection, and PSP integration; added PSP_UPSTREAM error code for 429 rate limiting.
Payment Method Validation & Checkout
frontend/lib/shop/payments.ts, frontend/lib/validation/shop.ts, frontend/app/api/shop/checkout/route.ts
New PaymentMethod type (stripe_card, monobank_invoice, monobank_google_pay); validators for method-provider-currency combinations; checkout route extended to parse paymentMethod and enforce validation rules; idempotency hashing updated to include payment method.
Order Service Enhancements
frontend/lib/services/orders/checkout.ts, frontend/lib/services/orders/_shared.ts
Extended order creation to support explicit payment method selection and storage in pspPaymentMethod/pspMetadata; added backfill logic for payment method data; improved idempotency contract to consider payment method.
Stripe Webhook Wallet Attribution
frontend/app/api/shop/webhooks/stripe/route.ts
Added wallet attribution detection for Apple Pay and Google Pay; new resolveStripeWalletType helper; wallet metadata conditionally included in PSP metadata and payment events; refund handling enhanced to detect and skip id-only refund charges.
Monobank Webhook Processing
frontend/lib/services/orders/monobank-webhook.ts
Extended to track wallet attribution from payment attempt metadata; WalletAttribution type added (provider: monobank, type: google_pay); wallet data propagated through webhook processing and stored in order PSP metadata and payment events.
Nova Poshta City Reference Refactor
frontend/lib/services/shop/shipping/nova-poshta-catalog.ts, frontend/lib/services/shop/shipping/nova-poshta-client.ts, frontend/db/schema/shop.ts, frontend/drizzle/0029_shop_np_warehouses_city_ref_fk.sql, frontend/drizzle/0030_shop_np_warehouses_city_ref_fk_backfill.sql
Replaced settlementRef with cityRef throughout warehouse caching and retrieval; updated DB schema FK to point cityRef to npCities.ref; enhanced NP client with fallback city search and improved pagination retry logic; all API calls and caching use city_ref.
Shopping Cart & Shipping
frontend/app/api/shop/shipping/np/warehouses/route.ts, frontend/app/api/shop/internal/shipping/np/sync/route.ts
Updated warehouse and sync routes to use cityRef parameter; added debug payload builder for NP errors in non-production; warehouse sync cache calls changed from cacheWarehousesBySettlement to cacheWarehousesByCityRef.
Authentication Form Enhancements
frontend/components/auth/LoginForm.tsx, frontend/components/auth/ResetPasswordForm.tsx, frontend/components/auth/SignupForm.tsx, frontend/components/auth/fields/EmailField.tsx, frontend/components/auth/fields/NameField.tsx, frontend/components/auth/fields/PasswordField.tsx
Simplified error handling to use generic translated messages; added client-side validation with localized constraint messages; password field now imports and uses PASSWORD_MIN_LEN/PASSWORD_MAX_BYTES constants.
Header Component Type Updates
frontend/components/header/AppChrome.tsx, frontend/components/header/UnifiedHeader.tsx, frontend/components/header/DesktopNav.tsx, frontend/components/header/MainSwitcher.tsx, frontend/components/header/AppMobileMenu.tsx, frontend/components/header/MobileActions.tsx
Updated blogCategories type across all header components from { _id, title } to { id, slug, title }; removed slug generation logic (slugify); all category links now use pre-computed slug property.
Dashboard Profile Updates
frontend/components/dashboard/ProfileCard.tsx
Added localized validation messages and error handling; introduced form validation helpers for name and password constraints using i18n keys; improved UX feedback with client-side custom validity messages.
Blog Utilities
frontend/lib/blog/image.ts, frontend/lib/blog/text.ts
Removed isSanityAssetUrl check; shouldBypassImageOptimization now returns false unconditionally (Cloudinary-hosted images); new extractPlainText utility for Tiptap JSON to plain-text conversion.
Internationalization
frontend/messages/en.json, frontend/messages/uk.json, frontend/messages/pl.json
Added localization keys for Monobank Google Pay (labels, hints, fallback states), city/warehouse selection guidance, auth error messages (emailAlreadyInUse, serverError), password validation constraints, and dashboard profile saving feedback.
Drizzle Configuration & Migration Tracking
frontend/drizzle.config.ts, frontend/drizzle/meta/_journal.json
Refactored config as named const export; added journal entries for all new blog and shop migrations (indices, city_ref FK changes, backfill).
Middleware & Routing
frontend/proxy.ts
Added early return to bypass Next.js handling for /.well-known paths; updated matcher pattern to exclude well-known paths from routing logic.
Test Suites (Blog)
Various test files with blog-related assertions
Updated test data models and assertions to reflect new blog schema (id/slug strings, plain-text body, nested categories); adjusted snapshots and query results accordingly.
Test Suites (Monobank Google Pay)
frontend/lib/tests/shop/monobank-google-pay-*.test.ts
New comprehensive test suites for Google Pay config, submit, and return flows; verify request/response shapes, feature flag enforcement, idempotency, error handling, and PSP interaction; mock environment and database state.
Test Suites (Monobank Wallet & Webhook)
frontend/lib/tests/shop/monobank-wallet-service.test.ts, frontend/lib/tests/shop/monobank-webhook-apply.test.ts
New tests validating wallet payment service idempotency, conflict detection, PSP calls, and wallet attribution propagation in webhook processing; expanded webhook tests to include wallet metadata tracking.
Test Suites (Nova Poshta)
frontend/lib/tests/shop/nova-poshta-client-network-failure.test.ts, frontend/lib/tests/shop/shipping-np-warehouses-route-p2.test.ts
Updated warehouse tests to use cityRef variants; mock getWarehousesByCityRef; validate CityRef-based error messages.
Test Suites (Checkout & Payment)
frontend/lib/tests/shop/checkout-monobank-*.test.ts, frontend/lib/tests/shop/stripe-webhook-*.test.ts
Extended checkout tests to validate payment method handling and idempotency with different methods; stripe webhook tests enhanced to verify wallet attribution storage and retrieval.
Version & Package Updates
frontend/package.json, studio/package.json
Bumped version from 1.0.6 to 1.0.7 in both packages.
Minor Import/Refactoring
Various files with reordering/cleanup
Multiple files had import reordering, unused import removals (e.g., crypto, groq, client, ne, sql), and minor format normalization without logic changes; includes files like quiz.ts, status/route.ts, orders files, shipping files, returns.ts, etc.
NP Mock Server
frontend/scripts/np-mock-server.mjs
Enhanced mock payloads to include cityRef and additional NP settlement/warehouse fields; added fallback settlement search logic with improved city data structures.

Sequence Diagram(s)

sequenceDiagram
    participant Client as Browser Client
    participant CartPage as CartPageClient
    participant ConfigAPI as GET /config<br/>(Google Pay Config)
    participant SubmitAPI as POST /submit<br/>(Payment Submit)
    participant Monobank as Monobank PSP
    participant ReturnPage as Return Status Page
    participant StatusAPI as GET /status<br/>(Order Status)

    Client->>CartPage: Select Monobank Google Pay
    CartPage->>ConfigAPI: Fetch Google Pay config<br/>(order, merchant ID, amount)
    ConfigAPI-->>CartPage: PaymentDataRequest

    CartPage->>CartPage: Load Google Pay SDK<br/>Show payment button
    Client->>CartPage: Tap "Pay with Google Pay"
    CartPage->>CartPage: Google Pay dialog opens<br/>User authenticates
    CartPage->>CartPage: Extract payment token

    CartPage->>SubmitAPI: POST with gToken<br/>+ idempotency key
    SubmitAPI->>SubmitAPI: Validate order payable<br/>Enforce currency/method
    SubmitAPI->>Monobank: submitWalletPayment(token)
    Monobank-->>SubmitAPI: Invoice ID & redirect URL<br/>(or 429/timeout)
    SubmitAPI->>SubmitAPI: Persist attempt result<br/>(submitted/unknown/rejected)
    SubmitAPI-->>CartPage: { redirectUrl, statusToken }

    CartPage->>ReturnPage: Redirect to return page<br/>with status token
    ReturnPage->>StatusAPI: Poll /status?view=lite<br/>+ statusToken
    StatusAPI-->>ReturnPage: Current payment status
    ReturnPage->>ReturnPage: Check: paid?<br/>Terminal state?
    alt Payment succeeded
        ReturnPage->>Client: Navigate to success
    else Still pending
        ReturnPage->>StatusAPI: Re-poll after delay
    else Terminal non-paid
        ReturnPage->>Client: Navigate to error
    end
Loading
sequenceDiagram
    participant Sanity as Sanity CMS<br/>(source)
    participant Migration as Migration Script
    participant CloudAPI as Cloudinary API
    participant DB as PostgreSQL
    participant DBSchema as blog_* Tables

    Migration->>Sanity: Fetch all categories<br/>(REST API)
    Sanity-->>Migration: Categories array
    Migration->>DB: Insert into blog_categories<br/>(id, slug, displayOrder)<br/>for each locale: translation
    DB-->>DBSchema: Rows inserted

    Migration->>Sanity: Fetch all authors<br/>with portable text bio
    Sanity-->>Migration: Authors array
    Migration->>CloudAPI: Upload author images<br/>to Cloudinary
    CloudAPI-->>Migration: Cloudinary URLs
    Migration->>DB: Insert into blog_authors<br/>+ blogAuthorTranslations<br/>with Cloudinary imageUrl
    DB-->>DBSchema: Rows inserted

    Migration->>Sanity: Fetch all posts<br/>with portable text body
    Sanity-->>Migration: Posts array
    Migration->>CloudAPI: Upload main images<br/>to Cloudinary
    CloudAPI-->>Migration: Cloudinary URLs
    Migration->>Migration: Convert portable text<br/>to Tiptap JSON<br/>per locale
    Migration->>DB: Insert blog_posts<br/>(id, slug, authorId, mainImage)
    DB-->>DBSchema: Rows inserted
    Migration->>DB: Insert blog_post_translations<br/>(title, body in Tiptap JSON)<br/>per locale
    DB-->>DBSchema: Translations inserted
    Migration->>DB: Insert blog_post_categories<br/>(postId, categoryId) junctions
    DB-->>DBSchema: Junctions inserted
    Migration-->>Migration: Idempotent:<br/>onConflictDoNothing
Loading

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Possibly related PRs

  • PR #390: Introduces the same blog database schema, Drizzle migrations (indices, blog tables), and the Sanity-to-Postgres migration script; directly overlaps on DB layer and migration infrastructure.
  • PR #392: Implements identical Monobank Google Pay flows, Nova Poshta city_ref refactoring, and wallet service/webhook enhancements; code-level overlap on payment routes, CartPageClient, and shipping catalog updates.
  • PR #368: Modifies the same Nova Poshta client, warehouse caching, and city/settlement reference logic; overlaps on nova-poshta-catalog.ts and related warehouse/city handling.

Suggested reviewers

  • AM1007

Poem

🐰 Blog hops from Sanity to SQL so spry,
Google Pay buttons make wallets sing high,
Cities and categories all get new names,
Tiptap and drizzle play payment games! 🎉

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch develop

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Note

Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
frontend/components/blog/BlogHeaderSearch.tsx (1)

143-149: ⚠️ Potential issue | 🟡 Minor

Only the toggle-button close path clears the query.

Outside click, Escape, submit, and result selection still close via setOpen(false), so the stale query comes back on the next open. A single closeSearch() helper would keep the behavior consistent.

🧹 Proposed cleanup
+  const closeSearch = () => {
+    setOpen(false);
+    setValue('');
+  };
+
   const submit = (event?: React.FormEvent) => {
     if (event) event.preventDefault();
     const query = value.trim();
     router.push(query ? `/blog?search=${encodeURIComponent(query)}` : '/blog');
-    setOpen(false);
+    closeSearch();
   };
@@
       if (!containerRef.current.contains(event.target as Node)) {
-        setOpen(false);
+        closeSearch();
       }
@@
-                if (event.key === 'Escape') setOpen(false);
+                if (event.key === 'Escape') closeSearch();
@@
-                    setOpen(false);
+                    closeSearch();

Also applies to: 181-183, 203-213

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/blog/BlogHeaderSearch.tsx` around lines 143 - 149, Create
a single closeSearch() helper that encapsulates the "close" behavior (e.g.,
setOpen(false); setValue(''); and any stopLoading/cleanup needed) and replace
all direct setOpen(false) calls (including the toggle-button branch inside the
onClick that currently calls setValue('') only on the toggle path, and other
handlers that close on outside click, Escape, submit, and result selection) with
calls to closeSearch(); keep the existing open path logic in the toggle (when
next is true call startLoading()) so only the closing branch uses closeSearch(),
and update references in handlers that currently use setOpen(false) to use
closeSearch() for consistent clearing of the query.
frontend/app/[locale]/blog/category/[category]/page.tsx (1)

72-84: ⚠️ Potential issue | 🟠 Major

Don't drop the first post when it has no hero image.

The hero section is skipped unless featuredPost.mainImage exists, but the grid always renders posts.slice(1). In that case the first post disappears entirely.

Suggested fix
-          <BlogCategoryGrid posts={posts.slice(1)} />
+          <BlogCategoryGrid
+            posts={featuredPost?.mainImage ? posts.slice(1) : posts}
+          />

Also applies to: 126-127

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/category/[category]/page.tsx around lines 72 -
84, The hero block currently only renders when featuredPost.mainImage exists but
the grid always uses posts.slice(1), which drops the first post when it has no
hero image; change the grid source to conditionally slice based on whether the
featured post has a hero (e.g., derive a variable like visiblePosts =
featuredPost?.mainImage ? posts.slice(1) : posts) and use visiblePosts in the
grid rendering (also update the same logic referenced around the other
occurrence at lines 126-127), or alternatively only set featuredPost when it has
a mainImage so posts.slice(1) is only used when appropriate; adjust references
to featuredPost, posts, posts.slice(1), and featuredPost.mainImage accordingly.
🟡 Minor comments (10)
frontend/components/auth/fields/NameField.tsx-30-48 (1)

30-48: ⚠️ Potential issue | 🟡 Minor

Fallback values may produce confusing validation messages.

In the tooShort handler, NAME_MAX_LEN: maxLength ?? minLength would display the same value for both min and max if maxLength is undefined (e.g., "Name must be 2-2 characters"). Similarly, NAME_MIN_LEN: minLength ?? 0 in the tooLong handler could show "0" as the minimum.

Consider either requiring both minLength and maxLength when the component is used, or providing sensible default constants (similar to how PasswordField imports PASSWORD_MIN_LEN).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/auth/fields/NameField.tsx` around lines 30 - 48, The
validation messages in NameField's tooShort and tooLong branches use weak
fallbacks that can produce confusing ranges (e.g., NAME_MAX_LEN: maxLength ??
minLength and NAME_MIN_LEN: minLength ?? 0); update the logic in the NameField
component so it either enforces that both minLength and maxLength props are
provided (update prop typing/validation) or replace the ad-hoc fallbacks with
explicit sensible constants (e.g., NAME_MIN_LEN_DEFAULT and
NAME_MAX_LEN_DEFAULT) and use those constants in the tooShort and tooLong
handlers instead of minLength ?? minLength or minLength ?? 0 to ensure
meaningful messages.
frontend/messages/uk.json-1169-1172 (1)

1169-1172: ⚠️ Potential issue | 🟡 Minor

Match the password copy to the actual byte-based limit.

Lines 1169 and 1172 say the max is measured in символів, but the placeholders/key names show the validation is byte-based. That will mislead users with non-ASCII passwords because they can exceed the byte cap before reaching that many characters.

✏️ Suggested wording fix
-        "passwordRequirements": "{PASSWORD_MIN_LEN}-{PASSWORD_MAX_BYTES} символів, принаймні одна велика літера та принаймні один спеціальний символ",
+        "passwordRequirements": "{PASSWORD_MIN_LEN}-{PASSWORD_MAX_BYTES} байтів, принаймні одна велика літера та принаймні один спеціальний символ",
...
-        "passwordTooLongBytes": "Пароль не повинен перевищувати {PASSWORD_MAX_BYTES} символів",
+        "passwordTooLongBytes": "Пароль не повинен перевищувати {PASSWORD_MAX_BYTES} байтів",
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/messages/uk.json` around lines 1169 - 1172, The Ukrainian
translation uses "символів" (characters) for the max password limit while the
validation enforces a byte-based cap, which is misleading for non-ASCII input;
update the message strings passwordRequirements and passwordTooLongBytes (and
any related invalidPassword usage of {passwordRequirementsText}) to explicitly
say the limit is in bytes (e.g., replace "символів" with "байтів" or similar
byte-specific wording) so the UI matches the byte-based validation.
frontend/drizzle/0030_shop_np_warehouses_city_ref_fk_backfill.sql-12-20 (1)

12-20: ⚠️ Potential issue | 🟡 Minor

Scope the FK existence check to np_warehouses.

Checking only pg_constraint.conname is not enough in Postgres because constraint names are not globally unique. If another table/schema already has np_warehouses_city_ref_np_cities_ref_fk, this migration will skip the ALTER TABLE and leave np_warehouses without the foreign key.

🛠️ Safer existence check
   IF NOT EXISTS (
     SELECT 1
     FROM pg_constraint
-    WHERE conname = 'np_warehouses_city_ref_np_cities_ref_fk'
+    WHERE conname = 'np_warehouses_city_ref_np_cities_ref_fk'
+      AND conrelid = 'np_warehouses'::regclass
   ) THEN
     ALTER TABLE np_warehouses
       ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/drizzle/0030_shop_np_warehouses_city_ref_fk_backfill.sql` around
lines 12 - 20, The FK existence check is too broad because pg_constraint.conname
is not unique; change the check so it confirms the constraint belongs to the
np_warehouses relation (not just a matching name). Update the IF NOT EXISTS
query that looks for 'np_warehouses_city_ref_np_cities_ref_fk' to also restrict
to the np_warehouses relation (for example by checking conrelid =
'np_warehouses'::regclass or by joining pg_constraint to pg_class/pg_namespace
and matching relname = 'np_warehouses' and the correct schema) before running
ALTER TABLE np_warehouses ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
FOREIGN KEY (city_ref) REFERENCES np_cities(ref) ON DELETE SET NULL.
frontend/components/blog/BlogCard.tsx-48-52 (1)

48-52: ⚠️ Potential issue | 🟡 Minor

Growth is being mislabeled as Career here.

getCategoryLabel() already has a growth branch, but this override forces legitimate Growth posts to render under the wrong category.

✂️ Suggested fix
-  const rawCategory =
-    post.categories?.[0].title === 'Growth'
-      ? 'Career'
-      : post.categories?.[0].title;
+  const rawCategory = post.categories?.[0].title;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/blog/BlogCard.tsx` around lines 48 - 52, The code
incorrectly maps a post with title 'Growth' to 'Career' before calling
getCategoryLabel; remove that hardcoded override so rawCategory is simply the
first category title (or undefined) and then call getCategoryLabel(rawCategory)
— i.e., stop transforming post.categories?.[0].title to 'Career' in the
rawCategory assignment and let getCategoryLabel handle 'Growth' itself (refer to
rawCategory, post.categories, and getCategoryLabel to locate the change).
frontend/lib/tests/shop/monobank-google-pay-submit-route.test.ts-319-325 (1)

319-325: ⚠️ Potential issue | 🟡 Minor

Make the replay request body deterministic.

req() generates a fresh gToken on every call, so this case never exercises “same key + same payload”. That can hide regressions if the submit route later becomes payload-aware.

🎯 Suggested fix
       const requestKey = 'mono_submit_replay_key_0001';
+      const gToken = `token_${crypto.randomUUID()}`;
       const req = () =>
         makeSubmitRequest({
           orderId,
           idempotencyKey: requestKey,
-          body: JSON.stringify({ gToken: `token_${crypto.randomUUID()}` }),
+          body: JSON.stringify({ gToken }),
         });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/tests/shop/monobank-google-pay-submit-route.test.ts` around
lines 319 - 325, The test's req() creates a new gToken each call so it doesn't
exercise "same key + same payload"; instead generate a deterministic token once
and reuse it inside req() (e.g., const fixedGToken = 'token_<fixed>' or capture
crypto.randomUUID() into a const) and pass that to makeSubmitRequest so
requestKey and the payload remain identical across calls; update references to
requestKey, req(), and makeSubmitRequest to use the fixedGToken.
frontend/app/[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx-123-133 (1)

123-133: ⚠️ Potential issue | 🟡 Minor

Existing script handling may not resolve correctly if script is still loading.

When an existing script element is found, the code attaches load and error listeners. However, if the script has already loaded before these listeners are attached, the promise will never resolve.

🐛 Proposed fix to handle already-loaded scripts
     if (existing) {
+      // If script already loaded, resolve immediately
+      if (window.google?.payments?.api?.PaymentsClient) {
+        resolve();
+        return;
+      }
       existing.addEventListener('load', () => resolve(), { once: true });
       existing.addEventListener(
         'error',
         () => {
           googlePayScriptPromise = null;
           reject(new Error('google_pay_script_failed'));
         },
         { once: true }
       );
       return;
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/`[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx
around lines 123 - 133, When an existing script element is reused, it may
already be loaded so attaching listeners can never fire; update the logic around
the existing element handling in MonobankGooglePayClient.tsx so that before
adding 'load'/'error' listeners you check if the script has already finished
(e.g., existing.readyState === 'complete' or
existing.getAttribute('data-loaded') === 'true') and call resolve() immediately
in that case, and also ensure that when you create a new script you set a marker
(e.g., script.dataset.loaded = 'true' in the script's 'load' handler) so future
checks detect it; keep using googlePayScriptPromise and preserve the existing
error handling that nulls googlePayScriptPromise and rejects on 'error'.
frontend/app/[locale]/shop/checkout/payment/monobank/[orderId]/page.tsx-121-134 (1)

121-134: ⚠️ Potential issue | 🟡 Minor

Error state missing navigation option.

Unlike the not_found state (lines 101-119), the generic error state doesn't provide a "Back to Cart" link, leaving users without a clear way to navigate away.

🐛 Add navigation link to error state
   if (loadState === 'error' || !order) {
     return (
       <main className="mx-auto max-w-4xl px-4 py-16 sm:px-6 lg:px-8">
         <section className="border-border bg-card rounded-lg border p-8 text-center">
           <h1 className="text-foreground text-2xl font-bold">
             {t('errors.unableToLoad')}
           </h1>
           <p className="text-muted-foreground mt-2 text-sm">
             {t('errors.tryAgainLater')}
           </p>
+          <div className="mt-6 flex justify-center">
+            <Link href="/shop/cart" className={SHOP_OUTLINE_BTN}>
+              {t('actions.backToCart')}
+            </Link>
+          </div>
         </section>
       </main>
     );
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/shop/checkout/payment/monobank/[orderId]/page.tsx
around lines 121 - 134, The error rendering block (when loadState === 'error' ||
!order) is missing a "Back to Cart" navigation option; update that JSX to
include a link/button back to the cart similar to the not_found case by
importing/using the same Link component and adding an anchor/Button with
href="/cart" (or the same cart route used elsewhere) and localized label via
t(...) (e.g., t('cart.backToCart') or the same key used in not_found); ensure
the new element is styled/positioned like the not_found link so users can
navigate away from the error screen.
frontend/app/[locale]/blog/category/[category]/page.tsx-15-23 (1)

15-23: ⚠️ Potential issue | 🟡 Minor

Map growth to the growth label.

This currently renders the growth slug with the career label, so the breadcrumb and page title are wrong for that category.

Suggested fix
-  if (key === 'growth') return t('categories.career');
+  if (key === 'growth') return t('categories.growth');
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/category/[category]/page.tsx around lines 15 -
23, getCategoryLabel currently maps the 'growth' slug to the 'categories.career'
label causing wrong breadcrumb/title; update the mapping in getCategoryLabel so
that when key === 'growth' it returns t('categories.growth') instead of
t('categories.career'), leave the other branch checks (tech, career, insights,
news) and the final fallback returning categoryName unchanged.
frontend/app/api/shop/orders/[id]/payment/monobank/google-pay/submit/route.ts-136-144 (1)

136-144: ⚠️ Potential issue | 🟡 Minor

Return the right error for malformed idempotency keys.

parseIdempotencyKey() distinguishes missing vs invalid headers, but this response always says the header is required. That makes client-side failures harder to diagnose.

Suggested fix
     return noStoreJson(
       {
         code: parsedIdempotency.code,
-        message: 'Idempotency-Key header is required.',
+        message:
+          parsedIdempotency.code === 'MISSING_IDEMPOTENCY_KEY'
+            ? 'Idempotency-Key header is required.'
+            : 'Idempotency-Key header is invalid.',
       },
       400
     );
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/api/shop/orders/`[id]/payment/monobank/google-pay/submit/route.ts
around lines 136 - 144, The current branch always responds "Idempotency-Key
header is required." when parseIdempotencyKey(request) returns !ok; instead
examine parsedIdempotency.code (from parseIdempotencyKey) and return noStoreJson
with the appropriate HTTP status and message reflecting whether the header is
missing or malformed—use parsedIdempotency.code and a message like
"Idempotency-Key header is required." for missing and "Invalid Idempotency-Key
header." (or the specific error text) for invalid keys so callers can
distinguish the two; update the failing branch that references parsedIdempotency
and noStoreJson accordingly.
frontend/app/[locale]/blog/category/[category]/page.tsx-88-90 (1)

88-90: ⚠️ Potential issue | 🟡 Minor

Use the requested category badge here.

On a category page, featuredPost.categories[0] is not stable: multi-category posts can show whichever category happens to come first. That can make the hero badge disagree with the current page.

Suggested fix
-                {featuredPost.categories?.[0]?.title && (
+                {categoryDisplay && (
                   <div className="text-sm font-medium text-gray-900 dark:text-gray-100">
-                    {featuredPost.categories[0].title}
+                    {categoryDisplay}
                   </div>
                 )}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/category/[category]/page.tsx around lines 88 -
90, The hero badge currently uses featuredPost.categories[0] which can mismatch
the current page category; update the page component to select the category from
featuredPost.categories that matches the current route/category param (e.g.
compare each category.slug or category.title to params.category or the route
param) and render that match instead, falling back to featuredPost.categories[0]
if no match is found; reference featuredPost, featuredPost.categories, and the
route param (params.category or similar) when implementing the change.
🧹 Nitpick comments (24)
frontend/lib/tests/shop/shipping-retention-phase7.test.ts (1)

91-105: Extract the duplicated snapshot shape into one shared type.

The same inline intersection type appears in both tests. Hoisting it to a ShippingSnapshot alias will keep these assertions from drifting the next time the retained payload changes.

♻️ Proposed refactor
+type ShippingSnapshot = Record<string, unknown> & {
+  piiRedacted?: boolean;
+  recipient?: {
+    fullName?: string;
+    phone?: string;
+    email?: string;
+  };
+  selection?: {
+    cityRef?: string;
+    warehouseRef?: string;
+  };
+};
+
 ...
-      const snapshot = (row?.shippingAddress ?? {}) as Record<
-        string,
-        unknown
-      > & {
-        piiRedacted?: boolean;
-        recipient?: {
-          fullName?: string;
-          phone?: string;
-          email?: string;
-        };
-        selection?: {
-          cityRef?: string;
-          warehouseRef?: string;
-        };
-      };
+      const snapshot = (row?.shippingAddress ?? {}) as ShippingSnapshot;
 ...
-      const snapshot = (row?.shippingAddress ?? {}) as Record<
-        string,
-        unknown
-      > & {
-        piiRedacted?: boolean;
-        recipient?: {
-          fullName?: string;
-          phone?: string;
-          email?: string;
-        };
-        selection?: {
-          cityRef?: string;
-          warehouseRef?: string;
-        };
-      };
+      const snapshot = (row?.shippingAddress ?? {}) as ShippingSnapshot;

Also applies to: 139-153

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/tests/shop/shipping-retention-phase7.test.ts` around lines 91 -
105, Extract the repeated inline intersection type used for snapshot into a
single shared type alias (e.g., ShippingSnapshot) and replace the inline type
annotations on the snapshot variables in both occurrences (the snapshot variable
declared around the current `snapshot = (row?.shippingAddress ?? {}) as ...` and
the similar declaration later) with that alias; ensure the alias includes the
base Record<string, unknown> plus piiRedacted, recipient, and selection shapes
so both tests reference the same ShippingSnapshot type.
frontend/lib/tests/shop/checkout-currency-policy.test.ts (1)

119-122: Use the shared helper instead of duplicating IP derivation logic.

This file defines a local makeTestClientIp function, but order-items-snapshot-immutable.test.ts imports deriveTestIpFromIdemKey from @/lib/tests/helpers/ip for the same purpose. Reuse the shared helper to avoid duplication and ensure consistent behavior across tests.

♻️ Proposed refactor
 import crypto from 'crypto';
 import { inArray } from 'drizzle-orm';
 import { NextRequest } from 'next/server';
 import { afterAll, beforeAll, describe, expect, it, vi } from 'vitest';
+
+import { deriveTestIpFromIdemKey } from '@/lib/tests/helpers/ip';

Then remove the local function and update the usage:

-function makeTestClientIp(seed: string): string {
-  const digest = crypto.createHash('sha256').update(seed).digest();
-  return `${(digest[0] % 223) + 1}.${digest[1]}.${digest[2]}.${(digest[3] % 254) + 1}`;
-}
-
 function makeCheckoutRequest(
   payload: unknown,
   opts: { idempotencyKey: string; acceptLanguage: string }
 ) {
   const headers = new Headers({
     'Content-Type': 'application/json',
     'Idempotency-Key': opts.idempotencyKey,
     'Accept-Language': opts.acceptLanguage,
-    'X-Forwarded-For': makeTestClientIp(opts.idempotencyKey),
+    'X-Forwarded-For': deriveTestIpFromIdemKey(opts.idempotencyKey),
     Origin: 'http://localhost:3000',
   });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/tests/shop/checkout-currency-policy.test.ts` around lines 119 -
122, Replace the locally defined makeTestClientIp with the shared helper
deriveTestIpFromIdemKey from "@/lib/tests/helpers/ip": remove the local
makeTestClientIp function, add an import for deriveTestIpFromIdemKey, and update
all calls that used makeTestClientIp(seed) to call deriveTestIpFromIdemKey(seed)
so tests reuse the same IP derivation logic as
order-items-snapshot-immutable.test.ts.
frontend/proxy.ts (1)

88-90: Minor redundancy: both early return and matcher exclusion for .well-known.

The matcher on line 119 already excludes .well-known paths, so the middleware won't be invoked for these requests. The early return on lines 88-90 would never be reached in practice. While this defense-in-depth approach is harmless, you could simplify by removing the early return if the matcher exclusion is sufficient.

♻️ Optional: Remove redundant early return
 export function proxy(req: NextRequest) {
-  if (req.nextUrl.pathname.startsWith('/.well-known/')) {
-    return NextResponse.next();
-  }
-
   if (req.nextUrl.pathname === '/') {

Also applies to: 116-120

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/proxy.ts` around lines 88 - 90, The early-return check that tests
req.nextUrl.pathname.startsWith('/.well-known/') and returns NextResponse.next()
is redundant because the middleware matcher already excludes `.well-known`
paths; remove that conditional (the block containing the startsWith check and
its return) from frontend/proxy.ts so the middleware is simplified and relies on
the matcher exclusion instead (ensure you update both occurrences noted around
the existing conditional/matcher code).
frontend/components/dashboard/ProfileCard.tsx (1)

326-340: Consider using PASSWORD_MIN_LEN constant for consistency.

The minLength of 8 is hardcoded both in the input attribute (line 326) and the validation message (line 337). Other password fields in this PR use imported constants from @/lib/auth/signup-constraints. Using the constant here would ensure consistency if the policy changes.

♻️ Suggested refactor using constant

Import at the top:

import { PASSWORD_MIN_LEN } from '@/lib/auth/signup-constraints';

Then update the input:

                       <input
                         type="password"
                         name="newPassword"
                         placeholder={t('newPassword')}
-                        minLength={8}
+                        minLength={PASSWORD_MIN_LEN}
                         className="..."
                         required
                         onInvalid={e => {
                           if (e.currentTarget.validity.valueMissing) {
                             e.currentTarget.setCustomValidity(
                               tFields('validation.required')
                             );
                           } else if (e.currentTarget.validity.tooShort) {
                             e.currentTarget.setCustomValidity(
                               tFields('validation.passwordTooShort', {
-                                minLength: 8,
+                                minLength: PASSWORD_MIN_LEN,
                               })
                             );
                           }
                         }}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/dashboard/ProfileCard.tsx` around lines 326 - 340,
Replace the hardcoded minimum password length (8) in ProfileCard component with
the shared constant: import PASSWORD_MIN_LEN from
'@/lib/auth/signup-constraints' and use it for the input prop
(minLength={PASSWORD_MIN_LEN}) and in the onInvalid handler where
tFields('validation.passwordTooShort', { minLength: 8 }) is called (change 8 to
PASSWORD_MIN_LEN); ensure the import is added at the top of the file and both
occurrences (the JSX minLength prop and the validation message payload)
reference PASSWORD_MIN_LEN so the component stays consistent with other password
fields.
frontend/components/auth/ResetPasswordForm.tsx (1)

78-83: Consider differentiating error types for better UX.

Mapping all 400 errors to a single resetFailed message may reduce user clarity. Password reset failures can have distinct causes (expired token, invalid token, password policy violation) that users could act upon differently. If the API returns distinguishable error codes, consider parsing and displaying specific messages.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/auth/ResetPasswordForm.tsx` around lines 78 - 83, In
ResetPasswordForm's error branch that currently checks if (!res.ok) and calls
setError with a generic message, parse the response body (await res.json()) and
inspect a machine-readable field (e.g., code, error or message) returned by the
API; map known codes like "TOKEN_EXPIRED", "INVALID_TOKEN",
"PASSWORD_POLICY_VIOLATION" to distinct i18n keys (e.g.,
t('errors.tokenExpired'), t('errors.invalidToken'), t('errors.passwordPolicy'))
and call setError with the specific message, falling back to the existing
t('errors.resetFailed') for unknown 4xx codes and t('errors.serverError') for
non-4xx responses, preserving the current behavior if the body is absent or
unparseable.
frontend/app/api/shop/checkout/route.ts (1)

111-116: Consider extracting isMonobankGooglePayEnabled to a shared utility.

This function is also defined in frontend/app/api/shop/orders/[id]/payment/monobank/_shared.ts (as mentioned in the config route's imports). Having two identical implementations risks divergence if one is updated without the other.

♻️ Suggested extraction

Move to a shared location (e.g., @/lib/env/monobank.ts alongside isMonobankEnabled):

// `@/lib/env/monobank.ts`
export function isMonobankGooglePayEnabled(): boolean {
  const raw = (process.env.SHOP_MONOBANK_GPAY_ENABLED ?? '')
    .trim()
    .toLowerCase();
  return raw === 'true' || raw === '1' || raw === 'yes' || raw === 'on';
}

Then import from the shared location in both routes.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/api/shop/checkout/route.ts` around lines 111 - 116, Duplicate
implementation of isMonobankGooglePayEnabled risks divergence; extract it to a
shared module (e.g., create "@/lib/env/monobank.ts" exporting
isMonobankGooglePayEnabled alongside existing isMonobankEnabled), update
frontend/app/api/shop/checkout/route.ts to import isMonobankGooglePayEnabled
from that shared module, and update
frontend/app/api/shop/orders/[id]/payment/monobank/_shared.ts to import the same
function instead of redefining it so both places use the single shared
implementation.
frontend/app/api/shop/orders/[id]/payment/monobank/_shared.ts (1)

63-70: Consider using a shared constant for known payment methods.

parseKnownMethod duplicates the payment method strings. If PaymentMethod is a union type, consider deriving this validation from the type definition or a shared constant to avoid drift.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/api/shop/orders/`[id]/payment/monobank/_shared.ts around lines
63 - 70, parseKnownMethod duplicates literal payment method strings; replace the
hard-coded checks with a single source of truth by importing or defining a
shared constant (e.g., KNOWN_PAYMENT_METHODS =
['stripe_card','monobank_invoice','monobank_google_pay']) or an enum that
mirrors the PaymentMethod union, then validate by normalizing the input and
checking membership (includes) and returning the matched constant value; update
parseKnownMethod to use that shared constant (or derive from PaymentMethod
export) so literals aren’t duplicated across the codebase.
frontend/db/queries/blog/blog-authors.ts (1)

41-44: Consider using Drizzle's eq + and instead of raw SQL for the join condition.

Using sql template for the join condition works but is less type-safe. Consider:

.leftJoin(
  blogAuthorTranslations,
  and(
    eq(blogAuthorTranslations.authorId, blogAuthors.id),
    eq(blogAuthorTranslations.locale, locale)
  )
)

This provides better type checking and consistency with other queries.

♻️ Suggested refactor
+import { and, eq, sql } from 'drizzle-orm';
-import { eq, sql } from 'drizzle-orm';
 
 // ...
 
     .leftJoin(
       blogAuthorTranslations,
-      sql`${blogAuthorTranslations.authorId} = ${blogAuthors.id} AND ${blogAuthorTranslations.locale} = ${locale}`
+      and(
+        eq(blogAuthorTranslations.authorId, blogAuthors.id),
+        eq(blogAuthorTranslations.locale, locale)
+      )
     )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/db/queries/blog/blog-authors.ts` around lines 41 - 44, Replace the
raw SQL join condition in the .leftJoin call with Drizzle's typed helpers: use
and(...) combining eq(blogAuthorTranslations.authorId, blogAuthors.id) and
eq(blogAuthorTranslations.locale, locale) so the join condition is type-safe and
consistent; locate the .leftJoin invocation that currently uses
sql`${blogAuthorTranslations.authorId} = ${blogAuthors.id} AND
${blogAuthorTranslations.locale} = ${locale}` and swap it to the and/eq form,
importing eq and and if not already imported.
frontend/drizzle/0027_material_bloodscream.sql (1)

54-68: Consider adding an index for blog post listing queries.

Queries filtering by is_published and ordering by published_at are common for blog listings. Consider adding a composite index:

CREATE INDEX "blog_posts_published_idx" ON "blog_posts" ("is_published", "published_at" DESC) WHERE "is_published" = true;

This partial index would optimize published post queries.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/drizzle/0027_material_bloodscream.sql` around lines 54 - 68, Add a
partial composite index to optimize listing queries that filter by is_published
and order by published_at; specifically, create an index named
blog_posts_published_idx on the blog_posts table covering (is_published,
published_at DESC) with a WHERE clause restricting to is_published = true so
published-post queries use the index efficiently. Update migration
0027_material_bloodscream.sql to include this CREATE INDEX statement after the
blog_posts table definition and ensure the index name matches
blog_posts_published_idx.
frontend/lib/services/orders/checkout.ts (1)

754-759: Duplicate function: isMonobankGooglePayEnabled exists in both files.

This function is duplicated in frontend/app/api/shop/orders/[id]/payment/monobank/_shared.ts (lines 27-32). Consider extracting it to a shared utility module (e.g., @/lib/env/monobank or @/lib/shop/payments) to maintain DRY.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/services/orders/checkout.ts` around lines 754 - 759, The
function isMonobankGooglePayEnabled is duplicated; extract it into a single
shared utility (e.g., create an exported function isMonobankGooglePayEnabled in
a new shared module like monobank env/util) and replace both local definitions
with an import from that module; update the call sites to use the imported
function and ensure the new module reads process.env the same way and is
exported for reuse (preserve the exact boolean logic and name to avoid breaking
references).
frontend/components/header/MainSwitcher.tsx (1)

54-58: Prefer a shared BlogCategory type here.

This { id, slug, title } shape is now repeated across several header components. Pulling it into one shared type will make the next category-schema change a single edit instead of a coordinated sweep.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/header/MainSwitcher.tsx` around lines 54 - 58, Replace
the inline category shape in MainSwitcherProps with the shared BlogCategory
type: change blogCategories?: Array<{ id: string; slug: string; title: string }>
to blogCategories?: BlogCategory[] and import or reference the existing
BlogCategory type (the centralized type that defines id, slug, title); update
the import at the top of this file and ensure MainSwitcherProps and any uses of
blogCategories in the MainSwitcher component compile against the shared
BlogCategory type.
frontend/app/[locale]/blog/page.tsx (1)

6-7: Reuse the cached categories loader on the blog index too.

/[locale]/layout.tsx already fetches getCachedBlogCategories(locale) for this request. Calling the uncached getBlogCategories(locale) here means the blog landing page does the same categories query twice.

♻️ Proposed change
-import { getBlogCategories } from '@/db/queries/blog/blog-categories';
+import { getCachedBlogCategories } from '@/db/queries/blog/blog-categories';
 import { getBlogPosts } from '@/db/queries/blog/blog-posts';
@@
   const [posts, categories] = await Promise.all([
     getBlogPosts(locale),
-    getBlogCategories(locale),
+    getCachedBlogCategories(locale),
   ]);

Also applies to: 33-36

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/page.tsx around lines 6 - 7, The blog index is
calling the uncached getBlogCategories(locale) twice; replace those uses with
the cached loader getCachedBlogCategories(locale) that /[locale]/layout.tsx
already runs. Update the imports in page.tsx to import getCachedBlogCategories
instead of getBlogCategories, remove the unused uncached import, and call
getCachedBlogCategories wherever categories are fetched (including the other
occurrence referenced around lines 33-36) so the categories query is not
duplicated.
frontend/app/[locale]/blog/[slug]/page.tsx (1)

7-16: This route fetches the same post twice per request and unnecessarily disables caching.

generateMetadata() calls getBlogPostBySlug() for the title (line 14), and PostDetails calls it again to render the post (line 56). Since no post data is passed between them, both queries execute. Combined with force-dynamic on line 7—which disables ISR unless truly needed—every blog view pays for two database reads and loses caching benefits.

Suggested fix: Pass the post data from Page to PostDetails and fetch once, or use React.cache() to deduplicate the queries. Also verify whether force-dynamic is justified by request-time data requirements (e.g., cookies(), headers(), draftMode()); none were detected.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/[slug]/page.tsx around lines 7 - 16,
generateMetadata currently calls getBlogPostBySlug and Page/PostDetails call it
again, causing double fetches and force-dynamic disables caching; fix by
fetching the post once in the route component (e.g., Page) and pass that post
object as a prop into PostDetails (stop PostDetails from calling
getBlogPostBySlug), or alternatively wrap getBlogPostBySlug with React.cache to
deduplicate calls across generateMetadata and rendering; also remove or justify
the export const dynamic = 'force-dynamic' if request-time signals (cookies(),
headers(), draftMode()) are not required so ISR/caching can be re-enabled.
frontend/db/seed-blog-migration.ts (2)

32-33: Hardcoded Sanity API URL.

The Sanity API URL is hardcoded. For a one-time migration script this is acceptable, but consider using an environment variable for flexibility between environments.

📝 Use environment variable
-const SANITY_API =
-  'https://6y9ive6v.api.sanity.io/v2025-11-29/data/query/production';
+const SANITY_API = process.env.SANITY_API_URL || 
+  'https://6y9ive6v.api.sanity.io/v2025-11-29/data/query/production';
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/db/seed-blog-migration.ts` around lines 32 - 33, The SANITY_API
constant is hardcoded; update the migration to read the URL from an environment
variable (e.g., process.env.SANITY_API) with the current hardcoded URL as a
sensible default fallback, and validate that the resulting value is present
before proceeding. Modify the symbol SANITY_API to be initialized from the env
var (and load .env if you rely on dotenv), and add a clear error/log and exit if
no API URL is available so the migration fails fast with a helpful message.

76-86: Consider adding timeout and retry for Sanity API calls.

The sanityFetch function has no timeout or retry logic. For a one-time migration this may be acceptable, but large datasets or network issues could cause the script to hang.

♻️ Add timeout with AbortController
 async function sanityFetch<T>(query: string): Promise<T> {
   const url = `${SANITY_API}?query=${encodeURIComponent(query)}`;
-  const res = await fetch(url);
+  const controller = new AbortController();
+  const timeoutId = setTimeout(() => controller.abort(), 30_000);
+  
+  let res: Response;
+  try {
+    res = await fetch(url, { signal: controller.signal });
+  } finally {
+    clearTimeout(timeoutId);
+  }

   if (!res.ok) {
     throw new Error(`Sanity API ${res.status}: ${await res.text()}`);
   }

   const json = (await res.json()) as { result: T };
   return json.result;
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/db/seed-blog-migration.ts` around lines 76 - 86, The sanityFetch
function lacks timeout and retry logic; update sanityFetch(query: string) to use
an AbortController with a configurable timeout (e.g., 10s) to abort the fetch if
it exceeds the timeout and ensure the signal is passed to fetch, then add simple
retry with exponential backoff and a maxAttempts parameter (e.g., 3 attempts)
that retries only on network/timeouts or 5xx responses, and ensure the
controller is cleaned up between attempts and the final error includes the last
response/status or timeout reason.
frontend/app/[locale]/shop/checkout/payment/monobank/[orderId]/page.tsx (1)

26-32: Static metadata may not be optimal for payment pages.

The metadata is statically defined. For SEO purposes this is fine for a payment page, but consider whether generateMetadata would be beneficial to include order-specific information or prevent indexing.

📝 Consider adding robots meta to prevent indexing
 export const metadata: Metadata = {
   title: 'Monobank Google Pay | DevLovers',
   description: 'Complete your payment with Google Pay via Monobank.',
+  robots: {
+    index: false,
+    follow: false,
+  },
 };
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/shop/checkout/payment/monobank/[orderId]/page.tsx
around lines 26 - 32, Replace the static Metadata object with a generateMetadata
function that reads the orderId (from page params) and returns dynamic,
order-specific metadata (e.g., title including orderId or masked identifier) and
a robots meta tag to prevent indexing (noindex, nofollow) for payment pages;
keep dynamic = 'force-dynamic' and revalidate = 0 as needed, and ensure the
function returns the same Metadata shape used currently (reference symbols:
metadata, generateMetadata, dynamic, revalidate, params/orderId).
frontend/app/[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx (3)

456-468: Using innerHTML to clear button container may cause issues.

Setting innerHTML = '' on lines 457 and 467 to manage the Google Pay button is a common pattern but can be problematic with React's virtual DOM. Consider using a ref-based approach with removeChild or letting React handle cleanup.

♻️ Proposed alternative using DOM methods
     const host = buttonHostRef.current;
-    host.innerHTML = '';
+    while (host.firstChild) {
+      host.removeChild(host.firstChild);
+    }
     const button = paymentsClientRef.current.createButton({
       onClick: onGooglePayClick,
       buttonType: 'pay',
       buttonColor: 'black',
       buttonSizeMode: 'fill',
     });
     host.appendChild(button);

     return () => {
-      host.innerHTML = '';
+      while (host.firstChild) {
+        host.removeChild(host.firstChild);
+      }
     };
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/`[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx
around lines 456 - 468, The code uses host.innerHTML = '' to clear the Google
Pay button which can clash with React; instead store the created button element
(from paymentsClientRef.current.createButton) in a local or ref (e.g., buttonRef
or a closure variable) and on mount append that element to
buttonHostRef.current, and in the cleanup remove it via
buttonHostRef.current.removeChild(button) (checking host and button exist)
rather than resetting innerHTML; update the block that references buttonHostRef,
paymentsClientRef, and onGooglePayClick to use this append/remove approach and
guard against missing refs.

376-379: Magic number for redirect timeout.

The 800ms timeout before redirecting on fallback lacks documentation explaining its purpose.

📝 Add constant with documentation
+// Brief delay before redirect to allow UI message to be visible
+const FALLBACK_REDIRECT_DELAY_MS = 800;

// ... in onGooglePayClick:
-        pendingRedirectTimeoutRef.current = window.setTimeout(() => {
+        pendingRedirectTimeoutRef.current = window.setTimeout(() => {
           pendingRedirectTimeoutRef.current = null;
           goToPending();
-        }, 800);
+        }, FALLBACK_REDIRECT_DELAY_MS);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/`[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx
around lines 376 - 379, Replace the magic literal 800 used in window.setTimeout
with a named constant (e.g., PENDING_REDIRECT_TIMEOUT_MS) declared near the top
of the MonobankGooglePayClient component/file and add a brief doc comment
explaining the purpose of the delay and any considerations (UX smoothing,
waiting for animations, or race conditions) so future readers understand why
goToPending is deferred; then change the call that sets
pendingRedirectTimeoutRef (the window.setTimeout in the pending redirect
fallback) to use that constant instead of the raw number.

70-70: Module-level mutable state may cause issues in concurrent requests.

The googlePayScriptPromise variable is a module-level singleton. In Next.js with server components and React strict mode, this shared state could lead to subtle bugs if multiple component instances race to load the script. While this pattern is common for script loading, consider whether the existing script element check on lines 119-134 sufficiently handles all race conditions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/`[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx
at line 70, The module-level mutable googlePayScriptPromise can race across
concurrent renders; instead attach the promise to a safe runtime global (e.g.,
window.__googlePayScriptPromise) or compute a new per-render promise by first
checking for an existing script element via document.querySelector (the existing
script element check) and only creating/appending a script + promise if none
exists; update any loader function that currently reads/writes
googlePayScriptPromise to use the window-scoped key (guarded by typeof window)
or a local promise derived from the DOM check so you avoid cross-request module
state races while keeping idempotent script loading.
frontend/app/[locale]/shop/checkout/return/monobank/page.tsx (1)

32-54: Code duplication with MonobankGooglePayPage.

The helper functions getStringParam, parseStatusToken, and shouldClearCart are duplicated between this file and [orderId]/page.tsx. Consider extracting these to a shared utility module.

♻️ Extract shared helpers

Create a shared utility file:

// frontend/lib/shop/checkout-params.ts
export type SearchParams = Record<string, string | string[] | undefined>;

export function getStringParam(params: SearchParams | undefined, key: string): string {
  const raw = params?.[key];
  if (!raw) return '';
  if (Array.isArray(raw)) return raw[0] ?? '';
  return raw;
}

export function parseStatusToken(params: SearchParams | undefined): string | null {
  const value = getStringParam(params, 'statusToken').trim();
  return value.length ? value : null;
}

export function shouldClearCart(params: SearchParams | undefined): boolean {
  const value = getStringParam(params, 'clearCart');
  return value === '1' || value === 'true';
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/shop/checkout/return/monobank/page.tsx around lines 32
- 54, Extract the duplicated helpers getStringParam, parseStatusToken, and
shouldClearCart into a shared utility (e.g.,
frontend/lib/shop/checkout-params.ts) that exports a SearchParams type and those
three functions (accepting SearchParams | undefined), then replace the local
implementations in frontend/app/[locale]/shop/checkout/return/monobank/page.tsx
and frontend/app/[locale]/shop/checkout/[orderId]/page.tsx with imports from
that module; keep parseOrderId/local orderIdParamSchema usage in-place (only
import getStringParam if you need it there) and update all affected import
statements.
frontend/lib/services/shop/shipping/nova-poshta-client.ts (1)

452-454: Consider documenting the limit constants.

The LIMIT_PRIMARY = 500 and LIMIT_FALLBACK = 200 values appear to be empirically determined based on NP API behavior. A brief comment explaining why these specific values were chosen would aid future maintenance.

📝 Add documentation for limit constants
 export async function getWarehousesByCityRef(
   cityRef: string
 ): Promise<NovaPoshtaWarehouse[]> {
   const MAX_PAGES = 50;
+  // NP API may return empty results with large page sizes for some cities;
+  // fallback to smaller limit if first page is empty
   const LIMIT_PRIMARY = 500;
   const LIMIT_FALLBACK = 200;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/services/shop/shipping/nova-poshta-client.ts` around lines 452 -
454, Add brief inline comments above the constants MAX_PAGES, LIMIT_PRIMARY, and
LIMIT_FALLBACK explaining why the numeric values were chosen (e.g., empirically
determined based on Nova Poshta API response/timeout/size behavior), what each
constant controls (primary request page size vs fallback page size), and when to
use/adjust them (e.g., if API changes or timeouts increase). Reference the
symbols LIMIT_PRIMARY and LIMIT_FALLBACK in the comment and mention that
LIMIT_FALLBACK is a conservative value used when the primary chunk
(LIMIT_PRIMARY) triggers API truncation/timeouts or excessive response sizes;
also note MAX_PAGES’ role limiting total pages to avoid infinite loops or
excessive requests.
frontend/lib/psp/monobank.ts (1)

442-443: Timestamp conversion threshold may cause issues in distant future.

The check value < 1e11 to distinguish seconds from milliseconds will incorrectly classify millisecond timestamps as seconds after year 5138. While unlikely to be relevant, consider using a more robust check.

📝 More robust timestamp detection
 function parseTimestampMs(value: unknown): number | null {
   if (typeof value === 'number' && Number.isFinite(value)) {
-    const ms = value < 1e11 ? value * 1000 : value;
+    // Timestamps before year 2001 in ms would be < 1e12, and after 2001 in seconds would be > 1e9
+    // Use 1e12 as threshold: values below are seconds, above are milliseconds
+    const ms = value < 1e12 ? value * 1000 : value;
     return Number.isFinite(ms) ? ms : null;
   }

Note: Using 1e12 as the threshold works because:

  • Unix seconds since 2001-09-09 are > 1e9 and < 1e10
  • Unix milliseconds are > 1e12 after 2001
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/psp/monobank.ts` around lines 442 - 443, The current conversion
uses the fragile check `value < 1e11` to decide seconds vs milliseconds; update
the threshold to a more robust magnitude test (e.g., use `value < 1e12`), so in
the `ms` assignment (the expression `const ms = value < 1e11 ? value * 1000 :
value;`) replace `1e11` with `1e12` (or an equivalent magnitude-based check) to
correctly distinguish seconds from milliseconds far into the future and keep the
`Number.isFinite(ms) ? ms : null` behavior unchanged.
frontend/app/[locale]/blog/[slug]/PostDetails.tsx (1)

55-58: Limit the recommendations query to the cards you actually render.

getBlogPosts(locale) makes every post page load the full localized catalog and then discard almost all of it. A dedicated query that excludes slugParam, limits the result set, and selects only card fields will scale much better and reduce TTFB.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/blog/[slug]/PostDetails.tsx around lines 55 - 58,
Replace the full-catalog call to getBlogPosts(locale) with a targeted query that
excludes the current slug and only selects the card fields you render: add a new
data-access function (e.g., getRecommendedPosts(locale, excludeSlug, limit))
that filters out slugParam, projects only the card properties, and returns a
bounded number (match the number of cards rendered); then update the Promise.all
call to await getBlogPostBySlug(slugParam, locale) and
getRecommendedPosts(locale, slugParam, N) instead of getBlogPosts(locale).
Ensure identifiers referenced: getBlogPostBySlug, getBlogPosts (to be replaced),
slugParam, and the new getRecommendedPosts function.
frontend/components/blog/BlogFilters.tsx (1)

393-396: Precompute body search text instead of extracting it inside the filter loop.

extractPlainText(post.body) now runs for every post on every search update. Memoizing a normalized search blob once per posts change will keep the client-side search path much cheaper as the catalog grows.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/blog/BlogFilters.tsx` around lines 393 - 396, The current
filter loop calls extractPlainText(post.body) and normalizeSearchText for every
post on each search change; instead, compute a normalized body blob once when
posts change (e.g., in a useMemo keyed on posts) and store it alongside each
post (or in a map keyed by post id) so the filter uses that precomputed
normalizedBody; update the filtering code that references
extractPlainText(post.body) and normalizeSearchText to use the precomputed value
(keep searchQueryNormalized logic the same) to avoid repeated extraction and
normalization during every search update.

Comment on lines +591 to 595
if (!response.ok || parsed.available === false) {
if (!cancelled) {
setCityOptions([]);
setCityLookupFailed(true);
}
return;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Reset city lookup results as soon as the query changes.

cityOptions survives both query edits and failed requests, so stale suggestions can stay selectable for a different cityQuery. That can attach the wrong cityRef to checkout.

💡 Suggested fix
 onChange={event => {
+  const nextQuery = event.target.value;
   clearCheckoutUiErrors();
   setCityLookupFailed(false);
-  setCityQuery(event.target.value);
+  setCityOptions([]);
+  setCitiesLoading(nextQuery.trim().length >= 2);
+  setCityQuery(nextQuery);
   setSelectedCityRef(null);
   setSelectedCityName(null);
 }}
 if (!response.ok || parsed.available === false) {
   if (!cancelled) {
+    setCityOptions([]);
     setCityLookupFailed(true);
   }
   return;
 }
 ...
 } catch {
   if (!cancelled) {
+    setCityOptions([]);
     setCityLookupFailed(true);
   }
 }

Also applies to: 616-619, 1423-1429

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/shop/cart/CartPageClient.tsx around lines 591 - 595,
The city suggestion state (cityOptions) isn't cleared when the cityQuery changes
or when a lookup fails, allowing stale options to be selected; update the lookup
logic in CartPageClient (the fetch/lookup handlers around the city lookup code
paths and any other branches noted) to call setCityOptions([]) whenever
cityQuery changes and also whenever a request fails or parsed.available ===
false (i.e., alongside setCityLookupFailed(true)); ensure the same change is
applied to the other similar blocks that use setCityLookupFailed (the other
lookup branches referenced) so cityOptions is always reset on query edits,
cancellations, and failed responses to avoid attaching the wrong cityRef.

Comment on lines +121 to +124
const payload = await response.json().catch(() => null);
if (!response.ok) return null;

return parseStatusPayload(payload);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Stop retrying permanent status failures.

fetchStatus() collapses every non-2xx into null, and the caller always schedules another poll. A missing or expired statusToken will sit in a 401/403 loop forever instead of surfacing a terminal error and stopping the timer.

Also applies to: 136-140

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/`[locale]/shop/checkout/return/monobank/MonobankReturnStatus.tsx
around lines 121 - 124, fetchStatus() currently maps every non-2xx response to
null so the caller keeps polling forever; modify the response handling in
fetchStatus() (the block that awaits response.json() and returns
parseStatusPayload(payload)) to detect permanent failures (at least HTTP 401 and
403) and return/propagate a terminal error or distinct terminal status instead
of null (or throw a specific Error) so the caller can stop the timer; apply the
same change to the similar response handling around parseStatusPayload in the
other block (lines 136-140) so missing/expired statusToken cases are surfaced as
terminal failures rather than retried.

Comment on lines 5 to +17
export async function GET(request: Request) {
const { searchParams } = new URL(request.url);
const locale = searchParams.get('locale') || 'en';
const items = await client
.withConfig({ useCdn: false })
.fetch(searchQuery, { locale });
return NextResponse.json(items || []);
const posts = await getBlogPosts(locale);

const items = posts.map(p => ({
id: p.id,
title: p.title,
body: p.body,
slug: p.slug,
}));

return NextResponse.json(items);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add error handling for the database query.

The getBlogPosts call can fail (database connection issues, query errors, etc.), but there's no try-catch to return a proper error response. This will result in unhandled rejections or generic 500 errors without useful context.

🛡️ Proposed fix
 export async function GET(request: Request) {
   const { searchParams } = new URL(request.url);
   const locale = searchParams.get('locale') || 'en';
-  const posts = await getBlogPosts(locale);
 
-  const items = posts.map(p => ({
-    id: p.id,
-    title: p.title,
-    body: p.body,
-    slug: p.slug,
-  }));
+  try {
+    const posts = await getBlogPosts(locale);
 
-  return NextResponse.json(items);
+    const items = posts.map(p => ({
+      id: p.id,
+      title: p.title,
+      body: p.body,
+      slug: p.slug,
+    }));
+
+    return NextResponse.json(items);
+  } catch (error) {
+    console.error('Failed to fetch blog posts:', error);
+    return NextResponse.json(
+      { error: 'Failed to fetch blog posts' },
+      { status: 500 }
+    );
+  }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app/api/blog-search/route.ts` around lines 5 - 17, Wrap the
getBlogPosts call inside GET in a try-catch to handle DB/query failures: call
getBlogPosts(locale) inside try, map posts to items if successful, and in catch
log the error (e.g., console.error or your logger) and return a
NextResponse.json error payload with an appropriate status (e.g., 500) and an
informative message including the error details; ensure the function names
referenced are GET, getBlogPosts and the response uses NextResponse.json so
callers get a controlled error response instead of an unhandled rejection.

Comment on lines +50 to +57
merchantInfo: {
merchantName: args.merchantName,
},
transactionInfo: {
totalPriceStatus: 'FINAL' as const,
totalPrice: formatMinorToDecimalString(args.totalAmountMinor),
currencyCode: 'UAH',
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Google Pay PaymentDataRequest merchantInfo merchantId required

💡 Result:

In the Google Pay API for Web PaymentDataRequest, merchantInfo.merchantId is required only when your PaymentsClient is initialized with environment: 'PRODUCTION'. In TEST, it’s not required. [1][2]

  • What it is: a Google merchant identifier issued after you register in the Google Pay & Wallet Console. [1][2]
  • Common failure in production: omitting it triggers errors like “merchantId must be set!” and can also relate to your domain not being associated with that merchant ID in the console. [2]
  • Not the same as gatewayMerchantId (that one is provided/defined by your PSP/payment gateway as part of tokenization). [2]

Sources:
[1] Google Pay API (Web) – Request objects (merchantId requirement / PRODUCTION note) (developers.google.com)
[2] Google Pay API (Web) – Troubleshooting (merchantId must be set, console/domain association) (developers.google.com)

Citations:


🏁 Script executed:

cat -n frontend/app/api/shop/orders/[id]/payment/monobank/google-pay/config/route.ts

Repository: DevLoversTeam/devlovers.net

Length of output: 5308


🏁 Script executed:

#!/bin/bash
# Search for PaymentsClient initialization to determine if PRODUCTION or TEST environment
rg -i "paymentsclient|environment.*production|environment.*test" --max-count=20

# Search for MONO_GOOGLE_PAY_MERCHANT_ID env var (Google merchant ID, distinct from gateway merchant ID)
rg "MONO_GOOGLE_PAY_MERCHANT_ID"

# Search for any Google Pay documentation or comments mentioning merchantId
rg -i "google.*pay.*merchantid|merchantid.*google" --max-count=20

Repository: DevLoversTeam/devlovers.net

Length of output: 2412


🏁 Script executed:

# Quick check: view the MonobankGooglePayClient to confirm environment setup
sed -n '1,50p' frontend/app/[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx

Repository: DevLoversTeam/devlovers.net

Length of output: 1219


🏁 Script executed:

# Confirm the environment initialization line
grep -n "environment.*NODE_ENV" frontend/app/[locale]/shop/checkout/payment/monobank/MonobankGooglePayClient.tsx

Repository: DevLoversTeam/devlovers.net

Length of output: 154


Add merchantId to merchantInfo for production deployments.

The merchantInfo object (lines 50–52) only includes merchantName, but when the application is deployed to production, the PaymentsClient is initialized with environment: 'PRODUCTION' (line 182 of MonobankGooglePayClient.tsx). Google Pay's API requires merchantId in the PaymentDataRequest for production environments—omitting it will cause a "merchantId must be set!" error.

You need to:

  1. Add a MONO_GOOGLE_PAY_MERCHANT_ID environment variable for your Google merchant ID (obtained from the Google Pay & Wallet Console—distinct from MONO_GOOGLE_PAY_GATEWAY_MERCHANT_ID)
  2. Validate and pass it to buildGooglePaySkeleton
  3. Include it in the merchantInfo object
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/api/shop/orders/`[id]/payment/monobank/google-pay/config/route.ts
around lines 50 - 57, The PaymentDataRequest currently sets merchantInfo only
with merchantName; add a production-ready merchantId by wiring a new env var
MONO_GOOGLE_PAY_MERCHANT_ID into the Google Pay build path: validate
process.env.MONO_GOOGLE_PAY_MERCHANT_ID (or equivalent args input) before
calling buildGooglePaySkeleton, add it to the buildGooglePaySkeleton
signature/args if needed, and include merchantId: <merchantIdValue> in the
merchantInfo object returned by buildGooglePaySkeleton so PaymentsClient in
MonobankGooglePayClient.tsx (which may run with environment: 'PRODUCTION')
receives a merchantId.

Comment on lines +189 to +225
const maxBytes = getMonobankGooglePayMaxBodyBytes();
const contentLength = Number.parseInt(
request.headers.get('content-length') ?? '',
10
);
if (Number.isFinite(contentLength) && contentLength > maxBytes) {
return noStoreJson(
{ code: 'PAYLOAD_TOO_LARGE', message: 'Request payload is too large.' },
413
);
}

let rawBodyBytes: Buffer;
try {
rawBodyBytes = Buffer.from(await request.arrayBuffer());
} catch {
return noStoreJson(
{ code: 'INVALID_PAYLOAD', message: 'Invalid request body.' },
400
);
}

const parsedPayload = parseSubmitPayload(rawBodyBytes, maxBytes);
if (!parsedPayload.ok) {
const status = parsedPayload.status;
const code = parsedPayload.code;
return noStoreJson(
{
code,
message:
code === 'PAYLOAD_TOO_LARGE'
? 'Request payload is too large.'
: 'Invalid payload.',
},
status
);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

The max-body guard still buffers the full request.

If content-length is absent or wrong, await request.arrayBuffer() loads the entire body before parseSubmitPayload() enforces maxBytes. A large payload can still consume arbitrary memory on the request path.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@frontend/app/api/shop/orders/`[id]/payment/monobank/google-pay/submit/route.ts
around lines 189 - 225, The current code calls request.arrayBuffer() which
buffers the entire body before parseSubmitPayload can enforce maxBytes; change
to stream-read the request body using request.body or request.stream/getReader
and manually accumulate bytes up to getMonobankGooglePayMaxBodyBytes() so you
can early-abort and return the same
noStoreJson({code:'PAYLOAD_TOO_LARGE',...},413) if the accumulated size exceeds
maxBytes; still preserve the existing fast-path check against content-length
(using Number.parseInt) but treat absent or invalid content-length as unknown
and rely on the streaming accumulator to enforce limits, and pass the
truncated/complete Buffer to parseSubmitPayload only after confirming it did not
exceed maxBytes while handling stream errors to return the same INVALID_PAYLOAD
response.

'use client';

import { useParams, useRouter, useSearchParams } from 'next/navigation';
import { useParams, useSearchParams } from 'next/navigation';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Locate QuizzesSection call sites and inspect nearby JSX for Suspense wrappers.
rg -n -C8 --type=tsx '<QuizzesSection\b|<Suspense\b|QuizzesSection' .

Repository: DevLoversTeam/devlovers.net

Length of output: 98


🏁 Script executed:

#!/bin/bash
# Find tsx files and search for QuizzesSection usage and Suspense wrappers
echo "=== Searching for QuizzesSection imports and usages ==="
rg -g "*.tsx" -n "QuizzesSection|Suspense" .

echo ""
echo "=== Checking QuizzesSection.tsx file for useSearchParams usage ==="
cat frontend/components/quiz/QuizzesSection.tsx

Repository: DevLoversTeam/devlovers.net

Length of output: 8513


🏁 Script executed:

#!/bin/bash
cat frontend/app/[locale]/quizzes/page.tsx

Repository: DevLoversTeam/devlovers.net

Length of output: 1874


Wrap QuizzesSection in a Suspense boundary or pass the category via props.

Adding useSearchParams() in a Client Component causes the entire route to bail out to client-side rendering. Since the parent page uses revalidate = 300 (ISR), this defeats static rendering and forces dynamic rendering on every request. Either wrap QuizzesSection in <Suspense> to isolate the bailout, or pass the category as a prop from the server instead of reading it from the URL search params.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/components/quiz/QuizzesSection.tsx` at line 3, QuizzesSection
currently imports and uses useSearchParams which forces the component to be a
Client Component and bails the parent route to dynamic rendering; change this by
either (A) moving URL-reading logic to the server and passing the selected
category into QuizzesSection via a prop (e.g., add a category: string prop to
QuizzesSection and remove useSearchParams usage inside it), or (B) keep
QuizzesSection as a client component but isolate the client-only behavior by
rendering it inside a React.Suspense boundary from the parent server component
(wrap <QuizzesSection ... /> in <Suspense fallback={...}> so the parent page can
still use revalidate = 300). Ensure you update the QuizzesSection signature and
its callers accordingly if you choose the prop approach.

Comment on lines +16 to +33
const rows = await db
.select({
id: blogCategories.id,
slug: blogCategories.slug,
title: blogCategoryTranslations.title,
})
.from(blogCategories)
.leftJoin(
blogCategoryTranslations,
sql`${blogCategoryTranslations.categoryId} = ${blogCategories.id} AND ${blogCategoryTranslations.locale} = ${locale}`
)
.orderBy(blogCategories.displayOrder);

return rows.map(row => ({
id: row.id,
slug: row.slug,
title: row.title ?? '',
}));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't surface blank category labels on missing translations.

If a translation row is missing during the migration, title: '' gives you empty nav/filter chips. Fall back to a stable label (slug or a default-locale title) or drop untranslated rows instead.

🩹 Minimal fallback
   return rows.map(row => ({
     id: row.id,
     slug: row.slug,
-    title: row.title ?? '',
+    title: row.title ?? row.slug,
   }));
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/db/queries/blog/blog-categories.ts` around lines 16 - 33, The
current mapping in the query result maps missing translations to title: '' which
causes blank labels; update the post-query mapping (the rows.map block that
constructs { id, slug, title }) to provide a stable fallback instead of an empty
string: use row.title ?? row.slug (or row.title ?? row.slug.toString()) or
filter out rows with no translation depending on desired behavior. Locate the
select/join using blogCategories, blogCategoryTranslations and the locale join
condition and change the mapping to either skip rows where row.title is null or
set title to row.slug (or another chosen stable label) so UI chips never show
empty labels.

Comment on lines +10 to +13
ALTER TABLE np_warehouses
ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
FOREIGN KEY (city_ref) REFERENCES np_cities(ref)
ON DELETE SET NULL;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
fd '^00(29|30)_.*\.sql$' frontend/drizzle | sort | while read -r f; do
  echo "=== $f ==="
  sed -n '1,200p' "$f"
  echo
done

Repository: DevLoversTeam/devlovers.net

Length of output: 1119


Defer FK validation until after the backfill completes.

Adding the constraint without NOT VALID in 0029_* will fail immediately if any rows have invalid city_ref values. Since 0030_* cleans those values up, the constraint should be marked NOT VALID here and validated only after the backfill step.

🔧 Proposed fix

In 0029_shop_np_warehouses_city_ref_fk.sql, add NOT VALID:

    ALTER TABLE np_warehouses
      ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
      FOREIGN KEY (city_ref) REFERENCES np_cities(ref)
      ON DELETE SET NULL
+     NOT VALID;

In 0030_shop_np_warehouses_city_ref_fk_backfill.sql, validate after the cleanup:

ALTER TABLE np_warehouses
  VALIDATE CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
ALTER TABLE np_warehouses
ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
FOREIGN KEY (city_ref) REFERENCES np_cities(ref)
ON DELETE SET NULL;
ALTER TABLE np_warehouses
ADD CONSTRAINT np_warehouses_city_ref_np_cities_ref_fk
FOREIGN KEY (city_ref) REFERENCES np_cities(ref)
ON DELETE SET NULL
NOT VALID;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/drizzle/0029_shop_np_warehouses_city_ref_fk.sql` around lines 10 -
13, Add the FK as NOT VALID in the migration that creates it so the ALTER TABLE
in 0029_shop_np_warehouses_city_ref_fk.sql does not fail on existing bad rows:
modify the ALTER TABLE that adds constraint
np_warehouses_city_ref_np_cities_ref_fk on table np_warehouses (FOREIGN KEY
(city_ref) REFERENCES np_cities(ref)) to include NOT VALID, and then in the
subsequent backfill migration (0030_shop_np_warehouses_city_ref_fk_backfill.sql)
run ALTER TABLE np_warehouses VALIDATE CONSTRAINT
np_warehouses_city_ref_np_cities_ref_fk after cleaning up bad city_ref values.

Comment on lines +5 to +6
if (Array.isArray(node.content)) {
return node.content.map(extractPlainText).join(' ');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't inject spaces between every sibling node.

Inline marks split text into separate child nodes, so join(' ') turns Hello **world**! into Hello world !. That will leak into excerpts and can break exact-match search. Collect child text without a separator and add whitespace only at block boundaries.

💡 Proposed fix
-// Extracts plain text from Tiptap JSON nodes
-export function extractPlainText(node: any): string {
-  if (!node) return '';
-  if (typeof node.text === 'string') return node.text;
-  if (Array.isArray(node.content)) {
-    return node.content.map(extractPlainText).join(' ');
-  }
-  return '';
-}
+type RichTextNode = {
+  type?: string;
+  text?: string;
+  content?: RichTextNode[];
+};
+
+const BLOCK_TYPES = new Set([
+  'paragraph',
+  'heading',
+  'blockquote',
+  'bulletList',
+  'orderedList',
+  'listItem',
+  'codeBlock',
+  'hardBreak',
+]);
+
+export function extractPlainText(node: RichTextNode | null | undefined): string {
+  const parts: string[] = [];
+  appendPlainText(node, parts);
+  return parts.join('').replace(/\s+/g, ' ').trim();
+}
+
+function appendPlainText(
+  node: RichTextNode | null | undefined,
+  parts: string[]
+): void {
+  if (!node) return;
+  if (typeof node.text === 'string') parts.push(node.text);
+  node.content?.forEach(child => appendPlainText(child, parts));
+  if (node.type && BLOCK_TYPES.has(node.type)) parts.push(' ');
+}

A regression test with mixed inline marks and trailing punctuation would lock this down.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (Array.isArray(node.content)) {
return node.content.map(extractPlainText).join(' ');
type RichTextNode = {
type?: string;
text?: string;
content?: RichTextNode[];
};
const BLOCK_TYPES = new Set([
'paragraph',
'heading',
'blockquote',
'bulletList',
'orderedList',
'listItem',
'codeBlock',
'hardBreak',
]);
export function extractPlainText(node: RichTextNode | null | undefined): string {
const parts: string[] = [];
appendPlainText(node, parts);
return parts.join('').replace(/\s+/g, ' ').trim();
}
function appendPlainText(
node: RichTextNode | null | undefined,
parts: string[]
): void {
if (!node) return;
if (typeof node.text === 'string') parts.push(node.text);
node.content?.forEach(child => appendPlainText(child, parts));
if (node.type && BLOCK_TYPES.has(node.type)) parts.push(' ');
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/blog/text.ts` around lines 5 - 6, The current extractPlainText
implementation joins child text with join(' '), which inserts spaces between
inline-mark-split nodes; change the mapping over node.content to concatenate
children without a separator (use join('') or equivalent) so inline marks don't
introduce extra spaces, and instead only add a single space when you are at
block boundaries (i.e., when node.type is a block-level node such as
paragraph/heading/listItem) after assembling the block's text; update
extractPlainText to detect block node types and append a space once per block,
and add a regression test exercising mixed inline marks with trailing
punctuation to prevent regressions.

Comment on lines +86 to +123
function readReplayResult(
attempt: PaymentAttemptRow,
reused: boolean
): MonobankWalletSubmitResult {
const meta = asRecord(attempt.metadata);
const wallet = readWalletMetadata(meta);

const submitOutcome =
wallet.submitOutcome === 'unknown' ? 'unknown' : 'submitted';
const syncStatus =
typeof wallet.syncStatus === 'string' && wallet.syncStatus.trim()
? wallet.syncStatus.trim()
: null;
const invoiceId =
typeof attempt.providerPaymentIntentId === 'string' &&
attempt.providerPaymentIntentId.trim()
? attempt.providerPaymentIntentId.trim()
: typeof wallet.invoiceId === 'string' && wallet.invoiceId.trim()
? wallet.invoiceId.trim()
: null;
const redirectUrl =
typeof wallet.redirectUrl === 'string' && wallet.redirectUrl.trim()
? wallet.redirectUrl.trim()
: null;
const providerModifiedAt =
attempt.providerModifiedAt ??
parseIsoDateOrNull(wallet.providerModifiedAt ?? null);

return {
attemptId: attempt.id,
attemptNumber: attempt.attemptNumber,
invoiceId,
redirectUrl,
outcome: submitOutcome,
syncStatus,
providerModifiedAt,
reused,
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't replay a creating attempt as submitted.

readReplayResult() maps every non-unknown wallet state to submitted. A duplicate request that lands after createCreatingAttempt() but before persistAttemptSubmitted() will therefore return a false success with no invoice or redirect URL while the PSP call is still in flight.

Suggested fix
-  const submitOutcome =
-    wallet.submitOutcome === 'unknown' ? 'unknown' : 'submitted';
+  const submitOutcome =
+    attempt.status === 'creating' || wallet.submitOutcome === 'creating'
+      ? 'unknown'
+      : wallet.submitOutcome === 'unknown'
+        ? 'unknown'
+        : 'submitted';
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function readReplayResult(
attempt: PaymentAttemptRow,
reused: boolean
): MonobankWalletSubmitResult {
const meta = asRecord(attempt.metadata);
const wallet = readWalletMetadata(meta);
const submitOutcome =
wallet.submitOutcome === 'unknown' ? 'unknown' : 'submitted';
const syncStatus =
typeof wallet.syncStatus === 'string' && wallet.syncStatus.trim()
? wallet.syncStatus.trim()
: null;
const invoiceId =
typeof attempt.providerPaymentIntentId === 'string' &&
attempt.providerPaymentIntentId.trim()
? attempt.providerPaymentIntentId.trim()
: typeof wallet.invoiceId === 'string' && wallet.invoiceId.trim()
? wallet.invoiceId.trim()
: null;
const redirectUrl =
typeof wallet.redirectUrl === 'string' && wallet.redirectUrl.trim()
? wallet.redirectUrl.trim()
: null;
const providerModifiedAt =
attempt.providerModifiedAt ??
parseIsoDateOrNull(wallet.providerModifiedAt ?? null);
return {
attemptId: attempt.id,
attemptNumber: attempt.attemptNumber,
invoiceId,
redirectUrl,
outcome: submitOutcome,
syncStatus,
providerModifiedAt,
reused,
};
function readReplayResult(
attempt: PaymentAttemptRow,
reused: boolean
): MonobankWalletSubmitResult {
const meta = asRecord(attempt.metadata);
const wallet = readWalletMetadata(meta);
const submitOutcome =
attempt.status === 'creating' || wallet.submitOutcome === 'creating'
? 'unknown'
: wallet.submitOutcome === 'unknown'
? 'unknown'
: 'submitted';
const syncStatus =
typeof wallet.syncStatus === 'string' && wallet.syncStatus.trim()
? wallet.syncStatus.trim()
: null;
const invoiceId =
typeof attempt.providerPaymentIntentId === 'string' &&
attempt.providerPaymentIntentId.trim()
? attempt.providerPaymentIntentId.trim()
: typeof wallet.invoiceId === 'string' && wallet.invoiceId.trim()
? wallet.invoiceId.trim()
: null;
const redirectUrl =
typeof wallet.redirectUrl === 'string' && wallet.redirectUrl.trim()
? wallet.redirectUrl.trim()
: null;
const providerModifiedAt =
attempt.providerModifiedAt ??
parseIsoDateOrNull(wallet.providerModifiedAt ?? null);
return {
attemptId: attempt.id,
attemptNumber: attempt.attemptNumber,
invoiceId,
redirectUrl,
outcome: submitOutcome,
syncStatus,
providerModifiedAt,
reused,
};
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/lib/services/orders/monobank-wallet.ts` around lines 86 - 123,
readReplayResult currently treats any wallet.submitOutcome !== 'unknown' as
'submitted', which causes attempts in the interim "creating" state to be
replayed as a false successful submit; update the mapping so only
wallet.submitOutcome === 'submitted' maps to 'submitted' and any other
non-'submitted' states (including 'creating') map to 'unknown' instead, leaving
invoiceId/redirectUrl null until persistAttemptSubmitted has run; modify the
logic in readReplayResult (referencing wallet.submitOutcome,
createCreatingAttempt, and persistAttemptSubmitted) accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working enhancement New feature or request performance Performance and efficiency optimizations without functional changes. refactor Code restructuring without functional changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants