diff --git a/.gitignore b/.gitignore index 731afaf3..06b67966 100644 --- a/.gitignore +++ b/.gitignore @@ -55,6 +55,4 @@ monitoring/prometheus-*.yml monitoring/alertmanager.yml ### K6 data ### -k6/data/** k6/reports/** -k6/scripts/** diff --git a/docs/architecture/README.md b/docs/architecture/README.md index c89648fc..157619ca 100644 --- a/docs/architecture/README.md +++ b/docs/architecture/README.md @@ -1,21 +1,15 @@ # Architecture Documents -이 디렉터리는 프로젝트의 구조와 핵심 흐름을 정리하는 문서 모음이다. +이 디렉터리는 리팩터링이나 성능 개선 전에 빠르게 구조를 파악하기 위한 미니맵 문서 모음이다. -## 포함 대상 +## 문서 원칙 -- 시스템 개요 -- 도메인 관계 -- 대표 요청 흐름 -- 상태 전이 -- 외부 시스템 연결 구조 - -## 작성 기준 - -- 클래스 전체 나열보다 도메인과 흐름을 우선 정리한다. -- Mermaid 다이어그램은 핵심 구조와 흐름만 표현한다. -- 문서는 실제 리팩터링과 성능 개선 판단에 도움이 되는 수준으로 유지한다. -- 구현 상세보다 “어디가 핵심이고 어디가 위험한지”가 먼저 보이게 적는다. +- 목적은 현재 구조를 빠르게 이해하는 것이다. +- 문서는 책임, 외부 의존성, 핵심 기능 흐름만 남긴다. +- 도메인별로 `핵심 용어 사전(용어/정의)`을 유지해 의미 흔들림을 줄인다. +- 도메인 의미에 영향을 주는 선택은 `간결 의사결정 기록`에 누적한다. +- 핵심 기능은 중요도와 이해 난이도를 기준으로 2~3개만 고른다. +- 도메인 내부 구현 상세나 세부 설계 판단은 `MISSIONS.md` 또는 `docs/decisions`에서 다룬다. ## 템플릿 @@ -23,7 +17,8 @@ ## 현재 문서 -- [프로젝트 전체 구조 개요](overview.md) -- [Streak 도메인 구조](streak.md) -- [Word 도메인 구조](word.md) -- [Book 도메인 구조](content-book.md) +- [시스템 컨텍스트 다이어그램](overview.md) +- [Streak 도메인 미니맵](streak.md) +- [Word 도메인 미니맵](word.md) +- [Book 도메인 미니맵](content-book.md) +- [MongoDB 논리 ERD (dbdiagram.io용 DBML)](mongodb-logical-erd.dbml) diff --git a/docs/architecture/content-book.md b/docs/architecture/content-book.md index 8df73e83..2855bc19 100644 --- a/docs/architecture/content-book.md +++ b/docs/architecture/content-book.md @@ -1,33 +1,40 @@ -# Book 도메인 구조 +# Book 도메인 미니맵 -## 목적 +## 현재 시스템의 책임 +- 책, 챕터, 청크로 구성된 콘텐츠를 관리한다. +- 사용자에게 책 콘텐츠를 제공하고, 읽기 진행도를 계산·저장한다. +- 관리자가 새로운 책과 콘텐츠를 등록할 수 있도록 한다. -이 문서는 책 콘텐츠 도메인이 조회, 진행도, import, 이미지 처리를 어떻게 함께 다루는지 설명한다. +## 도메인 구조 +- 책은 여러 개의 챕터로 구성된다. +- 챕터는 여러 개의 청크로 구성된다. +- 청크는 텍스트, 이미지 등의 타입을 가질 수 있다. -## 범위 +## 핵심 용어 사전 -- `BookService` -- `ChapterService` -- `ProgressService` -- 책 import 및 이미지 처리 - -## 핵심 구성 요소 - -- `BooksController`, `BooksProgressController` -- `BookService` -- `ChapterService` -- `ProgressService` -- `BookRepository`, `ChapterRepository`, `ChunkRepository`, `BookProgressRepository` +| 용어 | 정의 | +| --- | --- | +| Book | 여러 챕터를 포함하는 최상위 학습 콘텐츠 | +| Chapter | Book 내부의 순차 학습 단위 | +| Chunk | Chapter 내부의 세부 학습 단위(텍스트/이미지 등) | +| BookProgress | 사용자별 책 학습 상태를 저장하는 엔티티 | +| normalizedProgress | 완료된 챕터 수를 기준으로 계산한 책 진행률(%) | +| maxReadChunkNumber | 챕터/청크 조합 위치를 전역 순서값으로 환산한 최대 도달 지표 | -## 구조 요약 +## 핵심 불변식 -Book 도메인은 사용자에게 보이는 조회 API와 운영성 있는 import 파이프라인이 함께 들어 있는 구조다. -책 기본 정보와 챕터/청크는 MongoDB에 저장되고, import 시에는 AI 결과 파일 다운로드, 이미지 이동, 썸네일 생성이 같이 수행된다. -사용자 진행도는 책 단위가 아니라 챕터와 청크를 기반으로 계산되며, 읽기 완료는 스트릭 갱신으로 이어진다. +1. 책 진행률(`normalizedProgress`)은 챕터 완료 기반으로 계산한다. +2. 상태 분류(`NOT_STARTED`, `IN_PROGRESS`, `COMPLETED`)는 `normalizedProgress`와 `isCompleted`만으로 판정한다. +3. `maxReadChunkNumber`는 챕터 우선 정렬 기준으로 계산한다. 비교 순서는 `(chapterNumber, chunkNumber)`이며, chapter가 더 크면 항상 더 큰 진행 위치로 본다. +4. `GET /progress`는 학습 상태를 변경하지 않는다. (progress 문서 생성/수정 금지, 검증 중) +5. GET API의 부작용은 분석 목적(`viewCount`, 읽기 세션 시작)으로만 허용한다. +6. 진행도 모델은 V3 단일 경로를 목표로 하며, fallback은 검증 완료 후 제거한다. -## Mermaid 다이어그램 +## 외부 시스템 의존성 -### 구조 관계 +- MongoDB: 책, 챕터, 청크, 진행도 저장 +- S3 / R2: 표지 이미지와 import 산출물 저장 +- StreakService: 읽기 완료 이후 스트릭 반영 ```mermaid flowchart TD @@ -58,7 +65,37 @@ flowchart TD ProgressService --> Streak ``` -### 대표 흐름: 진행도 업데이트와 스트릭 연결 +## 핵심 기능 + +- 책 목록 조회 +- 진행도 업데이트 +- 책 import 및 이미지 처리 + +## 핵심 기능 흐름 + +### 책 목록 조회 + +```mermaid +sequenceDiagram + participant Client + participant BooksController + participant BookService + participant BookRepository + participant BookProgressRepository + participant Mongo + + Client->>BooksController: GET /api/v1/books + BooksController->>BookService: getBooks(...) + BookService->>BookRepository: find books + BookService->>BookProgressRepository: load user progress + BookRepository->>Mongo: query books + BookProgressRepository->>Mongo: query bookProgress + Mongo-->>BookService: books + progress + BookService-->>BooksController: BookResponse list + BooksController-->>Client: response +``` + +### 진행도 업데이트와 스트릭 연결 ```mermaid sequenceDiagram @@ -81,35 +118,29 @@ sequenceDiagram ProgressService-->>Client: ProgressResponse ``` -## 주요 흐름 설명 - -1. `BookService`는 책 조회와 import를 함께 담당한다. -2. 조회 시에는 사용자 진행도를 합쳐 `BookResponse`를 만들고, import 시에는 AI 결과 파일 다운로드와 이미지 후처리까지 수행한다. -3. `ChapterService`는 챕터 목록 조회와 탐색을 맡고, 챕터별 청크 수와 사용자 진행도를 조합해 응답을 만든다. -4. `ProgressService`는 청크 단위 요청을 챕터 단위 진행도와 책 완료 상태로 변환하고, 마지막 청크를 읽은 경우 `StreakService`를 호출한다. - -## 핵심 데이터 +## 핵심 기능 선정 기준 -- `Book` - - 책 메타데이터, 표지 이미지, 난이도, 챕터 수 -- `Chapter` - - 챕터 번호, 설명, 읽기 시간 -- `Chunk` - - 난이도별 세부 텍스트 조각 -- `BookProgress` - - 사용자별 현재 청크, 챕터별 진행도, 완료 여부 +1. 책 목록 조회는 사용자 트래픽과 성능 이슈가 가장 자주 모이는 진입점이다. +2. 진행도 업데이트는 `book`, `chapter`, `chunk`, `streak`를 함께 이해해야 한다. +3. import는 운영 기능이지만 파일 저장과 후처리가 함께 묶여 있어 읽기 진입점으로 가치가 있다. -## 이 도메인의 특징 +## 간결 의사결정 기록 -- 조회 응답이 단순 조회가 아니라 사용자 진행도와 이미지 URL 조합을 포함한다. -- import와 조회가 가까운 서비스에 있어 운영 기능과 사용자 API가 한 도메인 아래 모여 있다. -- 챕터 완료 판단은 청크 수 기준으로 계산되고, 책 완료 여부는 챕터 진행도 배열을 기반으로 계산된다. +| 날짜 | 결정 | 이유 | 영향 범위 | 상태 | +| --- | --- | --- | --- | --- | +| 2026-04-08 | Book 도메인은 책/챕터/청크 3계층 구조를 문서 표준으로 유지 | 기능 확장 시 공통 읽기 모델을 보존하기 위함 | BookService, ChapterService, ProgressService | 유지 | +| 2026-04-08 | 진행도 갱신과 스트릭 연동 흐름을 핵심 시나리오로 고정 | 교차 도메인 영향이 가장 큰 지점이기 때문 | ProgressService, StreakService | 유지 | +| 2026-04-08 | `normalizedProgress`를 챕터 완료 기반으로 통일 | 청크만으로는 책 진행 의미를 안정적으로 표현하기 어려움 | ProgressService, BookService | 유지 | +| 2026-04-08 | 상태 판정은 `normalizedProgress` + `isCompleted`로 단일화 | 상태 분류 기준 다중화로 인한 의미 흔들림 방지 | BookRepositoryImpl, BookService, ProgressService | 유지 | +| 2026-04-08 | `maxReadChunkNumber`를 정식 필드로 유지하고 챕터 우선 `(chapterNumber, chunkNumber)` 순서로 계산 | 챕터 경계를 보존한 진행 위치 비교를 위해서 | BookProgress, ProgressService, DTO | 유지 | +| 2026-04-08 | `GET /progress`에서 progress 미존재 시 0% 조회 정책 검증 | 조회 API의 상태 변경 부작용 제거 필요 | BooksProgressController, ProgressService | 검증 중 | +| 2026-04-08 | GET API는 분석성 부작용만 허용 | 사용자 학습 상태와 운영 지표를 분리하기 위함 | ChapterService, ReadingSessionService | 유지 | +| 2026-04-08 | V3 단일화 전환 가능성 사전 검증 후 fallback 제거 | 데이터 호환성 리스크를 통제하기 위함 | ProgressService, ChapterService, ChapterRepositoryImpl | 검증 중 | -## 개선 포인트 +## 검증 필요 항목 -- `BookService`는 import, 이미지 처리, 조회 응답 조립이 함께 있어 책임 분리가 가능하다. -- `ProgressService`는 검증, 진행도 계산, 읽기 완료 처리, 스트릭 연계를 한 번에 수행한다. -- `ChapterService`는 조회 응답 조립과 view count 증가, backward compatibility 로직이 같이 들어 있다. +- `GET /progress`에서 progress 문서가 없는 경우에도 DB 변경 없이 0% 응답 가능한지 확인 +- V3 단일화 시 기존 데이터/조회 경로에서 fallback 제거해도 회귀가 없는지 확인 ## 참고 코드 diff --git a/docs/architecture/mongodb-logical-erd.dbml b/docs/architecture/mongodb-logical-erd.dbml new file mode 100644 index 00000000..a72df4fa --- /dev/null +++ b/docs/architecture/mongodb-logical-erd.dbml @@ -0,0 +1,499 @@ +// llv-api MongoDB logical ERD for dbdiagram.io +// - Embedded arrays/objects/maps are modeled as `json`. +// - Refs are application-level logical relations; MongoDB does not enforce foreign keys. +// - Some polymorphic fields such as `contentId` are kept as notes instead of strict refs. + +Table users { + id varchar [pk] + username varchar [not null, unique] + password varchar + email varchar + displayName varchar + provider varchar + profileImageUrl varchar + role varchar + deleted boolean + createdAt datetime + deletedAt datetime +} + +Table refreshTokens { + id varchar [pk] + tokenId varchar [not null, unique] + userId varchar [not null] + expiresAt datetime [note: 'TTL index ttl_expires_at, expireAfter=0s'] +} + +Table userTickets { + id varchar [pk] + userId varchar [not null, unique] + balance int + version bigint + createdAt datetime + updatedAt datetime +} + +Table ticketTransactions { + id varchar [pk] + userId varchar [not null] + amount int [note: 'positive=grant, negative=use'] + description text + status varchar + reservationId varchar + createdAt datetime + + indexes { + (userId, createdAt) [name: 'userId_createdAt'] + } +} + +Table fcmTokens { + id varchar [pk] + userId varchar [not null] + deviceId varchar [not null] + fcmToken varchar [not null, unique] + platform varchar + countryCode varchar + appVersion varchar + osVersion varchar + createdAt datetime + updatedAt datetime [note: 'TTL index, expireAfter=90d'] + isActive boolean +} + +Table pushLogs { + id varchar [pk] + campaignId varchar [unique] + fcmMessageId varchar + campaignGroup varchar + userId varchar + sentAt datetime + sentSuccess boolean + openedAt datetime + createdAt datetime [note: 'TTL index, expireAfter=180d'] + version bigint +} + +Table appVersion { + id varchar [pk] + latestVersion varchar + minimumVersion varchar + updatedAt datetime +} + +Table contentBanners { + id varchar [pk] + countryCode varchar + contentId varchar [note: 'Polymorphic content ref: book/article/custom/feed depending on contentType'] + contentType varchar + contentTitle varchar + contentAuthor varchar + contentCoverImageUrl varchar + contentReadingTime int + subtitle varchar + title varchar + description text + displayOrder int + isActive boolean + createdAt datetime +} + +Table crawlingDsl { + id varchar [pk] + domain varchar [unique] + name varchar + contentType varchar + titleDsl text + contentDsl text + coverImageDsl text + accessUrl varchar + createdAt datetime + updatedAt datetime +} + +Table words { + id varchar [pk] + word varchar [not null] + sourceLanguageCode varchar [not null] + targetLanguageCode varchar [not null] + summary json + meanings json + relatedForms json + isEssential boolean + + indexes { + (word, sourceLanguageCode, targetLanguageCode) [unique, name: 'word_language_pair_idx'] + (word, targetLanguageCode) [name: 'word_target_language_idx'] + } +} + +Table word_variants { + id varchar [pk] + word varchar [not null] + originalForm varchar [not null, note: 'Logical ref to words.word'] + variantTypes json + + indexes { + (word, originalForm) [unique, name: 'word_original_idx'] + } +} + +Table invalidWords { + id varchar [pk] + word varchar [unique] + attemptedAt datetime + attemptCount int +} + +Table wordBookmarks { + id varchar [pk] + userId varchar [not null] + word varchar [not null, note: 'Bookmarked original-form string, not words.id'] + bookmarkedAt datetime + + indexes { + (userId, word) [unique, name: 'userId_word_unique'] + } +} + +Table userStudyReports { + id varchar [pk] + userId varchar [not null, unique] + currentStreak int + longestStreak int + lastCompletionDate date + streakStartDate date + lastLearningTimestamp datetime + availableFreezes int + totalReadingTimeSeconds bigint + completedContentIds json + preferredStudyHour int + preferredStudyHourUpdatedAt datetime + createdAt datetime + updatedAt datetime +} + +Table dailyCompletions { + id varchar [pk] + userId varchar [not null] + completionDate date + firstCompletionCount int + totalCompletionCount int + completedContents json [note: 'Embedded array: type, contentId, chapterId, completedAt, readingTime, category, difficultyLevel, streakStatus'] + streakCount int + streakStatus varchar + createdAt datetime + + indexes { + (userId, completionDate) [unique, name: 'idx_userId_completionDate'] + } +} + +Table freezeTransactions { + id varchar [pk] + userId varchar [not null] + amount int + description text + createdAt datetime +} + +Table books { + id varchar [pk] + title varchar + titleTranslations json + author varchar + coverImageUrl varchar + difficultyLevel varchar + chapterCount int + readingTime int + averageRating float + reviewCount int + viewCount int + tags json + createdAt datetime +} + +Table chapters { + id varchar [pk] + bookId varchar [not null] + chapterNumber int + title varchar + chapterImageUrl varchar + description text + readingTime int +} + +Table chunks { + id varchar [pk] + chapterId varchar [not null] + chunkNumber int + difficultyLevel varchar + type varchar + content text + description text + + indexes { + (chapterId, difficultyLevel, chunkNumber) [name: 'chapter_difficulty_chunk_idx'] + } +} + +Table bookProgress { + id varchar [pk] + userId varchar [not null] + bookId varchar [not null] + chapterId varchar + chunkId varchar + currentReadChapterNumber int + maxReadChapterNumber int + normalizedProgress float + maxNormalizedProgress float + currentDifficultyLevel varchar + chapterProgresses json [note: 'Embedded array: chapterNumber, progressPercentage, isCompleted, completedAt'] + isCompleted boolean + completedAt datetime + updatedAt datetime + + indexes { + (userId, bookId) [unique, name: 'idx_user_book_progress'] + } +} + +Table articles { + id varchar [pk] + title varchar + author varchar + coverImageUrl varchar + originUrl varchar + difficultyLevel varchar + readingTime int + averageRating float + reviewCount int + viewCount int + category varchar + tags json + targetLanguageCode json + createdAt datetime +} + +Table articleChunks { + id varchar [pk] + articleId varchar [not null] + chunkNumber int + difficultyLevel varchar + type varchar + content text + description text + + indexes { + (articleId, difficultyLevel, chunkNumber) [name: 'article_difficulty_chunk_idx'] + } +} + +Table articleProgress { + id varchar [pk] + userId varchar [not null] + articleId varchar [not null] + chunkId varchar + normalizedProgress float + maxNormalizedProgress float + currentDifficultyLevel varchar + isCompleted boolean + completedAt datetime + updatedAt datetime + + indexes { + (userId, articleId) [unique, name: 'idx_user_article_progress'] + } +} + +Table feedSources { + id varchar [pk] + url varchar [unique] + domain varchar + name varchar + coverImageDsl text + contentType varchar + category varchar + tags json + isActive boolean + createdAt datetime + updatedAt datetime +} + +Table feeds { + id varchar [pk] + contentType varchar + title varchar + url varchar [unique] + thumbnailUrl varchar + author varchar + description text + category varchar + tags json + sourceProvider varchar [note: 'Provider/domain string, not feedSources.id'] + publishedAt datetime + displayOrder int + viewCount int + avgReadTimeSeconds float + createdAt datetime + deleted boolean + deletedAt datetime +} + +Table contentRequests { + id varchar [pk] + userId varchar [not null] + title varchar [not null] + originalText text + contentType varchar + originAuthor varchar + targetDifficultyLevels json + originUrl varchar + originDomain varchar + coverImageUrl varchar + status varchar + progress int + createdAt datetime + completedAt datetime + deletedAt datetime + errorMessage text + resultCustomContentId varchar [note: 'Logical ref to customContents.id after generation'] + updatedAt datetime +} + +Table customContents { + id varchar [pk] + userId varchar [not null] + contentRequestId varchar [not null] + isDeleted boolean + title varchar [not null] + author varchar + coverImageUrl varchar + difficultyLevel varchar + targetDifficultyLevels json + readingTime int + averageRating float + reviewCount int + viewCount int + tags json + originUrl varchar + originDomain varchar + createdAt datetime + updatedAt datetime + deletedAt datetime +} + +Table customContentChunks { + id varchar [pk] + customContentId varchar [not null] + userId varchar [not null] + difficultyLevel varchar + chapterNum int + chunkNum int + type varchar + chunkText text + description text + isDeleted boolean + createdAt datetime + updatedAt datetime + deletedAt datetime + + indexes { + (customContentId, difficultyLevel, chapterNum, chunkNum) [name: 'custom_content_difficulty_chapter_chunk_idx'] + (userId, isDeleted, createdAt) [name: 'user_deleted_created_idx'] + } +} + +Table customProgress { + id varchar [pk] + userId varchar [not null] + customId varchar [not null] + chunkId varchar + normalizedProgress float + maxNormalizedProgress float + currentDifficultyLevel varchar + isCompleted boolean + completedAt datetime + updatedAt datetime + + indexes { + (userId, customId) [unique, name: 'idx_user_custom_progress'] + } +} + +Table userCustomContents { + id varchar [pk] + userId varchar [not null] + customContentId varchar [not null] + contentRequestId varchar [not null] + unlockedAt datetime + + indexes { + (userId, customContentId) [unique, name: 'user_content_idx'] + } +} + +Table contentAccessLogs { + id varchar [pk] + userId varchar [not null] + contentId varchar [note: 'Polymorphic content ref: book/article/custom/feed'] + contentType varchar + category varchar + readTimeSeconds int + accessedAt datetime + + indexes { + (userId, accessedAt) [name: 'user_accessed_idx'] + (userId, category) [name: 'user_category_idx'] + (userId, contentType) [name: 'user_content_type_idx'] + } +} + +Table userCategoryPreferences { + id varchar [pk] + userId varchar [not null, unique] + primaryCategory varchar + categoryScores json + rawAccessCounts json + tagScores json + totalAccessCount int + lastUpdatedAt datetime +} + +Ref: refreshTokens.userId > users.id +Ref: userTickets.userId - users.id +Ref: ticketTransactions.userId > users.id +Ref: fcmTokens.userId > users.id +Ref: pushLogs.userId > users.id +Ref: wordBookmarks.userId > users.id +Ref: userStudyReports.userId - users.id +Ref: dailyCompletions.userId > users.id +Ref: freezeTransactions.userId > users.id + +Ref: chapters.bookId > books.id +Ref: chunks.chapterId > chapters.id +Ref: bookProgress.userId > users.id +Ref: bookProgress.bookId > books.id +Ref: bookProgress.chapterId > chapters.id +Ref: bookProgress.chunkId > chunks.id + +Ref: articleChunks.articleId > articles.id +Ref: articleProgress.userId > users.id +Ref: articleProgress.articleId > articles.id +Ref: articleProgress.chunkId > articleChunks.id + +Ref: contentRequests.userId > users.id +Ref: contentRequests.resultCustomContentId > customContents.id +Ref: customContents.userId > users.id +Ref: customContents.contentRequestId > contentRequests.id +Ref: customContentChunks.customContentId > customContents.id +Ref: customContentChunks.userId > users.id +Ref: customProgress.userId > users.id +Ref: customProgress.customId > customContents.id +Ref: customProgress.chunkId > customContentChunks.id +Ref: userCustomContents.userId > users.id +Ref: userCustomContents.customContentId > customContents.id +Ref: userCustomContents.contentRequestId > contentRequests.id + +Ref: contentAccessLogs.userId > users.id +Ref: userCategoryPreferences.userId - users.id diff --git a/docs/architecture/overview.md b/docs/architecture/overview.md index 1d8630ad..de06ada5 100644 --- a/docs/architecture/overview.md +++ b/docs/architecture/overview.md @@ -1,31 +1,36 @@ -# 프로젝트 전체 구조 개요 +# 시스템 컨텍스트 다이어그램 -## 목적 +## 현재 시스템의 책임 -이 문서는 `llv-api`의 상위 구조를 한 장으로 설명하기 위한 문서다. -세부 구현보다 어떤 도메인이 핵심이고, 어떤 저장소와 외부 시스템이 붙어 있는지 빠르게 파악하는 데 초점을 둔다. +- 모바일 학습 앱의 API를 제공한다. +- 책 읽기, 단어 조회, 스트릭 유지 같은 핵심 학습 기능을 한 애플리케이션에서 처리한다. +- 추천, 알림, 크롤링, 파일 처리 같은 보조 기능도 함께 운영한다. -## 범위 +## 핵심 용어 사전 -- 주요 사용자 요청 경로 -- 핵심 도메인 묶음 -- 공통 인프라와 외부 시스템 -- 우선 문서화 대상 도메인 +| 용어 | 정의 | +| --- | --- | +| 학습 콘텐츠 | 사용자가 소비하는 책/아티클/커스텀 학습 단위 | +| 진행도 | 사용자의 현재 학습 위치를 수치화한 상태 | +| 스트릭 | 사용자 학습 연속성(일 단위 유지 상태) | +| 보조 기능 | 추천/알림/크롤링처럼 핵심 학습 흐름을 지원하는 기능 | -## 핵심 구성 요소 +## 외부 시스템 의존성 -- API 진입점: `controller` -- 핵심 도메인: `streak`, `word`, `content/book` -- 보조 도메인: `content/recommendation`, `fcm`, `crawling` -- 공통 인프라: MongoDB, Redis, S3/R2, Spring AI, FCM +- MongoDB: 주요 도메인 데이터와 로그 저장 +- Redis: 읽기 세션과 짧은 상태 관리 +- S3 / R2: 이미지와 파일 저장 +- AI Model: 단어 분석과 생성 요청 +- FCM: 푸시 알림 발송 +- External Content Sites: 크롤링 대상 -## 구조 요약 +## 핵심 기능 -이 프로젝트는 하나의 Spring Boot 애플리케이션 안에 학습 콘텐츠, 단어 분석, 스트릭, 추천, 알림, 크롤링, 파일 처리 기능을 함께 두고 있다. -데이터 저장은 MongoDB를 중심으로 하고, Redis는 읽기 세션과 rate limit 같은 짧은 상태 관리에 사용한다. -외부 연동은 AI 모델, FCM, S3/R2, 크롤링 대상 사이트가 중심이며, 도메인 서비스가 이 인프라를 직접 조합하는 구조가 많다. +- 책 읽기와 진행도 반영 +- 단어 조회와 AI 기반 보완 +- 스트릭 계산과 보상/알림 처리 -## Mermaid 다이어그램 +## 시스템 컨텍스트 다이어그램 ```mermaid flowchart TD @@ -73,62 +78,14 @@ flowchart TD Crawl --> External ``` -## 주요 흐름 설명 +## 핵심 기능 선정 기준 -1. `book`, `word`, `streak` 요청은 각각 전용 서비스로 들어가지만, 실제 사용자 학습 흐름에서는 서로 연결된다. -2. `content/book`의 읽기 완료는 `streak` 갱신과 이어지고, 읽기 로그는 `content/recommendation`에서 선호도 집계에 사용된다. -3. `word`는 MongoDB 캐시와 AI 호출을 조합해 결과를 만들고, `streak`는 Redis 읽기 세션과 MongoDB 리포트를 함께 사용한다. -4. `crawling`과 `feed`는 외부 사이트 구조 변화에 영향을 많이 받는 별도 리스크 영역이다. +1. 실제 사용자 요청이 자주 통과하는 기능이다. +2. 외부 의존성이나 도메인 결합이 있어 이해 난이도가 높다. +3. 리팩터링이나 성능 개선 시 영향 범위가 큰 영역이다. -## 핵심 도메인 +## 간결 의사결정 기록 -### `streak` - -- 사용자 학습 연속성, 프리즈, 보상, 알림을 담당한다. -- Redis 기반 읽기 세션과 MongoDB 기반 누적 리포트를 함께 사용한다. -- 스케줄러와 알림이 얽혀 있어 구조적으로 가장 복잡한 영역 중 하나다. - -### `word` - -- 단어 조회, 원형/변형 매핑, AI 분석, 유효하지 않은 단어 차단을 담당한다. -- 캐시와 AI 호출, 응답 검증이 한 흐름에 들어가 있어 비용과 안정성 측면에서 중요하다. - -### `content/book` - -- 책 조회, 챕터/청크, 진행도, 이미지 처리, 가져오기(import)까지 맡는다. -- 조회 성능과 진행도 계산, 다른 도메인과의 연결 지점이 함께 모여 있다. - -## 공통 인프라 - -### MongoDB - -- 주요 도메인 엔티티와 로그, 추천 데이터를 저장한다. -- 도메인 서비스는 Mongo 문서 구조를 직접 전제로 동작하는 경우가 많다. - -### Redis - -- `streak` 읽기 세션과 `common/ratelimit` 같은 짧은 상태 관리에 사용된다. - -### S3 / R2 - -- 책 이미지와 AI 생성 결과 파일 처리를 담당한다. -- `content/book`는 import 이후 이미지 이동과 썸네일 생성까지 이어진다. - -### AI / FCM / External Sites - -- AI는 `word` 분석의 핵심 의존성이다. -- FCM은 `streak`, `notification` 쪽에서 사용된다. -- 외부 사이트는 `crawling`, `feed` 영역의 가장 큰 불안정 요소다. - -## 현재 문서화 우선순위 - -- [Streak 도메인 구조](streak.md) -- [Word 도메인 구조](word.md) -- [Book 도메인 구조](content-book.md) - -## 개선 포인트 - -- `streak`는 상태 계산, 보상, 통계, 알림 관련 책임이 큰 서비스에 집중돼 있다. -- `word`는 캐시 정책과 AI 실패 처리, 응답 검증이 서비스 흐름 안에 함께 들어가 있다. -- `content/book`는 조회, import, 이미지 처리, 진행도 계산이 서로 가까이 있어 변경 영향 범위가 넓다. -- 외부 의존성이 큰 `crawling`, `feed`는 이후 안정성 문서에서 별도로 다루는 편이 맞다. +| 날짜 | 결정 | 이유 | 영향 범위 | 상태 | +| --- | --- | --- | --- | --- | +| 2026-04-08 | 시스템 문서는 미니맵 중심으로 유지 | 전체 구조를 빠르게 파악하기 위한 목적 우선 | docs/architecture/* | 유지 | diff --git a/docs/architecture/streak.md b/docs/architecture/streak.md index 5d5fce5d..3f98a037 100644 --- a/docs/architecture/streak.md +++ b/docs/architecture/streak.md @@ -1,33 +1,33 @@ -# Streak 도메인 구조 +# Streak 도메인 미니맵 -## 목적 +## 현재 시스템의 책임 -이 문서는 스트릭 도메인이 어떻게 학습 완료, 보상, 프리즈, 알림을 처리하는지 설명한다. +- 사용자 학습 연속일 수를 계산한다. +- 읽기 세션과 학습 시간을 관리한다. +- 프리즈, 보상, 완료 기록을 갱신한다. +- 보호 알림과 관련 스케줄 작업을 수행한다. -## 범위 +## 도메인 구조 -- `StreakService` -- `ReadingSessionService` -- 스트릭 관련 스케줄러 -- `UserStudyReport`, `DailyCompletion`, `FreezeTransaction` +- 스트릭은 사용자별 누적 리포트와 일자별 완료 기록으로 상태를 계산한다. +- 읽기 세션은 Redis에 짧게 저장되고, 확정된 상태는 MongoDB에 반영된다. +- 프리즈와 보상 기록은 별도 트랜잭션/이력 데이터로 관리된다. -## 핵심 구성 요소 +## 핵심 용어 사전 -- `StreakController` -- `StreakService` -- `ReadingSessionService` -- `StreakProtectionScheduler` -- `UserStudyReportRepository`, `DailyCompletionRepository`, `FreezeTransactionRepository` +| 용어 | 정의 | +| --- | --- | +| 스트릭 | 학습 완료를 일 단위로 누적한 연속 기록 | +| 읽기 세션 | 학습 시작 시점부터 종료/완료 처리 전까지의 임시 상태 | +| 프리즈 | 스트릭을 하루 보호하는 소모성 보호 자원 | +| 완료 기록 | 특정 날짜 학습 완료 여부를 확정한 데이터 | -## 구조 요약 +## 외부 시스템 의존성 -스트릭 도메인은 사용자의 학습 연속성을 계산하는 핵심 서비스다. -읽기 세션은 Redis에 짧게 저장하고, 실제 누적 리포트와 완료 기록은 MongoDB에 저장한다. -책 읽기 완료나 다른 콘텐츠 완료 흐름에서 `StreakService`를 호출해 스트릭을 갱신하고, 스케줄러는 밤 시간대에 보호 알림을 보낸다. - -## Mermaid 다이어그램 - -### 구조 관계 +- MongoDB: 누적 리포트와 완료 기록 저장 +- Redis: 읽기 세션과 짧은 상태 저장 +- FCM: 보호 알림 발송 +- content/book: 읽기 완료 이벤트가 유입되는 주요 호출 지점 ```mermaid flowchart TD @@ -52,7 +52,15 @@ flowchart TD Scheduler --> FCM ``` -### 대표 흐름: 읽기 완료 후 스트릭 갱신 +## 핵심 기능 + +- 읽기 완료 후 스트릭 갱신 +- 읽기 세션 관리 +- 보호 알림 스케줄링 + +## 핵심 기능 흐름 + +### 읽기 완료 후 스트릭 갱신 ```mermaid sequenceDiagram @@ -72,41 +80,18 @@ sequenceDiagram ProgressService-->>Client: progress response ``` -### 상태 관점 - -```mermaid -stateDiagram-v2 - [*] --> Active - Active --> CompletedToday: 오늘 학습 완료 - Active --> AtRisk: 학습 없이 하루 종료 - AtRisk --> Protected: freeze로 스트릭 보호 - AtRisk --> Reset: 보호 수단 없음 - Protected --> Active: 다음 학습일에 연속 유지 - CompletedToday --> Active: 다음 날짜로 이동 - Reset --> Active: 새 스트릭 시작 -``` - -## 주요 흐름 설명 - -1. 사용자가 학습을 시작하면 `ReadingSessionService`가 Redis에 읽기 세션을 저장한다. -2. 읽기 완료 시 `ProgressService` 같은 상위 도메인이 `StreakService`를 호출해 학습 시간, 스트릭, 완료 콘텐츠를 갱신한다. -3. `StreakService`는 오늘/어제 상태, 누락 일수, 프리즈 사용 여부를 계산하고 보상 지급 여부도 함께 판단한다. -4. `StreakProtectionScheduler`는 밤 9시에 오늘 미완료 사용자를 찾아 FCM 보호 알림을 보낸다. - -## 핵심 데이터 +## 핵심 기능 선정 기준 -- `UserStudyReport` - - 현재 스트릭, 최장 스트릭, 사용 가능 프리즈, 총 학습 시간 등 누적 상태 -- `DailyCompletion` - - 일자별 완료 상태 -- `FreezeTransaction` - - 프리즈 지급/사용 내역 +1. 스트릭 갱신은 다른 학습 도메인에서 공통으로 호출하는 핵심 교차 지점이다. +2. `리딩 세션 -> 누적 상태 -> 알림`으로 이어지는 도메인 구조를 같이 이해해야 한다. +3. Redis, MongoDB, FCM이 함께 등장해 의존성 파악 가치가 크다. +4. 세션, 누적 상태, 스케줄러가 모두 연결돼 있어 처음 읽는 난이도가 높다. -## 개선 포인트 +## 간결 의사결정 기록 -- `StreakService`에 상태 계산, 보상 지급, 통계 응답 조립이 많이 모여 있어 분리 여지가 크다. -- Redis 세션 검증, 읽기 시간 계산, 콘텐츠 완료 처리 경계가 다른 도메인과 섞여 있다. -- 스케줄러 알림 정책과 도메인 규칙이 점점 가까워지면 테스트 경계가 흐려질 수 있다. +| 날짜 | 결정 | 이유 | 영향 범위 | 상태 | +| --- | --- | --- | --- | --- | +| 2026-04-08 | 세션 상태는 Redis, 확정 상태는 MongoDB로 분리 | 짧은 상태와 영속 상태의 책임을 분리해 운영 단순화 | ReadingSessionService, StreakService | 유지 | ## 참고 코드 diff --git a/docs/architecture/word.md b/docs/architecture/word.md index 132f364c..c71f42ee 100644 --- a/docs/architecture/word.md +++ b/docs/architecture/word.md @@ -1,32 +1,31 @@ -# Word 도메인 구조 +# Word 도메인 미니맵 -## 목적 +## 현재 시스템의 책임 -이 문서는 단어 조회와 AI 분석 흐름이 어떻게 결합돼 있는지 설명한다. +- 단어 조회 API를 제공한다. +- 입력 단어와 원형 단어의 관계를 관리한다. +- 단어 데이터가 없을 때 AI 분석으로 보완한다. +- 실패한 단어를 차단 캐시로 관리한다. -## 범위 +## 도메인 구조 -- `WordService` -- `WordAiService` -- `WordVariant`, `InvalidWord`, `Word` -- 단어 조회 및 생성 흐름 +- 원형 단어 본문은 `Word`로 저장된다. +- 입력 단어와 원형 단어의 연결은 `WordVariant`로 관리된다. +- 반복 실패 단어는 `InvalidWord`에 저장해 재시도를 줄인다. -## 핵심 구성 요소 +## 핵심 용어 사전 -- `WordsController` -- `WordService` -- `WordAiService` -- `WordVariantRepository`, `WordRepository`, `InvalidWordRepository` +| 용어 | 정의 | +| --- | --- | +| 원형 단어 | 실제 사전/학습 기준이 되는 canonical 단어 | +| 입력 단어 | 사용자가 검색한 원문 입력값 | +| variant | 입력 단어와 원형 단어를 연결하는 매핑 엔티티 | +| invalid word | 반복 실패 단어를 차단하기 위한 캐시 데이터 | -## 구조 요약 +## 외부 시스템 의존성 -Word 도메인은 사용자가 입력한 단어를 바로 조회하지 않고, 먼저 원형/변형 관계를 확인한 뒤 필요한 경우 AI 분석으로 보완한다. -MongoDB에는 단어 본문, 변형 형태, 실패 캐시를 따로 저장하고, AI 결과는 검증과 필터링을 거친 뒤 저장한다. -즉 이 도메인은 조회 API처럼 보이지만 실제로는 캐시, 분석, 검증, 저장이 한 흐름에 묶인 구조다. - -## Mermaid 다이어그램 - -### 구조 관계 +- MongoDB: 단어 본문, variant, invalid cache 저장 +- AI Model: 새 단어 분석과 생성 요청 ```mermaid flowchart TD @@ -52,7 +51,15 @@ flowchart TD InvalidRepo --> Mongo ``` -### 대표 흐름: 단어 조회 및 생성 +## 핵심 기능 + +- 단어 조회 +- variant 기반 원형 매핑 +- AI 기반 신규 단어 생성 + +## 핵심 기능 흐름 + +### 단어 조회 및 생성 ```mermaid sequenceDiagram @@ -80,33 +87,18 @@ sequenceDiagram WordService-->>Client: WordSearchResponse ``` -## 주요 흐름 설명 - -1. 먼저 `WordVariantRepository`에서 입력 단어가 이미 다른 원형에 연결된 변형인지 확인한다. -2. 데이터가 없으면 `InvalidWordRepository`를 확인해 반복 실패 단어를 빠르게 차단한다. -3. AI 호출이 필요하면 `WordAiService`가 강한 프롬프트, Bean schema, validation, enum 필터링, homograph 병합을 적용한다. -4. 성공 결과는 `Word`와 `WordVariant`로 나눠 저장하고, 이후 요청에서는 캐시처럼 재사용한다. - -## 핵심 데이터 - -- `Word` - - 원형 단어, 번역, 의미, 활용형 정보 -- `WordVariant` - - 입력 단어와 원형 단어 연결 -- `InvalidWord` - - 반복 실패한 단어에 대한 차단 캐시 - -## 이 도메인의 특징 +## 핵심 기능 선정 기준 -- AI 응답을 그대로 신뢰하지 않고 validation과 enum 정리를 한 번 더 거친다. -- 같은 원형으로 합쳐야 하는 homograph/variant 처리를 서비스 쪽에서 보정한다. -- 실패한 단어를 `InvalidWord`로 캐시해 불필요한 재호출을 줄인다. +1. 조회처럼 보이지만 캐시, 저장, AI 호출이 함께 묶여 있어 흐름이 길다. +2. `Word`, `WordVariant`, `InvalidWord`의 역할을 같이 이해해야 실제 동작을 읽을 수 있다. +3. 외부 AI 의존성이 있어 실패 경로까지 같이 파악해야 한다. +4. variant와 invalid cache가 조회 흐름 초반에 분기점 역할을 한다. -## 개선 포인트 +## 간결 의사결정 기록 -- `WordService`가 캐시 판단, 예외 전략, 저장 규칙까지 많이 알고 있어 책임이 크다. -- `WordAiService`는 프롬프트, 비용 로깅, 응답 검증을 함께 갖고 있어 분리 후보가 될 수 있다. -- AI 실패 정책과 사용자 응답 정책을 더 명확히 나누면 테스트가 쉬워질 수 있다. +| 날짜 | 결정 | 이유 | 영향 범위 | 상태 | +| --- | --- | --- | --- | --- | +| 2026-04-08 | `Word/Variant/Invalid` 3분할 구조 유지 | 조회 성능, 정합성, 실패 재시도 제어를 분리하기 위함 | WordService, 관련 Repository | 유지 | ## 참고 코드 diff --git a/docs/templates/architecture-template.md b/docs/templates/architecture-template.md index 44dc6d1d..d1f261ce 100644 --- a/docs/templates/architecture-template.md +++ b/docs/templates/architecture-template.md @@ -4,45 +4,64 @@ 짧고 명확한 문서 제목 -## 목적 +## 현재 시스템의 책임 -이 문서가 어떤 구조나 흐름을 설명하기 위한 것인지 적는다. +- 이 도메인이나 시스템이 담당하는 핵심 책임 2~4개 +- 조회, 저장, 계산, 외부 연동 중 무엇이 중심인지 +- 다른 도메인과 구분되는 역할이 무엇인지 -## 범위 +## 도메인 구조 -어떤 도메인, 기능, 요청 흐름을 다루는지 적는다. +- 이 도메인을 구성하는 핵심 개체나 하위 구조를 짧게 적는다 +- 1:N 관계나 상위/하위 개념이 있다면 여기서 먼저 정리한다 +- 읽는 사람이 도메인 내부 shape를 빠르게 잡을 수 있으면 충분하다 -## 핵심 구성 요소 +## 핵심 용어 사전 -- 구성 요소 1 -- 구성 요소 2 -- 구성 요소 3 +용어 혼동으로 의미가 흔들리지 않도록, 최소한의 도메인 용어만 관리한다. +각 항목은 `용어`와 `정의`만 유지한다. -## 구조 요약 +예시: -현재 구조를 짧게 설명한다. +| 용어 | 정의 | +| --- | --- | +| 진행률 | 사용자의 현재 학습 위치를 퍼센트로 표현한 값 | +| 완료 | 도메인이 정의한 완료 조건을 만족한 상태 | +| 세션 | 특정 사용자 학습 행위의 추적 단위 | -## Mermaid 다이어그램 +## 외부 시스템 의존성 -필요한 경우 아래 예시 중 하나를 복사해서 사용한다. +- 사용하는 저장소, 캐시, 메시징, 외부 API +- 의존 시스템이 없다면 생략 가능 +- 가능하면 "왜 붙는지"를 한 줄로 적는다 +- 거시적인 관점에서 system context diagram을 포함한다. -### 시스템/도메인 관계 예시 +예시: -```mermaid -flowchart TD - Client[Client] - Api[Spring API] - Mongo[MongoDB] - Redis[Redis] - External[External Services] - - Client --> Api - Api --> Mongo - Api --> Redis - Api --> External -``` +- MongoDB: 핵심 도메인 데이터 저장 +- Redis: 짧은 상태 또는 세션 저장 +- S3 / R2: 파일 저장 +- FCM: 알림 발송 +- AI Model: 분석 또는 생성 요청 + +## 핵심 기능 + +- 핵심 기능은 2~3개만 고른다 +- 각 기능은 중요도 + 복잡도/이해 난이도를 기준으로 선정한다 +- 이름만 봐도 읽기 시작점을 알 수 있게 쓴다 + +예시: -### 요청 흐름 예시 +- 책 목록 조회 +- 진행도 업데이트 +- 단어 조회 및 생성 + +## 핵심 기능 흐름 + +핵심 기능마다 시퀀스 다이어그램 또는 데이터 흐름 중 하나만 둔다. +기능당 다이어그램 1개면 충분하다. + +### 기능 흐름 예시 ```mermaid sequenceDiagram @@ -50,41 +69,53 @@ sequenceDiagram participant Controller participant Service participant Repository - participant Mongo + participant External Client->>Controller: Request - Controller->>Service: Call use case - Service->>Repository: Query or save - Repository->>Mongo: Access data - Mongo-->>Repository: Result - Repository-->>Service: Result - Service-->>Controller: Response DTO - Controller-->>Client: HTTP Response + Controller->>Service: Use case call + Service->>Repository: Load or save + Service->>External: Optional dependency call + Repository-->>Service: Data + Service-->>Controller: Result + Controller-->>Client: Response ``` -### 상태 전이 예시 +### 데이터 흐름 예시 ```mermaid -stateDiagram-v2 - [*] --> Pending - Pending --> InProgress - InProgress --> Completed - InProgress --> Failed +flowchart TD + Client[Client] + Controller[Controller] + Service[Service] + Mongo[(MongoDB)] + External[External System] + + Client --> Controller + Controller --> Service + Service --> Mongo + Service --> External ``` -## 주요 흐름 설명 +## 핵심 기능 선정 기준 -다이어그램만으로 부족한 핵심 흐름을 짧게 설명한다. +각 기능을 왜 핵심으로 봤는지 짧게 적는다. 1. 요청이 어디서 시작되는가 -2. 어떤 서비스가 핵심 규칙을 담당하는가 -3. 어떤 저장소나 외부 시스템에 의존하는가 +2. 어느 서비스나 도메인이 중심 책임을 갖는가 +3. 도메인 구조를 이해해야 읽을 수 있는 포인트가 있는가 +4. 어떤 저장소나 외부 시스템과 결합되는가 +5. 왜 중요하거나 복잡한가 + +## 의사결정 기록 + +아키텍처/도메인 의미에 영향을 주는 결정은 이 문서에 간결하게 남긴다. +복잡한 별도 문서로 분리하기 전에, 아래 형식으로 먼저 누적한다. -## 개선 포인트 +| 날짜 | 결정 | 이유 | 영향 범위 | 상태 | +| --- | --- | --- | --- | --- | +| 2026-04-08 | 진행률 계산 기준을 chapter completion으로 통일 | API 간 의미 불일치 제거 | ProgressService, BookService | 유지 | -- 현재 구조의 문제 -- 리팩터링 후보 -- 성능 또는 안정성 리스크 +상태 예시: `유지`, `대체 예정`, `폐기` ## 참고 코드 diff --git a/k6/README.md b/k6/README.md index f70740f1..0483cadb 100644 --- a/k6/README.md +++ b/k6/README.md @@ -1,40 +1,131 @@ # K6 Performance Testing -## 디렉토리 구조 -``` +`k6` 디렉토리는 정량 부하 테스트를 `부하 프로필` 중심으로 관리한다. +요청 이름보다 `baseline`, `load`, `stress`, `mixed-load` 같은 실험 목적이 먼저 보이도록 두고, 실제 엔드포인트는 환경변수로 주입한다. + +## 현재 구조 + +```text k6/ -├── docker-compose.yml # K6 실행 환경 -├── scripts/ # 테스트 스크립트 -│ ├── smoke-test.js # 기본 연결 테스트 -│ ├── load-test.js # 부하 테스트 -│ └── stress-test.js # 스트레스 테스트 -├── data/ # 테스트 데이터 파일 -├── reports/ # 테스트 결과 리포트 -└── README.md +├── docker-compose.yml +├── README.md +├── seed/ +│ └── ... +└── scripts/ + ├── baseline.js + ├── load.js + ├── stress.js + ├── mixed-load.js + └── common/ + ├── endpoints.js + ├── http.js + ├── profiles.js + └── summary.js +``` + +## 전제 조건 + +- 애플리케이션이 로컬에서 실행 중이어야 한다. +- rate limit 영향 없이 k6를 실행하려면 앱을 `local,local-k6` 프로필로 실행한다. + - 예: `./gradlew bootRun --args='--spring.profiles.active=local,local-k6'` +- MongoDB 에는 seed 데이터가 들어 있어야 한다. +- 테스트용 사용자는 `X-Test-Username` 으로 인증 가능해야 한다. +- 시드 생성은 [README.md](/Users/solfe/Desktop/WORK/llv/llv-api/k6/seed/README.md)를 따른다. +- 기본 `BASE_URL` 은 `http://host.docker.internal:8080` 이다. +- 시계열 결과 저장이 필요하면 먼저 `influxdb` 컨테이너를 실행한다. + +## 기본 엔드포인트 이름 + +- `books.default_list` +- `books.progress_filter` +- `books.pagination` + +새 엔드포인트를 추가할 때는 `k6/scripts/common/endpoints.js` 에 등록한다. + +## 실행 예시 + +```bash +docker compose -f k6/docker-compose.yml up -d influxdb ``` -## 테스트 실행 +### Baseline +낮은 부하로 기준 응답시간과 기본 안정성을 확인한다. -### 기본 연결 테스트 (Smoke Test) ```bash -docker-compose run --rm k6 run /scripts/smoke-test.js +docker compose -f k6/docker-compose.yml run --rm \ + -e ENDPOINT_NAME=books.default_list \ + -e TEST_USERNAME=k6seed-user-02 \ + k6 run /scripts/baseline.js ``` -### 부하 테스트 (Load Test) +### Sustained Load +일반적인 목표 부하를 일정 시간 유지하면서 지속 성능을 본다. + ```bash -docker-compose run --rm k6 run /scripts/load-test.js +docker compose -f k6/docker-compose.yml run --rm \ + -e ENDPOINT_NAME=books.progress_filter \ + -e TEST_USERNAME=k6seed-user-02 \ + -e TARGET_VUS=20 \ + k6 run /scripts/load.js ``` -### 커스텀 설정으로 실행 +### Stress +일반 부하보다 더 높은 요청량으로 밀어 한계 구간과 급격한 성능 저하 지점을 찾는다. + ```bash -docker-compose run --rm k6 run /scripts/smoke-test.js --vus 10 --duration 1m +docker compose -f k6/docker-compose.yml run --rm \ + -e ENDPOINT_NAME=books.pagination \ + -e TEST_USERNAME=k6seed-user-02 \ + -e TARGET_VUS=20 \ + -e STRESS_TARGET_VUS=80 \ + k6 run /scripts/stress.js ``` +### Mixed Load +여러 요청을 비율대로 섞어서 실제 사용 패턴에 가까운 혼합 부하를 본다. + +```bash +docker compose -f k6/docker-compose.yml run --rm \ + -e ENDPOINT_NAMES=books.default_list,books.progress_filter,books.pagination \ + -e ENDPOINT_WEIGHTS=7,2,1 \ + -e TEST_USERNAME=k6seed-user-02 \ + k6 run /scripts/mixed-load.js +``` + +### Custom Endpoint +등록되지 않은 엔드포인트를 직접 지정해서 같은 부하 프로필로 측정한다. + +```bash +docker compose -f k6/docker-compose.yml run --rm \ + -e ENDPOINT_PATH=/api/v1/books \ + -e ENDPOINT_TAG=books \ + -e ENDPOINT_EXPECTS_ARRAY_AT=content \ + -e TEST_USERNAME=k6seed-user-02 \ + k6 run /scripts/load.js +``` + +## 자주 바꾸는 환경변수 + +- `BASE_URL` +- `TEST_USERNAME` +- `ENDPOINT_NAME` +- `ENDPOINT_NAMES` +- `ENDPOINT_WEIGHTS` +- `TARGET_VUS` +- `STRESS_TARGET_VUS` +- `THINK_TIME` + +부하 프로필별 세부 duration 은 `baseline.js`, `load.js`, `stress.js` 가 참조하는 `k6/scripts/common/profiles.js` 환경변수로 조정할 수 있다. + +## 관리 원칙 + +- 테스트 파일은 요청 중심이 아니라 부하 프로필 중심으로 둔다. +- 엔드포인트는 `ENDPOINT_NAME` 또는 `ENDPOINT_PATH` 로 주입한다. +- 여러 요청을 묶고 싶을 때는 `mixed-load.js` 에서 엔드포인트 모듈을 조합한다. +- 대조군 비교 시에는 같은 seed prefix, 같은 사용자, 같은 정렬 기준을 고정한다. + ## 결과 확인 -- 콘솔에서 실시간 확인 -- `/reports` 폴더에 JSON 결과 저장 -- Grafana 대시보드 연동 가능 -## 네트워크 설정 -- `host.docker.internal:8080`로 로컬 API 접근 -- 운영 환경 테스트 시 URL 변경 필요 \ No newline at end of file +- 콘솔 실시간 출력 +- `/reports` 아래 JSON 결과 +- 필요하면 InfluxDB 데이터를 monitoring Grafana에서 조회 diff --git a/k6/docker-compose.yml b/k6/docker-compose.yml index 12f64d8a..508db609 100644 --- a/k6/docker-compose.yml +++ b/k6/docker-compose.yml @@ -10,18 +10,6 @@ services: volumes: - influxdb-data:/var/lib/influxdb - grafana: - image: grafana/grafana:latest - ports: - - "3000:3000" - environment: - - GF_SECURITY_ADMIN_PASSWORD=admin - - GF_SECURITY_ADMIN_USER=admin - volumes: - - grafana-data:/var/lib/grafana - depends_on: - - influxdb - k6: image: grafana/k6:latest volumes: @@ -35,9 +23,7 @@ services: volumes: influxdb-data: - grafana-data: # 사용법: -# docker-compose up -d influxdb grafana # 모니터링 스택 시작 -# docker-compose run --rm k6 run /scripts/smoke-test.js # 테스트 실행 -# Grafana: http://localhost:3000 (admin/admin) \ No newline at end of file +# docker compose -f k6/docker-compose.yml up -d influxdb # k6 결과 저장소 시작 +# docker compose -f k6/docker-compose.yml run --rm k6 run /scripts/baseline.js diff --git a/k6/scripts/baseline.js b/k6/scripts/baseline.js new file mode 100644 index 00000000..6f6a9bad --- /dev/null +++ b/k6/scripts/baseline.js @@ -0,0 +1,19 @@ +import { requestEndpoint, getSharedTestInfo } from './common/http.js'; +import { resolveSingleEndpoint } from './common/endpoints.js'; +import { createProfileOptions } from './common/profiles.js'; +import { createMetrics, createSummaryHandler } from './common/summary.js'; + +const endpoint = resolveSingleEndpoint(); +const metrics = createMetrics('baseline'); + +export const options = createProfileOptions('baseline', metrics.prefix); + +export default function () { + requestEndpoint(endpoint, metrics, __ITER); +} + +export const handleSummary = createSummaryHandler(metrics, { + profileName: 'baseline', + endpointName: endpoint.name, + ...getSharedTestInfo(), +}); diff --git a/k6/scripts/common/endpoints.js b/k6/scripts/common/endpoints.js new file mode 100644 index 00000000..95661316 --- /dev/null +++ b/k6/scripts/common/endpoints.js @@ -0,0 +1,159 @@ +function splitCsv(value, fallback) { + return (value || fallback) + .split(',') + .map((item) => item.trim()) + .filter(Boolean); +} + +function splitNumberCsv(value, fallback) { + return splitCsv(value, fallback) + .map((item) => Number(item)) + .filter((item) => Number.isFinite(item) && item > 0); +} + +function getByPath(target, path) { + if (!path) { + return target; + } + + return path.split('.').reduce((current, key) => current?.[key], target); +} + +function validateArrayResponse(response, path = 'content') { + try { + const body = response.json(); + return Array.isArray(getByPath(body, path)); + } catch (error) { + return false; + } +} + +const DEFAULT_LIMIT = Number(__ENV.DEFAULT_LIMIT || 20); +const DEFAULT_LANGUAGE_CODE = __ENV.LANGUAGE_CODE || 'EN'; +const DEFAULT_SORT_BY = __ENV.SORT_BY || 'created_at'; +const DEFAULT_PAGE = Number(__ENV.PAGE || 1); +const PROGRESS_FILTERS = splitCsv(__ENV.PROGRESS_FILTERS, 'NOT_STARTED,IN_PROGRESS,COMPLETED'); +const PAGINATION_LIMITS = splitNumberCsv(__ENV.PAGINATION_LIMITS, '10,20,50,100'); + +function buildBooksBaseQuery() { + return { + languageCode: DEFAULT_LANGUAGE_CODE, + sortBy: DEFAULT_SORT_BY, + page: DEFAULT_PAGE, + limit: DEFAULT_LIMIT, + }; +} + +const endpointCatalog = { + 'books.default_list': { + name: 'books.default_list', + tag: 'books', + buildRequest: () => ({ + path: '/api/v1/books', + query: buildBooksBaseQuery(), + variant: 'default_list', + }), + validate: (response) => validateArrayResponse(response, 'data'), + }, + 'books.progress_filter': { + name: 'books.progress_filter', + tag: 'books', + buildRequest: ({ iteration }) => { + const progress = PROGRESS_FILTERS[iteration % PROGRESS_FILTERS.length]; + return { + path: '/api/v1/books', + query: { + ...buildBooksBaseQuery(), + progress, + }, + variant: `progress_${progress.toLowerCase()}`, + }; + }, + validate: (response) => validateArrayResponse(response, 'data'), + }, + 'books.pagination': { + name: 'books.pagination', + tag: 'books', + buildRequest: ({ iteration }) => { + const limit = PAGINATION_LIMITS[iteration % PAGINATION_LIMITS.length]; + return { + path: '/api/v1/books', + query: { + ...buildBooksBaseQuery(), + limit, + }, + variant: `pagination_limit_${limit}`, + }; + }, + validate: (response) => validateArrayResponse(response, 'data'), + }, +}; + +function createCustomEndpoint() { + const path = __ENV.ENDPOINT_PATH; + const tag = __ENV.ENDPOINT_TAG || 'custom'; + const variant = __ENV.ENDPOINT_VARIANT || tag; + const arrayPath = __ENV.ENDPOINT_EXPECTS_ARRAY_AT || ''; + + return { + name: 'custom.endpoint', + tag, + buildRequest: () => ({ + path, + variant, + query: {}, + }), + validate: (response) => (arrayPath ? validateArrayResponse(response, arrayPath) : true), + }; +} + +export function resolveEndpointByName(name) { + const endpoint = endpointCatalog[name]; + + if (!endpoint) { + throw new Error(`Unknown endpoint: ${name}`); + } + + return endpoint; +} + +export function resolveSingleEndpoint() { + if (__ENV.ENDPOINT_PATH) { + return createCustomEndpoint(); + } + + return resolveEndpointByName(__ENV.ENDPOINT_NAME || 'books.default_list'); +} + +export function resolveEndpointSet() { + if (__ENV.ENDPOINT_PATH) { + return [createCustomEndpoint()]; + } + + return splitCsv(__ENV.ENDPOINT_NAMES, 'books.default_list,books.progress_filter,books.pagination') + .map(resolveEndpointByName); +} + +export function resolveWeights(count) { + const weights = splitNumberCsv(__ENV.ENDPOINT_WEIGHTS, ''); + + if (weights.length === count) { + return weights; + } + + return Array.from({ length: count }, () => 1); +} + +export function selectWeightedEndpoint(endpoints, weights, iteration) { + const totalWeight = weights.reduce((sum, value) => sum + value, 0); + let cursor = iteration % totalWeight; + + for (let index = 0; index < endpoints.length; index += 1) { + cursor -= weights[index]; + if (cursor < 0) { + return endpoints[index]; + } + } + + return endpoints[endpoints.length - 1]; +} diff --git a/k6/scripts/common/http.js b/k6/scripts/common/http.js new file mode 100644 index 00000000..966a2d97 --- /dev/null +++ b/k6/scripts/common/http.js @@ -0,0 +1,71 @@ +import http from 'k6/http'; +import { check, sleep } from 'k6'; + +const BASE_URL = (__ENV.BASE_URL || 'http://host.docker.internal:8080').replace(/\/$/, ''); +const TEST_USERNAME = __ENV.TEST_USERNAME || ''; +const AUTH_TOKEN = __ENV.AUTH_TOKEN || ''; +const THINK_TIME = Number(__ENV.THINK_TIME || 1); + +export function buildHeaders() { + const headers = { + Accept: 'application/json', + }; + + if (TEST_USERNAME) { + headers['X-Test-Username'] = TEST_USERNAME; + } + + if (AUTH_TOKEN) { + headers.Authorization = `Bearer ${AUTH_TOKEN}`; + } + + return headers; +} + +export function getSharedTestInfo() { + return { + baseUrl: BASE_URL, + testUsername: TEST_USERNAME || null, + thinkTime: THINK_TIME, + }; +} + +export function buildUrl(path, query = {}) { + const normalizedPath = path.startsWith('http') + ? path + : `${BASE_URL}${path.startsWith('/') ? path : `/${path}`}`; + const queryString = Object.entries(query) + .filter(([, value]) => value !== undefined && value !== null && value !== '') + .map(([key, value]) => `${encodeURIComponent(key)}=${encodeURIComponent(value)}`) + .join('&'); + + if (!queryString) { + return normalizedPath; + } + + return `${normalizedPath}${normalizedPath.includes('?') ? '&' : '?'}${queryString}`; +} + +export function requestEndpoint(endpoint, metrics, iteration) { + const request = endpoint.buildRequest({ iteration }); + const url = buildUrl(request.path, request.query); + const response = http.get(url, { + headers: buildHeaders(), + tags: { + endpoint: endpoint.tag, + variant: request.variant, + }, + }); + + const success = check(response, { + [`${endpoint.tag} status is 200`]: (res) => res.status === 200, + [`${endpoint.tag} response shape is valid`]: (res) => endpoint.validate(res), + }); + + metrics.success.add(success); + metrics.duration.add(response.timings.duration); + + sleep(THINK_TIME); + + return response; +} diff --git a/k6/scripts/common/profiles.js b/k6/scripts/common/profiles.js new file mode 100644 index 00000000..96d30d37 --- /dev/null +++ b/k6/scripts/common/profiles.js @@ -0,0 +1,56 @@ +function positiveNumber(value, fallback) { + const parsed = Number(value); + return Number.isFinite(parsed) && parsed > 0 ? parsed : fallback; +} + +function buildRampingScenario(stages, exec = 'default') { + return { + executor: 'ramping-vus', + exec, + stages, + gracefulRampDown: '5s', + }; +} + +const BASELINE_VUS = positiveNumber(__ENV.BASELINE_VUS, 5); +const TARGET_VUS = positiveNumber(__ENV.TARGET_VUS, 20); +const STRESS_TARGET_VUS = positiveNumber(__ENV.STRESS_TARGET_VUS, 60); + +export function createProfileScenario(profileName, exec = 'default') { + switch (profileName) { + case 'baseline': + return buildRampingScenario([ + { duration: __ENV.BASELINE_RAMP_UP_DURATION || '15s', target: BASELINE_VUS }, + { duration: __ENV.BASELINE_STEADY_DURATION || '45s', target: BASELINE_VUS }, + { duration: __ENV.BASELINE_RAMP_DOWN_DURATION || '10s', target: 0 }, + ], exec); + case 'stress': + return buildRampingScenario([ + { duration: __ENV.STRESS_RAMP_UP_DURATION || '20s', target: TARGET_VUS }, + { duration: __ENV.STRESS_STEP_DURATION || '30s', target: STRESS_TARGET_VUS }, + { duration: __ENV.STRESS_STEADY_DURATION || '30s', target: STRESS_TARGET_VUS }, + { duration: __ENV.STRESS_RAMP_DOWN_DURATION || '15s', target: 0 }, + ], exec); + case 'load': + default: + return buildRampingScenario([ + { duration: __ENV.RAMP_UP_DURATION || '30s', target: TARGET_VUS }, + { duration: __ENV.STEADY_DURATION || '1m', target: TARGET_VUS }, + { duration: __ENV.RAMP_DOWN_DURATION || '10s', target: 0 }, + ], exec); + } +} + +export function createProfileOptions(profileName, metricPrefix, exec = 'default') { + return { + scenarios: { + [profileName]: createProfileScenario(profileName, exec), + }, + thresholds: { + http_req_failed: ['rate<0.05'], + http_req_duration: ['p(95)<1000'], + [`${metricPrefix}_success`]: ['rate>0.95'], + [`${metricPrefix}_duration`]: ['p(95)<1000'], + }, + }; +} diff --git a/k6/scripts/common/summary.js b/k6/scripts/common/summary.js new file mode 100644 index 00000000..94087478 --- /dev/null +++ b/k6/scripts/common/summary.js @@ -0,0 +1,47 @@ +import { Rate, Trend } from 'k6/metrics'; + +export function createMetrics(metricPrefix) { + return { + prefix: metricPrefix, + success: new Rate(`${metricPrefix}_success`), + duration: new Trend(`${metricPrefix}_duration`, true), + }; +} + +function metricSnapshot(metric) { + const values = metric?.values || {}; + return { + avg: Math.round(values.avg || 0), + min: Math.round(values.min || 0), + max: Math.round(values.max || 0), + p95: Math.round(values['p(95)'] || 0), + p99: Math.round(values['p(99)'] || 0), + rate: Math.round((values.rate || 0) * 10000) / 100, + count: values.count || 0, + }; +} + +export function createSummaryHandler(metrics, metadata) { + return function handleSummary(data) { + const timestamp = new Date().toISOString().replace(/[:.]/g, '-'); + const analysis = { + test_info: metadata, + metric_name: metrics.prefix, + scenario_metrics: { + success: metricSnapshot(data.metrics[`${metrics.prefix}_success`]), + duration: metricSnapshot(data.metrics[`${metrics.prefix}_duration`]), + }, + transport_metrics: { + http_req_duration: metricSnapshot(data.metrics.http_req_duration), + http_req_failed: metricSnapshot(data.metrics.http_req_failed), + http_reqs: metricSnapshot(data.metrics.http_reqs), + data_received: metricSnapshot(data.metrics.data_received), + }, + }; + + return { + [`/reports/${metrics.prefix}-test-${timestamp}.json`]: JSON.stringify(data, null, 2), + [`/reports/${metrics.prefix}-analysis-${timestamp}.json`]: JSON.stringify(analysis, null, 2), + }; + }; +} diff --git a/k6/scripts/load.js b/k6/scripts/load.js new file mode 100644 index 00000000..7ca42dd1 --- /dev/null +++ b/k6/scripts/load.js @@ -0,0 +1,19 @@ +import { requestEndpoint, getSharedTestInfo } from './common/http.js'; +import { resolveSingleEndpoint } from './common/endpoints.js'; +import { createProfileOptions } from './common/profiles.js'; +import { createMetrics, createSummaryHandler } from './common/summary.js'; + +const endpoint = resolveSingleEndpoint(); +const metrics = createMetrics('load'); + +export const options = createProfileOptions('load', metrics.prefix); + +export default function () { + requestEndpoint(endpoint, metrics, __ITER); +} + +export const handleSummary = createSummaryHandler(metrics, { + profileName: 'load', + endpointName: endpoint.name, + ...getSharedTestInfo(), +}); diff --git a/k6/scripts/mixed-load.js b/k6/scripts/mixed-load.js new file mode 100644 index 00000000..6241aa93 --- /dev/null +++ b/k6/scripts/mixed-load.js @@ -0,0 +1,22 @@ +import { requestEndpoint, getSharedTestInfo } from './common/http.js'; +import { resolveEndpointSet, resolveWeights, selectWeightedEndpoint } from './common/endpoints.js'; +import { createProfileOptions } from './common/profiles.js'; +import { createMetrics, createSummaryHandler } from './common/summary.js'; + +const endpoints = resolveEndpointSet(); +const weights = resolveWeights(endpoints.length); +const metrics = createMetrics('mixed_load'); + +export const options = createProfileOptions('load', metrics.prefix); + +export default function () { + const endpoint = selectWeightedEndpoint(endpoints, weights, __ITER); + requestEndpoint(endpoint, metrics, __ITER); +} + +export const handleSummary = createSummaryHandler(metrics, { + profileName: 'load', + endpointNames: endpoints.map((endpoint) => endpoint.name), + weights, + ...getSharedTestInfo(), +}); diff --git a/k6/scripts/stress.js b/k6/scripts/stress.js new file mode 100644 index 00000000..02667a1c --- /dev/null +++ b/k6/scripts/stress.js @@ -0,0 +1,19 @@ +import { requestEndpoint, getSharedTestInfo } from './common/http.js'; +import { resolveSingleEndpoint } from './common/endpoints.js'; +import { createProfileOptions } from './common/profiles.js'; +import { createMetrics, createSummaryHandler } from './common/summary.js'; + +const endpoint = resolveSingleEndpoint(); +const metrics = createMetrics('stress'); + +export const options = createProfileOptions('stress', metrics.prefix); + +export default function () { + requestEndpoint(endpoint, metrics, __ITER); +} + +export const handleSummary = createSummaryHandler(metrics, { + profileName: 'stress', + endpointName: endpoint.name, + ...getSharedTestInfo(), +}); diff --git a/k6/seed/README.md b/k6/seed/README.md new file mode 100644 index 00000000..449a18ea --- /dev/null +++ b/k6/seed/README.md @@ -0,0 +1,72 @@ +# K6 Seed Scripts + +이 디렉토리에는 로컬 `k6` 성능 실험용 시드 생성 스크립트를 둔다. +도메인 구조가 바로 드러나도록 `k6/seed///...` 형태를 기본 원칙으로 사용한다. + +## 디렉토리 원칙 + +- `k6/seed/content/book/...` +- `k6/seed/streak/...` +- `k6/seed/word/...` + +생성 결과물이나 임시 데이터는 `k6/data`에 두고, 버전 관리 대상은 `k6/seed` 아래에 둔다. + +## 포함된 스크립트 + +- `content/book/seed-books-content.mongosh.js` + - `user`, `books`, `chapters`, `chunks`, `bookProgress` 컬렉션에 재실행 가능한 업서트 시드를 넣는다. + - 기본 분포는 `short / medium / long` 책 구성을 섞고, 챕터 수와 청크 수를 현실적인 범위로 퍼뜨린다. + - 같은 콘텐츠 그래프를 기준으로 `NOT_STARTED`, `IN_PROGRESS`, `COMPLETED` 상태가 사용자별로 자연스럽게 섞인 `bookProgress`를 함께 생성한다. + +## 실행 + +로컬 MongoDB 컨테이너가 떠 있는 상태에서 실행한다. + +```bash +docker compose exec -T mongo mongosh llv_api_local < k6/seed/content/book/seed-books-content.mongosh.js +``` + +환경변수를 조정해서 규모를 바꿀 수 있다. + +```bash +SEED_PREFIX=k6seed \ +BOOK_COUNT=180 \ +USER_COUNT=4 \ +EXTRA_DIFFICULTY_RATIO=0.25 \ +docker compose exec -T mongo mongosh llv_api_local < k6/seed/content/book/seed-books-content.mongosh.js +``` + +같은 prefix 데이터만 지우고 다시 만들고 싶으면 `RESET_EXISTING=true` 를 함께 준다. + +```bash +RESET_EXISTING=true \ +docker compose exec -T mongo mongosh llv_api_local < k6/seed/content/book/seed-books-content.mongosh.js +``` + +## 기본 특성 + +- 책 수 기본값: `240` +- 사용자 수 기본값: `4` +- 책 길이 분포: + - `short`: 20% + - `medium`: 60% + - `long`: 20% +- 기본 난이도 기준 챕터당 평균 청크 수는 약 `30개`를 목표로 둔다. +- 프로필별 청크 범위: + - `short`: 18~24 + - `medium`: 26~34 + - `long`: 34~42 +- 각 책은 기본 난이도 1개를 갖고, 일부는 인접 난이도 청크를 추가로 가진다. +- 이미지 청크는 소량만 섞어서 응답 shape 를 단조롭게 만들지 않는다. +- `bookProgress`는 사용자 프로필별로 분포를 다르게 준다. + - `mostly-unread` + - `balanced` + - `active-reader` + - `completion-heavy` +- `NOT_STARTED`는 progress 문서를 만들지 않는 방식으로 표현한다. +- `IN_PROGRESS`는 `chapterProgresses`와 `maxReadChunkNumber`를 함께 채워서 현재 필터와 V3 응답 계산이 모두 자연스럽게 동작하게 한다. + +## 주의 + +- 현재 스크립트는 `books` 콘텐츠 그래프와 `bookProgress`를 함께 만든다. +- 이미 같은 prefix로 넣은 시드를 다시 깔끔하게 만들고 싶으면 `RESET_EXISTING=true`로 재실행한다. diff --git a/k6/seed/content/book/seed-books-content.mongosh.js b/k6/seed/content/book/seed-books-content.mongosh.js new file mode 100644 index 00000000..3b72cf06 --- /dev/null +++ b/k6/seed/content/book/seed-books-content.mongosh.js @@ -0,0 +1,728 @@ +/* + * K6 content seed for local MongoDB. + * + * Usage: + * docker compose exec -T mongo mongosh llv_api_local < k6/seed/content/book/seed-books-content.mongosh.js + * + * Optional env vars: + * SEED_PREFIX=k6seed + * BOOK_COUNT=240 + * USER_COUNT=4 + * EXTRA_DIFFICULTY_RATIO=0.3 + * RESET_EXISTING=false + */ + +function seedBooksContent() { + const env = typeof process !== 'undefined' ? process.env : {}; + + const config = { + seedPrefix: env.SEED_PREFIX || 'k6seed', + bookCount: positiveInt(env.BOOK_COUNT, 240), + userCount: positiveInt(env.USER_COUNT, 4), + extraDifficultyRatio: boundedNumber(env.EXTRA_DIFFICULTY_RATIO, 0.3, 0, 1), + resetExisting: parseBoolean(env.RESET_EXISTING || 'false'), + now: new Date(), + }; + + const random = createRandom(config.seedPrefix); + const databaseName = db.getName(); + + const collections = { + users: db.getCollection('user'), + books: db.getCollection('books'), + chapters: db.getCollection('chapters'), + chunks: db.getCollection('chunks'), + bookProgresses: db.getCollection('bookProgress'), + }; + + print(`[seed] Database: ${databaseName}`); + print(`[seed] Prefix: ${config.seedPrefix}`); + print(`[seed] Books: ${config.bookCount}, Users: ${config.userCount}`); + + if (config.resetExisting) { + resetExistingSeed(collections, config.seedPrefix); + } + + const users = buildUsers(config); + const content = buildContentGraph(config, random); + const progressSeed = buildBookProgresses(config, users, content.bookCatalog, createRandom(`${config.seedPrefix}:progress`)); + print(`[seed] Prepared bookProgress documents: ${progressSeed.bookProgresses.length}`); + + upsertUsers(collections.users, users); + upsertBooks(collections.books, content.books); + upsertChapters(collections.chapters, content.chapters); + upsertChunks(collections.chunks, content.chunks); + print('[seed] Chunk upserts completed'); + upsertBookProgresses(collections.bookProgresses, progressSeed.bookProgresses); + print('[seed] Book progress upserts completed'); + + print('[seed] Completed successfully'); + printjson({ + users: users.length, + books: content.books.length, + chapters: content.chapters.length, + chunks: content.chunks.length, + bookProgresses: progressSeed.bookProgresses.length, + progressProfiles: progressSeed.summaryByUser, + }); + + if (typeof quit === 'function') { + quit(0); + } +} + +function resetExistingSeed(collections, seedPrefix) { + const idRegex = new RegExp(`^${escapeRegex(seedPrefix)}-`); + const usernameRegex = new RegExp(`^${escapeRegex(seedPrefix)}-user-`); + + const deletedProgresses = collections.bookProgresses.deleteMany({ + $or: [ + { _id: idRegex }, + { id: idRegex }, + { userId: idRegex }, + { bookId: idRegex }, + ], + }).deletedCount; + const deletedChunks = collections.chunks.deleteMany({ + $or: [ + { _id: idRegex }, + { id: idRegex }, + ], + }).deletedCount; + const deletedChapters = collections.chapters.deleteMany({ + $or: [ + { _id: idRegex }, + { id: idRegex }, + ], + }).deletedCount; + const deletedBooks = collections.books.deleteMany({ + $or: [ + { _id: idRegex }, + { id: idRegex }, + ], + }).deletedCount; + const deletedUsers = collections.users.deleteMany({ + $or: [ + { _id: idRegex }, + { username: usernameRegex }, + ], + }).deletedCount; + + print(`[seed] Reset existing seed docs: users=${deletedUsers}, books=${deletedBooks}, chapters=${deletedChapters}, chunks=${deletedChunks}, bookProgresses=${deletedProgresses}`); +} + +function buildUsers(config) { + const users = []; + + for (let index = 1; index <= config.userCount; index += 1) { + const username = `${config.seedPrefix}-user-${pad(index, 2)}`; + + users.push({ + id: `${config.seedPrefix}-user-doc-${pad(index, 2)}`, + username, + email: `${username}@example.local`, + displayName: `K6 Seed User ${index}`, + provider: 'test', + profileImageUrl: `https://static.linglevel.local/profiles/${config.seedPrefix}/${pad(index, 2)}.png`, + role: 'USER', + deleted: false, + createdAt: daysAgo(index * 7), + deletedAt: null, + }); + } + + return users; +} + +function buildContentGraph(config, random) { + const books = []; + const chapters = []; + const chunks = []; + const bookCatalog = []; + + for (let bookIndex = 1; bookIndex <= config.bookCount; bookIndex += 1) { + const profile = pickBookProfile(random); + const bookId = `${config.seedPrefix}-book-${pad(bookIndex, 4)}`; + const createdAt = daysAgo(randomInt(random, 0, 365)); + const titleSeed = buildTitleSeed(bookIndex); + const primaryDifficulty = pickPrimaryDifficulty(random); + const difficultyLevels = buildDifficultyLevels(primaryDifficulty, config.extraDifficultyRatio, random); + const bookChapterCatalog = []; + + let totalReadingTime = 0; + const chapterCount = randomInt(random, profile.chapterRange.min, profile.chapterRange.max); + + for (let chapterNumber = 1; chapterNumber <= chapterCount; chapterNumber += 1) { + const chapterId = `${bookId}-chapter-${pad(chapterNumber, 2)}`; + const chunkPlan = buildChunkPlan(profile, chapterNumber, random); + const chapterReadingTime = estimateChapterReadingTime(chunkPlan.primaryChunkCount, difficultyLevels.length); + const chapterCatalog = { + id: chapterId, + chapterNumber, + chunkIdsByDifficulty: {}, + }; + + chapters.push({ + id: chapterId, + bookId, + chapterNumber, + title: `Chapter ${chapterNumber}. ${buildChapterTitle(titleSeed.baseNoun, chapterNumber)}`, + chapterImageUrl: chapterNumber % 5 === 0 + ? `https://static.linglevel.local/books/${bookId}/chapters/${pad(chapterNumber, 2)}.jpg` + : null, + description: `${titleSeed.baseAdjective} events unfold around ${titleSeed.baseNoun.toLowerCase()} in chapter ${chapterNumber}.`, + readingTime: chapterReadingTime, + }); + + totalReadingTime += chapterReadingTime; + + for (let levelIndex = 0; levelIndex < difficultyLevels.length; levelIndex += 1) { + const difficultyLevel = difficultyLevels[levelIndex]; + const chunkCount = adjustChunkCountByLevel(chunkPlan.primaryChunkCount, levelIndex, random); + + for (let chunkNumber = 1; chunkNumber <= chunkCount; chunkNumber += 1) { + const chunkId = `${chapterId}-${difficultyLevel.toLowerCase()}-chunk-${pad(chunkNumber, 2)}`; + const isImage = shouldCreateImageChunk(chunkNumber, chapterNumber, random); + + if (!chapterCatalog.chunkIdsByDifficulty[difficultyLevel]) { + chapterCatalog.chunkIdsByDifficulty[difficultyLevel] = []; + } + chapterCatalog.chunkIdsByDifficulty[difficultyLevel].push(chunkId); + + chunks.push({ + id: chunkId, + chapterId, + chunkNumber, + difficultyLevel, + type: isImage ? 'IMAGE' : 'TEXT', + content: isImage + ? `https://static.linglevel.local/books/${bookId}/chapters/${pad(chapterNumber, 2)}/images/${difficultyLevel.toLowerCase()}-${pad(chunkNumber, 2)}.jpg` + : buildChunkText(titleSeed, chapterNumber, chunkNumber, difficultyLevel), + description: isImage + ? `${titleSeed.baseAdjective} illustration for chapter ${chapterNumber}, chunk ${chunkNumber}.` + : null, + }); + } + } + + bookChapterCatalog.push(chapterCatalog); + } + + books.push({ + id: bookId, + title: `${titleSeed.baseAdjective} ${titleSeed.baseNoun}`, + titleTranslations: { + ko: `${titleSeed.baseNounKo}의 ${titleSeed.baseAdjectiveKo}`, + ja: `${titleSeed.baseAdjectiveJa} ${titleSeed.baseNounJa}`, + }, + author: buildAuthorName(bookIndex), + coverImageUrl: `https://static.linglevel.local/books/${bookId}/cover-small.jpg`, + difficultyLevel: primaryDifficulty, + chapterCount, + readingTime: totalReadingTime, + averageRating: buildAverageRating(random), + reviewCount: buildReviewCount(profile, random), + viewCount: buildViewCount(profile, random), + tags: buildTags(profile, random), + createdAt, + }); + + bookCatalog.push({ + id: bookId, + primaryDifficulty, + chapterCount, + chapters: bookChapterCatalog, + }); + } + + return { books, chapters, chunks, bookCatalog }; +} + +function upsertUsers(collection, users) { + collection.bulkWrite( + users.map((user) => ({ + updateOne: { + filter: { _id: user.id }, + update: { $set: toPersistedDocument(user) }, + upsert: true, + }, + })), + { ordered: false } + ); +} + +function upsertBooks(collection, books) { + collection.bulkWrite( + books.map((book) => ({ + updateOne: { + filter: { _id: book.id }, + update: { $set: toPersistedDocument(book) }, + upsert: true, + }, + })), + { ordered: false } + ); +} + +function upsertChapters(collection, chapters) { + collection.bulkWrite( + chapters.map((chapter) => ({ + updateOne: { + filter: { _id: chapter.id }, + update: { $set: toPersistedDocument(chapter) }, + upsert: true, + }, + })), + { ordered: false } + ); +} + +function upsertChunks(collection, chunks) { + const batchSize = 1000; + + for (let index = 0; index < chunks.length; index += batchSize) { + const batch = chunks.slice(index, index + batchSize); + + collection.bulkWrite( + batch.map((chunk) => ({ + updateOne: { + filter: { _id: chunk.id }, + update: { $set: toPersistedDocument(chunk) }, + upsert: true, + }, + })), + { ordered: false } + ); + } +} + +function upsertBookProgresses(collection, bookProgresses) { + if (bookProgresses.length === 0) { + return; + } + + collection.bulkWrite( + bookProgresses.map((progress) => ({ + updateOne: { + filter: { _id: progress.id }, + update: { $set: toPersistedDocument(progress) }, + upsert: true, + }, + })), + { ordered: false } + ); +} + +function buildBookProgresses(config, users, bookCatalog, random) { + const progressProfiles = [ + { + name: 'mostly-unread', + weights: { NOT_STARTED: 0.72, IN_PROGRESS: 0.18, COMPLETED: 0.10 }, + }, + { + name: 'balanced', + weights: { NOT_STARTED: 0.45, IN_PROGRESS: 0.35, COMPLETED: 0.20 }, + }, + { + name: 'active-reader', + weights: { NOT_STARTED: 0.25, IN_PROGRESS: 0.45, COMPLETED: 0.30 }, + }, + { + name: 'completion-heavy', + weights: { NOT_STARTED: 0.12, IN_PROGRESS: 0.28, COMPLETED: 0.60 }, + }, + ]; + + const bookProgresses = []; + const summaryByUser = []; + + users.forEach((user, userIndex) => { + const profile = progressProfiles[userIndex % progressProfiles.length]; + const counts = { + NOT_STARTED: 0, + IN_PROGRESS: 0, + COMPLETED: 0, + }; + + bookCatalog.forEach((bookEntry, bookIndex) => { + const status = pickWeightedProgressStatus(profile.weights, random); + counts[status] += 1; + + if (status === 'NOT_STARTED') { + return; + } + + const progressId = `${config.seedPrefix}-book-progress-${pad(userIndex + 1, 2)}-${pad(bookIndex + 1, 4)}`; + + if (status === 'COMPLETED') { + bookProgresses.push(buildCompletedBookProgress(progressId, user, bookEntry, random)); + return; + } + + bookProgresses.push(buildInProgressBookProgress(progressId, user, bookEntry, random)); + }); + + summaryByUser.push({ + username: user.username, + profile: profile.name, + counts, + }); + }); + + return { bookProgresses, summaryByUser }; +} + +function toPersistedDocument(entity) { + const document = Object.assign({}, entity); + delete document.id; + return document; +} + +function buildCompletedBookProgress(progressId, user, bookEntry, random) { + const completedAgo = randomInt(random, 7, 90); + const completedAt = daysAgo(completedAgo); + const updatedAt = daysAgo(randomInt(random, 0, completedAgo)); + const lastChapter = bookEntry.chapters[bookEntry.chapters.length - 1]; + const lastChunkIds = getChunkIdsForProgress(lastChapter, bookEntry.primaryDifficulty); + const lastChunkNumber = lastChunkIds.length; + + return { + id: progressId, + userId: user.id, + bookId: bookEntry.id, + chapterId: lastChapter.id, + chunkId: lastChunkIds[lastChunkIds.length - 1], + currentReadChapterNumber: bookEntry.chapterCount, + maxReadChapterNumber: bookEntry.chapterCount, + currentReadChunkNumber: lastChunkNumber, + maxReadChunkNumber: encodeChapterFirstPosition(bookEntry.chapterCount, lastChunkNumber), + normalizedProgress: 100, + maxNormalizedProgress: 100, + currentDifficultyLevel: bookEntry.primaryDifficulty, + chapterProgresses: bookEntry.chapters.map((chapter, index) => ({ + chapterNumber: chapter.chapterNumber, + progressPercentage: 100, + isCompleted: true, + completedAt: daysAgo(completedAgo + (bookEntry.chapterCount - index - 1)), + })), + isCompleted: true, + completedAt, + updatedAt, + }; +} + +function buildInProgressBookProgress(progressId, user, bookEntry, random) { + const chapterCount = bookEntry.chapterCount; + const minimumCompletedChapters = chapterCount >= 5 ? Math.floor(chapterCount * 0.2) : 0; + const maximumCompletedChapters = Math.max( + minimumCompletedChapters, + Math.min(chapterCount - 1, Math.floor(chapterCount * 0.75)) + ); + const completedChapterCount = randomInt(random, minimumCompletedChapters, maximumCompletedChapters); + const currentChapterNumber = Math.min(chapterCount, completedChapterCount + 1); + const currentChapter = bookEntry.chapters[currentChapterNumber - 1]; + const currentChunkIds = getChunkIdsForProgress(currentChapter, bookEntry.primaryDifficulty); + const minimumChunkNumber = Math.max(1, Math.floor(currentChunkIds.length * 0.25)); + const maximumChunkNumber = Math.max( + minimumChunkNumber, + Math.min(currentChunkIds.length - 1, Math.ceil(currentChunkIds.length * 0.85)) + ); + const currentReadChunkNumber = randomInt(random, minimumChunkNumber, maximumChunkNumber); + const currentChunkId = currentChunkIds[currentReadChunkNumber - 1]; + const currentChapterProgress = roundToOneDecimal((currentReadChunkNumber * 100) / currentChunkIds.length); + const updatedAt = daysAgo(randomInt(random, 0, 21)); + + return { + id: progressId, + userId: user.id, + bookId: bookEntry.id, + chapterId: currentChapter.id, + chunkId: currentChunkId, + currentReadChapterNumber: currentChapterNumber, + maxReadChapterNumber: currentChapterNumber, + currentReadChunkNumber, + maxReadChunkNumber: encodeChapterFirstPosition(currentChapterNumber, currentReadChunkNumber), + normalizedProgress: roundToOneDecimal((completedChapterCount * 100) / chapterCount), + maxNormalizedProgress: roundToOneDecimal((completedChapterCount * 100) / chapterCount), + currentDifficultyLevel: bookEntry.primaryDifficulty, + chapterProgresses: buildInProgressChapterProgresses( + bookEntry.chapters, + completedChapterCount, + currentChapterNumber, + currentChapterProgress, + random + ), + isCompleted: false, + completedAt: null, + updatedAt, + }; +} + +function buildInProgressChapterProgresses(chapters, completedChapterCount, currentChapterNumber, currentChapterProgress, random) { + const chapterProgresses = []; + + for (let index = 0; index < completedChapterCount; index += 1) { + chapterProgresses.push({ + chapterNumber: chapters[index].chapterNumber, + progressPercentage: 100, + isCompleted: true, + completedAt: daysAgo(randomInt(random, 2, 45)), + }); + } + + chapterProgresses.push({ + chapterNumber: currentChapterNumber, + progressPercentage: currentChapterProgress, + isCompleted: false, + completedAt: null, + }); + + return chapterProgresses; +} + +function pickWeightedProgressStatus(weights, random) { + const value = random(); + + if (value < weights.NOT_STARTED) { + return 'NOT_STARTED'; + } + + if (value < weights.NOT_STARTED + weights.IN_PROGRESS) { + return 'IN_PROGRESS'; + } + + return 'COMPLETED'; +} + +function getChunkIdsForProgress(chapter, difficultyLevel) { + const chunkIds = chapter.chunkIdsByDifficulty[difficultyLevel] || []; + + if (chunkIds.length === 0) { + throw new Error(`No chunks found for chapter=${chapter.id}, difficulty=${difficultyLevel}`); + } + + return chunkIds; +} + +function encodeChapterFirstPosition(chapterNumber, chunkNumber) { + const safeChapterNumber = Math.max(1, Number(chapterNumber) || 1); + const safeChunkNumber = Math.max(0, Number(chunkNumber) || 0); + return (safeChapterNumber * 65536) + safeChunkNumber; +} + +function pickBookProfile(random) { + const value = random(); + + if (value < 0.2) { + return { + name: 'short', + chapterRange: { min: 6, max: 8 }, + chunkRange: { min: 18, max: 24 }, + tags: ['starter', 'dialogue', 'daily-life', 'school'], + }; + } + + if (value < 0.8) { + return { + name: 'medium', + chapterRange: { min: 10, max: 15 }, + chunkRange: { min: 26, max: 34 }, + tags: ['classic', 'growth', 'friendship', 'mystery', 'travel'], + }; + } + + return { + name: 'long', + chapterRange: { min: 20, max: 30 }, + chunkRange: { min: 34, max: 42 }, + tags: ['epic', 'history', 'adventure', 'war', 'politics'], + }; +} + +function buildChunkPlan(profile, chapterNumber, random) { + const primaryChunkCount = randomInt(random, profile.chunkRange.min, profile.chunkRange.max); + const chapterWeight = chapterNumber % 7 === 0 ? 1 : 0; + + return { + primaryChunkCount: primaryChunkCount + chapterWeight, + }; +} + +function pickPrimaryDifficulty(random) { + const levels = ['A2', 'B1', 'B2', 'C1']; + return levels[randomInt(random, 0, levels.length - 1)]; +} + +function buildDifficultyLevels(primaryDifficulty, extraDifficultyRatio, random) { + const order = ['A0', 'A1', 'A2', 'B1', 'B2', 'C1', 'C2']; + const primaryIndex = order.indexOf(primaryDifficulty); + const levels = [primaryDifficulty]; + + if (random() >= extraDifficultyRatio) { + return levels; + } + + const candidates = []; + + if (primaryIndex > 0) { + candidates.push(order[primaryIndex - 1]); + } + if (primaryIndex < order.length - 1) { + candidates.push(order[primaryIndex + 1]); + } + + if (candidates.length > 0) { + levels.push(candidates[randomInt(random, 0, candidates.length - 1)]); + } + + return levels; +} + +function adjustChunkCountByLevel(primaryChunkCount, levelIndex, random) { + if (levelIndex === 0) { + return primaryChunkCount; + } + + return Math.max(3, primaryChunkCount + randomInt(random, -1, 1)); +} + +function shouldCreateImageChunk(chunkNumber, chapterNumber, random) { + if ((chapterNumber + chunkNumber) % 11 === 0) { + return true; + } + + return random() < 0.03; +} + +function buildChunkText(titleSeed, chapterNumber, chunkNumber, difficultyLevel) { + const sentence = `${titleSeed.baseAdjective} ${titleSeed.baseNoun.toLowerCase()} moves through chapter ${chapterNumber}, section ${chunkNumber}, at ${difficultyLevel} pace.`; + return [ + sentence, + 'The character observes small details, reacts to change, and keeps the scene moving with clear narrative beats.', + 'This placeholder text is intentionally stable so local k6 comparisons focus on query cost rather than random payload drift.', + ].join(' '); +} + +function buildTitleSeed(bookIndex) { + const adjectives = ['Silent', 'Hidden', 'Burning', 'Golden', 'Fading', 'Northern', 'Restless', 'Last']; + const nouns = ['Garden', 'Harbor', 'Compass', 'Archive', 'Forest', 'Letters', 'Skyline', 'Bridge']; + const adjectivesKo = ['조용한', '숨겨진', '타오르는', '황금빛', '희미한', '북쪽의', '불안한', '마지막']; + const nounsKo = ['정원', '항구', '나침반', '기록보관소', '숲', '편지', '스카이라인', '다리']; + const adjectivesJa = ['静かな', '隠された', '燃える', '黄金の', '薄れる', '北の', '落ち着かない', '最後の']; + const nounsJa = ['庭', '港', '羅針盤', '記録庫', '森', '手紙', 'スカイライン', '橋']; + + const adjectiveIndex = bookIndex % adjectives.length; + const nounIndex = Math.floor(bookIndex / adjectives.length) % nouns.length; + + return { + baseAdjective: adjectives[adjectiveIndex], + baseNoun: nouns[nounIndex], + baseAdjectiveKo: adjectivesKo[adjectiveIndex], + baseNounKo: nounsKo[nounIndex], + baseAdjectiveJa: adjectivesJa[adjectiveIndex], + baseNounJa: nounsJa[nounIndex], + }; +} + +function buildChapterTitle(baseNoun, chapterNumber) { + const patterns = ['Arrival', 'Signal', 'Detour', 'Witness', 'Crossing', 'Turn', 'Distance', 'Echo']; + return `${patterns[(chapterNumber - 1) % patterns.length]} of the ${baseNoun}`; +} + +function buildAuthorName(bookIndex) { + const firstNames = ['Mina', 'Elias', 'Harper', 'Jun', 'Noah', 'Sora', 'Lena', 'Theo']; + const lastNames = ['Park', 'Rivera', 'Bennett', 'Tanaka', 'Kim', 'Silva', 'Walker', 'Ito']; + + return `${firstNames[bookIndex % firstNames.length]} ${lastNames[Math.floor(bookIndex / firstNames.length) % lastNames.length]}`; +} + +function buildAverageRating(random) { + return Number((3.4 + random() * 1.4).toFixed(1)); +} + +function buildReviewCount(profile, random) { + const multiplier = profile.name === 'long' ? 1.4 : profile.name === 'short' ? 0.7 : 1; + return Math.round((20 + random() * 180) * multiplier); +} + +function buildViewCount(profile, random) { + const base = profile.name === 'long' ? 600 : profile.name === 'short' ? 80 : 250; + const heavyTail = Math.pow(random(), 0.35); + return Math.round(base + heavyTail * 6000); +} + +function buildTags(profile, random) { + const tags = []; + const candidates = profile.tags.slice(); + const tagCount = randomInt(random, 1, Math.min(3, candidates.length)); + + while (tags.length < tagCount) { + const index = randomInt(random, 0, candidates.length - 1); + tags.push(candidates.splice(index, 1)[0]); + } + + return tags; +} + +function estimateChapterReadingTime(primaryChunkCount, difficultyCount) { + return Math.max(3, Math.round((primaryChunkCount * (difficultyCount > 1 ? 1.15 : 1)) * 1.1)); +} + +function createRandom(seedString) { + let seed = 0; + + for (let index = 0; index < seedString.length; index += 1) { + seed = (seed * 31 + seedString.charCodeAt(index)) >>> 0; + } + + return function next() { + seed = (seed + 0x6D2B79F5) >>> 0; + let value = seed; + value = Math.imul(value ^ (value >>> 15), value | 1); + value ^= value + Math.imul(value ^ (value >>> 7), value | 61); + return ((value ^ (value >>> 14)) >>> 0) / 4294967296; + }; +} + +function randomInt(random, min, max) { + return Math.floor(random() * (max - min + 1)) + min; +} + +function positiveInt(value, fallback) { + const parsed = Number(value); + return Number.isInteger(parsed) && parsed > 0 ? parsed : fallback; +} + +function boundedNumber(value, fallback, min, max) { + const parsed = Number(value); + + if (!Number.isFinite(parsed)) { + return fallback; + } + + return Math.min(max, Math.max(min, parsed)); +} + +function roundToOneDecimal(value) { + return Math.round(value * 10) / 10; +} + +function parseBoolean(value) { + return ['1', 'true', 'yes', 'on'].includes(String(value).toLowerCase()); +} + +function pad(value, length) { + return String(value).padStart(length, '0'); +} + +function daysAgo(days) { + const date = new Date(); + date.setUTCDate(date.getUTCDate() - days); + return date; +} + +function escapeRegex(value) { + return value.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); +} + +seedBooksContent(); diff --git a/monitoring/docker-compose.monitoring-local.yml b/monitoring/docker-compose.monitoring-local.yml index 839cefef..78ee96c4 100644 --- a/monitoring/docker-compose.monitoring-local.yml +++ b/monitoring/docker-compose.monitoring-local.yml @@ -3,24 +3,64 @@ services: image: prom/prometheus:latest container_name: prometheus-local ports: - - "9090:9090" + - "${PROMETHEUS_PORT:-9090}:9090" env_file: - - path: .env.monitoring.local + - ../.env.local volumes: - - ./prometheus-local.yml:/etc/prometheus/prometheus.yml + - ./prometheus.yml:/etc/prometheus/prometheus.yml.tmpl:ro + - ./alert-rules.yml:/etc/prometheus/alert-rules.yml:ro - ./data/prometheus-data-local:/prometheus + entrypoint: /bin/sh + command: + - -ec + - | + TARGET="$${APP_METRICS_TARGET:-host.docker.internal:8080}"; + KEY="$${IMPORT_API_KEY:-}"; + ESC_TARGET="$$(printf '%s' "$$TARGET" | sed 's/[&|]/\\&/g')"; + ESC_KEY="$$(printf '%s' "$$KEY" | sed 's/[&|]/\\&/g')"; + sed -e "s|__APP_METRICS_TARGET__|$${ESC_TARGET}|g" \ + -e "s|__IMPORT_API_KEY__|$${ESC_KEY}|g" \ + /etc/prometheus/prometheus.yml.tmpl > /tmp/prometheus.yml; + exec prometheus --config.file=/tmp/prometheus.yml grafana: image: grafana/grafana:latest container_name: grafana-local ports: - - "3000:3000" - env_file: - - path: .env.monitoring.local + - "${MONITORING_GRAFANA_PORT:-3000}:3000" environment: - GF_SECURITY_ADMIN_USER=admin - GF_SECURITY_ADMIN_PASSWORD=password volumes: - ./data/grafana-data-local:/var/lib/grafana depends_on: - - prometheus \ No newline at end of file + - prometheus + + loki: + profiles: ["logs"] + image: grafana/loki:latest + container_name: loki-local + ports: + - "${LOKI_PORT:-3100}:3100" + volumes: + - ./loki-config.yml:/etc/loki/local-config.yaml:ro + - ./data/loki-data-local:/loki + command: + - "-config.file=/etc/loki/local-config.yaml" + + promtail: + profiles: ["logs"] + image: grafana/promtail:latest + container_name: promtail-local + environment: + - LOKI_PUSH_URL=http://loki:3100/loki/api/v1/push + volumes: + - ./promtail-config.yml:/etc/promtail/config.yml:ro + - /var/log:/var/log:ro + - /var/lib/docker/containers:/var/lib/docker/containers:ro + - /var/run/docker.sock:/var/run/docker.sock + command: + - "-config.file=/etc/promtail/config.yml" + - "-config.expand-env=true" + depends_on: + - loki diff --git a/monitoring/docker-compose.monitoring-prod.yml b/monitoring/docker-compose.monitoring-prod.yml index 9ccf67bc..f221ead1 100644 --- a/monitoring/docker-compose.monitoring-prod.yml +++ b/monitoring/docker-compose.monitoring-prod.yml @@ -11,10 +11,23 @@ services: - "9090" env_file: - path: .env.monitoring.prod + required: false volumes: - - ./prometheus-prod.yml:/etc/prometheus/prometheus.yml - - ./alert-rules.yml:/etc/prometheus/alert-rules.yml + - ./prometheus.yml:/etc/prometheus/prometheus.yml.tmpl:ro + - ./alert-rules.yml:/etc/prometheus/alert-rules.yml:ro - ./data/prometheus-data-prod:/prometheus + entrypoint: /bin/sh + command: + - -ec + - | + TARGET="$${APP_METRICS_TARGET:-app:8080}"; + KEY="$${IMPORT_API_KEY:-}"; + ESC_TARGET="$$(printf '%s' "$$TARGET" | sed 's/[&|]/\\&/g')"; + ESC_KEY="$$(printf '%s' "$$KEY" | sed 's/[&|]/\\&/g')"; + sed -e "s|__APP_METRICS_TARGET__|$${ESC_TARGET}|g" \ + -e "s|__IMPORT_API_KEY__|$${ESC_KEY}|g" \ + /etc/prometheus/prometheus.yml.tmpl > /tmp/prometheus.yml; + exec prometheus --config.file=/tmp/prometheus.yml alertmanager: image: prom/alertmanager:latest @@ -23,6 +36,7 @@ services: - "9093" env_file: - path: .env.monitoring.prod + required: false volumes: - ./alertmanager.yml:/etc/alertmanager/alertmanager.yml - ./data/alertmanager-data-prod:/alertmanager @@ -34,6 +48,7 @@ services: - "3000" env_file: - path: .env.monitoring.prod + required: false environment: - GF_SECURITY_ADMIN_USER=admin - GF_SECURITY_ADMIN_PASSWORD=password @@ -50,6 +65,7 @@ services: - "3100" env_file: - path: .env.monitoring.prod + required: false volumes: - ./loki-config.yml:/etc/loki/local-config.yaml - ./data/loki-data-prod:/loki @@ -60,11 +76,16 @@ services: container_name: promtail-prod env_file: - path: .env.monitoring.prod + required: false + environment: + - LOKI_PUSH_URL=http://loki:3100/loki/api/v1/push volumes: - - ./promtail-config.yml:/etc/promtail/config.yml + - ./promtail-config.yml:/etc/promtail/config.yml:ro - /var/log:/var/log:ro - /var/lib/docker/containers:/var/lib/docker/containers:ro - /var/run/docker.sock:/var/run/docker.sock - command: -config.file=/etc/promtail/config.yml + command: + - "-config.file=/etc/promtail/config.yml" + - "-config.expand-env=true" depends_on: - - loki \ No newline at end of file + - loki diff --git a/monitoring/loki-config.yml b/monitoring/loki-config.yml index a6e8d699..68622759 100644 --- a/monitoring/loki-config.yml +++ b/monitoring/loki-config.yml @@ -22,4 +22,8 @@ schema_config: schema: v13 index: prefix: index_ - period: 24h \ No newline at end of file + period: 24h + +limits_config: + ingestion_rate_mb: 16 + ingestion_burst_size_mb: 32 diff --git a/monitoring/prometheus.yml b/monitoring/prometheus.yml new file mode 100644 index 00000000..8c25c55a --- /dev/null +++ b/monitoring/prometheus.yml @@ -0,0 +1,15 @@ +global: + scrape_interval: 15s + evaluation_interval: 15s + +rule_files: + - /etc/prometheus/alert-rules.yml + +scrape_configs: + - job_name: "llv-api" + metrics_path: /actuator/prometheus + static_configs: + - targets: ["__APP_METRICS_TARGET__"] + authorization: + type: llvk + credentials: "__IMPORT_API_KEY__" diff --git a/monitoring/promtail-config.yml b/monitoring/promtail-config.yml index ff0367fc..5b8d8c24 100644 --- a/monitoring/promtail-config.yml +++ b/monitoring/promtail-config.yml @@ -6,7 +6,7 @@ positions: filename: /tmp/positions.yaml clients: - - url: http://loki-prod:3100/loki/api/v1/push + - url: ${LOKI_PUSH_URL} scrape_configs: # Docker containers logs @@ -76,4 +76,4 @@ scrape_configs: source: timestamp - labels: level: - logger: \ No newline at end of file + logger: diff --git a/src/main/java/com/linglevel/api/common/ratelimit/config/RateLimitProperties.java b/src/main/java/com/linglevel/api/common/ratelimit/config/RateLimitProperties.java index 71f65295..cd054c71 100644 --- a/src/main/java/com/linglevel/api/common/ratelimit/config/RateLimitProperties.java +++ b/src/main/java/com/linglevel/api/common/ratelimit/config/RateLimitProperties.java @@ -11,6 +11,8 @@ @ConfigurationProperties(prefix = "rate.limit") public class RateLimitProperties { + private boolean enabled = true; + private int capacity; private Refill refill = new Refill(); diff --git a/src/main/java/com/linglevel/api/common/ratelimit/filter/RateLimitFilter.java b/src/main/java/com/linglevel/api/common/ratelimit/filter/RateLimitFilter.java index 8e6af6f6..c67754a5 100644 --- a/src/main/java/com/linglevel/api/common/ratelimit/filter/RateLimitFilter.java +++ b/src/main/java/com/linglevel/api/common/ratelimit/filter/RateLimitFilter.java @@ -34,6 +34,11 @@ public class RateLimitFilter implements Filter { public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { + if (!rateLimitProperties.isEnabled()) { + chain.doFilter(request, response); + return; + } + HttpServletRequest httpRequest = (HttpServletRequest) request; HttpServletResponse httpResponse = (HttpServletResponse) response; diff --git a/src/main/java/com/linglevel/api/content/book/dto/ProgressResponse.java b/src/main/java/com/linglevel/api/content/book/dto/ProgressResponse.java index f4cb77ab..ea4442de 100644 --- a/src/main/java/com/linglevel/api/content/book/dto/ProgressResponse.java +++ b/src/main/java/com/linglevel/api/content/book/dto/ProgressResponse.java @@ -39,7 +39,7 @@ public class ProgressResponse { @Schema(description = "최대 읽은 챕터 번호", example = "3") private Integer maxReadChapterNumber; - @Schema(description = "최대 읽은 청크 번호", example = "8") + @Schema(description = "챕터 우선 정렬 기준의 최대 도달 청크 위치값", example = "65544") private Integer maxReadChunkNumber; @Schema(description = "완료 여부", example = "false") @@ -59,4 +59,4 @@ public class ProgressResponse { @Schema(description = "업데이트 일시", example = "2024-01-15T10:30:00Z") private Instant updatedAt; -} \ No newline at end of file +} diff --git a/src/main/java/com/linglevel/api/content/book/entity/BookProgress.java b/src/main/java/com/linglevel/api/content/book/entity/BookProgress.java index 3ada537e..2b2fdce0 100644 --- a/src/main/java/com/linglevel/api/content/book/entity/BookProgress.java +++ b/src/main/java/com/linglevel/api/content/book/entity/BookProgress.java @@ -37,6 +37,12 @@ public class BookProgress { private Integer maxReadChapterNumber; + /** + * 챕터 우선 정렬 기준의 최대 도달 청크 위치값. + * 비교 순서는 (chapterNumber, chunkNumber)이며 chapter가 우선한다. + */ + private Integer maxReadChunkNumber; + // V2 Progress Fields private Double normalizedProgress; diff --git a/src/main/java/com/linglevel/api/content/book/repository/BookProgressRepository.java b/src/main/java/com/linglevel/api/content/book/repository/BookProgressRepository.java index 15ecbf3d..076c9551 100644 --- a/src/main/java/com/linglevel/api/content/book/repository/BookProgressRepository.java +++ b/src/main/java/com/linglevel/api/content/book/repository/BookProgressRepository.java @@ -10,6 +10,7 @@ public interface BookProgressRepository extends MongoRepository { Optional findByUserIdAndBookId(String UserId, String bookId); + List findByUserIdAndBookIdIn(String userId, List bookIds); Page findAllByUserId(String userId, Pageable pageable); List findAllByUserId(String userId); List findByBookId(String bookId); diff --git a/src/main/java/com/linglevel/api/content/book/repository/BookRepositoryImpl.java b/src/main/java/com/linglevel/api/content/book/repository/BookRepositoryImpl.java index a3a64258..c0a59d0a 100644 --- a/src/main/java/com/linglevel/api/content/book/repository/BookRepositoryImpl.java +++ b/src/main/java/com/linglevel/api/content/book/repository/BookRepositoryImpl.java @@ -14,7 +14,9 @@ import org.springframework.util.StringUtils; import java.util.Arrays; +import java.util.HashSet; import java.util.List; +import java.util.Set; /** * Book Repository 커스텀 구현체 @@ -94,6 +96,8 @@ private void applyProgressFilter(Query query, ProgressStatus progress, String us List bookIds = getBookIdsByProgress(userId, progress); if (!bookIds.isEmpty()) { query.addCriteria(Criteria.where("id").in(bookIds)); + } else { + query.addCriteria(Criteria.where("_id").is(null)); } } @@ -128,12 +132,13 @@ private List getNotStartedBookIds(String userId) { .map(Book::getId) .toList(); - // 진도가 있는 책 ID 조회 - List progressBookIds = findProgressBookIds(userId); + // 시작한 책(진행 중/완료) ID 조회 + List startedBookIds = findStartedBookIds(userId); + Set startedBookIdSet = new HashSet<>(startedBookIds); - // 진도가 없는 책만 반환 + // 시작하지 않은 책(완료/부분 읽기/완료 챕터 진행률이 없는 책)만 반환 return allBookIds.stream() - .filter(bookId -> !progressBookIds.contains(bookId)) + .filter(bookId -> !startedBookIdSet.contains(bookId)) .toList(); } @@ -144,7 +149,10 @@ private List getInProgressBookIds(String userId) { Query query = new Query(); query.addCriteria(Criteria.where("userId").is(userId)); query.addCriteria(Criteria.where("isCompleted").is(false)); - query.addCriteria(Criteria.where("maxReadChunkNumber").gt(0)); + query.addCriteria(new Criteria().orOperator( + Criteria.where("normalizedProgress").gt(0), + partiallyReadChapterCriteria() + )); return findBookIdsFromProgress(query); } @@ -163,13 +171,25 @@ private List getCompletedBookIds(String userId) { /** * 특정 사용자의 모든 진도 책 ID 조회 */ - private List findProgressBookIds(String userId) { + private List findStartedBookIds(String userId) { Query query = new Query(); query.addCriteria(Criteria.where("userId").is(userId)); + query.addCriteria(new Criteria().orOperator( + Criteria.where("isCompleted").is(true), + Criteria.where("normalizedProgress").gt(0), + partiallyReadChapterCriteria() + )); return findBookIdsFromProgress(query); } + private Criteria partiallyReadChapterCriteria() { + return Criteria.where("chapterProgresses").elemMatch( + Criteria.where("isCompleted").is(false) + .and("progressPercentage").gt(0) + ); + } + /** * Progress 컬렉션에서 bookId 추출 */ diff --git a/src/main/java/com/linglevel/api/content/book/repository/ChapterRepositoryImpl.java b/src/main/java/com/linglevel/api/content/book/repository/ChapterRepositoryImpl.java index 1ab49f98..5c4113fb 100644 --- a/src/main/java/com/linglevel/api/content/book/repository/ChapterRepositoryImpl.java +++ b/src/main/java/com/linglevel/api/content/book/repository/ChapterRepositoryImpl.java @@ -65,7 +65,7 @@ private void applyProgressFilter(Query query, ProgressStatus progress, String bo BookProgress bookProgress = bookProgressRepository.findByUserIdAndBookId(userId, bookId) .orElse(null); - List chapterNumbers = getChapterNumbersByProgress(bookProgress, progress); + List chapterNumbers = getChapterNumbersByProgress(bookId, bookProgress, progress); if (chapterNumbers == null) { // null이면 필터링하지 않음 (모든 챕터 반환) @@ -83,10 +83,10 @@ private void applyProgressFilter(Query query, ProgressStatus progress, String bo /** * 진도 상태별 챕터 번호 목록 조회 */ - private List getChapterNumbersByProgress(BookProgress bookProgress, ProgressStatus progressStatus) { + private List getChapterNumbersByProgress(String bookId, BookProgress bookProgress, ProgressStatus progressStatus) { // 모든 챕터 번호 조회 List allChapters = mongoTemplate.find( - Query.query(Criteria.where("bookId").is(bookProgress.getBookId())), + Query.query(Criteria.where("bookId").is(bookId)), Chapter.class ); List allChapterNumbers = allChapters.stream().map(Chapter::getChapterNumber).toList(); @@ -95,38 +95,27 @@ private List getChapterNumbersByProgress(BookProgress bookProgress, Pro return progressStatus == ProgressStatus.NOT_STARTED ? allChapterNumbers : List.of(); } - // [FIX] V3 마이그레이션된 데이터(chapterProgresses)를 기준으로 필터링 - if (bookProgress.getChapterProgresses() != null && !bookProgress.getChapterProgresses().isEmpty()) { - Map progressInfoMap = bookProgress.getChapterProgresses().stream() - .collect(Collectors.toMap(BookProgress.ChapterProgressInfo::getChapterNumber, Function.identity())); - - return allChapterNumbers.stream() - .filter(chapterNumber -> { - BookProgress.ChapterProgressInfo info = progressInfoMap.get(chapterNumber); - boolean isCompleted = info != null && Boolean.TRUE.equals(info.getIsCompleted()); - boolean inProgress = info != null && !isCompleted && info.getProgressPercentage() != null && info.getProgressPercentage() > 0; - - return switch (progressStatus) { - case COMPLETED -> isCompleted; - case IN_PROGRESS -> inProgress; - case NOT_STARTED -> !isCompleted && !inProgress; - }; - }) - .toList(); - } - - // [FALLBACK] 마이그레이션되지 않은 옛날 데이터 기준 - Integer currentChapterNumber = bookProgress.getCurrentReadChapterNumber() != null - ? bookProgress.getCurrentReadChapterNumber() : 0; + Map progressInfoMap = + bookProgress.getChapterProgresses() == null + ? Map.of() + : bookProgress.getChapterProgresses().stream() + .collect(Collectors.toMap(BookProgress.ChapterProgressInfo::getChapterNumber, Function.identity())); return allChapterNumbers.stream() .filter(chapterNumber -> { + BookProgress.ChapterProgressInfo info = progressInfoMap.get(chapterNumber); + boolean isCompleted = info != null && Boolean.TRUE.equals(info.getIsCompleted()); + boolean inProgress = info != null + && !isCompleted + && info.getProgressPercentage() != null + && info.getProgressPercentage() > 0; + return switch (progressStatus) { - case NOT_STARTED -> chapterNumber > currentChapterNumber; - case IN_PROGRESS -> chapterNumber.equals(currentChapterNumber) && currentChapterNumber > 0; - case COMPLETED -> chapterNumber < currentChapterNumber; + case COMPLETED -> isCompleted; + case IN_PROGRESS -> inProgress; + case NOT_STARTED -> !isCompleted && !inProgress; }; }) .toList(); } -} \ No newline at end of file +} diff --git a/src/main/java/com/linglevel/api/content/book/service/BookService.java b/src/main/java/com/linglevel/api/content/book/service/BookService.java index ca6f9b51..a09e29d7 100644 --- a/src/main/java/com/linglevel/api/content/book/service/BookService.java +++ b/src/main/java/com/linglevel/api/content/book/service/BookService.java @@ -28,7 +28,9 @@ import org.springframework.util.StringUtils; import java.time.Instant; +import java.util.HashMap; import java.util.List; +import java.util.Map; import java.util.stream.Collectors; @Service @@ -125,10 +127,12 @@ public PageResponse getBooks(GetBooksRequest request, String userI // QueryDSL Custom Repository를 사용하여 필터링 + 페이지네이션 통합 처리 Page bookPage = bookRepository.findBooksWithFilters(request, userId, pageable); + List books = bookPage.getContent(); + Map progressMap = getProgressMap(userId, books); LanguageCode languageCode = request.getLanguageCode(); - List bookResponses = bookPage.getContent().stream() - .map(book -> convertToBookResponse(book, userId, languageCode)) + List bookResponses = books.stream() + .map(book -> convertToBookResponse(book, progressMap.get(book.getId()), languageCode)) .collect(Collectors.toList()); return new PageResponse<>(bookResponses, bookPage); @@ -138,7 +142,11 @@ public BookResponse getBook(String bookId, String userId, LanguageCode languageC Book book = bookRepository.findById(bookId) .orElseThrow(() -> new BooksException(BooksErrorCode.BOOK_NOT_FOUND)); - return convertToBookResponse(book, userId, languageCode); + BookProgress progress = userId == null + ? null + : bookProgressRepository.findByUserIdAndBookId(userId, book.getId()).orElse(null); + + return convertToBookResponse(book, progress, languageCode); } public boolean existsById(String bookId) { @@ -179,29 +187,44 @@ private List filterByProgress(List bookResponses, Pr .collect(Collectors.toList()); } - private BookResponse convertToBookResponse(Book book, String userId, LanguageCode languageCode) { + private Map getProgressMap(String userId, List books) { + if (userId == null || books.isEmpty()) { + return Map.of(); + } + + List bookIds = books.stream().map(Book::getId).toList(); + List progresses = bookProgressRepository.findByUserIdAndBookIdIn(userId, bookIds); + if (progresses == null || progresses.isEmpty()) { + return Map.of(); + } + + Map progressMap = new HashMap<>(); + for (BookProgress progress : progresses) { + if (progress.getBookId() != null) { + // unique index(userId, bookId) 기준으로 bookId당 1건만 유지 + progressMap.putIfAbsent(progress.getBookId(), progress); + } + } + return progressMap; + } + + private BookResponse convertToBookResponse(Book book, BookProgress progress, LanguageCode languageCode) { // 진도 정보 조회 int currentReadChapterNumber = 0; double progressPercentage = 0.0; boolean isCompleted = false; - if (userId != null) { - BookProgress progress = bookProgressRepository - .findByUserIdAndBookId(userId, book.getId()) - .orElse(null); + if (progress != null) { + currentReadChapterNumber = progress.getCurrentReadChapterNumber() != null + ? progress.getCurrentReadChapterNumber() : 0; - if (progress != null) { - currentReadChapterNumber = progress.getCurrentReadChapterNumber() != null - ? progress.getCurrentReadChapterNumber() : 0; + // 진행률은 저장된 normalizedProgress를 단일 소스로 사용한다. + progressPercentage = progress.getNormalizedProgress() != null + ? progress.getNormalizedProgress() + : 0.0; - // 진행률 계산 - if (book.getChapterCount() != null && book.getChapterCount() > 0) { - progressPercentage = (double) currentReadChapterNumber / book.getChapterCount() * 100.0; - } - - // DB에 저장된 완료 여부 사용 - isCompleted = progress.getIsCompleted() != null ? progress.getIsCompleted() : false; - } + // DB에 저장된 완료 여부 사용 + isCompleted = progress.getIsCompleted() != null ? progress.getIsCompleted() : false; } // 언어 코드에 따라 title 선택 @@ -251,4 +274,4 @@ private String selectTitleByLanguage(Book book, LanguageCode languageCode) { } -} \ No newline at end of file +} diff --git a/src/main/java/com/linglevel/api/content/book/service/ChapterService.java b/src/main/java/com/linglevel/api/content/book/service/ChapterService.java index 38ac6188..200e9a4b 100644 --- a/src/main/java/com/linglevel/api/content/book/service/ChapterService.java +++ b/src/main/java/com/linglevel/api/content/book/service/ChapterService.java @@ -6,7 +6,6 @@ import com.linglevel.api.content.book.dto.GetChaptersRequest; import com.linglevel.api.content.book.entity.Book; import com.linglevel.api.content.book.entity.Chapter; -import com.linglevel.api.content.book.entity.Chunk; import com.linglevel.api.content.book.exception.BooksException; import com.linglevel.api.content.book.exception.BooksErrorCode; import com.linglevel.api.content.book.repository.BookRepository; @@ -65,10 +64,6 @@ public PageResponse getChapters(String bookId, GetChaptersReque .flatMap(id -> bookProgressRepository.findByUserIdAndBookId(id, bookId)) .orElse(null); - Chunk progressChunk = (bookProgress != null && bookProgress.getChunkId() != null) - ? chunkRepository.findById(bookProgress.getChunkId()).orElse(null) - : null; - Map> chunkCountsMap = chunkRepository.findChunkCountsByChapterIds(chapterIds) .stream() .collect(Collectors.groupingBy( @@ -77,7 +72,7 @@ public PageResponse getChapters(String bookId, GetChaptersReque )); List chapterResponses = chapters.stream() - .map(chapter -> convertToChapterResponse(chapter, book, bookProgress, progressChunk, chunkCountsMap)) + .map(chapter -> convertToChapterResponse(chapter, book, bookProgress, chunkCountsMap)) .collect(Collectors.toList()); return new PageResponse<>(chapterResponses, chapterPage); @@ -97,10 +92,6 @@ public ChapterResponse getChapter(String bookId, String chapterId, String userId .flatMap(id -> bookProgressRepository.findByUserIdAndBookId(id, bookId)) .orElse(null); - Chunk progressChunk = (bookProgress != null && bookProgress.getChunkId() != null) - ? chunkRepository.findById(bookProgress.getChunkId()).orElse(null) - : null; - Map> chunkCountsMap = chunkRepository.findChunkCountsByChapterIds(Collections.singletonList(chapterId)) .stream() .collect(Collectors.groupingBy( @@ -108,7 +99,7 @@ public ChapterResponse getChapter(String bookId, String chapterId, String userId Collectors.toMap(ChunkCountByLevelDto::getDifficultyLevel, ChunkCountByLevelDto::getCount) )); - return convertToChapterResponse(chapter, book, bookProgress, progressChunk, chunkCountsMap); + return convertToChapterResponse(chapter, book, bookProgress, chunkCountsMap); } public boolean existsById(String chapterId) { @@ -153,7 +144,7 @@ public ChapterNavigationResponse getChapterNavigation(String bookId, String chap .build(); } - private ChapterResponse convertToChapterResponse(Chapter chapter, Book book, BookProgress bookProgress, Chunk progressChunk, Map> chunkCountsMap) { + private ChapterResponse convertToChapterResponse(Chapter chapter, Book book, BookProgress bookProgress, Map> chunkCountsMap) { int currentReadChunkNumber = 0; double progressPercentage = 0.0; DifficultyLevel currentDifficultyLevel = book.getDifficultyLevel(); // Fallback: Book's difficulty @@ -173,7 +164,6 @@ private ChapterResponse convertToChapterResponse(Chapter chapter, Book book, Boo : null; if (chapterProgressInfo != null) { - // 배열에서 찾은 경우 progressPercentage = chapterProgressInfo.getProgressPercentage() != null ? chapterProgressInfo.getProgressPercentage() : 0.0; isCompleted = Boolean.TRUE.equals(chapterProgressInfo.getIsCompleted()); @@ -182,30 +172,6 @@ private ChapterResponse convertToChapterResponse(Chapter chapter, Book book, Boo long totalChunksForLevel = chunkCountsMap.getOrDefault(chapter.getId(), Collections.emptyMap()) .getOrDefault(currentDifficultyLevel, 0L); currentReadChunkNumber = (int) Math.ceil(progressPercentage * totalChunksForLevel / 100.0); - - } else { - // [FALLBACK] 기존 로직 사용 (backward compatibility) - Integer progressChapterNumber = bookProgress.getCurrentReadChapterNumber() != null - ? bookProgress.getCurrentReadChapterNumber() : 0; - - Integer progressChunkNumber = (progressChunk != null && progressChunk.getChunkNumber() != null) - ? progressChunk.getChunkNumber() : 0; - - long totalChunksForLevel = chunkCountsMap.getOrDefault(chapter.getId(), Collections.emptyMap()) - .getOrDefault(currentDifficultyLevel, 0L); - - if (chapter.getChapterNumber() < progressChapterNumber) { - currentReadChunkNumber = (int) totalChunksForLevel; - progressPercentage = 100.0; - } else if (chapter.getChapterNumber().equals(progressChapterNumber)) { - currentReadChunkNumber = progressChunkNumber; - if (totalChunksForLevel > 0) { - progressPercentage = (double) progressChunkNumber / totalChunksForLevel * 100.0; - } - } else { - currentReadChunkNumber = 0; - progressPercentage = 0.0; - } } } diff --git a/src/main/java/com/linglevel/api/content/book/service/ProgressService.java b/src/main/java/com/linglevel/api/content/book/service/ProgressService.java index 9b88d274..b76552a6 100644 --- a/src/main/java/com/linglevel/api/content/book/service/ProgressService.java +++ b/src/main/java/com/linglevel/api/content/book/service/ProgressService.java @@ -11,7 +11,6 @@ import com.linglevel.api.content.book.repository.ChapterRepository; import com.linglevel.api.content.book.repository.ChunkRepository; import com.linglevel.api.content.common.ContentType; -import com.linglevel.api.content.common.service.ProgressCalculationService; import com.linglevel.api.content.common.service.ReadingCompletionService; import com.linglevel.api.streak.service.StreakService; import lombok.RequiredArgsConstructor; @@ -25,13 +24,15 @@ @RequiredArgsConstructor @Slf4j public class ProgressService { + private static final int CHAPTER_POSITION_SHIFT = 16; + private static final int CHAPTER_NUMBER_MAX = 0x7FFF; // 32767 + private static final int CHUNK_NUMBER_MAX = 0xFFFF; // 65535 private final BookService bookService; private final ChapterService chapterService; private final ChunkService chunkService; private final BookProgressRepository bookProgressRepository; private final ChunkRepository chunkRepository; - private final ProgressCalculationService progressCalculationService; private final ReadingCompletionService readingCompletionService; private final StreakService streakService; private final ChapterRepository chapterRepository; @@ -106,6 +107,12 @@ public ProgressResponse updateProgress(String bookId, ProgressUpdateRequest requ bookProgress.setMaxReadChapterNumber(currentChapterNum); } + int currentChunkPosition = toChapterFirstPosition(chapter.getChapterNumber(), chunk.getChunkNumber()); + Integer maxChunkPosition = bookProgress.getMaxReadChunkNumber(); + if (maxChunkPosition == null || currentChunkPosition > maxChunkPosition) { + bookProgress.setMaxReadChunkNumber(currentChunkPosition); + } + // maxNormalizedProgress는 완료된 챕터 기반으로 설정 bookProgress.setMaxNormalizedProgress(bookProgress_normalizedProgress); @@ -250,37 +257,9 @@ public ProgressResponse getProgress(String bookId, String userId) { throw new BooksException(BooksErrorCode.BOOK_NOT_FOUND); } - BookProgress bookProgress = bookProgressRepository.findByUserIdAndBookId(userId, bookId) - .orElseGet(() -> initializeProgress(userId, bookId)); - - return convertToProgressResponse(bookProgress, false); - } - - private BookProgress initializeProgress(String userId, String bookId) { - Chapter firstChapter = chapterService.findFirstByBookId(bookId); - Chunk firstChunk = chunkService.findFirstByChapterId(firstChapter.getId()); - - BookProgress newProgress = new BookProgress(); - newProgress.setUserId(userId); - newProgress.setBookId(bookId); - newProgress.setChapterId(firstChapter.getId()); - newProgress.setChunkId(firstChunk.getId()); - newProgress.setCurrentReadChapterNumber(firstChapter.getChapterNumber()); - newProgress.setMaxReadChapterNumber(firstChapter.getChapterNumber()); - - // [V2_CORE] V2 필드: 초기 진행률 계산 - long totalChunks = chunkRepository.countByChapterIdAndDifficultyLevel( - firstChapter.getId(), firstChunk.getDifficultyLevel() - ); - double initialProgress = progressCalculationService.calculateNormalizedProgress( - firstChunk.getChunkNumber(), totalChunks - ); - - newProgress.setNormalizedProgress(initialProgress); - newProgress.setMaxNormalizedProgress(initialProgress); - newProgress.setCurrentDifficultyLevel(firstChunk.getDifficultyLevel()); - - return bookProgressRepository.save(newProgress); + return bookProgressRepository.findByUserIdAndBookId(userId, bookId) + .map(progress -> convertToProgressResponse(progress, false)) + .orElseGet(() -> createNotStartedProgressResponse(userId, bookId)); } @Transactional @@ -320,6 +299,7 @@ private ProgressResponse convertToProgressResponse(BookProgress progress, boolea .currentReadChapterNumber(progress.getCurrentReadChapterNumber()) .currentReadChunkNumber(chunk.getChunkNumber()) .maxReadChapterNumber(progress.getMaxReadChapterNumber()) + .maxReadChunkNumber(progress.getMaxReadChunkNumber()) .isCompleted(progress.getIsCompleted()) .currentDifficultyLevel(progress.getCurrentDifficultyLevel()) .normalizedProgress(progress.getNormalizedProgress()) @@ -328,4 +308,29 @@ private ProgressResponse convertToProgressResponse(BookProgress progress, boolea .updatedAt(progress.getUpdatedAt()) .build(); } -} \ No newline at end of file + + private ProgressResponse createNotStartedProgressResponse(String userId, String bookId) { + return ProgressResponse.builder() + .userId(userId) + .bookId(bookId) + .currentReadChapterNumber(0) + .currentReadChunkNumber(0) + .maxReadChapterNumber(0) + .maxReadChunkNumber(0) + .isCompleted(false) + .normalizedProgress(0.0) + .maxNormalizedProgress(0.0) + .streakUpdated(false) + .build(); + } + + private int toChapterFirstPosition(Integer chapterNumber, Integer chunkNumber) { + if (chapterNumber == null || chapterNumber <= 0 || chunkNumber == null || chunkNumber <= 0) { + throw new BooksException(BooksErrorCode.INVALID_CHUNK_NUMBER); + } + if (chapterNumber > CHAPTER_NUMBER_MAX || chunkNumber > CHUNK_NUMBER_MAX) { + throw new BooksException(BooksErrorCode.INVALID_CHUNK_NUMBER); + } + return (chapterNumber << CHAPTER_POSITION_SHIFT) | chunkNumber; + } +} diff --git a/src/main/resources/application-local-k6.properties b/src/main/resources/application-local-k6.properties new file mode 100644 index 00000000..3ea3c7ce --- /dev/null +++ b/src/main/resources/application-local-k6.properties @@ -0,0 +1,4 @@ +# k6 전용 로컬 프로필 +# 실행 예: --spring.profiles.active=local,local-k6 + +rate.limit.enabled=false diff --git a/src/test/java/com/linglevel/api/content/book/repository/BookRepositoryImplTest.java b/src/test/java/com/linglevel/api/content/book/repository/BookRepositoryImplTest.java new file mode 100644 index 00000000..313f6302 --- /dev/null +++ b/src/test/java/com/linglevel/api/content/book/repository/BookRepositoryImplTest.java @@ -0,0 +1,198 @@ +package com.linglevel.api.content.book.repository; + +import com.linglevel.api.common.AbstractDatabaseTest; +import com.linglevel.api.content.book.dto.GetBooksRequest; +import com.linglevel.api.content.book.entity.Book; +import com.linglevel.api.content.common.DifficultyLevel; +import com.linglevel.api.content.common.ProgressStatus; +import org.bson.Document; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayName; +import org.junit.jupiter.api.Test; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest; +import org.springframework.context.annotation.Import; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; +import org.springframework.data.domain.Sort; +import org.springframework.data.mongodb.core.MongoTemplate; + +import java.time.Instant; +import java.util.List; + +import static org.assertj.core.api.Assertions.assertThat; + +@DataMongoTest +@Import(BookRepositoryImpl.class) +class BookRepositoryImplTest extends AbstractDatabaseTest { + + @Autowired + private BookRepository bookRepository; + + @Autowired + private BookProgressRepository bookProgressRepository; + + @Autowired + private MongoTemplate mongoTemplate; + + private static final String USER_ID = "user-1"; + + @BeforeEach + void setUp() { + bookProgressRepository.deleteAll(); + bookRepository.deleteAll(); + + bookRepository.saveAll(List.of( + createBook("book-1", "Alpha", Instant.parse("2026-01-01T00:00:00Z")), + createBook("book-2", "Beta", Instant.parse("2026-01-02T00:00:00Z")), + createBook("book-3", "Gamma", Instant.parse("2026-01-03T00:00:00Z")) + )); + + mongoTemplate.insert(createProgressDocument("book-2", false, 40.0), "bookProgress"); + mongoTemplate.insert(createProgressDocument("book-3", true, 100.0), "bookProgress"); + } + + @Test + @DisplayName("NOT_STARTED 필터는 시작하지 않은 책(문서 없음 또는 normalizedProgress 0)을 반환한다") + void findBooksWithFilters_returnsNotStartedBooks() { + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId).containsExactly("book-1"); + assertThat(result.getTotalElements()).isEqualTo(1); + } + + @Test + @DisplayName("IN_PROGRESS 필터는 완료되지 않았고 읽기를 시작한 책을 반환한다") + void findBooksWithFilters_returnsInProgressBooks() { + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.IN_PROGRESS) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId).containsExactly("book-2"); + assertThat(result.getTotalElements()).isEqualTo(1); + } + + @Test + @DisplayName("normalizedProgress가 0이어도 부분 읽기면 IN_PROGRESS로 분류한다") + void findBooksWithFilters_includesPartialReadAsInProgress() { + bookProgressRepository.deleteAll(); + mongoTemplate.insert(createPartialInProgressDocument("book-1", 1, 2, 20.0), "bookProgress"); + + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.IN_PROGRESS) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId).containsExactly("book-1"); + assertThat(result.getTotalElements()).isEqualTo(1); + } + + @Test + @DisplayName("COMPLETED 필터는 완료된 책만 반환한다") + void findBooksWithFilters_returnsCompletedBooks() { + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.COMPLETED) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId).containsExactly("book-3"); + assertThat(result.getTotalElements()).isEqualTo(1); + } + + @Test + @DisplayName("조건에 맞는 progress가 없으면 빈 페이지를 반환한다") + void findBooksWithFilters_returnsEmptyPageWhenNoProgressMatch() { + bookProgressRepository.deleteAll(); + mongoTemplate.insert(createProgressDocument("book-1", false, 0.0), "bookProgress"); + + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.IN_PROGRESS) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).isEmpty(); + assertThat(result.getTotalElements()).isZero(); + } + + @Test + @DisplayName("normalizedProgress가 0이고 미완료인 책은 NOT_STARTED로 분류한다") + void findBooksWithFilters_includesZeroProgressAsNotStarted() { + mongoTemplate.insert(createProgressDocument("book-1", false, 0.0), "bookProgress"); + + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId).containsExactly("book-1"); + assertThat(result.getTotalElements()).isEqualTo(1); + } + + @Test + @DisplayName("부분 읽기 데이터는 NOT_STARTED에서 제외한다") + void findBooksWithFilters_excludesPartialReadFromNotStarted() { + bookProgressRepository.deleteAll(); + mongoTemplate.insert(createPartialInProgressDocument("book-1", 1, 2, 20.0), "bookProgress"); + + GetBooksRequest request = GetBooksRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page result = bookRepository.findBooksWithFilters(request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Book::getId) + .containsExactly("book-2", "book-3"); + assertThat(result.getTotalElements()).isEqualTo(2); + } + + private Pageable defaultPageable() { + return PageRequest.of(0, 10, Sort.by(Sort.Direction.ASC, "createdAt")); + } + + private Book createBook(String id, String title, Instant createdAt) { + Book book = new Book(); + book.setId(id); + book.setTitle(title); + book.setAuthor("Author"); + book.setDifficultyLevel(DifficultyLevel.A1); + book.setChapterCount(10); + book.setCreatedAt(createdAt); + return book; + } + + private Document createProgressDocument(String bookId, boolean isCompleted, double normalizedProgress) { + return new Document("userId", USER_ID) + .append("bookId", bookId) + .append("isCompleted", isCompleted) + .append("normalizedProgress", normalizedProgress); + } + + private Document createPartialInProgressDocument( + String bookId, + int chapterNumber, + int chunkNumber, + double progressPercentage + ) { + int encodedPosition = chapterNumber * 65536 + chunkNumber; + Document chapterProgress = new Document("chapterNumber", chapterNumber) + .append("progressPercentage", progressPercentage) + .append("isCompleted", false) + .append("completedAt", null); + + return createProgressDocument(bookId, false, 0.0) + .append("maxReadChapterNumber", chapterNumber) + .append("maxReadChunkNumber", encodedPosition) + .append("chapterProgresses", List.of(chapterProgress)); + } +} diff --git a/src/test/java/com/linglevel/api/content/book/repository/ChapterRepositoryImplTest.java b/src/test/java/com/linglevel/api/content/book/repository/ChapterRepositoryImplTest.java new file mode 100644 index 00000000..8237ed7b --- /dev/null +++ b/src/test/java/com/linglevel/api/content/book/repository/ChapterRepositoryImplTest.java @@ -0,0 +1,140 @@ +package com.linglevel.api.content.book.repository; + +import com.linglevel.api.common.AbstractDatabaseTest; +import com.linglevel.api.content.book.dto.GetChaptersRequest; +import com.linglevel.api.content.book.entity.BookProgress; +import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.common.ProgressStatus; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayName; +import org.junit.jupiter.api.Test; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.boot.test.autoconfigure.data.mongo.DataMongoTest; +import org.springframework.context.annotation.Import; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; +import org.springframework.data.domain.Sort; + +import java.util.List; + +import static org.assertj.core.api.Assertions.assertThat; + +@DataMongoTest +@Import(ChapterRepositoryImpl.class) +class ChapterRepositoryImplTest extends AbstractDatabaseTest { + + @Autowired + private ChapterRepository chapterRepository; + + @Autowired + private BookProgressRepository bookProgressRepository; + + private static final String BOOK_ID = "book-1"; + private static final String USER_ID = "user-1"; + + @BeforeEach + void setUp() { + bookProgressRepository.deleteAll(); + chapterRepository.deleteAll(); + + chapterRepository.saveAll(List.of( + createChapter(1, "Chapter 1"), + createChapter(2, "Chapter 2"), + createChapter(3, "Chapter 3") + )); + } + + @Test + @DisplayName("진도 정보가 없으면 NOT_STARTED 필터는 모든 챕터를 반환한다") + void findChaptersWithFilters_returnsAllChaptersWhenNoProgress() { + GetChaptersRequest request = GetChaptersRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page result = chapterRepository.findChaptersWithFilters(BOOK_ID, request, USER_ID, defaultPageable()); + + assertThat(result.getContent()).extracting(Chapter::getChapterNumber).containsExactly(1, 2, 3); + assertThat(result.getTotalElements()).isEqualTo(3); + } + + @Test + @DisplayName("V3 chapterProgresses 기준으로 IN_PROGRESS와 COMPLETED를 구분한다") + void findChaptersWithFilters_usesV3ChapterProgresses() { + BookProgress progress = new BookProgress(); + progress.setUserId(USER_ID); + progress.setBookId(BOOK_ID); + progress.setChapterProgresses(List.of( + BookProgress.ChapterProgressInfo.builder() + .chapterNumber(1) + .progressPercentage(100.0) + .isCompleted(true) + .build(), + BookProgress.ChapterProgressInfo.builder() + .chapterNumber(2) + .progressPercentage(50.0) + .isCompleted(false) + .build() + )); + bookProgressRepository.save(progress); + + GetChaptersRequest inProgressRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.IN_PROGRESS) + .build(); + GetChaptersRequest completedRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.COMPLETED) + .build(); + GetChaptersRequest notStartedRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page inProgress = chapterRepository.findChaptersWithFilters(BOOK_ID, inProgressRequest, USER_ID, defaultPageable()); + Page completed = chapterRepository.findChaptersWithFilters(BOOK_ID, completedRequest, USER_ID, defaultPageable()); + Page notStarted = chapterRepository.findChaptersWithFilters(BOOK_ID, notStartedRequest, USER_ID, defaultPageable()); + + assertThat(inProgress.getContent()).extracting(Chapter::getChapterNumber).containsExactly(2); + assertThat(completed.getContent()).extracting(Chapter::getChapterNumber).containsExactly(1); + assertThat(notStarted.getContent()).extracting(Chapter::getChapterNumber).containsExactly(3); + } + + @Test + @DisplayName("V3 데이터가 없으면 모든 챕터를 NOT_STARTED로 본다") + void findChaptersWithFilters_treatsMissingV3DataAsNotStarted() { + BookProgress progress = new BookProgress(); + progress.setUserId(USER_ID); + progress.setBookId(BOOK_ID); + progress.setCurrentReadChapterNumber(2); // legacy field only (ignored in V3-only filtering) + bookProgressRepository.save(progress); + + GetChaptersRequest completedRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.COMPLETED) + .build(); + GetChaptersRequest inProgressRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.IN_PROGRESS) + .build(); + GetChaptersRequest notStartedRequest = GetChaptersRequest.builder() + .progress(ProgressStatus.NOT_STARTED) + .build(); + + Page completed = chapterRepository.findChaptersWithFilters(BOOK_ID, completedRequest, USER_ID, defaultPageable()); + Page inProgress = chapterRepository.findChaptersWithFilters(BOOK_ID, inProgressRequest, USER_ID, defaultPageable()); + Page notStarted = chapterRepository.findChaptersWithFilters(BOOK_ID, notStartedRequest, USER_ID, defaultPageable()); + + assertThat(completed.getContent()).isEmpty(); + assertThat(inProgress.getContent()).isEmpty(); + assertThat(notStarted.getContent()).extracting(Chapter::getChapterNumber).containsExactly(1, 2, 3); + } + + private Pageable defaultPageable() { + return PageRequest.of(0, 10, Sort.by(Sort.Direction.ASC, "chapterNumber")); + } + + private Chapter createChapter(int chapterNumber, String title) { + Chapter chapter = new Chapter(); + chapter.setId("chapter-" + chapterNumber); + chapter.setBookId(BOOK_ID); + chapter.setChapterNumber(chapterNumber); + chapter.setTitle(title); + return chapter; + } +} diff --git a/src/test/java/com/linglevel/api/content/book/service/BookImportServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/BookImportServiceTest.java new file mode 100644 index 00000000..29783983 --- /dev/null +++ b/src/test/java/com/linglevel/api/content/book/service/BookImportServiceTest.java @@ -0,0 +1,276 @@ +package com.linglevel.api.content.book.service; + +import com.linglevel.api.content.book.dto.BookImportData; +import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.book.entity.Chunk; +import com.linglevel.api.content.book.repository.ChapterRepository; +import com.linglevel.api.content.book.repository.ChunkRepository; +import com.linglevel.api.content.common.ChunkType; +import com.linglevel.api.content.common.DifficultyLevel; +import com.linglevel.api.s3.service.S3UrlService; +import com.linglevel.api.s3.strategy.BookPathStrategy; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.DisplayName; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.ArgumentMatchers; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +import java.util.List; +import java.util.stream.StreamSupport; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNull; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class BookImportServiceTest { + @Mock + private ChapterRepository chapterRepository; + + @Mock + private ChunkRepository chunkRepository; + + @Mock + private S3UrlService s3UrlService; + + @Mock + private BookPathStrategy bookPathStrategy; + + @InjectMocks + private BookImportService bookImportService; + + private BookImportData bookImportData; + private BookImportData.ChapterMetadata chapterMetadata; + + @BeforeEach + void setUp() { + bookImportData = new BookImportData(); + chapterMetadata = new BookImportData.ChapterMetadata(); + BookImportData.TextLevelData textLevelData = new BookImportData.TextLevelData(); + BookImportData.ChapterData chapterData = new BookImportData.ChapterData(); + BookImportData.ChunkData textChunkData = new BookImportData.ChunkData(); + BookImportData.ChunkData imageChunkData = new BookImportData.ChunkData(); + + textChunkData.setChunkNum(1); + textChunkData.setChunkText("내용"); + textChunkData.setIsImage(false); + imageChunkData.setChunkNum(2); + imageChunkData.setChunkText("주소"); + imageChunkData.setIsImage(true); + imageChunkData.setDescription("이미지 설명"); + + chapterData.setChapterNum(1); + chapterData.setChunks(List.of(textChunkData, imageChunkData)); + + textLevelData.setTextLevel("a1"); + textLevelData.setChapters(List.of(chapterData)); + + chapterMetadata.setChapterNum(1); + chapterMetadata.setTitle("제목"); + chapterMetadata.setSummary("요약"); + + bookImportData.setChapterMetadata(List.of(chapterMetadata)); + bookImportData.setLeveledResults(List.of(textLevelData)); + } + + @Test + @DisplayName("chapter metadata를 Chapter 엔티티로 변환해 저장한다.") + void importChapters() { + // given + @SuppressWarnings("unchecked") + ArgumentCaptor> captor = ArgumentCaptor.forClass((Class) Iterable.class); + + when(chapterRepository.saveAll(ArgumentMatchers.anyList())).thenAnswer(invocation -> invocation.getArgument(0)); + + // when + List chapters = bookImportService.createChaptersFromMetadata(bookImportData, "bookId"); + + // then + verify(chapterRepository).saveAll(captor.capture()); + + List savedChapters = StreamSupport.stream(captor.getValue().spliterator(), false).toList(); + + assertEquals(1, savedChapters.size()); + + Chapter savedChapter = savedChapters.get(0); + assertEquals("bookId", savedChapter.getBookId()); + assertEquals(1, savedChapter.getChapterNumber()); + assertEquals("제목", savedChapter.getTitle()); + assertEquals("요약", savedChapter.getDescription()); + assertEquals(0, savedChapter.getReadingTime()); + assertEquals(savedChapters, chapters); + } + + @Test + @DisplayName("leveled results를 텍스트와 이미지 Chunk 엔티티로 변환해 저장한다.") + void importChunks() { + // given + Chapter savedChapter = new Chapter(); + savedChapter.setId("chapter-1"); + List chapters = List.of(savedChapter); + + when(s3UrlService.buildImageUrl("bookId", "주소", bookPathStrategy)) + .thenReturn("https://cdn.example.com/image.png"); + + @SuppressWarnings("unchecked") + ArgumentCaptor> captor = + ArgumentCaptor.forClass((Class) Iterable.class); + + // when + bookImportService.createChunksFromLeveledResults(bookImportData, chapters, "bookId"); + + // then + verify(chunkRepository).saveAll(captor.capture()); + + List savedChunks = + StreamSupport.stream(captor.getValue().spliterator(), false).toList(); + + assertEquals(2, savedChunks.size()); + + Chunk textChunk = savedChunks.get(0); + assertEquals("chapter-1", textChunk.getChapterId()); + assertEquals(1, textChunk.getChunkNumber()); + assertEquals(DifficultyLevel.A1, textChunk.getDifficultyLevel()); + assertEquals(ChunkType.TEXT, textChunk.getType()); + assertEquals("내용", textChunk.getContent()); + assertNull(textChunk.getDescription()); + + Chunk imageChunk = savedChunks.get(1); + assertEquals("chapter-1", imageChunk.getChapterId()); + assertEquals(2, imageChunk.getChunkNumber()); + assertEquals(DifficultyLevel.A1, imageChunk.getDifficultyLevel()); + assertEquals(ChunkType.IMAGE, imageChunk.getType()); + assertEquals("https://cdn.example.com/image.png", imageChunk.getContent()); + assertEquals("이미지 설명", imageChunk.getDescription()); + + verify(s3UrlService).buildImageUrl("bookId", "주소", bookPathStrategy); + } + + @Test + @DisplayName("여러 chapter metadata가 주어지면 chapterNumber를 1부터 순차 증가시켜 저장한다.") + void importChapters_assignSequentialChapterNumbers() { + // given + BookImportData.ChapterMetadata secondMetadata = new BookImportData.ChapterMetadata(); + secondMetadata.setChapterNum(2); + secondMetadata.setTitle("두번째 제목"); + secondMetadata.setSummary("두번째 요약"); + bookImportData.setChapterMetadata(List.of(chapterMetadata, secondMetadata)); + + @SuppressWarnings("unchecked") + ArgumentCaptor> captor = ArgumentCaptor.forClass((Class) Iterable.class); + + when(chapterRepository.saveAll(ArgumentMatchers.anyList())).thenAnswer(invocation -> invocation.getArgument(0)); + + // when + bookImportService.createChaptersFromMetadata(bookImportData, "bookId"); + + // then + verify(chapterRepository).saveAll(captor.capture()); + + List savedChapters = StreamSupport.stream(captor.getValue().spliterator(), false).toList(); + + assertEquals(2, savedChapters.size()); + assertEquals(1, savedChapters.get(0).getChapterNumber()); + assertEquals("제목", savedChapters.get(0).getTitle()); + assertEquals(2, savedChapters.get(1).getChapterNumber()); + assertEquals("두번째 제목", savedChapters.get(1).getTitle()); + } + + @Test + @DisplayName("여러 챕터를 저장할 때 각 챕터의 chunkNumber는 1부터 다시 시작한다.") + void importChunks_resetsChunkNumberPerChapter() { + // given + Chapter firstChapter = new Chapter(); + firstChapter.setId("chapter-1"); + Chapter secondChapter = new Chapter(); + secondChapter.setId("chapter-2"); + List chapters = List.of(firstChapter, secondChapter); + + BookImportData.ChunkData firstTextChunk = createChunkData("첫 챕터 1", false, null); + BookImportData.ChunkData firstImageChunk = createChunkData("first.png", true, "첫 이미지"); + BookImportData.ChunkData secondTextChunk = createChunkData("둘째 챕터 1", false, null); + + BookImportData.ChapterData firstChapterData = createChapterData(List.of(firstTextChunk, firstImageChunk)); + BookImportData.ChapterData secondChapterData = createChapterData(List.of(secondTextChunk)); + + BookImportData.TextLevelData textLevelData = new BookImportData.TextLevelData(); + textLevelData.setTextLevel("a1"); + textLevelData.setChapters(List.of(firstChapterData, secondChapterData)); + bookImportData.setLeveledResults(List.of(textLevelData)); + + when(s3UrlService.buildImageUrl("bookId", "first.png", bookPathStrategy)) + .thenReturn("https://cdn.example.com/first.png"); + + @SuppressWarnings("unchecked") + ArgumentCaptor> captor = ArgumentCaptor.forClass((Class) Iterable.class); + + // when + bookImportService.createChunksFromLeveledResults(bookImportData, chapters, "bookId"); + + // then + verify(chunkRepository).saveAll(captor.capture()); + + List savedChunks = StreamSupport.stream(captor.getValue().spliterator(), false).toList(); + + assertEquals(3, savedChunks.size()); + assertEquals("chapter-1", savedChunks.get(0).getChapterId()); + assertEquals(1, savedChunks.get(0).getChunkNumber()); + assertEquals("chapter-1", savedChunks.get(1).getChapterId()); + assertEquals(2, savedChunks.get(1).getChunkNumber()); + assertEquals("chapter-2", savedChunks.get(2).getChapterId()); + assertEquals(1, savedChunks.get(2).getChunkNumber()); + } + + @Test + @DisplayName("AI chapter 수가 savedChapters보다 적으면 남은 챕터는 건너뛴다.") + void importChunks_skipsRemainingSavedChaptersWhenAiChaptersAreShorter() { + // given + Chapter firstChapter = new Chapter(); + firstChapter.setId("chapter-1"); + Chapter secondChapter = new Chapter(); + secondChapter.setId("chapter-2"); + List chapters = List.of(firstChapter, secondChapter); + + BookImportData.ChunkData onlyChunk = createChunkData("첫 챕터만 저장", false, null); + BookImportData.ChapterData onlyChapterData = createChapterData(List.of(onlyChunk)); + + BookImportData.TextLevelData textLevelData = new BookImportData.TextLevelData(); + textLevelData.setTextLevel("a1"); + textLevelData.setChapters(List.of(onlyChapterData)); + bookImportData.setLeveledResults(List.of(textLevelData)); + + @SuppressWarnings("unchecked") + ArgumentCaptor> captor = ArgumentCaptor.forClass((Class) Iterable.class); + + // when + bookImportService.createChunksFromLeveledResults(bookImportData, chapters, "bookId"); + + // then + verify(chunkRepository).saveAll(captor.capture()); + + List savedChunks = StreamSupport.stream(captor.getValue().spliterator(), false).toList(); + + assertEquals(1, savedChunks.size()); + assertEquals("chapter-1", savedChunks.get(0).getChapterId()); + assertEquals("첫 챕터만 저장", savedChunks.get(0).getContent()); + } + + private BookImportData.ChunkData createChunkData(String chunkText, boolean isImage, String description) { + BookImportData.ChunkData chunkData = new BookImportData.ChunkData(); + chunkData.setChunkText(chunkText); + chunkData.setIsImage(isImage); + chunkData.setDescription(description); + return chunkData; + } + + private BookImportData.ChapterData createChapterData(List chunks) { + BookImportData.ChapterData chapterData = new BookImportData.ChapterData(); + chapterData.setChunks(chunks); + return chapterData; + } +} diff --git a/src/test/java/com/linglevel/api/content/book/service/BookReadingTimeServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/BookReadingTimeServiceTest.java new file mode 100644 index 00000000..35dde96c --- /dev/null +++ b/src/test/java/com/linglevel/api/content/book/service/BookReadingTimeServiceTest.java @@ -0,0 +1,153 @@ +package com.linglevel.api.content.book.service; + +import com.linglevel.api.content.book.dto.BookImportData; +import com.linglevel.api.content.book.entity.Book; +import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.book.exception.BooksErrorCode; +import com.linglevel.api.content.book.exception.BooksException; +import com.linglevel.api.content.book.repository.BookRepository; +import com.linglevel.api.content.book.repository.ChapterRepository; +import com.linglevel.api.content.common.DifficultyLevel; +import com.linglevel.api.content.common.service.ReadingTimeService; +import org.junit.jupiter.api.DisplayName; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +import java.util.List; +import java.util.Optional; +import java.util.stream.StreamSupport; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.never; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class BookReadingTimeServiceTest { + + @Mock + private BookRepository bookRepository; + + @Mock + private ChapterRepository chapterRepository; + + @Mock + private ReadingTimeService readingTimeService; + + @InjectMocks + private BookReadingTimeService bookReadingTimeService; + + @Test + @DisplayName("책이 없으면 BOOK_NOT_FOUND 예외를 던진다.") + void updateReadingTimes_throwsWhenBookNotFound() { + // given + BookImportData importData = new BookImportData(); + when(bookRepository.findById("missing-book")).thenReturn(Optional.empty()); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> bookReadingTimeService.updateReadingTimes("missing-book", importData) + ); + + // then + assertEquals(BooksErrorCode.BOOK_NOT_FOUND.getMessage(), exception.getMessage()); + verify(chapterRepository, never()).findByBookIdOrderByChapterNumber("missing-book"); + verify(chapterRepository, never()).saveAll(org.mockito.ArgumentMatchers.anyList()); + verify(bookRepository, never()).save(org.mockito.ArgumentMatchers.any(Book.class)); + } + + @Test + @DisplayName("책 난이도와 일치하는 leveled results를 사용해 chapter와 book readingTime을 저장한다.") + void updateReadingTimes_updatesChapterAndBookReadingTimes() { + // given + Book book = new Book(); + book.setId("book-1"); + book.setDifficultyLevel(DifficultyLevel.A1); + + Chapter firstChapter = new Chapter(); + firstChapter.setId("chapter-1"); + firstChapter.setBookId("book-1"); + firstChapter.setChapterNumber(1); + + Chapter secondChapter = new Chapter(); + secondChapter.setId("chapter-2"); + secondChapter.setBookId("book-1"); + secondChapter.setChapterNumber(2); + + BookImportData importData = createImportData(); + + when(bookRepository.findById("book-1")).thenReturn(Optional.of(book)); + when(chapterRepository.findByBookIdOrderByChapterNumber("book-1")) + .thenReturn(List.of(firstChapter, secondChapter)); + when(readingTimeService.calculateReadingTimeFromCharacters(5)).thenReturn(3); + when(readingTimeService.calculateReadingTimeFromCharacters(4)).thenReturn(2); + + @SuppressWarnings("unchecked") + ArgumentCaptor> chaptersCaptor = + ArgumentCaptor.forClass((Class) Iterable.class); + ArgumentCaptor bookCaptor = ArgumentCaptor.forClass(Book.class); + + // when + bookReadingTimeService.updateReadingTimes("book-1", importData); + + // then + verify(chapterRepository).saveAll(chaptersCaptor.capture()); + verify(bookRepository).save(bookCaptor.capture()); + + List savedChapters = + StreamSupport.stream(chaptersCaptor.getValue().spliterator(), false).toList(); + Book savedBook = bookCaptor.getValue(); + + assertEquals(2, savedChapters.size()); + assertEquals(3, savedChapters.get(0).getReadingTime()); + assertEquals(2, savedChapters.get(1).getReadingTime()); + assertEquals(5, savedBook.getReadingTime()); + + verify(readingTimeService).calculateReadingTimeFromCharacters(5); + verify(readingTimeService).calculateReadingTimeFromCharacters(4); + } + + private BookImportData createImportData() { + BookImportData.ChunkData firstA1Chunk = createChunkData("abc"); + BookImportData.ChunkData secondA1Chunk = createChunkData("de"); + BookImportData.ChunkData chapterTwoA1Chunk = createChunkData("wxyz"); + BookImportData.ChunkData ignoredB1Chunk = createChunkData("ignored-text"); + + BookImportData.ChapterData firstA1Chapter = createChapterData(1, List.of(firstA1Chunk, secondA1Chunk)); + BookImportData.ChapterData secondA1Chapter = createChapterData(2, List.of(chapterTwoA1Chunk)); + BookImportData.ChapterData ignoredB1Chapter = createChapterData(1, List.of(ignoredB1Chunk)); + + BookImportData.TextLevelData a1Level = createTextLevelData("a1", List.of(firstA1Chapter, secondA1Chapter)); + BookImportData.TextLevelData b1Level = createTextLevelData("b1", List.of(ignoredB1Chapter)); + + BookImportData importData = new BookImportData(); + importData.setLeveledResults(List.of(a1Level, b1Level)); + return importData; + } + + private BookImportData.TextLevelData createTextLevelData(String textLevel, List chapters) { + BookImportData.TextLevelData levelData = new BookImportData.TextLevelData(); + levelData.setTextLevel(textLevel); + levelData.setChapters(chapters); + return levelData; + } + + private BookImportData.ChapterData createChapterData(int chapterNum, List chunks) { + BookImportData.ChapterData chapterData = new BookImportData.ChapterData(); + chapterData.setChapterNum(chapterNum); + chapterData.setChunks(chunks); + return chapterData; + } + + private BookImportData.ChunkData createChunkData(String chunkText) { + BookImportData.ChunkData chunkData = new BookImportData.ChunkData(); + chunkData.setChunkText(chunkText); + return chunkData; + } +} diff --git a/src/test/java/com/linglevel/api/content/book/service/BookServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/BookServiceTest.java index 559bc683..94d5b8a3 100644 --- a/src/test/java/com/linglevel/api/content/book/service/BookServiceTest.java +++ b/src/test/java/com/linglevel/api/content/book/service/BookServiceTest.java @@ -1,20 +1,30 @@ package com.linglevel.api.content.book.service; import com.linglevel.api.common.dto.PageResponse; -import com.linglevel.api.content.book.dto.BookResponse; -import com.linglevel.api.content.book.dto.GetBooksRequest; +import com.linglevel.api.content.book.dto.*; import com.linglevel.api.content.book.entity.Book; import com.linglevel.api.content.book.entity.BookProgress; +import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.book.exception.BooksErrorCode; +import com.linglevel.api.content.book.exception.BooksException; import com.linglevel.api.content.book.repository.BookProgressRepository; import com.linglevel.api.content.book.repository.BookRepository; import com.linglevel.api.content.common.DifficultyLevel; import com.linglevel.api.content.common.ProgressStatus; +import com.linglevel.api.content.common.TitleTranslations; +import com.linglevel.api.i18n.LanguageCode; +import com.linglevel.api.s3.service.ImageResizeService; +import com.linglevel.api.s3.service.S3AiService; +import com.linglevel.api.s3.service.S3TransferService; +import com.linglevel.api.s3.service.S3UrlService; +import com.linglevel.api.s3.strategy.BookPathStrategy; import com.linglevel.api.user.entity.User; import com.linglevel.api.user.entity.UserRole; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; import org.mockito.InjectMocks; import org.mockito.Mock; import org.mockito.junit.jupiter.MockitoExtension; @@ -28,7 +38,10 @@ import java.util.Optional; import static org.assertj.core.api.Assertions.assertThat; -import static org.mockito.ArgumentMatchers.*; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) @@ -40,6 +53,27 @@ class BookServiceTest { @Mock private BookProgressRepository bookProgressRepository; + @Mock + private S3AiService s3AiService; + + @Mock + private S3TransferService s3TransferService; + + @Mock + private S3UrlService s3UrlService; + + @Mock + private BookPathStrategy bookPathStrategy; + + @Mock + private ImageResizeService imageResizeService; + + @Mock + private BookReadingTimeService bookReadingTimeService; + + @Mock + private BookImportService bookImportService; + @InjectMocks private BookService bookService; @@ -56,6 +90,176 @@ void setUp() { testUser.setCreatedAt(LocalDateTime.now()); } + @Test + @DisplayName("importBook는 책 저장, 이미지 처리, 챕터/청크 import, reading time 갱신을 순서대로 수행한다") + void importBook_orchestratesImportFlow() { + // given + BookImportRequest request = new BookImportRequest(); + request.setId("request-1"); + + BookImportData importData = createImportData(); + Chapter savedChapter = new Chapter(); + savedChapter.setId("chapter-1"); + List savedChapters = List.of(savedChapter); + + when(s3AiService.downloadJsonFile("request-1", BookImportData.class, bookPathStrategy)) + .thenReturn(importData); + when(s3UrlService.getCoverImageUrl("request-1", bookPathStrategy)) + .thenReturn("https://cdn/request-cover.jpg"); + when(s3UrlService.getCoverImageUrl("saved-book-id", bookPathStrategy)) + .thenReturn("https://cdn/original-cover.jpg"); + when(bookPathStrategy.generateCoverImagePath("saved-book-id")) + .thenReturn("literature/saved-book-id/images/cover.jpg"); + when(imageResizeService.createSmallImage("literature/saved-book-id/images/cover.jpg")) + .thenReturn("https://cdn/small-cover.webp"); + when(bookRepository.save(any(Book.class))) + .thenAnswer(invocation -> { + Book book = invocation.getArgument(0); + if (book.getId() == null) { + book.setId("saved-book-id"); + } + return book; + }); + when(bookImportService.createChaptersFromMetadata(importData, "saved-book-id")) + .thenReturn(savedChapters); + + ArgumentCaptor bookCaptor = ArgumentCaptor.forClass(Book.class); + + // when + BookImportResponse response = bookService.importBook(request); + + // then + verify(bookRepository, org.mockito.Mockito.times(2)).save(bookCaptor.capture()); + List savedBooks = bookCaptor.getAllValues(); + Book finalSavedBook = savedBooks.get(savedBooks.size() - 1); + + assertThat(response.getId()).isEqualTo("saved-book-id"); + assertThat(finalSavedBook.getId()).isEqualTo("saved-book-id"); + assertThat(finalSavedBook.getTitle()).isEqualTo("Imported title"); + assertThat(finalSavedBook.getDifficultyLevel()).isEqualTo(DifficultyLevel.A1); + assertThat(finalSavedBook.getChapterCount()).isEqualTo(2); + assertThat(finalSavedBook.getCoverImageUrl()).isEqualTo("https://cdn/small-cover.webp"); + + verify(s3TransferService).transferImagesFromAiToStatic("request-1", "saved-book-id", bookPathStrategy); + verify(bookImportService).createChaptersFromMetadata(importData, "saved-book-id"); + verify(bookImportService).createChunksFromLeveledResults(importData, savedChapters, "saved-book-id"); + verify(bookReadingTimeService).updateReadingTimes("saved-book-id", importData); + } + + @Test + @DisplayName("cover image 리사이즈가 실패하면 원본 cover URL을 유지한다") + void importBook_keepsOriginalCoverUrlWhenResizeFails() { + // given + BookImportRequest request = new BookImportRequest(); + request.setId("request-1"); + + BookImportData importData = createImportData(); + List savedChapters = List.of(new Chapter()); + + when(s3AiService.downloadJsonFile("request-1", BookImportData.class, bookPathStrategy)) + .thenReturn(importData); + when(s3UrlService.getCoverImageUrl("request-1", bookPathStrategy)) + .thenReturn("https://cdn/request-cover.jpg"); + when(s3UrlService.getCoverImageUrl("saved-book-id", bookPathStrategy)) + .thenReturn("https://cdn/original-cover.jpg"); + when(bookPathStrategy.generateCoverImagePath("saved-book-id")) + .thenReturn("literature/saved-book-id/images/cover.jpg"); + when(imageResizeService.createSmallImage("literature/saved-book-id/images/cover.jpg")) + .thenThrow(new RuntimeException("resize failed")); + when(bookRepository.save(any(Book.class))) + .thenAnswer(invocation -> { + Book book = invocation.getArgument(0); + if (book.getId() == null) { + book.setId("saved-book-id"); + } + return book; + }); + when(bookImportService.createChaptersFromMetadata(importData, "saved-book-id")) + .thenReturn(savedChapters); + + ArgumentCaptor bookCaptor = ArgumentCaptor.forClass(Book.class); + + // when + BookImportResponse response = bookService.importBook(request); + + // then + verify(bookRepository, org.mockito.Mockito.times(2)).save(bookCaptor.capture()); + List savedBooks = bookCaptor.getAllValues(); + Book finalSavedBook = savedBooks.get(savedBooks.size() - 1); + + assertThat(response.getId()).isEqualTo("saved-book-id"); + assertThat(finalSavedBook.getCoverImageUrl()).isEqualTo("https://cdn/original-cover.jpg"); + + verify(bookImportService).createChunksFromLeveledResults(importData, savedChapters, "saved-book-id"); + verify(bookReadingTimeService).updateReadingTimes("saved-book-id", importData); + } + + @Test + @DisplayName("단일 책 조회 시 요청 언어에 맞는 번역 제목을 우선 사용한다") + void getBook_selectsTranslatedTitleByLanguage() { + // given + Book book = createBook("Original title", "Author", List.of("tag1")); + book.setTitleTranslations(new TitleTranslations("번역 제목", "Original title")); + when(bookRepository.findById(book.getId())).thenReturn(Optional.of(book)); + + // when + BookResponse response = bookService.getBook(book.getId(), testUser.getId(), LanguageCode.KO); + + // then + assertThat(response.getTitle()).isEqualTo("번역 제목"); + } + + @Test + @DisplayName("번역 제목이 비어 있으면 원본 제목으로 fallback 한다") + void getBook_fallsBackToOriginalTitleWhenTranslationMissing() { + // given + Book book = createBook("Original title", "Author", List.of("tag1")); + book.setTitleTranslations(new TitleTranslations(null, "Original title")); + when(bookRepository.findById(book.getId())).thenReturn(Optional.of(book)); + + // when + BookResponse response = bookService.getBook(book.getId(), testUser.getId(), LanguageCode.KO); + + // then + assertThat(response.getTitle()).isEqualTo("Original title"); + } + + @Test + @DisplayName("단일 책 조회 시 책이 없으면 BOOK_NOT_FOUND 예외를 던진다") + void getBook_throwsWhenBookMissing() { + // given + when(bookRepository.findById("missing-book")).thenReturn(Optional.empty()); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> bookService.getBook("missing-book", testUser.getId(), LanguageCode.EN) + ); + + // then + assertThat(exception.getMessage()).isEqualTo(BooksErrorCode.BOOK_NOT_FOUND.getMessage()); + } + + @Test + @DisplayName("지원하지 않는 sortBy 값이면 INVALID_SORT_BY 예외를 던진다") + void getBooks_throwsWhenSortByInvalid() { + // given + GetBooksRequest request = GetBooksRequest.builder() + .sortBy("unknown-sort") + .page(1) + .limit(10) + .build(); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> bookService.getBooks(request, testUser.getId()) + ); + + // then + assertThat(exception.getMessage()).isEqualTo(BooksErrorCode.INVALID_SORT_BY.getMessage()); + } + @Test @DisplayName("진도 필터링과 페이지네이션이 함께 동작할 때 - IN_PROGRESS 필터") void testProgressFilterWithPagination_InProgress() { @@ -110,11 +314,7 @@ void testProgressFilterWithPagination_Completed() { when(bookRepository.findBooksWithFilters(any(), eq(testUser.getId()), any())) .thenReturn(bookPage); - for (Book book : books) { - BookProgress progress = createBookProgress(testUser.getId(), book.getId(), true); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), book.getId())) - .thenReturn(Optional.of(progress)); - } + mockBookProgress(books, true); PageResponse response = bookService.getBooks(request, testUser.getId()); @@ -239,11 +439,7 @@ void testCombinedFiltersWithPagination() { when(bookRepository.findBooksWithFilters(any(), eq(testUser.getId()), any())) .thenReturn(bookPage); - for (Book book : books) { - BookProgress progress = createBookProgress(testUser.getId(), book.getId(), false); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), book.getId())) - .thenReturn(Optional.of(progress)); - } + mockBookProgress(books, false); PageResponse response = bookService.getBooks(request, testUser.getId()); @@ -283,11 +479,13 @@ private Book createBook(String title, String author, List tags) { } private void mockBookProgress(List books, boolean isCompleted) { - for (Book book : books) { - BookProgress progress = createBookProgress(testUser.getId(), book.getId(), isCompleted); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), book.getId())) - .thenReturn(Optional.of(progress)); - } + List bookIds = books.stream().map(Book::getId).toList(); + List progresses = books.stream() + .map(book -> createBookProgress(testUser.getId(), book.getId(), isCompleted)) + .toList(); + + when(bookProgressRepository.findByUserIdAndBookIdIn(testUser.getId(), bookIds)) + .thenReturn(progresses); } private BookProgress createBookProgress(String userId, String bookId, boolean isCompleted) { @@ -296,8 +494,37 @@ private BookProgress createBookProgress(String userId, String bookId, boolean is progress.setBookId(bookId); progress.setCurrentReadChapterNumber(isCompleted ? 20 : 10); progress.setMaxReadChapterNumber(isCompleted ? 20 : 10); + progress.setNormalizedProgress(isCompleted ? 100.0 : 50.0); + progress.setMaxNormalizedProgress(isCompleted ? 100.0 : 50.0); progress.setIsCompleted(isCompleted); progress.setUpdatedAt(Instant.now()); return progress; } -} \ No newline at end of file + + private BookImportData createImportData() { + BookImportData importData = new BookImportData(); + importData.setTitle("Imported title"); + importData.setTitleTranslations(new TitleTranslations("가져온 제목", "Imported title")); + importData.setAuthor("Imported author"); + importData.setOriginalTextLevel("a1"); + importData.setLeveledResults(List.of( + createTextLevelData("a1", 2), + createTextLevelData("b1", 1) + )); + return importData; + } + + private BookImportData.TextLevelData createTextLevelData(String level, int chapterCount) { + BookImportData.TextLevelData textLevelData = new BookImportData.TextLevelData(); + textLevelData.setTextLevel(level); + List chapters = new java.util.ArrayList<>(); + for (int i = 1; i <= chapterCount; i++) { + BookImportData.ChapterData chapterData = new BookImportData.ChapterData(); + chapterData.setChapterNum(i); + chapterData.setChunks(List.of()); + chapters.add(chapterData); + } + textLevelData.setChapters(chapters); + return textLevelData; + } +} diff --git a/src/test/java/com/linglevel/api/content/book/service/ChapterServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/ChapterServiceTest.java index 68371781..b22b3456 100644 --- a/src/test/java/com/linglevel/api/content/book/service/ChapterServiceTest.java +++ b/src/test/java/com/linglevel/api/content/book/service/ChapterServiceTest.java @@ -2,10 +2,14 @@ import com.linglevel.api.common.dto.PageResponse; import com.linglevel.api.content.book.dto.ChapterResponse; +import com.linglevel.api.content.book.dto.ChapterNavigationResponse; +import com.linglevel.api.content.book.dto.ChunkCountByLevelDto; import com.linglevel.api.content.book.dto.GetChaptersRequest; import com.linglevel.api.content.book.entity.Book; import com.linglevel.api.content.book.entity.BookProgress; import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.book.exception.BooksErrorCode; +import com.linglevel.api.content.book.exception.BooksException; import com.linglevel.api.content.book.repository.BookProgressRepository; import com.linglevel.api.content.book.repository.BookRepository; import com.linglevel.api.content.book.repository.ChapterRepository; @@ -32,7 +36,10 @@ import java.util.Optional; import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.ArgumentMatchers.*; +import static org.mockito.Mockito.lenient; +import static org.mockito.Mockito.verify; import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) @@ -77,10 +84,10 @@ void setUp() { testBook.setChapterCount(10); testBook.setCreatedAt(Instant.now()); - when(bookService.findById(anyString())).thenReturn(testBook); + lenient().when(bookService.findById(anyString())).thenReturn(testBook); // Add stubs for the new repository methods called during refactoring - when(chunkRepository.findChunkCountsByChapterIds(anyList())).thenReturn(Collections.emptyList()); + lenient().when(chunkRepository.findChunkCountsByChapterIds(anyList())).thenReturn(Collections.emptyList()); } @Test @@ -107,11 +114,6 @@ void testProgressFilterWithPagination_NotStarted() { progress.setIsCompleted(false); progress.setUpdatedAt(Instant.now()); - com.linglevel.api.content.book.entity.Chunk mockChunk = new com.linglevel.api.content.book.entity.Chunk(); - mockChunk.setId("test-chunk-id"); - mockChunk.setChunkNumber(50); - when(chunkRepository.findById(anyString())).thenReturn(Optional.of(mockChunk)); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), testBook.getId())) .thenReturn(Optional.of(progress)); @@ -149,11 +151,6 @@ void testProgressFilterWithPagination_InProgress() { progress.setIsCompleted(false); progress.setUpdatedAt(Instant.now()); - com.linglevel.api.content.book.entity.Chunk mockChunk = new com.linglevel.api.content.book.entity.Chunk(); - mockChunk.setId("test-chunk-id"); - mockChunk.setChunkNumber(50); - when(chunkRepository.findById(anyString())).thenReturn(Optional.of(mockChunk)); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), testBook.getId())) .thenReturn(Optional.of(progress)); @@ -191,11 +188,6 @@ void testProgressFilterWithPagination_Completed() { progress.setIsCompleted(false); progress.setUpdatedAt(Instant.now()); - com.linglevel.api.content.book.entity.Chunk mockChunk = new com.linglevel.api.content.book.entity.Chunk(); - mockChunk.setId("test-chunk-id"); - mockChunk.setChunkNumber(50); - when(chunkRepository.findById(anyString())).thenReturn(Optional.of(mockChunk)); - when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), testBook.getId())) .thenReturn(Optional.of(progress)); @@ -236,6 +228,136 @@ void testNoProgress_NotStarted() { assertThat(response.getTotalCount()).isEqualTo(10); } + @Test + @DisplayName("단일 챕터 조회 시 V3 chapterProgresses 정보를 기준으로 응답을 계산한다") + void getChapter_usesV3ChapterProgressInfo() { + // given + Chapter chapter = createChapter(testBook.getId(), 2, "Chapter 2"); + + BookProgress progress = new BookProgress(); + progress.setUserId(testUser.getId()); + progress.setBookId(testBook.getId()); + progress.setCurrentDifficultyLevel(DifficultyLevel.B1); + progress.setChapterProgresses(List.of( + BookProgress.ChapterProgressInfo.builder() + .chapterNumber(2) + .progressPercentage(37.5) + .isCompleted(false) + .build() + )); + + when(chapterRepository.findById(chapter.getId())).thenReturn(Optional.of(chapter)); + when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), testBook.getId())) + .thenReturn(Optional.of(progress)); + when(chunkRepository.findChunkCountsByChapterIds(List.of(chapter.getId()))) + .thenReturn(List.of(new ChunkCountByLevelDto(chapter.getId(), DifficultyLevel.B1, 8L))); + + // when + ChapterResponse response = chapterService.getChapter(testBook.getId(), chapter.getId(), testUser.getId()); + + // then + assertThat(response.getId()).isEqualTo(chapter.getId()); + assertThat(response.getCurrentDifficultyLevel()).isEqualTo(DifficultyLevel.B1); + assertThat(response.getChunkCount()).isEqualTo(8); + assertThat(response.getProgressPercentage()).isEqualTo(37.5); + assertThat(response.getCurrentReadChunkNumber()).isEqualTo(3); + assertThat(response.getIsCompleted()).isFalse(); + } + + @Test + @DisplayName("단일 챕터 조회 시 V3 데이터가 없으면 해당 챕터를 NOT_STARTED로 계산한다") + void getChapter_returnsNotStartedWhenV3DataMissing() { + // given + Chapter chapter = createChapter(testBook.getId(), 2, "Chapter 2"); + + BookProgress progress = new BookProgress(); + progress.setUserId(testUser.getId()); + progress.setBookId(testBook.getId()); + progress.setCurrentDifficultyLevel(DifficultyLevel.B1); + progress.setChapterProgresses(null); + + when(chapterRepository.findById(chapter.getId())).thenReturn(Optional.of(chapter)); + when(bookProgressRepository.findByUserIdAndBookId(testUser.getId(), testBook.getId())) + .thenReturn(Optional.of(progress)); + when(chunkRepository.findChunkCountsByChapterIds(List.of(chapter.getId()))) + .thenReturn(List.of(new ChunkCountByLevelDto(chapter.getId(), DifficultyLevel.B1, 8L))); + + // when + ChapterResponse response = chapterService.getChapter(testBook.getId(), chapter.getId(), testUser.getId()); + + // then + assertThat(response.getId()).isEqualTo(chapter.getId()); + assertThat(response.getCurrentDifficultyLevel()).isEqualTo(DifficultyLevel.B1); + assertThat(response.getChunkCount()).isEqualTo(8); + assertThat(response.getProgressPercentage()).isEqualTo(0.0); + assertThat(response.getCurrentReadChunkNumber()).isEqualTo(0); + assertThat(response.getIsCompleted()).isFalse(); + } + + @Test + @DisplayName("챕터가 다른 책에 속하면 CHAPTER_NOT_FOUND_IN_BOOK 예외를 던진다") + void getChapter_throwsWhenChapterDoesNotBelongToBook() { + // given + Chapter anotherBookChapter = createChapter("another-book", 1, "Wrong Chapter"); + when(chapterRepository.findById(anotherBookChapter.getId())).thenReturn(Optional.of(anotherBookChapter)); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> chapterService.getChapter(testBook.getId(), anotherBookChapter.getId(), testUser.getId()) + ); + + // then + assertThat(exception.getMessage()).isEqualTo(BooksErrorCode.CHAPTER_NOT_FOUND_IN_BOOK.getMessage()); + } + + @Test + @DisplayName("챕터 네비게이션 조회 시 이전/다음 챕터 정보를 반환한다") + void getChapterNavigation_returnsPreviousAndNextChapter() { + // given + Chapter currentChapter = createChapter(testBook.getId(), 2, "Chapter 2"); + Chapter previousChapter = createChapter(testBook.getId(), 1, "Chapter 1"); + Chapter nextChapter = createChapter(testBook.getId(), 3, "Chapter 3"); + + when(bookService.existsById(testBook.getId())).thenReturn(true); + when(chapterRepository.findById(currentChapter.getId())).thenReturn(Optional.of(currentChapter)); + when(chapterRepository.findByBookIdAndChapterNumber(testBook.getId(), 1)).thenReturn(Optional.of(previousChapter)); + when(chapterRepository.findByBookIdAndChapterNumber(testBook.getId(), 3)).thenReturn(Optional.of(nextChapter)); + + // when + ChapterNavigationResponse response = chapterService.getChapterNavigation(testBook.getId(), currentChapter.getId()); + + // then + assertThat(response.getCurrentChapterId()).isEqualTo(currentChapter.getId()); + assertThat(response.getCurrentChapterNumber()).isEqualTo(2); + assertThat(response.getHasPreviousChapter()).isTrue(); + assertThat(response.getPreviousChapterId()).isEqualTo(previousChapter.getId()); + assertThat(response.getHasNextChapter()).isTrue(); + assertThat(response.getNextChapterId()).isEqualTo(nextChapter.getId()); + } + + @Test + @DisplayName("챕터 목록 조회 시 viewCount를 증가시킨다") + void getChapters_incrementsBookViewCount() { + // given + GetChaptersRequest request = GetChaptersRequest.builder() + .page(1) + .limit(2) + .build(); + + List chapters = createChapters(1, testBook.getId(), 1, "Chapter"); + Page chapterPage = new PageImpl<>(chapters, PageRequest.of(0, 2), 1); + + when(chapterRepository.findChaptersWithFilters(anyString(), any(), any(), any())) + .thenReturn(chapterPage); + + // when + chapterService.getChapters(testBook.getId(), request, testUser.getId()); + + // then + verify(bookRepository).incrementViewCount(testBook.getId()); + } + private List createChapters(int count, String bookId, int startNumber, String titlePrefix) { List chapters = new java.util.ArrayList<>(); for (int i = 0; i < count; i++) { @@ -254,4 +376,4 @@ private Chapter createChapter(String bookId, Integer chapterNumber, String title chapter.setReadingTime(30); return chapter; } -} \ No newline at end of file +} diff --git a/src/test/java/com/linglevel/api/content/book/service/ChunkServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/ChunkServiceTest.java new file mode 100644 index 00000000..ac86b835 --- /dev/null +++ b/src/test/java/com/linglevel/api/content/book/service/ChunkServiceTest.java @@ -0,0 +1,228 @@ +package com.linglevel.api.content.book.service; + +import com.linglevel.api.common.dto.PageResponse; +import com.linglevel.api.content.book.dto.ChunkResponse; +import com.linglevel.api.content.book.dto.GetChunksRequest; +import com.linglevel.api.content.book.entity.Chapter; +import com.linglevel.api.content.book.entity.Chunk; +import com.linglevel.api.content.book.exception.BooksErrorCode; +import com.linglevel.api.content.book.exception.BooksException; +import com.linglevel.api.content.book.repository.ChapterRepository; +import com.linglevel.api.content.book.repository.ChunkRepository; +import com.linglevel.api.content.common.ChunkType; +import com.linglevel.api.content.common.DifficultyLevel; +import org.junit.jupiter.api.DisplayName; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.ArgumentCaptor; +import org.mockito.ArgumentMatchers; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageImpl; +import org.springframework.data.domain.Pageable; + +import java.util.List; +import java.util.Optional; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.when; + +@ExtendWith(MockitoExtension.class) +class ChunkServiceTest { + + @Mock + private ChunkRepository chunkRepository; + + @Mock + private ChapterRepository chapterRepository; + + @Mock + private BookService bookService; + + @InjectMocks + private ChunkService chunkService; + + @Test + @DisplayName("청크 목록 조회 시 페이지 정보와 ChunkResponse 매핑을 반환한다.") + void getChunks_returnsPagedChunkResponses() { + // given + GetChunksRequest request = GetChunksRequest.builder() + .difficultyLevel(DifficultyLevel.A1) + .page(1) + .limit(300) + .build(); + + Chapter chapter = createChapter("chapter-1", "book-1"); + Chunk firstChunk = createChunk("chunk-1", "chapter-1", 1, ChunkType.TEXT, "first", null); + Chunk secondChunk = createChunk("chunk-2", "chapter-1", 2, ChunkType.IMAGE, "https://cdn/image.png", "image"); + Page chunkPage = new PageImpl<>(List.of(firstChunk, secondChunk)); + + when(bookService.existsById("book-1")).thenReturn(true); + when(chapterRepository.findById("chapter-1")).thenReturn(Optional.of(chapter)); + + ArgumentCaptor pageableCaptor = ArgumentCaptor.forClass(Pageable.class); + when(chunkRepository.findByChapterIdAndDifficultyLevel( + ArgumentMatchers.eq("chapter-1"), + ArgumentMatchers.eq(DifficultyLevel.A1), + pageableCaptor.capture() + )).thenReturn(chunkPage); + + // when + PageResponse response = chunkService.getChunks("book-1", "chapter-1", request, "user-1"); + + // then + assertEquals(2, response.getData().size()); + assertEquals("chunk-1", response.getData().get(0).getId()); + assertEquals(ChunkType.TEXT, response.getData().get(0).getType()); + assertEquals("https://cdn/image.png", response.getData().get(1).getContent()); + assertEquals("image", response.getData().get(1).getDescription()); + + assertEquals(0, pageableCaptor.getValue().getPageNumber()); + assertEquals(200, pageableCaptor.getValue().getPageSize()); + } + + @Test + @DisplayName("책이 없으면 BOOK_NOT_FOUND 예외를 던진다.") + void getChunks_throwsWhenBookNotFound() { + // given + GetChunksRequest request = GetChunksRequest.builder() + .difficultyLevel(DifficultyLevel.A1) + .build(); + when(bookService.existsById("missing-book")).thenReturn(false); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> chunkService.getChunks("missing-book", "chapter-1", request, "user-1") + ); + + // then + assertEquals(BooksErrorCode.BOOK_NOT_FOUND.getMessage(), exception.getMessage()); + } + + @Test + @DisplayName("챕터가 다른 책에 속하면 CHAPTER_NOT_FOUND_IN_BOOK 예외를 던진다.") + void getChunks_throwsWhenChapterDoesNotBelongToBook() { + // given + GetChunksRequest request = GetChunksRequest.builder() + .difficultyLevel(DifficultyLevel.A1) + .build(); + + when(bookService.existsById("book-1")).thenReturn(true); + when(chapterRepository.findById("chapter-1")) + .thenReturn(Optional.of(createChapter("chapter-1", "another-book"))); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> chunkService.getChunks("book-1", "chapter-1", request, "user-1") + ); + + // then + assertEquals(BooksErrorCode.CHAPTER_NOT_FOUND_IN_BOOK.getMessage(), exception.getMessage()); + } + + @Test + @DisplayName("단일 청크 조회 시 ChunkResponse로 변환해 반환한다.") + void getChunk_returnsChunkResponse() { + // given + Chapter chapter = createChapter("chapter-1", "book-1"); + Chunk chunk = createChunk("chunk-1", "chapter-1", 3, ChunkType.TEXT, "body", null); + + when(bookService.existsById("book-1")).thenReturn(true); + when(chapterRepository.findById("chapter-1")).thenReturn(Optional.of(chapter)); + when(chunkRepository.findById("chunk-1")).thenReturn(Optional.of(chunk)); + + // when + ChunkResponse response = chunkService.getChunk("book-1", "chapter-1", "chunk-1"); + + // then + assertEquals("chunk-1", response.getId()); + assertEquals(3, response.getChunkNumber()); + assertEquals(ChunkType.TEXT, response.getType()); + assertEquals("body", response.getContent()); + } + + @Test + @DisplayName("청크가 다른 챕터에 속하면 CHUNK_NOT_FOUND 예외를 던진다.") + void getChunk_throwsWhenChunkDoesNotBelongToChapter() { + // given + Chapter chapter = createChapter("chapter-1", "book-1"); + Chunk chunk = createChunk("chunk-1", "chapter-2", 1, ChunkType.TEXT, "body", null); + + when(bookService.existsById("book-1")).thenReturn(true); + when(chapterRepository.findById("chapter-1")).thenReturn(Optional.of(chapter)); + when(chunkRepository.findById("chunk-1")).thenReturn(Optional.of(chunk)); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> chunkService.getChunk("book-1", "chapter-1", "chunk-1") + ); + + // then + assertEquals(BooksErrorCode.CHUNK_NOT_FOUND.getMessage(), exception.getMessage()); + } + + @Test + @DisplayName("findById는 청크가 없으면 CHUNK_NOT_FOUND 예외를 던진다.") + void findById_throwsWhenChunkNotFound() { + // given + when(chunkRepository.findById("missing-chunk")).thenReturn(Optional.empty()); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> chunkService.findById("missing-chunk") + ); + + // then + assertEquals(BooksErrorCode.CHUNK_NOT_FOUND.getMessage(), exception.getMessage()); + } + + @Test + @DisplayName("findFirstByChapterId는 첫 번째 청크를 반환한다.") + void findFirstByChapterId_returnsFirstChunk() { + // given + Chunk chunk = createChunk("chunk-1", "chapter-1", 1, ChunkType.TEXT, "body", null); + when(chunkRepository.findFirstByChapterIdOrderByChunkNumberAsc("chapter-1")) + .thenReturn(Optional.of(chunk)); + + // when + Chunk result = chunkService.findFirstByChapterId("chapter-1"); + + // then + assertEquals("chunk-1", result.getId()); + assertEquals(1, result.getChunkNumber()); + } + + private Chapter createChapter(String chapterId, String bookId) { + Chapter chapter = new Chapter(); + chapter.setId(chapterId); + chapter.setBookId(bookId); + return chapter; + } + + private Chunk createChunk( + String chunkId, + String chapterId, + int chunkNumber, + ChunkType type, + String content, + String description + ) { + Chunk chunk = new Chunk(); + chunk.setId(chunkId); + chunk.setChapterId(chapterId); + chunk.setChunkNumber(chunkNumber); + chunk.setDifficultyLevel(DifficultyLevel.A1); + chunk.setType(type); + chunk.setContent(content); + chunk.setDescription(description); + return chunk; + } +} diff --git a/src/test/java/com/linglevel/api/content/book/service/ProgressServiceIntegrationTest.java b/src/test/java/com/linglevel/api/content/book/service/ProgressServiceIntegrationTest.java index 747afe21..881d1fe6 100644 --- a/src/test/java/com/linglevel/api/content/book/service/ProgressServiceIntegrationTest.java +++ b/src/test/java/com/linglevel/api/content/book/service/ProgressServiceIntegrationTest.java @@ -107,6 +107,8 @@ void updateProgress_ChapterCompletion_CallsBothMethods() { .thenReturn(10); // 총 10개 챕터 when(readingCompletionService.processReadingCompletion(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID, null)) .thenReturn(120L); + when(streakService.updateStreak(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID)) + .thenReturn(true); // when progressService.updateProgress(TEST_BOOK_ID, request, TEST_USER_ID); @@ -114,12 +116,12 @@ void updateProgress_ChapterCompletion_CallsBothMethods() { // then - 세 가지 메서드가 순서대로 호출됨 verify(streakService).addStudyTime(TEST_USER_ID, 120L); verify(streakService).updateStreak(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID); - verify(streakService).addCompletedContent(eq(TEST_USER_ID), eq(ContentType.BOOK), eq(TEST_CHAPTER_ID), anyBoolean()); + verify(streakService).addCompletedContent(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID, true); } @Test - @DisplayName("같은 날 두 번째 챕터 완료 시 - 스트릭은 false, 완료 기록은 정상") - void updateProgress_SecondChapterSameDay_OnlyCompletionRecorded() { + @DisplayName("updateStreak가 false를 반환하면 완료 기록에도 false를 전달한다") + void updateProgress_passesFalseWhenStreakIsNotUpdated() { // given ProgressUpdateRequest request = new ProgressUpdateRequest(); request.setChunkId(TEST_CHUNK_ID); @@ -135,19 +137,21 @@ void updateProgress_SecondChapterSameDay_OnlyCompletionRecorded() { .thenReturn(10); when(readingCompletionService.processReadingCompletion(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID, null)) .thenReturn(120L); + when(streakService.updateStreak(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID)) + .thenReturn(false); // when progressService.updateProgress(TEST_BOOK_ID, request, TEST_USER_ID); - // then - addCompletedContent는 여전히 호출됨 + // then verify(streakService).addStudyTime(TEST_USER_ID, 120L); verify(streakService).updateStreak(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID); - verify(streakService).addCompletedContent(eq(TEST_USER_ID), eq(ContentType.BOOK), eq(TEST_CHAPTER_ID), anyBoolean()); + verify(streakService).addCompletedContent(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID, false); } @Test - @DisplayName("세션 유효하지 않아도 학습 시간과 완료 기록은 정상 처리") - void updateProgress_InvalidSession_StudyTimeAndCompletionStillRecorded() { + @DisplayName("마지막 청크여도 읽기 시간이 30초 미만이면 스트릭 관련 메서드를 호출하지 않는다") + void updateProgress_shortReadTime_skipsStreakUpdates() { // given ProgressUpdateRequest request = new ProgressUpdateRequest(); request.setChunkId(TEST_CHUNK_ID); @@ -162,15 +166,15 @@ void updateProgress_InvalidSession_StudyTimeAndCompletionStillRecorded() { when(chapterRepository.countByBookId(TEST_BOOK_ID)) .thenReturn(10); when(readingCompletionService.processReadingCompletion(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID, null)) - .thenReturn(30L); // 짧은 시간 + .thenReturn(29L); // when progressService.updateProgress(TEST_BOOK_ID, request, TEST_USER_ID); - // then - 학습 시간과 완료 기록은 정상 처리됨 - verify(streakService).addStudyTime(TEST_USER_ID, 30L); - verify(streakService).updateStreak(TEST_USER_ID, ContentType.BOOK, TEST_CHAPTER_ID); - verify(streakService).addCompletedContent(eq(TEST_USER_ID), eq(ContentType.BOOK), eq(TEST_CHAPTER_ID), anyBoolean()); + // then + verify(streakService, never()).addStudyTime(any(), anyLong()); + verify(streakService, never()).updateStreak(any(), any(), any()); + verify(streakService, never()).addCompletedContent(any(), any(), any(), anyBoolean()); } @Test diff --git a/src/test/java/com/linglevel/api/content/book/service/ProgressServiceTest.java b/src/test/java/com/linglevel/api/content/book/service/ProgressServiceTest.java index 4f8b79d9..3e80ba8a 100644 --- a/src/test/java/com/linglevel/api/content/book/service/ProgressServiceTest.java +++ b/src/test/java/com/linglevel/api/content/book/service/ProgressServiceTest.java @@ -1,13 +1,16 @@ package com.linglevel.api.content.book.service; import com.linglevel.api.content.book.dto.ProgressUpdateRequest; +import com.linglevel.api.content.book.dto.ProgressResponse; import com.linglevel.api.content.book.entity.BookProgress; import com.linglevel.api.content.book.entity.Chapter; import com.linglevel.api.content.book.entity.Chunk; +import com.linglevel.api.content.book.exception.BooksErrorCode; +import com.linglevel.api.content.book.exception.BooksException; import com.linglevel.api.content.book.repository.BookProgressRepository; import com.linglevel.api.content.book.repository.ChapterRepository; import com.linglevel.api.content.book.repository.ChunkRepository; -import com.linglevel.api.content.common.service.ProgressCalculationService; +import com.linglevel.api.content.common.DifficultyLevel; import com.linglevel.api.content.common.service.ReadingCompletionService; import com.linglevel.api.streak.service.StreakService; import org.junit.jupiter.api.DisplayName; @@ -23,8 +26,12 @@ import java.util.Optional; import static org.assertj.core.api.Assertions.assertThat; +import static org.junit.jupiter.api.Assertions.assertThrows; import static org.mockito.ArgumentMatchers.any; +import static org.mockito.ArgumentMatchers.eq; +import static org.mockito.Mockito.never; import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoInteractions; import static org.mockito.Mockito.when; @ExtendWith(MockitoExtension.class) @@ -48,9 +55,6 @@ class ProgressServiceTest { @Mock private ChunkService chunkService; - @Mock - private ProgressCalculationService progressCalculationService; - @Mock private ReadingCompletionService readingCompletionService; @@ -113,5 +117,257 @@ void updateProgress_shouldLazyMigrate_forOldBookProgress() { // ensureMigrated initializes the list, and the subsequent logic adds the first progress info assertThat(savedProgress.getChapterProgresses()).hasSize(1); assertThat(savedProgress.getChapterProgresses().get(0).getChapterNumber()).isEqualTo(1); + assertThat(savedProgress.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(1, 1)); + } + + @Test + @DisplayName("진도 정보가 없으면 문서를 생성하지 않고 0% 진도를 반환한다") + void getProgress_returnsZeroProgressWhenMissing() { + // given + String userId = "user-1"; + String bookId = "book-1"; + + when(bookService.existsById(bookId)).thenReturn(true); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.empty()); + + // when + ProgressResponse response = progressService.getProgress(bookId, userId); + + // then + verify(bookProgressRepository, never()).save(any(BookProgress.class)); + verifyNoInteractions(chapterService, chunkService, chunkRepository); + + assertThat(response.getId()).isNull(); + assertThat(response.getChapterId()).isNull(); + assertThat(response.getChunkId()).isNull(); + assertThat(response.getCurrentReadChapterNumber()).isEqualTo(0); + assertThat(response.getCurrentReadChunkNumber()).isEqualTo(0); + assertThat(response.getMaxReadChapterNumber()).isEqualTo(0); + assertThat(response.getMaxReadChunkNumber()).isEqualTo(0); + assertThat(response.getNormalizedProgress()).isEqualTo(0.0); + assertThat(response.getMaxNormalizedProgress()).isEqualTo(0.0); + assertThat(response.getIsCompleted()).isFalse(); + assertThat(response.getStreakUpdated()).isFalse(); + } + + @Test + @DisplayName("기존 챕터 진행률이 있으면 같은 챕터 항목을 업데이트하고 중복 추가하지 않는다") + void updateProgress_updatesExistingChapterProgressEntry() { + // given + String userId = "user-1"; + String bookId = "book-1"; + String chunkId = "chunk-3"; + String chapterId = "chapter-1"; + + BookProgress progress = new BookProgress(); + progress.setId("progress-1"); + progress.setUserId(userId); + progress.setBookId(bookId); + progress.setChapterProgresses(new ArrayList<>()); + progress.getChapterProgresses().add(BookProgress.ChapterProgressInfo.builder() + .chapterNumber(1) + .progressPercentage(20.0) + .isCompleted(false) + .build()); + + Chunk chunk = new Chunk(); + chunk.setId(chunkId); + chunk.setChapterId(chapterId); + chunk.setChunkNumber(3); + chunk.setDifficultyLevel(DifficultyLevel.A1); + + Chapter chapter = new Chapter(); + chapter.setId(chapterId); + chapter.setBookId(bookId); + chapter.setChapterNumber(1); + + ProgressUpdateRequest request = new ProgressUpdateRequest(); + request.setChunkId(chunkId); + + when(bookService.existsById(bookId)).thenReturn(true); + when(chunkService.findById(chunkId)).thenReturn(chunk); + when(chapterService.findById(chapterId)).thenReturn(chapter); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.of(progress)); + when(chunkRepository.countByChapterIdAndDifficultyLevel(chapterId, DifficultyLevel.A1)).thenReturn(5L); + when(chapterRepository.countByBookId(bookId)).thenReturn(10); + when(readingCompletionService.processReadingCompletion(userId, com.linglevel.api.content.common.ContentType.BOOK, chapterId, null)) + .thenReturn(null); + + // when + ProgressResponse response = progressService.updateProgress(bookId, request, userId); + + // then + verify(bookProgressRepository).save(bookProgressCaptor.capture()); + BookProgress saved = bookProgressCaptor.getValue(); + + assertThat(saved.getChapterProgresses()).hasSize(1); + assertThat(saved.getChapterProgresses().get(0).getChapterNumber()).isEqualTo(1); + assertThat(saved.getChapterProgresses().get(0).getProgressPercentage()).isEqualTo(60.0); + assertThat(saved.getChapterProgresses().get(0).getIsCompleted()).isFalse(); + assertThat(saved.getCurrentReadChapterNumber()).isEqualTo(1); + assertThat(saved.getChunkId()).isEqualTo(chunkId); + assertThat(saved.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(1, 3)); + assertThat(response.getCurrentReadChunkNumber()).isEqualTo(3); + assertThat(response.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(1, 3)); + assertThat(response.getStreakUpdated()).isFalse(); + } + + @Test + @DisplayName("마지막 남은 챕터를 완료하면 책 전체를 완료 상태로 저장하고 streakUpdated를 반영한다") + void updateProgress_marksBookCompletedWhenLastRemainingChapterFinishes() { + // given + String userId = "user-1"; + String bookId = "book-1"; + String chunkId = "chunk-4"; + String chapterId = "chapter-2"; + + BookProgress progress = new BookProgress(); + progress.setId("progress-1"); + progress.setUserId(userId); + progress.setBookId(bookId); + progress.setIsCompleted(false); + progress.setChapterProgresses(new ArrayList<>()); + progress.getChapterProgresses().add(BookProgress.ChapterProgressInfo.builder() + .chapterNumber(1) + .progressPercentage(100.0) + .isCompleted(true) + .build()); + + Chunk chunk = new Chunk(); + chunk.setId(chunkId); + chunk.setChapterId(chapterId); + chunk.setChunkNumber(4); + chunk.setDifficultyLevel(DifficultyLevel.A1); + + Chapter chapter = new Chapter(); + chapter.setId(chapterId); + chapter.setBookId(bookId); + chapter.setChapterNumber(2); + + ProgressUpdateRequest request = new ProgressUpdateRequest(); + request.setChunkId(chunkId); + + when(bookService.existsById(bookId)).thenReturn(true); + when(chunkService.findById(chunkId)).thenReturn(chunk); + when(chapterService.findById(chapterId)).thenReturn(chapter); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.of(progress)); + when(chunkRepository.countByChapterIdAndDifficultyLevel(chapterId, DifficultyLevel.A1)).thenReturn(4L); + when(chapterRepository.countByBookId(bookId)).thenReturn(2); + when(readingCompletionService.processReadingCompletion(userId, com.linglevel.api.content.common.ContentType.BOOK, chapterId, null)) + .thenReturn(45L); + when(streakService.updateStreak(userId, com.linglevel.api.content.common.ContentType.BOOK, chapterId)) + .thenReturn(true); + + // when + ProgressResponse response = progressService.updateProgress(bookId, request, userId); + + // then + verify(bookProgressRepository).save(bookProgressCaptor.capture()); + BookProgress saved = bookProgressCaptor.getValue(); + + assertThat(saved.getChapterProgresses()).hasSize(2); + assertThat(saved.getChapterProgresses().get(1).getChapterNumber()).isEqualTo(2); + assertThat(saved.getChapterProgresses().get(1).getProgressPercentage()).isEqualTo(100.0); + assertThat(saved.getChapterProgresses().get(1).getIsCompleted()).isTrue(); + assertThat(saved.getIsCompleted()).isTrue(); + assertThat(saved.getCompletedAt()).isNotNull(); + assertThat(saved.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(2, 4)); + assertThat(response.getCurrentReadChunkNumber()).isEqualTo(4); + assertThat(response.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(2, 4)); + assertThat(response.getStreakUpdated()).isTrue(); + } + + @Test + @DisplayName("maxReadChunkNumber는 챕터 우선 순서로 업데이트된다") + void updateProgress_updatesMaxReadChunkNumberByChapterPriority() { + // given + String userId = "user-1"; + String bookId = "book-1"; + String chunkId = "chunk-1"; + String chapterId = "chapter-2"; + + BookProgress progress = new BookProgress(); + progress.setId("progress-1"); + progress.setUserId(userId); + progress.setBookId(bookId); + progress.setMaxReadChunkNumber(chapterFirstPosition(1, 100)); + progress.setChapterProgresses(new ArrayList<>()); + + Chunk chunk = new Chunk(); + chunk.setId(chunkId); + chunk.setChapterId(chapterId); + chunk.setChunkNumber(1); + chunk.setDifficultyLevel(DifficultyLevel.A1); + + Chapter chapter = new Chapter(); + chapter.setId(chapterId); + chapter.setBookId(bookId); + chapter.setChapterNumber(2); + + ProgressUpdateRequest request = new ProgressUpdateRequest(); + request.setChunkId(chunkId); + + when(bookService.existsById(bookId)).thenReturn(true); + when(chunkService.findById(chunkId)).thenReturn(chunk); + when(chapterService.findById(chapterId)).thenReturn(chapter); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.of(progress)); + when(chunkRepository.countByChapterIdAndDifficultyLevel(chapterId, DifficultyLevel.A1)).thenReturn(10L); + when(chapterRepository.countByBookId(bookId)).thenReturn(5); + when(readingCompletionService.processReadingCompletion(userId, com.linglevel.api.content.common.ContentType.BOOK, chapterId, null)) + .thenReturn(null); + + // when + ProgressResponse response = progressService.updateProgress(bookId, request, userId); + + // then + verify(bookProgressRepository).save(bookProgressCaptor.capture()); + BookProgress saved = bookProgressCaptor.getValue(); + assertThat(saved.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(2, 1)); + assertThat(response.getMaxReadChunkNumber()).isEqualTo(chapterFirstPosition(2, 1)); + } + + @Test + @DisplayName("deleteProgress는 기존 진도 정보를 삭제한다") + void deleteProgress_deletesExistingProgress() { + // given + String userId = "user-1"; + String bookId = "book-1"; + + BookProgress progress = new BookProgress(); + progress.setId("progress-1"); + + when(bookService.existsById(bookId)).thenReturn(true); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.of(progress)); + + // when + progressService.deleteProgress(bookId, userId); + + // then + verify(bookProgressRepository).delete(progress); + } + + @Test + @DisplayName("deleteProgress는 진도 정보가 없으면 PROGRESS_NOT_FOUND 예외를 던진다") + void deleteProgress_throwsWhenProgressMissing() { + // given + String userId = "user-1"; + String bookId = "book-1"; + + when(bookService.existsById(bookId)).thenReturn(true); + when(bookProgressRepository.findByUserIdAndBookId(userId, bookId)).thenReturn(Optional.empty()); + + // when + BooksException exception = assertThrows( + BooksException.class, + () -> progressService.deleteProgress(bookId, userId) + ); + + // then + assertThat(exception.getMessage()).isEqualTo(BooksErrorCode.PROGRESS_NOT_FOUND.getMessage()); + verify(bookProgressRepository, never()).delete(any(BookProgress.class)); + } + + private int chapterFirstPosition(int chapterNumber, int chunkNumber) { + return (chapterNumber << 16) | chunkNumber; } } diff --git a/src/test/java/com/linglevel/api/content/feed/service/Formula1EspnThumbnailTest.java b/src/test/java/com/linglevel/api/content/feed/service/Formula1EspnThumbnailTest.java index 548e67af..e9a70ca6 100644 --- a/src/test/java/com/linglevel/api/content/feed/service/Formula1EspnThumbnailTest.java +++ b/src/test/java/com/linglevel/api/content/feed/service/Formula1EspnThumbnailTest.java @@ -9,6 +9,7 @@ import org.jsoup.nodes.Document; import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.condition.DisabledIfEnvironmentVariable; import java.net.URL; import java.util.List; @@ -16,6 +17,11 @@ import static org.junit.jupiter.api.Assertions.*; @DisplayName("Formula1 & ESPN 썸네일 추출 DSL 테스트") +@DisabledIfEnvironmentVariable( + named = "CI", + matches = "true", + disabledReason = "외부 RSS/웹 페이지 의존 통합 테스트는 CI 환경에서 불안정하여 로컬에서만 실행" +) class Formula1EspnThumbnailTest { // 권장 DSL (다양한 사이트에서 작동) diff --git a/src/test/java/com/linglevel/api/content/feed/service/NewFeedSourcesTest.java b/src/test/java/com/linglevel/api/content/feed/service/NewFeedSourcesTest.java index 6647d5fc..546c865b 100644 --- a/src/test/java/com/linglevel/api/content/feed/service/NewFeedSourcesTest.java +++ b/src/test/java/com/linglevel/api/content/feed/service/NewFeedSourcesTest.java @@ -6,6 +6,7 @@ import com.rometools.rome.io.XmlReader; import org.junit.jupiter.api.DisplayName; import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.condition.DisabledIfEnvironmentVariable; import java.lang.reflect.Method; import java.net.URL; @@ -14,6 +15,11 @@ import static org.junit.jupiter.api.Assertions.*; @DisplayName("새로운 RSS Feed 소스 파싱 테스트 (Formula1, ESPN)") +@DisabledIfEnvironmentVariable( + named = "CI", + matches = "true", + disabledReason = "외부 RSS 의존 통합 테스트는 CI 환경에서 불안정하여 로컬에서만 실행" +) class NewFeedSourcesTest { @Test @@ -245,4 +251,4 @@ void testBothNewSourcesComparison() throws Exception { } } } -} \ No newline at end of file +}