diff --git a/.cursor/agents/robot-coordinator.md b/.cursor/agents/robot-coordinator.md
index 8226e585..09a0f2ea 100644
--- a/.cursor/agents/robot-coordinator.md
+++ b/.cursor/agents/robot-coordinator.md
@@ -15,9 +15,9 @@ You are a **Coordinator** for Java Enterprise Development. Your primary responsi
### Collaboration partners
- **[@robot-java-coder](robot-java-coder.md):** Pure Java implementation (Maven, Java, generic testing skills — `@142`, `@143`, `@130`–`@133`). Use when **Framework identification** yields plain Java, CLI-only, or a stack without a dedicated framework agent here.
-- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, etc. — `@301`, `@302`, `@311`–`@313`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
-- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Quarkus tests — `@401`, `@402`, `@411`–`@413`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
-- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@513`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
+- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, Kafka messaging, MongoDB — `@301`, `@302`, `@311`–`@315`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
+- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Kafka messaging, MongoDB, Quarkus tests — `@401`, `@402`, `@411`–`@415`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
+- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, Kafka messaging, MongoDB, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@515`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
- **Parallel column drives grouping:** The plan's task list table includes a **Parallel** column (or **Agent** if the plan uses that name). Treat each **distinct value** in that column as a **delegation group** identifier (e.g. `A1`, `A2`, `A3-timeout`, `A3-retry`, `A4`).
- **One logical developer per group:** For each distinct **Parallel** value, assign a **separate** instance of the **same** chosen implementation agent (`robot-java-coder`, `robot-spring-boot-coder`, `robot-quarkus-coder`, or `robot-micronaut-coder`) whose scope is **only** the rows for that value. Label every handoff, e.g. `Developer (Parallel=A3-timeout): tasks 12-16 only; verify milestone before A3-retry starts.`
@@ -36,9 +36,9 @@ When you analyze the task, **determine the target framework** from requirements
| Finding | Delegate to |
|---------|-------------|
-| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests or tasks) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
-| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, or Quarkus-specific tasks) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
-| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, or Micronaut-specific tasks) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
+| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests, Kafka with `spring-kafka`, or MongoDB with `spring-data-mongodb`) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
+| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, SmallRye Reactive Messaging, or Quarkus MongoDB Panache) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
+| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, `micronaut-kafka`, or `micronaut-data-mongodb`) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
| No Spring Boot, Quarkus, or Micronaut; plain Java, other framework not covered by a dedicated agent here, or requirements are framework-neutral | [@robot-java-coder](robot-java-coder.md) |
**If mixed or ambiguous:** Prefer **robot-spring-boot-coder** when **any** authoritative requirement document commits to Spring Boot; prefer **robot-quarkus-coder** when it commits to Quarkus; prefer **robot-micronaut-coder** when it commits to Micronaut; otherwise prefer **robot-java-coder** and state the ambiguity in the handoff so the implementer can align with `pom.xml` / ADRs.
diff --git a/.cursor/agents/robot-micronaut-coder.md b/.cursor/agents/robot-micronaut-coder.md
index eeed8001..0bdb6d51 100644
--- a/.cursor/agents/robot-micronaut-coder.md
+++ b/.cursor/agents/robot-micronaut-coder.md
@@ -11,6 +11,8 @@ You are an **Implementation Specialist** for Micronaut projects. You focus on wr
- Implement `@Controller` HTTP endpoints, `@Singleton` application services, and `@Factory` beans following Micronaut conventions.
- Configure Micronaut `application.yml` / `application.properties`, environments, and `@Requires` / `@ConfigurationProperties`.
- Apply **Micronaut Data** (`@MappedEntity`, repositories, `@Query`, transactions) for relational persistence, or **raw JDBC** (`DataSource`, `PreparedStatement`) when `@511-frameworks-micronaut-jdbc` fits better.
+- Integrate Apache Kafka producers and consumers using `@KafkaClient`, `@KafkaListener`, `@KafkaKey`, and `KafkaListenerExceptionHandler`.
+- Integrate MongoDB using Micronaut Data MongoDB (`@MappedEntity`, `@MongoRepository`, `@MongoFindQuery`).
- Write Micronaut tests (`@MicronautTest`, `@MockBean`, `HttpClient`, `TestPropertyProvider` with Testcontainers).
- Ensure secure coding practices for web APIs.
@@ -30,6 +32,8 @@ Apply guidance from these Skills when relevant:
- `@511-frameworks-micronaut-jdbc`: programmatic JDBC (DataSource, SQL, transactions)
- `@512-frameworks-micronaut-data`: Micronaut Data (repositories, entities, generated SQL)
- `@513-frameworks-micronaut-db-migrations-flyway`: Micronaut DB migrations (Flyway)
+- `@514-frameworks-micronaut-kafka`: Kafka messaging (@KafkaClient, @KafkaListener, retries, dead-letter routing)
+- `@515-frameworks-micronaut-mongodb`: MongoDB (@MongoRepository, @MappedEntity, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
diff --git a/.cursor/agents/robot-quarkus-coder.md b/.cursor/agents/robot-quarkus-coder.md
index 9b8ca2ec..00c74b2c 100644
--- a/.cursor/agents/robot-quarkus-coder.md
+++ b/.cursor/agents/robot-quarkus-coder.md
@@ -10,8 +10,10 @@ You are an **Implementation Specialist** for Quarkus projects. You focus on writ
- Implement Jakarta REST resources, CDI services, and repositories following Quarkus conventions.
- Configure Quarkus extensions, profiles (`%dev`, `%test`, `%prod`), and `application.properties`.
-- Apply Quarkus JDBC or Hibernate ORM Panache for persistence.
-- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, REST Assured).
+- Apply Quarkus JDBC or Hibernate ORM Panache for relational persistence.
+- Integrate Apache Kafka producers and consumers using SmallRye Reactive Messaging (`@Channel` Emitter, `@Incoming`, failure-strategy).
+- Integrate MongoDB using Quarkus MongoDB Panache (`PanacheMongoEntity`, `PanacheMongoRepository`).
+- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, `@TestTransaction`, REST Assured, Dev Services).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -29,6 +31,8 @@ Apply guidance from these Skills when relevant:
- `@411-frameworks-quarkus-jdbc`: Quarkus JDBC
- `@412-frameworks-quarkus-panache`: Quarkus Panache
- `@413-frameworks-quarkus-db-migrations-flyway`: Quarkus DB migrations (Flyway)
+- `@414-frameworks-quarkus-kafka`: Kafka messaging (SmallRye Reactive Messaging, Emitter, @Incoming, failure strategies)
+- `@415-frameworks-quarkus-mongodb`: MongoDB (Panache Mongo entities, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing Strategies
diff --git a/.cursor/agents/robot-spring-boot-coder.md b/.cursor/agents/robot-spring-boot-coder.md
index cfbb2631..0126e621 100644
--- a/.cursor/agents/robot-spring-boot-coder.md
+++ b/.cursor/agents/robot-spring-boot-coder.md
@@ -10,8 +10,10 @@ You are an **Implementation Specialist** for Spring Boot projects. You focus on
- Implement REST controllers, services, and repositories following Spring Boot conventions.
- Configure Spring Boot auto-configuration, profiles, and `application.yml`.
-- Apply Spring Data JDBC for persistence.
-- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@SpringBootTest`).
+- Apply Spring Data JDBC for relational persistence.
+- Integrate Apache Kafka producers and listeners using `spring-kafka` (typed templates, retries, dead-letter topics).
+- Integrate MongoDB using Spring Data MongoDB (documents, repositories, error handling).
+- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@DataMongoTest`, `@SpringBootTest`, `@EmbeddedKafka`).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -29,6 +31,8 @@ Apply guidance from these Skills when relevant:
- `@311-frameworks-spring-jdbc`: Spring JDBC
- `@312-frameworks-spring-data-jdbc`: Spring Data JDBC
- `@313-frameworks-spring-db-migrations-flyway`: Flyway database migrations
+- `@314-frameworks-spring-kafka`: Kafka messaging (producers, listeners, retries, dead-letter topics)
+- `@315-frameworks-spring-mongodb`: MongoDB (document design, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
diff --git a/.github/workflows/examples-build.yaml b/.github/workflows/examples-build.yaml
new file mode 100644
index 00000000..fe276b82
--- /dev/null
+++ b/.github/workflows/examples-build.yaml
@@ -0,0 +1,28 @@
+name: Examples Build
+
+on:
+ push:
+ paths:
+ - "examples/**"
+
+jobs:
+ examples:
+ name: Build Examples
+ runs-on: ubuntu-latest
+ strategy:
+ matrix:
+ example:
+ - { name: "Maven", path: "examples/maven-demo", goal: "verify" }
+ - { name: "Spring Boot Memory Leak", path: "examples/spring-boot-memory-leak-demo", goal: "package" }
+ - { name: "Spring Boot Performance Bottleneck", path: "examples/spring-boot-performance-bottleneck-demo", goal: "package" }
+ - { name: "Spring Boot", path: "examples/spring-boot-demo/implementation", goal: "verify -Pjacoco" }
+ steps:
+ - uses: actions/checkout@v6
+ with:
+ submodules: true # Fetches all submodules
+ - uses: actions/setup-java@v5
+ with:
+ distribution: "graalvm" # See 'Supported distributions' for available options
+ java-version: "25"
+ - name: Build ${{ matrix.example.name }}
+ run: cd ${{ matrix.example.path }} && ./mvnw --batch-mode --no-transfer-progress ${{ matrix.example.goal }} --file pom.xml
diff --git a/.github/workflows/maven.yaml b/.github/workflows/maven.yaml
index 671357c5..a4ba9c88 100644
--- a/.github/workflows/maven.yaml
+++ b/.github/workflows/maven.yaml
@@ -52,27 +52,6 @@ jobs:
git log -1 --pretty=%B > /tmp/commit-msg.txt
pre-commit run conventional-pre-commit --hook-stage commit-msg --commit-msg-filename /tmp/commit-msg.txt
- examples:
- name: Build Examples
- runs-on: ubuntu-latest
- strategy:
- matrix:
- example:
- - { name: "Maven", path: "examples/maven-demo", goal: "verify" }
- - { name: "Spring Boot Memory Leak", path: "examples/spring-boot-memory-leak-demo", goal: "package" }
- - { name: "Spring Boot Performance Bottleneck", path: "examples/spring-boot-performance-bottleneck-demo", goal: "package" }
- - { name: "Spring Boot", path: "examples/spring-boot-demo/implementation", goal: "verify -Pjacoco" }
- steps:
- - uses: actions/checkout@v6
- with:
- submodules: true # Fetches all submodules
- - uses: actions/setup-java@v5
- with:
- distribution: 'graalvm' # See 'Supported distributions' for available options
- java-version: '25'
- - name: Build ${{ matrix.example.name }}
- run: cd ${{ matrix.example.path }} && ./mvnw --batch-mode --no-transfer-progress ${{ matrix.example.goal }} --file pom.xml
-
package-agent-artifacts:
name: Package Agents and Skills
runs-on: ubuntu-latest
diff --git a/skills-generator/src/main/resources/skill-indexes/314-skill.xml b/skills-generator/src/main/resources/skill-indexes/314-skill.xml
new file mode 100644
index 00000000..946211ef
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/314-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need to design or implement Kafka messaging in Spring Boot — including topic design, producer/consumer implementation with Spring for Apache Kafka, retries and dead-letter topics, idempotency, and error handling. This should trigger for requests such as Add Kafka in Spring Boot; Review Spring Kafka consumers; Improve retries and DLT in Spring Kafka.
+
+
+ Spring Boot — Kafka messaging
+
+
+
+ Compile before messaging refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add Kafka in Spring Boot
+ Review Spring Kafka consumers/producers
+ Improve retries, dead-letter topics, or idempotency in Spring Kafka
+
+
+
+
+ Read reference and assess project contextRead `references/314-frameworks-spring-kafka.md` and inspect current messaging setup before proposing changes.
+ Gather scope and decide target improvementsIdentify reliability and throughput goals and define the minimum safe set of changes.
+ Apply framework-aligned changesImplement/refactor Spring Kafka configuration, producer/consumer logic, and failure handling.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-indexes/315-skill.xml b/skills-generator/src/main/resources/skill-indexes/315-skill.xml
new file mode 100644
index 00000000..f0e6bfcc
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/315-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need to design or implement MongoDB data access in Spring Boot — including document modeling, Spring Data Mongo repositories/templates, indexing, optimistic concurrency, and error handling. This should trigger for requests such as Add MongoDB in Spring Boot; Review Spring Data Mongo design; Improve error handling for Mongo writes.
+
+
+ Spring Boot — MongoDB
+
+
+
+ Compile before MongoDB refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add MongoDB in Spring Boot
+ Review Spring Data Mongo repositories/documents
+ Improve duplicate key handling, retries, or optimistic locking in Mongo flows
+
+
+
+
+ Read reference and assess project contextRead `references/315-frameworks-spring-mongodb.md` and inspect persistence setup before proposing changes.
+ Gather scope and decide target improvementsIdentify data model, consistency, and query requirements to define safe improvements.
+ Apply framework-aligned changesImplement/refactor mappings, repositories, indexes, and failure handling policies.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-indexes/414-skill.xml b/skills-generator/src/main/resources/skill-indexes/414-skill.xml
new file mode 100644
index 00000000..64600576
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/414-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need Kafka messaging in Quarkus with SmallRye Reactive Messaging — including channel/topic design, serialization, ack/failure strategies, retries/DLQ, and error handling. This should trigger for requests such as Add Kafka in Quarkus; Review Reactive Messaging consumers; Improve failure handling for Quarkus Kafka.
+
+
+ Quarkus — Kafka messaging
+
+
+
+ Compile before messaging refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add Kafka in Quarkus
+ Review Quarkus Reactive Messaging consumers/producers
+ Improve retries, dead-letter handling, or idempotency in Quarkus Kafka
+
+
+
+
+ Read reference and assess project contextRead `references/414-frameworks-quarkus-kafka.md` and inspect current messaging setup before proposing changes.
+ Gather scope and decide target improvementsIdentify delivery semantics and resilience goals to define safe improvements.
+ Apply framework-aligned changesImplement/refactor channels, serializers, and failure strategies in Reactive Messaging.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-indexes/415-skill.xml b/skills-generator/src/main/resources/skill-indexes/415-skill.xml
new file mode 100644
index 00000000..4d9b7f87
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/415-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need MongoDB persistence in Quarkus — including Panache Mongo entities/repositories, document design, indexes, transactions where applicable, and error handling. This should trigger for requests such as Add MongoDB in Quarkus; Review Quarkus Mongo Panache design; Improve Mongo error handling in Quarkus services.
+
+
+ Quarkus — MongoDB
+
+
+
+ Compile before MongoDB refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add MongoDB in Quarkus
+ Review Quarkus Mongo Panache entities/repositories
+ Improve duplicate key handling, retry policy, or optimistic locking in Quarkus Mongo
+
+
+
+
+ Read reference and assess project contextRead `references/415-frameworks-quarkus-mongodb.md` and inspect persistence setup before proposing changes.
+ Gather scope and decide target improvementsIdentify model/query consistency needs and define safe improvements.
+ Apply framework-aligned changesImplement/refactor Panache Mongo mappings, repository access, and failure handling.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-indexes/514-skill.xml b/skills-generator/src/main/resources/skill-indexes/514-skill.xml
new file mode 100644
index 00000000..b6ff6954
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/514-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need Kafka messaging in Micronaut — including @KafkaClient and @KafkaListener design, topic/partition strategy, serialization, retries and dead-letter processing, and error handling. This should trigger for requests such as Add Kafka in Micronaut; Review Micronaut Kafka listeners; Improve retry and failure handling for Micronaut Kafka.
+
+
+ Micronaut — Kafka messaging
+
+
+
+ Compile before messaging refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add Kafka in Micronaut
+ Review Micronaut Kafka consumers/producers
+ Improve retries, dead-letter handling, or idempotency in Micronaut Kafka
+
+
+
+
+ Read reference and assess project contextRead `references/514-frameworks-micronaut-kafka.md` and inspect current messaging setup before proposing changes.
+ Gather scope and decide target improvementsIdentify delivery guarantees and resilience requirements to define safe improvements.
+ Apply framework-aligned changesImplement/refactor clients, listeners, and failure strategies in Micronaut Kafka.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-indexes/515-skill.xml b/skills-generator/src/main/resources/skill-indexes/515-skill.xml
new file mode 100644
index 00000000..74eeb420
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-indexes/515-skill.xml
@@ -0,0 +1,41 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Use when you need MongoDB persistence in Micronaut — including @MongoRepository design, document modeling, indexes, query patterns, and error handling. This should trigger for requests such as Add MongoDB in Micronaut; Review Micronaut Data Mongo design; Improve error handling for Micronaut Mongo operations.
+
+
+ Micronaut — MongoDB
+
+
+
+ Compile before MongoDB refactors; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+ **SAFETY**: If compilation fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+ **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+
+
+
+
+ Add MongoDB in Micronaut
+ Review Micronaut Mongo entities/repositories
+ Improve duplicate key handling, retries, or optimistic locking in Micronaut Mongo
+
+
+
+
+ Read reference and assess project contextRead `references/515-frameworks-micronaut-mongodb.md` and inspect persistence setup before proposing changes.
+ Gather scope and decide target improvementsIdentify model and consistency requirements and define safe improvements.
+ Apply framework-aligned changesImplement/refactor documents, repositories, indexes, and error handling.
+ Run verification and report resultsExecute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+
diff --git a/skills-generator/src/main/resources/skill-references/314-frameworks-spring-kafka.xml b/skills-generator/src/main/resources/skill-references/314-frameworks-spring-kafka.xml
new file mode 100644
index 00000000..ab3f58d1
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/314-frameworks-spring-kafka.xml
@@ -0,0 +1,242 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Spring Boot — Kafka messaging
+ Use when you need Kafka with Spring Boot (`spring-kafka`) and want examples for design, implementation, and error handling with retries, dead-letter topics, and idempotent consumers.
+
+ You are a Senior software engineer with extensive experience in Spring Boot and Apache Kafka
+
+ Design resilient Kafka-based flows in Spring Boot with clear topic contracts, robust producer/consumer implementations, and predictable failure handling.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Topic versioning and key strategy
+
+
+ `.
+ ]]>
+
+
+
+
+
+
+ createEvent() {
+ return Map.of("type", "order", "payload", "...");
+ }
+}]]>
+
+
+
+
+
+ Implementation
+ Producer + listener with manual service boundary
+
+
+
+
+
+
+ template;
+
+ OrderEventPublisher(KafkaTemplate template) {
+ this.template = template;
+ }
+
+ void publish(OrderCreatedEvent event) {
+ template.send("orders.events.v1", event.orderId(), event);
+ }
+}
+
+@Component
+class OrderEventListener {
+ @KafkaListener(topics = "orders.events.v1", groupId = "billing-service")
+ void onEvent(OrderCreatedEvent event) {
+ // Delegate to application service; keep listener thin
+ // billingService.processOrder(event);
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Error handling
+ Retries + dead-letter topic + idempotency
+
+
+
+
+
+
+ template) {
+ var recoverer = new DeadLetterPublishingRecoverer(template);
+ // Retry 3 times with 1 second backoff before sending to DLT
+ return new DefaultErrorHandler(recoverer, new FixedBackOff(1000L, 3));
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Testing
+ Integration tests with Testcontainers Kafka
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ **ANALYZE** messaging code: producer/consumer implementations, topic naming, serialization, and error handling strategies
+ **CATEGORIZE** issues by impact (RELIABILITY, MAINTAINABILITY, PERFORMANCE)
+ **APPLY** Spring Kafka-aligned fixes: configure proper error handlers, use typed payloads, ensure idempotency
+ **IMPLEMENT** changes consistently across producer and consumer configurations
+ **EXPLAIN** trade-offs (e.g., at-least-once vs exactly-once delivery, retry backoff strategies)
+ **TEST** messaging behavior with Testcontainers Kafka integration tests (avoid relying only on `@EmbeddedKafka`)
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **POISON PILLS**: Always configure a Dead Letter Topic (DLT) or explicit error handler to prevent blocking the partition
+ **IDEMPOTENCY**: Ensure consumer logic is idempotent to handle retries and rebalances safely
+ **TRANSACTIONS**: If using Kafka transactions, ensure `transactional.id` is configured and align with database transactions if needed
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/315-frameworks-spring-mongodb.xml b/skills-generator/src/main/resources/skill-references/315-frameworks-spring-mongodb.xml
new file mode 100644
index 00000000..244efa92
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/315-frameworks-spring-mongodb.xml
@@ -0,0 +1,245 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Spring Boot — MongoDB
+ Use when you need MongoDB with Spring Data MongoDB and want examples for design, implementation, and error handling for robust document persistence.
+
+ You are a Senior software engineer with extensive experience in Spring Boot and MongoDB
+
+ Design clear document models, implement robust Spring Data Mongo repositories/services, and handle Mongo failures explicitly.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Document boundaries and indexes
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Implementation
+ Repository + service composition
+
+
+
+
+
+
+ {
+ Optional findByOrderNumber(String orderNumber);
+}
+
+@Service
+class OrderService {
+ private final OrderRepository repository;
+
+ OrderService(OrderRepository repository) {
+ this.repository = repository;
+ }
+
+ OrderDocument create(OrderDocument doc) {
+ return repository.save(doc);
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Error handling
+ Duplicate key and optimistic locking
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Testing
+ Repository tests with MongoDB Testcontainers
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ **ANALYZE** MongoDB code: document design, repository interfaces, indexing strategies, and error handling
+ **CATEGORIZE** issues by impact (CORRECTNESS, PERFORMANCE, DATA INTEGRITY)
+ **APPLY** Spring Data MongoDB-aligned fixes: use `@Document`, `@Id`, `@Version`, and proper index annotations
+ **IMPLEMENT** changes so schema and queries stay consistent
+ **EXPLAIN** trade-offs (e.g., embedding vs referencing documents, index overhead)
+ **TEST** repository behavior with `@DataMongoTest` and Testcontainers
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **INDEXING**: Ensure indexes are created (either via Spring Data auto-index creation in dev, or explicitly via scripts in prod)
+ **CONCURRENCY**: Use `@Version` to prevent lost updates in concurrent environments
+ **TRANSACTIONS**: MongoDB supports multi-document transactions; ensure replica sets are used if `@Transactional` is applied
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/414-frameworks-quarkus-kafka.xml b/skills-generator/src/main/resources/skill-references/414-frameworks-quarkus-kafka.xml
new file mode 100644
index 00000000..cfd59b53
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/414-frameworks-quarkus-kafka.xml
@@ -0,0 +1,215 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Quarkus — Kafka messaging
+ Use when you need Kafka in Quarkus with SmallRye Reactive Messaging and want examples for design, implementation, and error handling.
+
+ You are a Senior software engineer with extensive experience in Quarkus and Kafka
+
+ Build resilient Quarkus Kafka messaging with explicit contracts, clear channel wiring, and controlled failure behavior using SmallRye Reactive Messaging.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Channel naming and event contract stability
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Implementation
+ Reactive Messaging producer and consumer
+
+
+
+
+
+
+ emitter;
+
+ void publish(OrderCreatedEvent event) {
+ emitter.send(event);
+ }
+}
+
+@ApplicationScoped
+class OrderConsumer {
+ @Incoming("orders-in")
+ public Uni onMessage(OrderCreatedEvent event) {
+ // Delegate to app service, returning a Uni for reactive processing
+ return Uni.createFrom().voidItem();
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Error handling
+ Nack strategy and dead-letter queue
+
+
+
+
+
+
+
+
+
+ onMessage(Message message) {
+ try {
+ // process(message.getPayload());
+ return message.ack();
+ } catch (Exception e) {
+ return message.ack(); // Bad: acknowledges failed records, losing data
+ }
+ }
+}]]>
+
+
+
+
+
+ Testing
+ @QuarkusTest with Kafka Testcontainers via test resource
+
+
+
+
+
+
+ start() {
+ kafka = new KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka:7.6.1"));
+ kafka.start();
+ return Map.of("kafka.bootstrap.servers", kafka.getBootstrapServers());
+ }
+ @Override
+ public void stop() {
+ if (kafka != null) kafka.stop();
+ }
+}
+
+@QuarkusTest
+@QuarkusTestResource(KafkaTestResource.class)
+class OrderKafkaIT {
+ @Test
+ void shouldProcessIncomingMessage() {
+ // publish event and assert side effects
+ }
+}]]>
+
+
+
+
+
+
+
+
+
+ **ANALYZE** messaging code: `@Incoming`/`@Outgoing` usage, channel configuration, and failure strategies
+ **CATEGORIZE** issues by impact (RELIABILITY, MAINTAINABILITY, PERFORMANCE)
+ **APPLY** Quarkus Kafka-aligned fixes: configure DLQs, use typed payloads, ensure non-blocking processing
+ **IMPLEMENT** changes consistently across `application.properties` and Java code
+ **EXPLAIN** trade-offs (e.g., DLQ vs ignore, throttled vs latest commit strategies)
+ **TEST** messaging behavior with `@QuarkusTest` and Testcontainers Kafka (optionally through Dev Services where appropriate)
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **POISON PILLS**: Use `failure-strategy=dead-letter-queue` to prevent poison pills from blocking the consumer
+ **BLOCKING CODE**: Do not block the event loop in `@Incoming` methods; use `@Blocking` if synchronous processing is required
+ **ACKNOWLEDGEMENT**: Let the framework handle ACKs automatically when returning `Uni`/`CompletionStage`, or handle them explicitly but correctly on failure
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/415-frameworks-quarkus-mongodb.xml b/skills-generator/src/main/resources/skill-references/415-frameworks-quarkus-mongodb.xml
new file mode 100644
index 00000000..ef92e1cd
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/415-frameworks-quarkus-mongodb.xml
@@ -0,0 +1,243 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Quarkus — MongoDB
+ Use when you need MongoDB in Quarkus with Mongo Panache and want examples for design, implementation, and error handling.
+
+ You are a Senior software engineer with extensive experience in Quarkus and MongoDB
+
+ Apply Quarkus MongoDB patterns with clean document modeling, maintainable repositories/services, and robust failure handling using Panache.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Document schema and indexes
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Implementation
+ Panache repository with explicit query methods
+
+
+
+
+
+
+ {
+ public Optional findByOrderNumber(String orderNumber) {
+ return find("orderNumber", orderNumber).firstResultOptional();
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Error handling
+ Duplicate keys and transient failures
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Testing
+ @QuarkusTest with MongoDB Testcontainers via test resource
+
+
+
+
+
+
+ start() {
+ mongo = new MongoDBContainer(DockerImageName.parse("mongo:7.0"));
+ mongo.start();
+ return Map.of("quarkus.mongodb.connection-string", mongo.getReplicaSetUrl());
+ }
+ @Override
+ public void stop() {
+ if (mongo != null) mongo.stop();
+ }
+}
+
+@QuarkusTest
+@QuarkusTestResource(MongoTestResource.class)
+class OrderMongoIT {
+ @Test
+ void shouldPersistAndReadDocument() {
+ // verify repository round trip
+ }
+}]]>
+
+
+
+
+
+
+
+
+
+ **ANALYZE** MongoDB code: document design, repository interfaces, indexing strategies, and error handling
+ **CATEGORIZE** issues by impact (CORRECTNESS, PERFORMANCE, DATA INTEGRITY)
+ **APPLY** Quarkus Mongo Panache-aligned fixes: use `@MongoEntity`, `PanacheMongoRepository`, and proper exception handling
+ **IMPLEMENT** changes so schema and queries stay consistent
+ **EXPLAIN** trade-offs (e.g., active record vs repository pattern, embedding vs referencing)
+ **TEST** repository behavior with `@QuarkusTest` and MongoDB Testcontainers (optionally through Dev Services where appropriate)
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **INDEXING**: Ensure indexes are created explicitly, as Quarkus does not auto-create them by default in production
+ **INJECTION**: Avoid building raw JSON queries using string concatenation; use Panache's parameterized `find()` methods
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/514-frameworks-micronaut-kafka.xml b/skills-generator/src/main/resources/skill-references/514-frameworks-micronaut-kafka.xml
new file mode 100644
index 00000000..d8c05eca
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/514-frameworks-micronaut-kafka.xml
@@ -0,0 +1,218 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Micronaut — Kafka messaging
+ Use when you need Kafka in Micronaut and want examples for design, implementation, and error handling with @KafkaClient and @KafkaListener.
+
+ You are a Senior software engineer with extensive experience in Micronaut and Kafka
+
+ Design and implement reliable Micronaut Kafka flows with explicit contracts and resilient consumer error handling.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Topic and consumer-group strategy
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Implementation
+ @KafkaClient and @KafkaListener
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Error handling
+ Retry with backoff and dead-letter strategy
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Testing
+ @MicronautTest with Kafka Testcontainers
+
+
+
+
+
+
+ getProperties() {
+ return Map.of("kafka.bootstrap.servers", kafka.getBootstrapServers());
+ }
+
+ @Test
+ void shouldConsumeEvent() {
+ // publish and verify processing
+ }
+}]]>
+
+
+
+
+
+
+
+
+
+ **ANALYZE** messaging code: `@KafkaClient`/`@KafkaListener` usage, topic naming, serialization, and error handling strategies
+ **CATEGORIZE** issues by impact (RELIABILITY, MAINTAINABILITY, PERFORMANCE)
+ **APPLY** Micronaut Kafka-aligned fixes: configure proper error strategies, use typed payloads, ensure idempotency
+ **IMPLEMENT** changes consistently across `application.yml` and Java code
+ **EXPLAIN** trade-offs (e.g., retry vs DLQ, consumer group isolation)
+ **TEST** messaging behavior with `@MicronautTest` and Testcontainers
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **POISON PILLS**: Use `RETRY_ON_ERROR` or explicit exception handlers to prevent poison pills from silently being acknowledged
+ **CONSUMER GROUPS**: Always specify a `groupId` in `@KafkaListener` to prevent random group generation and message loss on restart
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/515-frameworks-micronaut-mongodb.xml b/skills-generator/src/main/resources/skill-references/515-frameworks-micronaut-mongodb.xml
new file mode 100644
index 00000000..2b9b40c2
--- /dev/null
+++ b/skills-generator/src/main/resources/skill-references/515-frameworks-micronaut-mongodb.xml
@@ -0,0 +1,244 @@
+
+
+
+ Juan Antonio Breña Moral
+ 0.15.0-SNAPSHOT
+ Apache-2.0
+ Micronaut — MongoDB
+ Use when you need MongoDB in Micronaut Data and want examples for design, implementation, and error handling.
+
+ You are a Senior software engineer with extensive experience in Micronaut and MongoDB
+
+ Apply Micronaut MongoDB best practices with explicit document design, repository implementation, and safe handling of persistence failures.
+
+
+ Compile first; stop on failure; verify after changes.
+
+ **MANDATORY**: Run `./mvnw compile` before applying changes
+ **SAFETY**: If compile fails, stop immediately
+ **VERIFY**: Run `./mvnw clean verify` after changes
+
+
+
+
+
+
+ Design
+ Document shape and unique keys
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Implementation
+ @MongoRepository with typed query methods
+
+
+
+
+
+
+ {
+ Optional findByOrderNumber(String orderNumber);
+}
+
+@Singleton
+class OrderService {
+ private final OrderRepository repository;
+
+ OrderService(OrderRepository repository) {
+ this.repository = repository;
+ }
+
+ OrderDocument create(OrderDocument doc) {
+ return repository.save(doc);
+ }
+}]]>
+
+
+
+
+
+
+
+
+ Error handling
+ Duplicate key and transient timeout handling
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Testing
+ @MicronautTest with MongoDB Testcontainers
+
+
+
+
+
+
+ getProperties() {
+ return Map.of("mongodb.uri", mongo.getReplicaSetUrl());
+ }
+
+ @Test
+ void shouldPersistDocument() {
+ // verify repository persistence/query behavior
+ }
+}]]>
+
+
+
+
+
+
+
+
+
+ **ANALYZE** MongoDB code: document design, repository interfaces, indexing strategies, and error handling
+ **CATEGORIZE** issues by impact (CORRECTNESS, PERFORMANCE, DATA INTEGRITY)
+ **APPLY** Micronaut Data MongoDB-aligned fixes: use `@MappedEntity`, `@MongoRepository`, and proper exception handling
+ **IMPLEMENT** changes so schema and queries stay consistent
+ **EXPLAIN** trade-offs (e.g., embedding vs referencing, index overhead)
+ **TEST** repository behavior with `@MicronautTest` and Testcontainers
+ **VALIDATE** with `./mvnw compile` before and `./mvnw clean verify` after changes
+
+
+
+
+ **BLOCKING SAFETY CHECK**: Run `./mvnw compile` before ANY refactoring
+ **INDEXING**: Ensure indexes are created explicitly, as Micronaut Data may not auto-create them in production
+ **INJECTION**: Avoid building raw JSON queries using string concatenation; use derived queries or parameterized `@Query`
+
+
+
\ No newline at end of file
diff --git a/skills-generator/src/main/resources/skill-references/assets/agents/robot-coordinator.md b/skills-generator/src/main/resources/skill-references/assets/agents/robot-coordinator.md
index 8226e585..09a0f2ea 100644
--- a/skills-generator/src/main/resources/skill-references/assets/agents/robot-coordinator.md
+++ b/skills-generator/src/main/resources/skill-references/assets/agents/robot-coordinator.md
@@ -15,9 +15,9 @@ You are a **Coordinator** for Java Enterprise Development. Your primary responsi
### Collaboration partners
- **[@robot-java-coder](robot-java-coder.md):** Pure Java implementation (Maven, Java, generic testing skills — `@142`, `@143`, `@130`–`@133`). Use when **Framework identification** yields plain Java, CLI-only, or a stack without a dedicated framework agent here.
-- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, etc. — `@301`, `@302`, `@311`–`@313`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
-- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Quarkus tests — `@401`, `@402`, `@411`–`@413`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
-- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@513`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
+- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, Kafka messaging, MongoDB — `@301`, `@302`, `@311`–`@315`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
+- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Kafka messaging, MongoDB, Quarkus tests — `@401`, `@402`, `@411`–`@415`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
+- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, Kafka messaging, MongoDB, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@515`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
- **Parallel column drives grouping:** The plan's task list table includes a **Parallel** column (or **Agent** if the plan uses that name). Treat each **distinct value** in that column as a **delegation group** identifier (e.g. `A1`, `A2`, `A3-timeout`, `A3-retry`, `A4`).
- **One logical developer per group:** For each distinct **Parallel** value, assign a **separate** instance of the **same** chosen implementation agent (`robot-java-coder`, `robot-spring-boot-coder`, `robot-quarkus-coder`, or `robot-micronaut-coder`) whose scope is **only** the rows for that value. Label every handoff, e.g. `Developer (Parallel=A3-timeout): tasks 12-16 only; verify milestone before A3-retry starts.`
@@ -36,9 +36,9 @@ When you analyze the task, **determine the target framework** from requirements
| Finding | Delegate to |
|---------|-------------|
-| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests or tasks) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
-| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, or Quarkus-specific tasks) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
-| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, or Micronaut-specific tasks) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
+| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests, Kafka with `spring-kafka`, or MongoDB with `spring-data-mongodb`) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
+| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, SmallRye Reactive Messaging, or Quarkus MongoDB Panache) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
+| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, `micronaut-kafka`, or `micronaut-data-mongodb`) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
| No Spring Boot, Quarkus, or Micronaut; plain Java, other framework not covered by a dedicated agent here, or requirements are framework-neutral | [@robot-java-coder](robot-java-coder.md) |
**If mixed or ambiguous:** Prefer **robot-spring-boot-coder** when **any** authoritative requirement document commits to Spring Boot; prefer **robot-quarkus-coder** when it commits to Quarkus; prefer **robot-micronaut-coder** when it commits to Micronaut; otherwise prefer **robot-java-coder** and state the ambiguity in the handoff so the implementer can align with `pom.xml` / ADRs.
diff --git a/skills-generator/src/main/resources/skill-references/assets/agents/robot-micronaut-coder.md b/skills-generator/src/main/resources/skill-references/assets/agents/robot-micronaut-coder.md
index eeed8001..0bdb6d51 100644
--- a/skills-generator/src/main/resources/skill-references/assets/agents/robot-micronaut-coder.md
+++ b/skills-generator/src/main/resources/skill-references/assets/agents/robot-micronaut-coder.md
@@ -11,6 +11,8 @@ You are an **Implementation Specialist** for Micronaut projects. You focus on wr
- Implement `@Controller` HTTP endpoints, `@Singleton` application services, and `@Factory` beans following Micronaut conventions.
- Configure Micronaut `application.yml` / `application.properties`, environments, and `@Requires` / `@ConfigurationProperties`.
- Apply **Micronaut Data** (`@MappedEntity`, repositories, `@Query`, transactions) for relational persistence, or **raw JDBC** (`DataSource`, `PreparedStatement`) when `@511-frameworks-micronaut-jdbc` fits better.
+- Integrate Apache Kafka producers and consumers using `@KafkaClient`, `@KafkaListener`, `@KafkaKey`, and `KafkaListenerExceptionHandler`.
+- Integrate MongoDB using Micronaut Data MongoDB (`@MappedEntity`, `@MongoRepository`, `@MongoFindQuery`).
- Write Micronaut tests (`@MicronautTest`, `@MockBean`, `HttpClient`, `TestPropertyProvider` with Testcontainers).
- Ensure secure coding practices for web APIs.
@@ -30,6 +32,8 @@ Apply guidance from these Skills when relevant:
- `@511-frameworks-micronaut-jdbc`: programmatic JDBC (DataSource, SQL, transactions)
- `@512-frameworks-micronaut-data`: Micronaut Data (repositories, entities, generated SQL)
- `@513-frameworks-micronaut-db-migrations-flyway`: Micronaut DB migrations (Flyway)
+- `@514-frameworks-micronaut-kafka`: Kafka messaging (@KafkaClient, @KafkaListener, retries, dead-letter routing)
+- `@515-frameworks-micronaut-mongodb`: MongoDB (@MongoRepository, @MappedEntity, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
diff --git a/skills-generator/src/main/resources/skill-references/assets/agents/robot-quarkus-coder.md b/skills-generator/src/main/resources/skill-references/assets/agents/robot-quarkus-coder.md
index 9b8ca2ec..00c74b2c 100644
--- a/skills-generator/src/main/resources/skill-references/assets/agents/robot-quarkus-coder.md
+++ b/skills-generator/src/main/resources/skill-references/assets/agents/robot-quarkus-coder.md
@@ -10,8 +10,10 @@ You are an **Implementation Specialist** for Quarkus projects. You focus on writ
- Implement Jakarta REST resources, CDI services, and repositories following Quarkus conventions.
- Configure Quarkus extensions, profiles (`%dev`, `%test`, `%prod`), and `application.properties`.
-- Apply Quarkus JDBC or Hibernate ORM Panache for persistence.
-- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, REST Assured).
+- Apply Quarkus JDBC or Hibernate ORM Panache for relational persistence.
+- Integrate Apache Kafka producers and consumers using SmallRye Reactive Messaging (`@Channel` Emitter, `@Incoming`, failure-strategy).
+- Integrate MongoDB using Quarkus MongoDB Panache (`PanacheMongoEntity`, `PanacheMongoRepository`).
+- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, `@TestTransaction`, REST Assured, Dev Services).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -29,6 +31,8 @@ Apply guidance from these Skills when relevant:
- `@411-frameworks-quarkus-jdbc`: Quarkus JDBC
- `@412-frameworks-quarkus-panache`: Quarkus Panache
- `@413-frameworks-quarkus-db-migrations-flyway`: Quarkus DB migrations (Flyway)
+- `@414-frameworks-quarkus-kafka`: Kafka messaging (SmallRye Reactive Messaging, Emitter, @Incoming, failure strategies)
+- `@415-frameworks-quarkus-mongodb`: MongoDB (Panache Mongo entities, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing Strategies
diff --git a/skills-generator/src/main/resources/skill-references/assets/agents/robot-spring-boot-coder.md b/skills-generator/src/main/resources/skill-references/assets/agents/robot-spring-boot-coder.md
index cfbb2631..0126e621 100644
--- a/skills-generator/src/main/resources/skill-references/assets/agents/robot-spring-boot-coder.md
+++ b/skills-generator/src/main/resources/skill-references/assets/agents/robot-spring-boot-coder.md
@@ -10,8 +10,10 @@ You are an **Implementation Specialist** for Spring Boot projects. You focus on
- Implement REST controllers, services, and repositories following Spring Boot conventions.
- Configure Spring Boot auto-configuration, profiles, and `application.yml`.
-- Apply Spring Data JDBC for persistence.
-- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@SpringBootTest`).
+- Apply Spring Data JDBC for relational persistence.
+- Integrate Apache Kafka producers and listeners using `spring-kafka` (typed templates, retries, dead-letter topics).
+- Integrate MongoDB using Spring Data MongoDB (documents, repositories, error handling).
+- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@DataMongoTest`, `@SpringBootTest`, `@EmbeddedKafka`).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -29,6 +31,8 @@ Apply guidance from these Skills when relevant:
- `@311-frameworks-spring-jdbc`: Spring JDBC
- `@312-frameworks-spring-data-jdbc`: Spring Data JDBC
- `@313-frameworks-spring-db-migrations-flyway`: Flyway database migrations
+- `@314-frameworks-spring-kafka`: Kafka messaging (producers, listeners, retries, dead-letter topics)
+- `@315-frameworks-spring-mongodb`: MongoDB (document design, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
diff --git a/skills-generator/src/main/resources/skills.xml b/skills-generator/src/main/resources/skills.xml
index 028635b8..917f1fb4 100644
--- a/skills-generator/src/main/resources/skills.xml
+++ b/skills-generator/src/main/resources/skills.xml
@@ -257,6 +257,16 @@
313-frameworks-spring-db-migrations-flyway
+
+
+ 314-frameworks-spring-kafka
+
+
+
+
+ 315-frameworks-spring-mongodb
+
+ 321-frameworks-spring-boot-testing-unit-tests
@@ -307,6 +317,16 @@
413-frameworks-quarkus-db-migrations-flyway
+
+
+ 414-frameworks-quarkus-kafka
+
+
+
+
+ 415-frameworks-quarkus-mongodb
+
+ 421-frameworks-quarkus-testing-unit-tests
@@ -357,6 +377,16 @@
513-frameworks-micronaut-db-migrations-flyway
+
+
+ 514-frameworks-micronaut-kafka
+
+
+
+
+ 515-frameworks-micronaut-mongodb
+
+ 521-frameworks-micronaut-testing-unit-tests
diff --git a/skills/003-agents-installation/references/003-agents-installation.md b/skills/003-agents-installation/references/003-agents-installation.md
index 46b75139..2c490c8e 100644
--- a/skills/003-agents-installation/references/003-agents-installation.md
+++ b/skills/003-agents-installation/references/003-agents-installation.md
@@ -98,9 +98,9 @@ You are a **Coordinator** for Java Enterprise Development. Your primary responsi
### Collaboration partners
- **[@robot-java-coder](robot-java-coder.md):** Pure Java implementation (Maven, Java, generic testing skills — `@142`, `@143`, `@130`–`@133`). Use when **Framework identification** yields plain Java, CLI-only, or a stack without a dedicated framework agent here.
-- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, etc. — `@301`, `@302`, `@311`–`@313`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
-- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Quarkus tests — `@401`, `@402`, `@411`–`@413`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
-- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@513`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
+- **[@robot-spring-boot-coder](robot-spring-boot-coder.md):** Spring Boot implementation (controllers, REST, Spring Test slices, Spring Data/JDBC, Flyway migrations, Kafka messaging, MongoDB — `@301`, `@302`, `@311`–`@315`, `@321`–`@323`). Use when **Framework identification** yields **Spring Boot** as the application framework.
+- **[@robot-quarkus-coder](robot-quarkus-coder.md):** Quarkus implementation (Jakarta REST resources, CDI, Panache/JDBC, Flyway migrations, Kafka messaging, MongoDB, Quarkus tests — `@401`, `@402`, `@411`–`@415`, `@421`–`@423`). Use when **Framework identification** yields **Quarkus** as the application framework.
+- **[@robot-micronaut-coder](robot-micronaut-coder.md):** Micronaut implementation (`@Controller`, programmatic JDBC, Micronaut Data, Flyway migrations, Kafka messaging, MongoDB, `Micronaut.run`, CDI-style beans, Micronaut tests — `@501`, `@502`, `@511`–`@515`, `@521`–`@523`). Use when **Framework identification** yields **Micronaut** as the application framework.
- **Parallel column drives grouping:** The plan's task list table includes a **Parallel** column (or **Agent** if the plan uses that name). Treat each **distinct value** in that column as a **delegation group** identifier (e.g. `A1`, `A2`, `A3-timeout`, `A3-retry`, `A4`).
- **One logical developer per group:** For each distinct **Parallel** value, assign a **separate** instance of the **same** chosen implementation agent (`robot-java-coder`, `robot-spring-boot-coder`, `robot-quarkus-coder`, or `robot-micronaut-coder`) whose scope is **only** the rows for that value. Label every handoff, e.g. `Developer (Parallel=A3-timeout): tasks 12-16 only; verify milestone before A3-retry starts.`
@@ -119,9 +119,9 @@ When you analyze the task, **determine the target framework** from requirements
| Finding | Delegate to |
|---------|-------------|
-| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests or tasks) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
-| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, or Quarkus-specific tasks) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
-| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, or Micronaut-specific tasks) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
+| Spring Boot is the chosen or evident stack (starters, Boot parent/BOM, Boot-specific tests, Kafka with `spring-kafka`, or MongoDB with `spring-data-mongodb`) | [@robot-spring-boot-coder](robot-spring-boot-coder.md) |
+| Quarkus is the chosen or evident stack (quarkus-bom, quarkus-maven-plugin, `@QuarkusTest`, Dev Services, SmallRye Reactive Messaging, or Quarkus MongoDB Panache) | [@robot-quarkus-coder](robot-quarkus-coder.md) |
+| Micronaut is the chosen or evident stack (micronaut-parent / micronaut-maven-plugin, `io.micronaut` BOM, `@MicronautTest`, `Micronaut.run`, `micronaut-kafka`, or `micronaut-data-mongodb`) | [@robot-micronaut-coder](robot-micronaut-coder.md) |
| No Spring Boot, Quarkus, or Micronaut; plain Java, other framework not covered by a dedicated agent here, or requirements are framework-neutral | [@robot-java-coder](robot-java-coder.md) |
**If mixed or ambiguous:** Prefer **robot-spring-boot-coder** when **any** authoritative requirement document commits to Spring Boot; prefer **robot-quarkus-coder** when it commits to Quarkus; prefer **robot-micronaut-coder** when it commits to Micronaut; otherwise prefer **robot-java-coder** and state the ambiguity in the handoff so the implementer can align with `pom.xml` / ADRs.
@@ -248,6 +248,8 @@ You are an **Implementation Specialist** for Micronaut projects. You focus on wr
- Implement `@Controller` HTTP endpoints, `@Singleton` application services, and `@Factory` beans following Micronaut conventions.
- Configure Micronaut `application.yml` / `application.properties`, environments, and `@Requires` / `@ConfigurationProperties`.
- Apply **Micronaut Data** (`@MappedEntity`, repositories, `@Query`, transactions) for relational persistence, or **raw JDBC** (`DataSource`, `PreparedStatement`) when `@511-frameworks-micronaut-jdbc` fits better.
+- Integrate Apache Kafka producers and consumers using `@KafkaClient`, `@KafkaListener`, `@KafkaKey`, and `KafkaListenerExceptionHandler`.
+- Integrate MongoDB using Micronaut Data MongoDB (`@MappedEntity`, `@MongoRepository`, `@MongoFindQuery`).
- Write Micronaut tests (`@MicronautTest`, `@MockBean`, `HttpClient`, `TestPropertyProvider` with Testcontainers).
- Ensure secure coding practices for web APIs.
@@ -267,6 +269,8 @@ Apply guidance from these Skills when relevant:
- `@511-frameworks-micronaut-jdbc`: programmatic JDBC (DataSource, SQL, transactions)
- `@512-frameworks-micronaut-data`: Micronaut Data (repositories, entities, generated SQL)
- `@513-frameworks-micronaut-db-migrations-flyway`: Micronaut DB migrations (Flyway)
+- `@514-frameworks-micronaut-kafka`: Kafka messaging (@KafkaClient, @KafkaListener, retries, dead-letter routing)
+- `@515-frameworks-micronaut-mongodb`: MongoDB (@MongoRepository, @MappedEntity, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
@@ -302,8 +306,10 @@ You are an **Implementation Specialist** for Quarkus projects. You focus on writ
- Implement Jakarta REST resources, CDI services, and repositories following Quarkus conventions.
- Configure Quarkus extensions, profiles (`%dev`, `%test`, `%prod`), and `application.properties`.
-- Apply Quarkus JDBC or Hibernate ORM Panache for persistence.
-- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, REST Assured).
+- Apply Quarkus JDBC or Hibernate ORM Panache for relational persistence.
+- Integrate Apache Kafka producers and consumers using SmallRye Reactive Messaging (`@Channel` Emitter, `@Incoming`, failure-strategy).
+- Integrate MongoDB using Quarkus MongoDB Panache (`PanacheMongoEntity`, `PanacheMongoRepository`).
+- Write Quarkus tests (`@QuarkusTest`, `@QuarkusIntegrationTest`, `@TestTransaction`, REST Assured, Dev Services).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -321,6 +327,8 @@ Apply guidance from these Skills when relevant:
- `@411-frameworks-quarkus-jdbc`: Quarkus JDBC
- `@412-frameworks-quarkus-panache`: Quarkus Panache
- `@413-frameworks-quarkus-db-migrations-flyway`: Quarkus DB migrations (Flyway)
+- `@414-frameworks-quarkus-kafka`: Kafka messaging (SmallRye Reactive Messaging, Emitter, @Incoming, failure strategies)
+- `@415-frameworks-quarkus-mongodb`: MongoDB (Panache Mongo entities, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing Strategies
@@ -356,8 +364,10 @@ You are an **Implementation Specialist** for Spring Boot projects. You focus on
- Implement REST controllers, services, and repositories following Spring Boot conventions.
- Configure Spring Boot auto-configuration, profiles, and `application.yml`.
-- Apply Spring Data JDBC for persistence.
-- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@SpringBootTest`).
+- Apply Spring Data JDBC for relational persistence.
+- Integrate Apache Kafka producers and listeners using `spring-kafka` (typed templates, retries, dead-letter topics).
+- Integrate MongoDB using Spring Data MongoDB (documents, repositories, error handling).
+- Write Spring Test slices (`@WebMvcTest`, `@DataJdbcTest`, `@DataMongoTest`, `@SpringBootTest`, `@EmbeddedKafka`).
- Ensure secure coding practices for web APIs.
### Coding Standards
@@ -375,6 +385,8 @@ Apply guidance from these Skills when relevant:
- `@311-frameworks-spring-jdbc`: Spring JDBC
- `@312-frameworks-spring-data-jdbc`: Spring Data JDBC
- `@313-frameworks-spring-db-migrations-flyway`: Flyway database migrations
+- `@314-frameworks-spring-kafka`: Kafka messaging (producers, listeners, retries, dead-letter topics)
+- `@315-frameworks-spring-mongodb`: MongoDB (document design, repositories, error handling)
- `@142-java-functional-programming`: Functional programming patterns
- `@143-java-functional-exception-handling`: Exception handling patterns
- `@130-java-testing-strategies`: Testing strategies
diff --git a/skills/314-frameworks-spring-kafka/SKILL.md b/skills/314-frameworks-spring-kafka/SKILL.md
new file mode 100644
index 00000000..31f3d9c3
--- /dev/null
+++ b/skills/314-frameworks-spring-kafka/SKILL.md
@@ -0,0 +1,48 @@
+---
+name: 314-frameworks-spring-kafka
+description: Use when you need to design or implement Kafka messaging in Spring Boot — including topic design, producer/consumer implementation with Spring for Apache Kafka, retries and dead-letter topics, idempotency, and error handling. This should trigger for requests such as Add Kafka in Spring Boot; Review Spring Kafka consumers; Improve retries and DLT in Spring Kafka. Part of cursor-rules-java project
+license: Apache-2.0
+metadata:
+ author: Juan Antonio Breña Moral
+ version: 0.15.0-SNAPSHOT
+---
+# Spring Boot — Kafka messaging
+
+Apply Spring Kafka guidance with concrete examples for design, implementation, and error handling.
+
+## Constraints
+
+Compile before messaging refactors; verify after changes.
+
+- **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+- **SAFETY**: If compilation fails, stop immediately
+- **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+- **BEFORE APPLYING**: Read the reference for detailed rules and examples
+
+## When to use this skill
+
+- Add Kafka in Spring Boot
+- Review Spring Kafka consumers/producers
+- Improve retries, dead-letter topics, or idempotency in Spring Kafka
+
+## Workflow
+
+1. **Read reference and assess project context**
+
+Read `references/314-frameworks-spring-kafka.md` and inspect current messaging setup before proposing changes.
+
+2. **Gather scope and decide target improvements**
+
+Identify reliability and throughput goals and define the minimum safe set of changes.
+
+3. **Apply framework-aligned changes**
+
+Implement/refactor Spring Kafka configuration, producer/consumer logic, and failure handling.
+
+4. **Run verification and report results**
+
+Execute build/tests and summarize what changed, what was verified, and follow-up actions.
+
+## Reference
+
+For detailed guidance, examples, and constraints, see [references/314-frameworks-spring-kafka.md](references/314-frameworks-spring-kafka.md).
diff --git a/skills/314-frameworks-spring-kafka/references/314-frameworks-spring-kafka.md b/skills/314-frameworks-spring-kafka/references/314-frameworks-spring-kafka.md
new file mode 100644
index 00000000..881885b2
--- /dev/null
+++ b/skills/314-frameworks-spring-kafka/references/314-frameworks-spring-kafka.md
@@ -0,0 +1,396 @@
+---
+name: 314-frameworks-spring-kafka
+description: Use when you need Kafka with Spring Boot (`spring-kafka`) — including Maven dependencies, topic and event schema design, typed KafkaTemplate producers, @KafkaListener consumers, retries with DefaultErrorHandler, dead-letter topics, idempotent consumers, and integration testing with @EmbeddedKafka. This should trigger for requests such as Add Kafka in Spring Boot; Review Spring Kafka consumers; Improve retries and DLT in Spring Kafka.
+license: Apache-2.0
+metadata:
+ author: Juan Antonio Breña Moral
+ version: 0.15.0-SNAPSHOT
+---
+# Spring Boot — Kafka messaging
+
+## Role
+
+You are a Senior software engineer with extensive experience in Spring Boot and Apache Kafka
+
+## Goal
+
+Design and implement reliable Kafka messaging in Spring Boot using `spring-kafka`. Prefer typed event records, keyed producers for ordered processing, and declarative error handling with dead-letter topics over silent exception swallowing. Keep listeners thin — delegate business logic to services. Guard consumers against poison messages and replay with idempotency on the eventId.
+
+**What is covered in this Skill?**
+
+- Maven `spring-kafka` dependency aligned with the Spring Boot BOM
+- Versioned event schemas as Java records with explicit `eventId`, `schemaVersion`, and aggregate key
+- Topic naming conventions (`domain.entity.operation.v{N}`)
+- `KafkaTemplate` typed producer with explicit key strategy
+- `@KafkaListener` consumer with explicit `groupId`, `topics`, and typed payload
+- `ConcurrentKafkaListenerContainerFactory` for concurrency and batch vs. single-record modes
+- `DefaultErrorHandler` with `FixedBackOff` / `ExponentialBackOff` and `DeadLetterPublishingRecoverer`
+- Idempotent consumer pattern using `eventId` de-duplication store
+- Kafka transactions for exactly-once producer semantics
+- Testing with `@EmbeddedKafka` and `EmbeddedKafkaBroker`
+
+**Scope:** Apply recommendations based on the reference rules and good/bad code examples.
+
+## Constraints
+
+Before applying any Kafka changes, ensure the project compiles. Compilation failure is a BLOCKING condition.
+
+- **MANDATORY**: Run `./mvnw compile` or `mvn compile` before applying any change
+- **SAFETY**: If compilation fails, stop immediately
+- **VERIFY**: Run `./mvnw clean verify` or `mvn clean verify` after applying improvements
+- **INJECTION**: Never build topic names or Kafka header values from untrusted user input
+- **BEFORE APPLYING**: Read the reference for detailed rules and good/bad patterns
+- **EDGE CASE**: If the user goal is ambiguous, stop and ask a clarifying question before editing files
+- **EDGE CASE**: If required context, files, or tools are missing, report the blocker explicitly
+
+## Examples
+
+### Table of contents
+
+- Example 1: Maven dependency
+- Example 2: Event schema design
+- Example 3: Producer implementation
+- Example 4: Consumer implementation
+- Example 5: Error handling and dead-letter topic
+- Example 6: Idempotent consumer
+- Example 7: Integration testing
+
+### Example 1: Maven dependency
+
+Title: Add spring-kafka via the Spring Boot BOM; never pin the version manually
+Description: Spring Boot manages the `spring-kafka` version via its BOM. Declaring an explicit version pins the library and can cause incompatibility with the auto-configured `KafkaAutoConfiguration`. Use the starter form when using Spring Boot; add the raw `spring-kafka` artifact for library modules that do not depend on Spring Boot auto-configuration.
+
+**Good example:**
+
+```xml
+
+
+ org.springframework.kafka
+ spring-kafka
+
+```
+
+**Bad example:**
+
+```xml
+
+
+ org.springframework.kafka
+ spring-kafka
+ 3.1.0
+
+```
+
+### Example 2: Event schema design
+
+Title: Versioned immutable records with eventId and aggregate key
+Description: Define each Kafka event as a Java record. Include a unique `eventId` (UUID) for consumer de-duplication, a `schemaVersion` string for forward compatibility, and the aggregate's natural key (e.g. `orderId`). Use the aggregate key as the Kafka message key so all events for the same aggregate land in the same partition, preserving ordering. Topic names follow the pattern `domain.entity.operation.v{N}` to allow parallel consumers on older and newer versions.
+
+**Good example:**
+
+```java
+import java.time.Instant;
+
+// Immutable event schema — one record per event type
+public record OrderCreatedEvent(
+ String eventId, // UUID; used for consumer-side de-duplication
+ String schemaVersion, // "v1"; bump when non-backward-compatible fields change
+ String orderId, // aggregate key → use as Kafka message key
+ String customerId,
+ Instant occurredAt
+) {
+ public static OrderCreatedEvent of(String orderId, String customerId) {
+ return new OrderCreatedEvent(
+ java.util.UUID.randomUUID().toString(),
+ "v1",
+ orderId,
+ customerId,
+ Instant.now()
+ );
+ }
+}
+// topic: orders.created.v1 key: orderId
+```
+
+**Bad example:**
+
+```java
+// Bad: generic map payload — no schema, no version, no stable key
+Map event = Map.of(
+ "type", "orderCreated",
+ "data", "..."
+);
+// topic: events key: null (random partition assignment)
+```
+
+### Example 3: Producer implementation
+
+Title: Typed KafkaTemplate with explicit key; handle send failure
+Description: Inject a typed `KafkaTemplate<String, OrderCreatedEvent>` so the compiler enforces the key/value contract. Always pass the aggregate's key as the Kafka record key. Return the `CompletableFuture` to the caller (or await it at a command boundary) so broker-side send failures are observable. Do not throw from a `whenComplete` callback and assume the caller will see it.
+
+**Good example:**
+
+```java
+import org.springframework.kafka.core.KafkaTemplate;
+import org.springframework.kafka.support.SendResult;
+import org.springframework.stereotype.Service;
+import java.util.concurrent.CompletableFuture;
+import java.util.concurrent.CompletionException;
+
+@Service
+class OrderEventPublisher {
+
+ private static final String TOPIC = "orders.created.v1";
+
+ private final KafkaTemplate template;
+
+ OrderEventPublisher(KafkaTemplate template) {
+ this.template = template;
+ }
+
+ CompletableFuture> publish(OrderCreatedEvent event) {
+ return template.send(TOPIC, event.orderId(), event)
+ .exceptionally(ex -> {
+ throw new CompletionException("Kafka send failed for order: " + event.orderId(), ex);
+ });
+ }
+}
+```
+
+**Bad example:**
+
+```java
+// Bad: raw KafkaTemplate loses type safety;
+// null key loses partition ordering; ignored CompletableFuture drops send errors
+@Service
+class OrderEventPublisher {
+ @Autowired
+ KafkaTemplate template;
+
+ void publish(OrderCreatedEvent event) {
+ template.send("events", null, event.toString()); // no key, serialized via toString()
+ // CompletableFuture discarded — broker errors silently lost
+ }
+}
+```
+
+### Example 4: Consumer implementation
+
+Title: @KafkaListener with typed payload and thin handler
+Description: Annotate the listener class with `@Component`. Declare `topics` and `groupId` on the `@KafkaListener` so the consumer group is explicit and testable. Accept the typed event record as the method parameter. Delegate all business logic to an application service — the listener method should only translate the Kafka message into a service call.
+
+**Good example:**
+
+```java
+import org.springframework.kafka.annotation.KafkaListener;
+import org.springframework.stereotype.Component;
+
+@Component
+class BillingEventListener {
+
+ private final BillingService billingService;
+
+ BillingEventListener(BillingService billingService) {
+ this.billingService = billingService;
+ }
+
+ @KafkaListener(topics = "orders.created.v1", groupId = "billing-service-v1")
+ void onOrderCreated(OrderCreatedEvent event) {
+ billingService.createInvoice(event.orderId(), event.customerId());
+ }
+}
+```
+
+**Bad example:**
+
+```java
+// Bad: no groupId means a random group on each restart causing re-read from the beginning;
+// String payload requires manual parsing; business logic mixed directly in listener
+@KafkaListener(topics = "orders.created.v1")
+void onMessage(String rawJson) {
+ var orderId = rawJson.split(",")[0]; // fragile parsing
+ // 50 lines of business logic inline ...
+}
+```
+
+### Example 5: Error handling and dead-letter topic
+
+Title: DefaultErrorHandler with backoff and DeadLetterPublishingRecoverer
+Description: Configure a `DefaultErrorHandler` bean that retries with exponential back-off and then routes unrecoverable messages to a dead-letter topic via `DeadLetterPublishingRecoverer`. The recoverer automatically targets `{topic}.DLT` by default. Register the handler on the container factory so it applies to all listeners. Do not swallow exceptions inside the listener — let them propagate so the container's error handler can decide.
+
+**Good example:**
+
+```java
+import org.apache.kafka.common.TopicPartition;
+import org.springframework.context.annotation.Bean;
+import org.springframework.context.annotation.Configuration;
+import org.springframework.kafka.core.KafkaTemplate;
+import org.springframework.kafka.listener.DeadLetterPublishingRecoverer;
+import org.springframework.kafka.listener.DefaultErrorHandler;
+import org.springframework.util.backoff.ExponentialBackOff;
+
+@Configuration
+class KafkaErrorConfig {
+
+ @Bean
+ DefaultErrorHandler errorHandler(KafkaTemplate