diff --git a/.claude/skills/compile/SKILL.md b/.claude/skills/compile/SKILL.md
new file mode 100644
index 0000000000..caebde530a
--- /dev/null
+++ b/.claude/skills/compile/SKILL.md
@@ -0,0 +1,156 @@
+---
+name: compile
+description: Build the SkyWalking Java agent — full build, skip tests, single module, or plugin test scenarios
+user-invocable: true
+allowed-tools: Bash, Read, Glob, Grep
+---
+
+# Compile SkyWalking Java Agent
+
+Build the project based on user request. Detect what they want to build and run the appropriate command.
+
+## Prerequisites
+
+- JDK 17, 21, or 25 (JDK 8 is supported at runtime but JDK 17+ is needed to compile)
+- Maven is bundled as `./mvnw` (Maven wrapper)
+- Git submodules must be initialized for protocol definitions
+
+Check JDK version first:
+```bash
+java -version
+```
+
+If submodules are not initialized:
+```bash
+git submodule init && git submodule update
+```
+
+## Build Commands
+
+### Full build (with tests)
+```bash
+./mvnw clean package -Pall
+```
+
+### Full build (skip tests — recommended for development)
+```bash
+./mvnw clean package -Dmaven.test.skip=true
+```
+
+### CI build (with javadoc verification)
+```bash
+./mvnw clean verify install javadoc:javadoc -Dmaven.test.skip=true
+```
+
+### Build a single plugin module
+```bash
+./mvnw clean package -pl apm-sniffer/apm-sdk-plugin/{plugin-name} -am -Dmaven.test.skip=true
+```
+The `-am` flag builds required dependencies. Replace `{plugin-name}` with the actual plugin directory name.
+
+### Run checkstyle only
+```bash
+./mvnw checkstyle:check
+```
+
+### Run unit tests for a single module
+```bash
+./mvnw test -pl apm-sniffer/apm-sdk-plugin/{plugin-name}
+```
+
+### Build agent distribution only (after full build)
+The built agent is in `skywalking-agent/` directory after a full build.
+
+### Run a plugin E2E test scenario
+
+The E2E test framework has a **two-phase build** (matching CI):
+
+**Phase 1 — Build agent + test tools + Docker images (one-time setup):**
+```bash
+# Build the agent (JDK 17+ required)
+./mvnw clean package -Dmaven.test.skip=true
+
+# Switch to JDK 8 to build test tools and Docker images
+# The test/plugin/pom.xml builds: runner-helper, agent-test-tools, JVM/Tomcat container images
+export JAVA_HOME=$(/usr/libexec/java_home -v 8)
+export PATH=$JAVA_HOME/bin:$PATH
+
+./mvnw --batch-mode -f test/plugin/pom.xml \
+ -Dmaven.test.skip \
+ -Dbase_image_java=eclipse-temurin:8-jdk \
+ -Dbase_image_tomcat=tomcat:8.5-jdk8-openjdk \
+ -Dcontainer_image_version=1.0.0 \
+ clean package
+```
+
+This builds `skywalking/agent-test-jvm:1.0.0` and `skywalking/agent-test-tomcat:1.0.0` Docker images,
+plus `test/plugin/dist/plugin-runner-helper.jar` and `test/plugin/agent-test-tools/dist/` (mock-collector, validator).
+
+**Phase 2 — Run test scenarios (per scenario):**
+```bash
+# Use JDK 8 (matching CI). JDK 17 works for runner-helper but JDK 8 matches CI exactly.
+export JAVA_HOME=$(/usr/libexec/java_home -v 8)
+export PATH=$JAVA_HOME/bin:$PATH
+
+# Run WITHOUT -f (reuses pre-built tools and images from Phase 1)
+bash ./test/plugin/run.sh --debug {scenario-name}
+```
+
+**IMPORTANT flags:**
+- `--debug` — keeps workspace with logs and `actualData.yaml` for inspection after test
+- `-f` (force) — rebuilds ALL test tools and Docker images from scratch. **Do NOT use** if Phase 1 already completed — it re-clones `skywalking-agent-test-tool` from GitHub and rebuilds everything, which is slow and may fail due to network issues.
+- Without `-f` — reuses existing tools/images. This is the normal way to run tests.
+
+**Key rules:**
+- Run scenarios **one at a time** — they share Docker ports (8080, etc.) and will conflict if parallel
+- JDK 8 test scenarios use `eclipse-temurin:8-jdk` base image
+- JDK 17 test scenarios (in `plugins-jdk17-test` workflows) use `eclipse-temurin:17-jdk` base image
+- After a test, check `test/plugin/workspace/{scenario}/{version}/data/actualData.yaml` vs `expectedData.yaml` for debugging failures
+- Check `test/plugin/workspace/{scenario}/{version}/logs/` for container logs
+
+**Diagnosing "startup script not exists" failures:**
+This error means the scenario ZIP wasn't built or copied into the container. The root cause is almost always a **silent Maven build failure** — `run.sh` uses `mvnw -q` (quiet mode) which hides errors. Common causes:
+1. **Maven Central network timeout** — downloading a new library version fails silently. The `mvnw clean package` exits non-zero but the `-q` flag hides the error, and `run.sh` continues with missing artifacts.
+2. **Docker Hub timeout** — pulling dependency images (mongo, mysql, kafka, zookeeper) fails with EOF/TLS errors.
+3. **Killed previous run** — if a prior parallel run was killed mid-execution, leftover state in `test/plugin/workspace/` can interfere. Always `rm -rf test/plugin/workspace/{scenario}` before rerunning.
+
+To debug: run the Maven build manually in the scenario directory with verbose output:
+```bash
+cd test/plugin/scenarios/{scenario-name}
+../../../../mvnw clean package -Dtest.framework.version={version} -Dmaven.test.skip=true
+```
+If this succeeds but `run.sh` fails, it's likely a transient Maven Central network issue. Pre-download dependencies first:
+```bash
+# Pre-warm Maven cache for all versions before running tests
+for v in $(grep -v '^#' test/plugin/scenarios/{scenario}/support-version.list | grep -v '^$'); do
+ cd test/plugin/scenarios/{scenario}
+ ../../../../mvnw dependency:resolve -Dtest.framework.version=$v -q
+ cd -
+done
+```
+
+**Pre-pulling Docker dependency images:**
+Scenarios with `dependencies:` in `configuration.yml` need external Docker images. Pre-pull them before running tests to avoid mid-test Docker Hub failures:
+```bash
+# Check what images a scenario needs
+grep "image:" test/plugin/scenarios/{scenario}/configuration.yml
+# Pull them
+docker pull {image:tag}
+```
+
+### Generate protobuf sources (needed before IDE import)
+```bash
+./mvnw compile -Dmaven.test.skip=true
+```
+Then mark `*/target/generated-sources/protobuf/java` and `*/target/generated-sources/protobuf/grpc-java` as generated source folders in your IDE.
+
+## Common Issues
+
+- **Submodule not initialized**: If proto files are missing, run `git submodule init && git submodule update`
+- **Wrong JDK version**: Agent build requires JDK 17+. Test tools build (test/plugin/pom.xml) works best with JDK 8. Check with `java -version`.
+- **Checkstyle failures**: Run `./mvnw checkstyle:check` to see violations. Common: star imports, unused imports, System.out.println, missing @Override.
+- **Test scenario Docker issues**: Ensure Docker daemon is running. Use `--debug` flag to inspect `actualData.yaml`.
+- **`run.sh -f` fails on agent-test-tools**: The `-f` flag clones `skywalking-agent-test-tool` from GitHub and rebuilds from source. If GitHub is slow or unreachable, this fails. Solution: run Phase 1 build separately (see above), then use `run.sh` without `-f`.
+- **Lombok errors in runner-helper on JDK 25**: The test framework uses Lombok 1.18.20 which doesn't support JDK 25. Use JDK 8 or JDK 17 for building and running test tools.
+- **"startup script not exists" inside container**: The scenario ZIP wasn't built or copied correctly. Check that `mvnw clean package` succeeds in the scenario directory and produces both a `.jar` and `.zip` in `target/`.
+- **Port conflicts**: Never run multiple E2E scenarios simultaneously — they all bind to the same Docker ports.
diff --git a/.claude/skills/new-plugin/SKILL.md b/.claude/skills/new-plugin/SKILL.md
new file mode 100644
index 0000000000..08dc73d0f1
--- /dev/null
+++ b/.claude/skills/new-plugin/SKILL.md
@@ -0,0 +1,1124 @@
+---
+name: new-plugin
+description: Develop a new SkyWalking Java agent plugin — instrumentation, interceptor, tracing/meter, tests, and all boilerplate
+user-invocable: true
+allowed-tools: Read, Write, Edit, Glob, Grep, Bash, Agent
+---
+
+# SkyWalking Java Agent Plugin Development
+
+Develop a new plugin for the Apache SkyWalking Java Agent. Ask the user what library/framework to instrument and what to observe (tracing, metrics, or both), then generate all required files.
+
+## Step 0 - Gather Requirements
+
+Ask the user:
+1. **Target library/framework** and version range (e.g., "Jedis 3.x-4.x", "Spring Kafka 2.7+")
+2. **Observation type**: tracing plugin, meter plugin, or both
+3. **Plugin category**: SDK plugin (default), bootstrap plugin, or optional plugin
+4. **Span type needed**: Entry (server/consumer), Exit (client/producer), Local (internal), or combination
+
+If the user already provided this info, skip asking.
+
+## Step 1 - Understand the Library and Identify Interception Points
+
+This is the most critical step. Do NOT jump to picking method names. Follow these phases in order.
+
+### Phase 1: Understand How the Library Is Used
+
+Read the target library's documentation, quickstart guides, or sample code. Understand the **user-facing API** — how developers create clients, make calls, and handle responses. This tells you:
+- What objects are long-lived (clients, connections, pools) vs. per-request (requests, commands)
+- Where configuration lives (server address, credentials, timeouts)
+- Whether the library is sync, async, or reactive
+- Whether it uses callbacks, futures, or blocking calls
+
+Example thought process for a Redis client:
+```
+User creates: RedisClient client = RedisClient.create("redis://localhost:6379");
+User connects: StatefulRedisConnection conn = client.connect();
+User executes: conn.sync().get("key"); // or conn.async().get("key")
+```
+This tells you: connection holds the server address, commands are executed on the connection.
+
+### Phase 2: Trace the Execution Flow
+
+Starting from the user-facing API, trace inward through the library source code to understand the execution workflow:
+1. What happens when the user calls the API method?
+2. Where does the request object get built?
+3. Where is the actual network I/O or dispatch?
+4. Where is the response available?
+5. For RPC/MQ: where are headers/metadata accessible for inject/extract?
+
+**Key question at each point:** What data is directly accessible as method arguments, return values, or fields on `this`? You want interception points where you can read the data you need **without reflection**.
+
+### Phase 3: Choose Interception Points
+
+Pick interception points based on these principles:
+
+**Principle 1: Data accessibility without reflection.**
+Choose methods where the information you need (peer address, operation name, request/response details, headers for inject/extract) is directly available as method arguments, return values, or accessible through the `this` object's public API. **Never use reflection to read private fields.** If the data is not accessible at one method, look at a different point in the execution flow.
+
+**Principle 2: Use `EnhancedInstance` dynamic field to propagate context inside the library.**
+This is the primary mechanism for passing data between interception points. The agent adds a dynamic field to every enhanced class via `EnhancedInstance`. Use it to:
+- Store server address (peer) at connection/client creation time, retrieve it at command execution time
+- Store request info at request-build time, retrieve it at send time
+- Pass span references from the initiating method to the completion callback
+
+**Do NOT use `Map` or other caches to store per-instance context.** Always use the dynamic field on the relevant `EnhancedInstance`. Maps introduce memory leaks, concurrency issues, and are slower than the direct field access that `EnhancedInstance` provides.
+
+**Principle 3: Intercept the minimal set of methods.**
+Prefer one well-chosen interception point over many surface-level ones. If a library has 20 command methods that all flow through a single `dispatch()` method internally, intercept `dispatch()` — not all 20.
+
+**Principle 4: Pick points where you can do inject/extract for cross-process propagation.**
+For RPC/HTTP/MQ plugins, you need to inject trace context into outgoing requests (ExitSpan) or extract from incoming requests (EntrySpan). The interception point MUST be where headers/metadata are writable (inject) or readable (extract). If headers are not accessible at the execution method, look for:
+- A request builder/decorator stage where headers can be added
+- A channel/transport layer where metadata is attached
+- A message properties object accessible from the method arguments
+
+**Principle 5: Consider the span lifecycle across threads.**
+If the library dispatches work asynchronously:
+- Identify where the work is submitted (original thread) and where the result arrives (callback/future thread)
+- You may need interception points in both threads
+- Use `EnhancedInstance` dynamic field on the task/callback/future object to carry the span or `ContextSnapshot` across the thread boundary
+- Use `prepareForAsync()` / `asyncFinish()` if the span must stay open across threads
+
+### Phase 4: Map Out the Interception Plan
+
+Before writing code, create a clear plan listing:
+
+| Target Class | Method/Constructor | What to Do | Data Available |
+|---|---|---|---|
+| `XxxClient` | constructor | Store peer address in dynamic field | host, port from args |
+| `XxxConnection` | `execute(Command)` | Create ExitSpan, inject carrier into command headers | command name, peer from dynamic field |
+| `XxxResponseHandler` | `onComplete(Response)` | Set response tags, stop span or asyncFinish | status code, error from args |
+
+For each interception point, verify:
+- [ ] The data I need is readable from method args, `this`, or `EnhancedInstance` dynamic field — no reflection needed
+- [ ] For inject: I can write headers/metadata through a public API on a method argument
+- [ ] For extract: I can read headers/metadata through a public API on a method argument
+- [ ] The `this` object (or a method argument) will be enhanced as `EnhancedInstance`, so I can use the dynamic field
+
+### Choosing Span Type
+
+| Scenario | Span Type | Requires |
+|----------|-----------|----------|
+| Receiving requests (HTTP server, MQ consumer, RPC provider) | EntrySpan | Extract ContextCarrier from incoming headers |
+| Making outgoing calls (HTTP client, DB, cache, MQ producer, RPC consumer) | ExitSpan | Peer address; inject ContextCarrier into outgoing headers (for RPC/HTTP/MQ) |
+| Internal processing (annotation-driven, local logic) | LocalSpan | Nothing extra |
+
+### For Meter Plugins
+
+Meter plugins follow the same understand-then-intercept process, but the goal is to find objects that expose numeric state:
+- **Gauges**: Intercept the creation of pool/executor/connection-manager objects. Register a `MeterFactory.gauge()` with a supplier lambda that calls the object's own getter methods (e.g., `pool.getActiveCount()`). Store the gauge reference in the dynamic field if needed.
+- **Counters**: Intercept execution methods and call `counter.increment()` on each invocation.
+- **Histograms**: Intercept methods where duration or size is computable (measure between before/after, or read from response).
+
+### Check Existing Plugins for Reference
+
+Before writing a new plugin, check if a similar library already has a plugin:
+```
+apm-sniffer/apm-sdk-plugin/ # 70+ standard plugins
+apm-sniffer/optional-plugins/ # Optional plugins
+apm-sniffer/bootstrap-plugins/ # JDK-level plugins
+```
+
+Similar libraries often share execution patterns. Study how an existing plugin for a similar library solved the same problems — especially how it chains dynamic fields across multiple interception points and where it does inject/extract.
+
+### Verify Against Actual Source Code — Never Speculate
+
+**This applies to both new plugin development AND extending existing plugins to newer library versions.**
+
+When assessing whether a plugin works with a new library version, or when choosing interception points for a new plugin, you MUST read the **actual source code** of the target library at the specific version. Do NOT rely on:
+- Version number assumptions ("it's still 4.x so it should be compatible")
+- Changelog summaries (they don't list every internal class rename or method removal)
+- General knowledge about the library's public API (plugins intercept internal classes, which change without notice)
+
+**What to verify for each intercepted class/method:**
+1. Does the target class still exist at the exact FQCN? (internal classes get renamed, extracted, or removed between minor versions)
+2. Does the intercepted method still exist with a compatible signature? (parameters may be added/removed/reordered)
+3. Do the witness classes still correctly distinguish versions? (a witness class that exists in both old and new versions won't prevent the plugin from loading on an incompatible version)
+4. Do the runtime APIs called by the interceptor still exist? (e.g., calling `cluster.getDescription()` will crash if that method was removed, even if the plugin loaded successfully)
+
+**How to verify — clone the library source locally:**
+```bash
+# Clone specific version tag to /tmp for easy source code inspection
+cd /tmp && git clone --depth 1 --branch {tag} https://github.com/{org}/{repo}.git {local-name}
+
+# Examples:
+git clone --depth 1 --branch r4.9.0 https://github.com/mongodb/mongo-java-driver.git mongo-4.9
+git clone --depth 1 --branch v2.4.13.RELEASE https://github.com/spring-projects/spring-kafka.git spring-kafka-2.4
+
+# Then grep/read the actual source files to check class/method existence
+grep -rn "getFilter\|getWriteRequests" /tmp/mongo-4.9/driver-core/src/main/com/mongodb/internal/operation/
+```
+
+This is faster and more reliable than fetching individual files via raw GitHub URLs. You can `grep`, `diff` between versions, and trace the full execution path.
+
+**Also check for import-time class loading failures:**
+If a plugin helper class (not just the instrumentation class) imports a library class that was removed, the entire helper class will fail to load with `NoClassDefFoundError` at runtime. This silently breaks ALL functionality in that helper — not just the code paths using the removed class. Verify that every `import` statement in plugin support classes resolves to an existing class in the target version.
+
+**Real examples of why this matters:**
+- MongoDB driver 4.9 removed `InsertOperation`, `DeleteOperation`, `UpdateOperation` — `MongoOperationHelper` imported all three, causing the entire class to fail loading with `NoClassDefFoundError`, silently losing ALL `db.bind_vars` tags even for operations that still exist (like `FindOperation`, `AggregateOperation`)
+- MongoDB driver 4.11 removed `Cluster.getDescription()` — the plugin loads (witness classes pass) but crashes at runtime with `NoSuchMethodError`
+- Feign 12.2 moved `ReflectiveFeign$BuildTemplateByResolvingArgs` to `RequestTemplateFactoryResolver$BuildTemplateByResolvingArgs` — the path variable interception silently stops working
+- MariaDB 3.0 renamed every JDBC wrapper class (`MariaDbConnection` → `Connection`) — none of the plugin's `byName` matchers match anything
+
+**When extending `support-version.list` to add newer versions:**
+Before adding a version, verify that every class and method the plugin intercepts still exists in that version's source. A plugin test passing does not mean everything works — it only means the test scenario's specific code path exercised the intercepted methods. Missing interception points may go undetected if the test doesn't cover them.
+
+## Step 2 - Create Plugin Module
+
+### Directory Structure
+
+**SDK plugin** (most common):
+```
+apm-sniffer/apm-sdk-plugin/{framework}-{version}-plugin/
+ pom.xml
+ src/main/java/org/apache/skywalking/apm/plugin/{framework}/v{N}/
+ define/
+ {Target}Instrumentation.java # One per target class
+ {Target}Interceptor.java # One per interception concern
+ {Target}ConstructorInterceptor.java # If intercepting constructors
+ {PluginName}PluginConfig.java # If plugin needs configuration
+ src/main/resources/
+ skywalking-plugin.def # Plugin registration
+ src/test/java/org/apache/skywalking/apm/plugin/{framework}/v{N}/
+ {Target}InterceptorTest.java # Unit tests
+```
+
+**Bootstrap plugin** (for JDK classes):
+```
+apm-sniffer/bootstrap-plugins/{name}-plugin/
+ (same structure, but instrumentation class overrides isBootstrapInstrumentation)
+```
+
+**Optional plugin**:
+```
+apm-sniffer/optional-plugins/{name}-plugin/
+ (same structure as SDK plugin)
+```
+
+### pom.xml Template
+
+```xml
+
+
+ 4.0.0
+
+
+ org.apache.skywalking
+ apm-sdk-plugin
+ ${revision}
+
+
+ {framework}-{version}-plugin
+ jar
+
+
+ X.Y.Z
+
+
+
+
+
+ com.example
+ target-library
+ ${target-library.version}
+ provided
+
+
+
+```
+
+**CRITICAL dependency rules:**
+- Target library: **always `provided` scope** (supplied by the application at runtime)
+- `apm-agent-core`: inherited from parent POM as `provided`
+- `apm-util`: inherited from parent POM as `provided`
+- Never bundle target library classes into the plugin JAR
+- If the plugin needs a 3rd-party utility not already in agent-core, discuss with maintainers first
+
+### Register in Parent POM
+
+Add the new module to the parent `pom.xml`:
+```xml
+
+ ...
+ {framework}-{version}-plugin
+
+```
+
+## Step 3 - Implement Instrumentation Class (V2 API)
+
+**ALWAYS use V2 API for new plugins.** V1 is legacy.
+
+### Import Rules (Enforced by Checkstyle)
+
+Plugins may ONLY import from:
+- `java.*` - Java standard library
+- `org.apache.skywalking.*` - SkyWalking modules
+- `net.bytebuddy.*` - ByteBuddy (for matchers in instrumentation classes)
+
+**No other 3rd-party imports are allowed in instrumentation/activation files.** This is enforced by `apm-checkstyle/importControl.xml`. Interceptor classes CAN reference target library classes (they're loaded after the target library).
+
+### Class Matching
+
+**CRITICAL: NEVER use `.class` references in instrumentation definitions.** Always use string literals.
+
+```java
+// WRONG - breaks agent if class doesn't exist at runtime
+byName(SomeThirdPartyClass.class.getName())
+takesArgument(0, SomeThirdPartyClass.class)
+
+// CORRECT - safe string literals
+byName("com.example.SomeThirdPartyClass")
+takesArgumentWithType(0, "com.example.SomeThirdPartyClass")
+```
+
+Available ClassMatch types (from `org.apache.skywalking.apm.agent.core.plugin.match`):
+
+| Matcher | Usage | Performance |
+|---------|-------|-------------|
+| `NameMatch.byName(String)` | Exact class name | Best (HashMap lookup) |
+| `MultiClassNameMatch.byMultiClassMatch(String...)` | Multiple exact names | Good |
+| `HierarchyMatch.byHierarchyMatch(String...)` | Implements interface / extends class | Expensive - avoid unless necessary |
+| `ClassAnnotationMatch.byClassAnnotationMatch(String...)` | Has annotation(s) | Moderate |
+| `MethodAnnotationMatch.byMethodAnnotationMatch(String...)` | Has method with annotation | Moderate |
+| `PrefixMatch.nameStartsWith(String...)` | Class name prefix | Moderate |
+| `RegexMatch.byRegexMatch(String...)` | Regex on class name | Expensive |
+| `LogicalMatchOperation.and(match1, match2)` | AND composition | Depends on operands |
+| `LogicalMatchOperation.or(match1, match2)` | OR composition | Depends on operands |
+
+**Prefer `NameMatch.byName()` whenever possible.** It uses a fast HashMap lookup. All other matchers require linear scanning.
+
+### Method Matching (ByteBuddy ElementMatcher API)
+
+Common matchers from `net.bytebuddy.matcher.ElementMatchers`:
+
+```java
+// By name
+named("methodName")
+
+// By argument count
+takesArguments(2)
+takesArguments(0) // no-arg methods
+
+// By argument type (use SkyWalking's helper - string-based, safe)
+import static org.apache.skywalking.apm.agent.core.plugin.bytebuddy.ArgumentTypeNameMatch.takesArgumentWithType;
+takesArgumentWithType(0, "com.example.SomeType") // arg at index 0
+
+// By return type (SkyWalking helper)
+import static org.apache.skywalking.apm.agent.core.plugin.bytebuddy.ReturnTypeNameMatch.returnsWithType;
+returnsWithType("java.util.List")
+
+// By annotation (SkyWalking helper - string-based)
+import static org.apache.skywalking.apm.agent.core.plugin.bytebuddy.AnnotationTypeNameMatch.isAnnotatedWithType;
+isAnnotatedWithType("org.springframework.web.bind.annotation.RequestMapping")
+
+// Visibility
+isPublic()
+isPrivate()
+
+// Composition
+named("execute").and(takesArguments(1))
+named("method1").or(named("method2"))
+not(isDeclaredBy(Object.class))
+
+// Match any (use sparingly)
+any()
+```
+
+### Instrumentation Template - Instance Methods
+
+```java
+package org.apache.skywalking.apm.plugin.xxx.define;
+
+import net.bytebuddy.description.method.MethodDescription;
+import net.bytebuddy.matcher.ElementMatcher;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.ConstructorInterceptPoint;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.v2.InstanceMethodsInterceptV2Point;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.v2.ClassInstanceMethodsEnhancePluginDefineV2;
+import org.apache.skywalking.apm.agent.core.plugin.match.ClassMatch;
+import org.apache.skywalking.apm.agent.core.plugin.match.NameMatch;
+
+import static net.bytebuddy.matcher.ElementMatchers.named;
+
+public class XxxInstrumentation extends ClassInstanceMethodsEnhancePluginDefineV2 {
+
+ private static final String ENHANCE_CLASS = "com.example.TargetClass";
+ private static final String INTERCEPTOR_CLASS = "org.apache.skywalking.apm.plugin.xxx.XxxInterceptor";
+
+ @Override
+ protected ClassMatch enhanceClass() {
+ return NameMatch.byName(ENHANCE_CLASS);
+ }
+
+ @Override
+ public ConstructorInterceptPoint[] getConstructorsInterceptPoints() {
+ return null; // null or empty array if not intercepting constructors
+ }
+
+ @Override
+ public InstanceMethodsInterceptV2Point[] getInstanceMethodsInterceptV2Points() {
+ return new InstanceMethodsInterceptV2Point[] {
+ new InstanceMethodsInterceptV2Point() {
+ @Override
+ public ElementMatcher getMethodsMatcher() {
+ return named("targetMethod");
+ }
+
+ @Override
+ public String getMethodsInterceptorV2() {
+ return INTERCEPTOR_CLASS;
+ }
+
+ @Override
+ public boolean isOverrideArgs() {
+ return false;
+ }
+ }
+ };
+ }
+}
+```
+
+### Instrumentation Template - Static Methods
+
+```java
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.StaticMethodsInterceptPoint;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.v2.ClassStaticMethodsEnhancePluginDefineV2;
+
+public class XxxStaticInstrumentation extends ClassStaticMethodsEnhancePluginDefineV2 {
+ @Override
+ public StaticMethodsInterceptPoint[] getStaticMethodsInterceptPoints() {
+ return new StaticMethodsInterceptPoint[] {
+ new StaticMethodsInterceptPoint() {
+ @Override
+ public ElementMatcher getMethodsMatcher() {
+ return named("factoryMethod").and(takesArguments(2));
+ }
+
+ @Override
+ public String getMethodsInterceptor() {
+ return INTERCEPTOR_CLASS;
+ }
+
+ @Override
+ public boolean isOverrideArgs() {
+ return false;
+ }
+ }
+ };
+ }
+
+ @Override
+ protected ClassMatch enhanceClass() {
+ return NameMatch.byName(ENHANCE_CLASS);
+ }
+}
+```
+
+### Instrumentation Template - Both Instance + Static Methods
+
+Extend `ClassEnhancePluginDefineV2` and implement all four methods:
+- `enhanceClass()`
+- `getConstructorsInterceptPoints()`
+- `getInstanceMethodsInterceptV2Points()`
+- `getStaticMethodsInterceptPoints()`
+
+### Matching Interface/Abstract Class Implementations
+
+Use `HierarchyMatch` when you need to intercept all implementations of an interface:
+
+```java
+import org.apache.skywalking.apm.agent.core.plugin.match.HierarchyMatch;
+
+@Override
+protected ClassMatch enhanceClass() {
+ return HierarchyMatch.byHierarchyMatch("com.example.SomeInterface");
+}
+```
+
+**When to use HierarchyMatch:**
+- The library has an interface/abstract class with multiple implementations
+- You cannot enumerate all implementation class names
+- Example: intercepting all `javax.servlet.Servlet` implementations
+
+**Performance warning:** HierarchyMatch checks every loaded class against the hierarchy. Prefer `NameMatch` or `MultiClassNameMatch` if you know the concrete class names.
+
+**Combining with other matchers:**
+```java
+import org.apache.skywalking.apm.agent.core.plugin.match.logical.LogicalMatchOperation;
+
+@Override
+protected ClassMatch enhanceClass() {
+ return LogicalMatchOperation.and(
+ PrefixMatch.nameStartsWith("com.example"),
+ HierarchyMatch.byHierarchyMatch("java.lang.Runnable")
+ );
+}
+```
+
+### Witness Classes for Version Detection
+
+Override `witnessClasses()` or `witnessMethods()` to activate the plugin only for specific library versions:
+
+```java
+@Override
+protected String[] witnessClasses() {
+ // Plugin only loads if this class exists in the application
+ return new String[] {"com.example.VersionSpecificClass"};
+}
+
+@Override
+protected List witnessMethods() {
+ return Collections.singletonList(
+ new WitnessMethod("com.example.SomeClass", ElementMatchers.named("methodAddedInV2"))
+ );
+}
+```
+
+### Bootstrap Plugin Override
+
+For bootstrap plugins (instrumenting JDK classes), add:
+```java
+@Override
+public boolean isBootstrapInstrumentation() {
+ return true;
+}
+```
+
+**Bootstrap plugin rules:**
+- Only for JDK core classes (java.*, javax.*, sun.*)
+- Minimal interception scope (performance-critical paths)
+- Extra care with class loading (bootstrap classloader visibility)
+- Test with `runningMode: with_bootstrap`
+
+## Step 4 - Implement Interceptor Class (V2 API)
+
+### Available Interceptor Interfaces
+
+| Interface | Use Case |
+|-----------|----------|
+| `InstanceMethodsAroundInterceptorV2` | Instance method interception |
+| `StaticMethodsAroundInterceptorV2` | Static method interception |
+| `InstanceConstructorInterceptor` | Constructor interception (shared V1/V2) |
+
+### Core APIs Available in Interceptors
+
+**ContextManager** - Central tracing API (ThreadLocal-based):
+
+**CRITICAL threading rule:** All span lifecycle APIs (`createEntrySpan`, `createExitSpan`, `createLocalSpan`, `activeSpan`, `stopSpan`) operate on a **per-thread context via ThreadLocal**. By default, `createXxxSpan` and `stopSpan` MUST be called in the **same thread**. There are only two ways to work across threads:
+1. **`ContextSnapshot` (capture/continued)** — snapshot the context in thread A, then `continued()` in thread B to link a NEW span in thread B back to the parent trace. Each thread manages its own span lifecycle independently.
+2. **Async mode (`prepareForAsync`/`asyncFinish`)** — keeps a single span alive beyond the creating thread. Call `prepareForAsync()` in the original thread (before `stopSpan`), then `asyncFinish()` from any thread when the async work completes. Between `prepareForAsync` and `asyncFinish`, you may call tag/log/error on the span from any thread, but you must NOT call `ContextManager.stopSpan()` for that span again.
+
+```java
+import org.apache.skywalking.apm.agent.core.context.ContextManager;
+
+// Create spans (must stopSpan in the SAME thread, unless async mode)
+AbstractSpan span = ContextManager.createEntrySpan(operationName, contextCarrier);
+AbstractSpan span = ContextManager.createLocalSpan(operationName);
+AbstractSpan span = ContextManager.createExitSpan(operationName, contextCarrier, remotePeer);
+AbstractSpan span = ContextManager.createExitSpan(operationName, remotePeer);
+
+// Span lifecycle (same thread as create, unless async mode)
+ContextManager.activeSpan(); // Get current span in THIS thread
+ContextManager.stopSpan(); // Stop current span in THIS thread
+ContextManager.isActive(); // Check if context exists in THIS thread
+
+// Cross-process propagation (inject/extract ContextCarrier into headers/metadata)
+ContextManager.inject(carrier); // Inject into outgoing carrier
+ContextManager.extract(carrier); // Extract from incoming carrier
+
+// Cross-thread propagation (ContextSnapshot — link spans across threads)
+ContextManager.capture(); // Capture snapshot in originating thread
+ContextManager.continued(snapshot); // Continue from snapshot in receiving thread
+
+// Trace metadata
+ContextManager.getGlobalTraceId();
+ContextManager.getSegmentId();
+ContextManager.getSpanId();
+```
+
+**AbstractSpan** - Span configuration:
+```java
+span.setComponent(ComponentsDefine.YOUR_COMPONENT); // Required for Entry/Exit
+span.setLayer(SpanLayer.HTTP); // Required for Entry/Exit
+span.setOperationName("GET:/api/users");
+span.setPeer("host:port"); // Required for Exit spans
+
+// Tags
+span.tag(Tags.URL, url);
+span.tag(Tags.HTTP_RESPONSE_STATUS_CODE, statusCode);
+span.tag(Tags.DB_TYPE, "sql");
+span.tag(Tags.DB_STATEMENT, sql);
+span.tag(Tags.ofKey("custom.key"), value);
+
+// Error handling
+span.errorOccurred();
+span.log(throwable);
+
+// Async support
+span.prepareForAsync(); // Must call in original thread
+span.asyncFinish(); // Call in async thread when done
+```
+
+**SpanLayer** values: `DB`, `RPC_FRAMEWORK`, `HTTP`, `MQ`, `CACHE`, `GEN_AI`
+
+**Standard Tags** (from `org.apache.skywalking.apm.agent.core.context.tag.Tags`):
+
+| Tag | Constant | Purpose |
+|---------------------|----------------------------------|--------------------------|
+| `url` | `Tags.URL` | Request URL |
+| `http.status_code` | `Tags.HTTP_RESPONSE_STATUS_CODE` | HTTP status (IntegerTag) |
+| `http.method` | `Tags.HTTP.METHOD` | HTTP method |
+| `db.type` | `Tags.DB_TYPE` | Database type |
+| `db.instance` | `Tags.DB_INSTANCE` | Database name |
+| `db.statement` | `Tags.DB_STATEMENT` | SQL/query |
+| `db.bind_variables` | `Tags.DB_BIND_VARIABLES` | Bound params |
+| `mq.queue` | `Tags.MQ_QUEUE` | Queue name |
+| `mq.topic` | `Tags.MQ_TOPIC` | Topic name |
+| `mq.broker` | `Tags.MQ_BROKER` | Broker address |
+| `cache.type` | `Tags.CACHE_TYPE` | Cache type |
+| `cache.op` | `Tags.CACHE_OP` | "read" or "write" |
+| `cache.cmd` | `Tags.CACHE_CMD` | Cache command |
+| `cache.key` | `Tags.CACHE_KEY` | Cache key |
+| Custom | `Tags.ofKey("key")` | Any custom tag |
+
+**EnhancedInstance** - Dynamic field for cross-interceptor data:
+```java
+// Store data (e.g., in constructor interceptor)
+objInst.setSkyWalkingDynamicField(connectionInfo);
+
+// Retrieve data (e.g., in method interceptor)
+ConnectionInfo info = (ConnectionInfo) objInst.getSkyWalkingDynamicField();
+```
+
+**Logging** - Agent-internal logging (NOT application logging):
+```java
+import org.apache.skywalking.apm.agent.core.logging.api.ILog;
+import org.apache.skywalking.apm.agent.core.logging.api.LogManager;
+
+private static final ILog LOGGER = LogManager.getLogger(MyInterceptor.class);
+LOGGER.info("message: {}", value);
+LOGGER.error("error", throwable);
+```
+
+**MeterFactory** - For meter plugins:
+```java
+import org.apache.skywalking.apm.toolkit.meter.MeterFactory;
+import org.apache.skywalking.apm.toolkit.meter.Counter;
+import org.apache.skywalking.apm.toolkit.meter.Gauge;
+import org.apache.skywalking.apm.toolkit.meter.Histogram;
+
+Counter counter = MeterFactory.counter("metric_name")
+ .tag("key", "value")
+ .mode(Counter.Mode.INCREMENT)
+ .build();
+counter.increment(1.0);
+
+Gauge gauge = MeterFactory.gauge("metric_name", () -> pool.getActiveCount())
+ .tag("pool_name", name)
+ .build();
+
+Histogram histogram = MeterFactory.histogram("metric_name")
+ .steps(Arrays.asList(10.0, 50.0, 100.0, 500.0))
+ .minValue(0)
+ .build();
+histogram.addValue(latencyMs);
+```
+
+### Interceptor Template - ExitSpan (Client/Producer)
+
+```java
+package org.apache.skywalking.apm.plugin.xxx;
+
+import java.lang.reflect.Method;
+import org.apache.skywalking.apm.agent.core.context.CarrierItem;
+import org.apache.skywalking.apm.agent.core.context.ContextCarrier;
+import org.apache.skywalking.apm.agent.core.context.ContextManager;
+import org.apache.skywalking.apm.agent.core.context.tag.Tags;
+import org.apache.skywalking.apm.agent.core.context.trace.AbstractSpan;
+import org.apache.skywalking.apm.agent.core.context.trace.SpanLayer;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.v2.InstanceMethodsAroundInterceptorV2;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.v2.MethodInvocationContext;
+import org.apache.skywalking.apm.network.trace.component.ComponentsDefine;
+
+public class XxxClientInterceptor implements InstanceMethodsAroundInterceptorV2 {
+
+ @Override
+ public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, MethodInvocationContext context) throws Throwable {
+ // 1. Build peer address from stored connection info
+ String remotePeer = (String) objInst.getSkyWalkingDynamicField();
+
+ // 2. Create ExitSpan with ContextCarrier for cross-process propagation
+ ContextCarrier contextCarrier = new ContextCarrier();
+ AbstractSpan span = ContextManager.createExitSpan("operation/name", contextCarrier, remotePeer);
+ span.setComponent(ComponentsDefine.YOUR_COMPONENT);
+ SpanLayer.asHttp(span); // or asDB, asMQ, asRPCFramework, asCache
+
+ // 3. Inject trace context into outgoing request headers
+ // The request object is typically one of the method arguments
+ CarrierItem next = contextCarrier.items();
+ while (next.hasNext()) {
+ next = next.next();
+ // Set header on the outgoing request:
+ // request.setHeader(next.getHeadKey(), next.getHeadValue());
+ }
+
+ // 4. Set tags
+ Tags.URL.set(span, url);
+
+ // 5. Store span in context for afterMethod
+ context.setContext(span);
+ }
+
+ @Override
+ public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, Object ret, MethodInvocationContext context) throws Throwable {
+ // Check response status, set tags/errors
+ AbstractSpan span = (AbstractSpan) context.getContext();
+ // Example: Tags.HTTP_RESPONSE_STATUS_CODE.set(span, statusCode);
+ // if (statusCode >= 400) span.errorOccurred();
+
+ ContextManager.stopSpan();
+ return ret;
+ }
+
+ @Override
+ public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, Throwable t, MethodInvocationContext context) {
+ ContextManager.activeSpan().log(t);
+ ContextManager.activeSpan().errorOccurred();
+ }
+}
+```
+
+### Interceptor Template - EntrySpan (Server/Consumer)
+
+```java
+public class XxxServerInterceptor implements InstanceMethodsAroundInterceptorV2 {
+
+ @Override
+ public void beforeMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, MethodInvocationContext context) throws Throwable {
+ // 1. Extract trace context from incoming request headers
+ ContextCarrier contextCarrier = new ContextCarrier();
+ CarrierItem next = contextCarrier.items();
+ while (next.hasNext()) {
+ next = next.next();
+ // Read header from incoming request:
+ // next.setHeadValue(request.getHeader(next.getHeadKey()));
+ }
+
+ // 2. Create EntrySpan (extracts context automatically)
+ AbstractSpan span = ContextManager.createEntrySpan("operation/name", contextCarrier);
+ span.setComponent(ComponentsDefine.YOUR_COMPONENT);
+ SpanLayer.asHttp(span); // or asMQ, asRPCFramework
+ span.setPeer(clientAddress); // Optional: client address
+
+ context.setContext(span);
+ }
+
+ @Override
+ public Object afterMethod(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, Object ret, MethodInvocationContext context) throws Throwable {
+ ContextManager.stopSpan();
+ return ret;
+ }
+
+ @Override
+ public void handleMethodException(EnhancedInstance objInst, Method method, Object[] allArguments,
+ Class>[] argumentsTypes, Throwable t, MethodInvocationContext context) {
+ ContextManager.activeSpan().log(t);
+ ContextManager.activeSpan().errorOccurred();
+ }
+}
+```
+
+### Interceptor Template - Constructor (Store Connection Info)
+
+```java
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.InstanceConstructorInterceptor;
+
+public class XxxConstructorInterceptor implements InstanceConstructorInterceptor {
+
+ @Override
+ public void onConstruct(EnhancedInstance objInst, Object[] allArguments) {
+ // Store connection info for later use by method interceptors
+ String host = (String) allArguments[0];
+ int port = (int) allArguments[1];
+ objInst.setSkyWalkingDynamicField(host + ":" + port);
+ }
+}
+```
+
+### Cross-Thread Context Propagation (ContextSnapshot)
+
+Use `ContextSnapshot` when the library dispatches work to another thread and you want the new thread's spans to be linked to the parent trace. Each thread creates and stops its OWN spans — the snapshot only provides the link.
+
+```java
+// Thread A (originating thread) — create span, capture snapshot, stop span (all same thread)
+@Override
+public void beforeMethod(..., MethodInvocationContext context) {
+ AbstractSpan span = ContextManager.createLocalSpan("async/dispatch");
+ // Capture context snapshot BEFORE handing off to another thread
+ ContextSnapshot snapshot = ContextManager.capture();
+ // Store snapshot on the task object via EnhancedInstance dynamic field
+ ((EnhancedInstance) allArguments[0]).setSkyWalkingDynamicField(snapshot);
+ ContextManager.stopSpan(); // Stop span in THIS thread (same thread as create)
+}
+
+// Thread B (receiving thread) — create its OWN span, link to parent via continued()
+@Override
+public void beforeMethod(EnhancedInstance objInst, ...) {
+ ContextSnapshot snapshot = (ContextSnapshot) objInst.getSkyWalkingDynamicField();
+ if (snapshot != null) {
+ AbstractSpan span = ContextManager.createLocalSpan("async/execute");
+ ContextManager.continued(snapshot); // Link this span to the parent trace
+ }
+}
+
+@Override
+public Object afterMethod(...) {
+ if (ContextManager.isActive()) {
+ ContextManager.stopSpan(); // Stop span in THIS thread (same thread as create)
+ }
+ return ret;
+}
+```
+
+### Async Span Pattern (prepareForAsync / asyncFinish)
+
+Use this when a **single span** needs to stay open across thread boundaries — e.g., an ExitSpan created before an async call, finished when the callback fires in another thread. The key difference from ContextSnapshot: here one span lives across threads instead of each thread having its own span.
+
+```java
+// Thread A — create span, mark async, stop context (all same thread)
+@Override
+public void beforeMethod(..., MethodInvocationContext context) {
+ AbstractSpan span = ContextManager.createExitSpan("async/call", remotePeer);
+ span.setComponent(ComponentsDefine.YOUR_COMPONENT);
+ SpanLayer.asHttp(span);
+
+ span.prepareForAsync(); // Mark: this span will finish in another thread
+ ContextManager.stopSpan(); // Detach from THIS thread's context (required, same thread as create)
+
+ // Store span reference on the callback object's dynamic field
+ ((EnhancedInstance) callback).setSkyWalkingDynamicField(span);
+}
+
+// Thread B (callback/completion handler) — finish the async span
+@Override
+public void beforeMethod(EnhancedInstance objInst, ...) {
+ AbstractSpan span = (AbstractSpan) objInst.getSkyWalkingDynamicField();
+ if (span != null) {
+ // Add response info to the span (tag/log/error are thread-safe after prepareForAsync)
+ span.tag(Tags.HTTP_RESPONSE_STATUS_CODE, statusCode);
+ if (isError) span.errorOccurred();
+ span.asyncFinish(); // Must match prepareForAsync count
+ }
+}
+```
+
+### Plugin Configuration (Optional)
+
+```java
+import org.apache.skywalking.apm.agent.core.boot.PluginConfig;
+
+public class XxxPluginConfig {
+ public static class Plugin {
+ @PluginConfig(root = XxxPluginConfig.class)
+ public static class Xxx {
+ // Config key: plugin.xxx.trace_param
+ public static boolean TRACE_PARAM = false;
+ // Config key: plugin.xxx.max_length
+ public static int MAX_LENGTH = 256;
+ }
+ }
+}
+```
+
+## Step 5 - Register Plugin
+
+Create `src/main/resources/skywalking-plugin.def`:
+```
+plugin-name=org.apache.skywalking.apm.plugin.xxx.define.XxxInstrumentation
+plugin-name=org.apache.skywalking.apm.plugin.xxx.define.XxxOtherInstrumentation
+```
+
+Format: `{plugin-id}={fully.qualified.InstrumentationClassName}` (one line per instrumentation class, all sharing the same plugin-id prefix).
+
+## Step 6 - Write Unit Tests
+
+### Test Setup Pattern
+
+```java
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.mockito.Mock;
+import org.mockito.junit.MockitoJUnit;
+import org.mockito.junit.MockitoRule;
+import org.apache.skywalking.apm.agent.core.plugin.interceptor.enhance.EnhancedInstance;
+import org.apache.skywalking.apm.agent.test.tools.AgentServiceRule;
+import org.apache.skywalking.apm.agent.test.tools.SegmentStorage;
+import org.apache.skywalking.apm.agent.test.tools.SegmentStoragePoint;
+import org.apache.skywalking.apm.agent.test.tools.TracingSegmentRunner;
+
+@RunWith(TracingSegmentRunner.class)
+public class XxxInterceptorTest {
+
+ @SegmentStoragePoint
+ private SegmentStorage segmentStorage;
+
+ @Rule
+ public AgentServiceRule agentServiceRule = new AgentServiceRule();
+
+ @Rule
+ public MockitoRule rule = MockitoJUnit.rule();
+
+ @Mock
+ private EnhancedInstance enhancedInstance;
+
+ private XxxInterceptor interceptor;
+
+ @Before
+ public void setUp() {
+ interceptor = new XxxInterceptor();
+ // Setup mocks
+ }
+
+ @Test
+ public void testNormalRequest() throws Throwable {
+ // Arrange
+ Object[] allArguments = new Object[] { /* mock args */ };
+ Class[] argumentsTypes = new Class[] { /* arg types */ };
+
+ // Act
+ interceptor.beforeMethod(enhancedInstance, null, allArguments, argumentsTypes, null);
+ interceptor.afterMethod(enhancedInstance, null, allArguments, argumentsTypes, mockResponse, null);
+
+ // Assert spans
+ assertThat(segmentStorage.getTraceSegments().size(), is(1));
+ TraceSegment segment = segmentStorage.getTraceSegments().get(0);
+ List spans = SegmentHelper.getSpans(segment);
+ assertThat(spans.size(), is(1));
+ // Verify span properties...
+ }
+}
+```
+
+## Step 7 - Write E2E Plugin Tests
+
+### Test Scenario Structure
+
+```
+test/plugin/scenarios/{framework}-{version}-scenario/
+ bin/startup.sh
+ config/expectedData.yaml
+ src/main/java/org/apache/skywalking/apm/testcase/{framework}/
+ controller/CaseController.java # HTTP endpoints
+ pom.xml
+ configuration.yml
+ support-version.list
+```
+
+**When copying an existing scenario to create a new one**, update the scenario name in ALL of these files:
+- `pom.xml` — `artifactId`, `name`, `finalName`
+- `src/main/assembly/assembly.xml` — JAR filename reference
+- `bin/startup.sh` — JAR filename in java -jar command
+- `config/expectedData.yaml` — `serviceName` field AND `parentService` in refs (but NOT URL paths — those are the app context path)
+- `support-version.list` — new versions
+- For JDK 17+ scenarios: update `compiler.version` to `17`, `spring.boot.version` to `3.x`, change `javax.annotation` imports to `jakarta.annotation` in Java source
+- Add the scenario to the appropriate CI workflow (`plugins-test.*.yaml` for JDK 8, `plugins-jdk17-test.*.yaml` for JDK 17)
+
+### configuration.yml
+
+```yaml
+type: jvm
+entryService: http://localhost:8080/{scenario-name}/case/{endpoint}
+healthCheck: http://localhost:8080/{scenario-name}/case/healthCheck
+startScript: ./bin/startup.sh
+environment: []
+dependencies: {}
+```
+
+### expectedData.yaml for Tracing
+
+```yaml
+segmentItems:
+ - serviceName: {scenario-name}
+ segmentSize: ge 1
+ segments:
+ - segmentId: not null
+ spans:
+ - operationName: your/operation
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http # Http, DB, RPCFramework, MQ, CACHE
+ spanType: Exit # Entry, Exit, Local
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 2 # Must match ComponentsDefine ID
+ isError: false
+ peer: 'host:port'
+ skipAnalysis: 'false'
+ tags:
+ - {key: url, value: not null}
+ - {key: http.method, value: GET}
+ logs: []
+ refs: []
+```
+
+### expectedData.yaml for Meters
+
+```yaml
+meterItems:
+ - serviceName: {scenario-name}
+ meterSize: ge 1
+ meters:
+ - meterId:
+ name: your_counter_name
+ tags:
+ - {name: tag_key, value: tag_value}
+ singleValue: gt 0
+```
+
+### Assertion Operators
+
+| Operator | Meaning |
+|----------|---------|
+| `eq VALUE` | Equals |
+| `nq VALUE` | Not equals |
+| `ge VALUE` | Greater or equal |
+| `gt VALUE` | Greater than |
+| `not null` | Must be present |
+| `null` | Must be absent |
+
+### Running Tests
+
+```bash
+bash ./test/plugin/run.sh -f {scenario-name}
+bash ./test/plugin/run.sh --debug {scenario-name} # Save actualData.yaml for debugging
+```
+
+**IMPORTANT: Run E2E test scenarios ONE AT A TIME.** Multiple scenarios use the same Docker ports (8080, etc.) and will conflict if run in parallel. Always wait for one scenario to finish before starting the next.
+
+## Step 8 - Shading (Package Relocation)
+
+Plugins automatically inherit ByteBuddy shading from the parent POM:
+```xml
+
+ net.bytebuddy
+ org.apache.skywalking.apm.dependencies.net.bytebuddy
+
+```
+
+The agent-core module handles shading of all core dependencies:
+- `com.google.*` -> `org.apache.skywalking.apm.dependencies.com.google.*`
+- `io.grpc.*` -> `org.apache.skywalking.apm.dependencies.io.grpc.*`
+- `io.netty.*` -> `org.apache.skywalking.apm.dependencies.io.netty.*`
+- `org.slf4j.*` -> `org.apache.skywalking.apm.dependencies.org.slf4j.*`
+
+**Plugins should NOT add their own shade configurations** unless they need to bundle a library not in agent-core (rare, requires maintainer approval). If a plugin needs a reporter-level dependency, see `optional-reporter-plugins/kafka-reporter-plugin/pom.xml` for the pattern.
+
+## Step 9 - Code Style Checklist
+
+Before submitting:
+- [ ] No `System.out.println` (use `ILog` from `LogManager`)
+- [ ] No `@author` tags (ASF policy)
+- [ ] No Chinese characters in source
+- [ ] No tab characters (use 4 spaces)
+- [ ] No star imports (`import xxx.*`)
+- [ ] No unused imports
+- [ ] `@Override` on all overridden methods
+- [ ] Apache 2.0 license header on all source files
+- [ ] File length under 3000 lines
+- [ ] Constants in `UPPER_SNAKE_CASE`
+- [ ] Types in `PascalCase`, variables in `camelCase`
+- [ ] Imports only from `java.*`, `org.apache.skywalking.*`, `net.bytebuddy.*` (in instrumentation files)
+- [ ] Target library dependencies in `provided` scope
+- [ ] Using V2 API for new plugins
+- [ ] String literals (not `.class` references) in instrumentation definitions
+- [ ] `skywalking-plugin.def` registered
+- [ ] Module added to parent POM
+
+## Step 10 - Update Documentation
+
+After verifying the plugin works (locally or via CI), update these documentation files:
+
+**1. `docs/en/setup/service-agent/java-agent/Supported-list.md`**
+Update the version range for the relevant entry. Format example:
+```
+ * [MongoDB Java Driver](https://github.com/mongodb/mongo-java-driver) 2.13-2.14, 3.4.0-3.12.7, 4.0.0-4.10.2
+```
+For new plugins, add a new bullet under the appropriate category.
+
+**2. `CHANGES.md`**
+Add a changelog entry under the current unreleased version section. Format:
+```
+* Extend {plugin-name} plugin to support {library} {version-range}.
+```
+Or for new plugins:
+```
+* Add {framework} {version} plugin.
+```
+
+**3. `test/plugin/scenarios/{scenario}/support-version.list`**
+Add verified versions. Only include the **latest patch version for each minor version** — do not list every patch release.
+
+The version list supports **extra Maven properties** per version line using comma-separated `key=value` pairs:
+```
+# Simple version (default pom properties)
+2.3.10.RELEASE
+
+# Version with overridden Maven properties
+2.7.14,spring.boot.version=2.5.15
+2.8.11,spring.boot.version=2.7.18
+3.1.4,spring.boot.version=3.2.12
+```
+
+This is useful when different framework versions need different dependency versions (e.g., spring-kafka minor versions require matching Spring Boot versions). The extra properties are passed as `-D` flags to Maven during the scenario build.
+
+**IMPORTANT:** Maven `-D` overrides work for `` and direct `${prop}` references, but do NOT override BOM versions resolved via `` imports. If a BOM version needs to change, set the default in the pom to the highest needed version, not the lowest.
+
+**Spring Boot / Spring Kafka compatibility mapping** (for reference):
+- spring-kafka 2.3-2.6 → Spring Boot 2.3 (default)
+- spring-kafka 2.7 → Spring Boot 2.5 (2.6+ autoconfigure requires `CommonErrorHandler` from spring-kafka 2.8)
+- spring-kafka 2.8-2.9 → Spring Boot 2.7
+- spring-kafka 3.0 → Spring Boot 3.0
+- spring-kafka 3.1-3.3 → Spring Boot 3.2 (requires Spring Framework 6.1)
+
+**This step is mandatory.** Documentation updates are part of the PR requirements.
+
+## Quick Reference - Plugin Type Decision Tree
+
+```
+Is the target a JDK core class (java.*, sun.*)?
+ YES -> Bootstrap plugin (isBootstrapInstrumentation = true)
+ NO -> Is it commonly used and should be auto-activated?
+ YES -> SDK plugin (apm-sdk-plugin/)
+ NO -> Optional plugin (optional-plugins/)
+
+What span type?
+ Receiving requests (HTTP server, MQ consumer, RPC provider) -> EntrySpan
+ Making outgoing calls (HTTP client, DB, cache, MQ producer) -> ExitSpan
+ Internal processing (annotation, local logic) -> LocalSpan
+
+Need cross-process propagation?
+ YES -> Use ContextCarrier (inject on exit, extract on entry)
+ NO -> No carrier needed
+
+Need cross-thread propagation?
+ YES -> Use ContextSnapshot (capture + continued) OR prepareForAsync/asyncFinish
+ NO -> Standard span lifecycle
+
+Need metrics without traces?
+ YES -> Meter plugin (Counter/Gauge/Histogram via MeterFactory)
+
+Need both traces and metrics?
+ YES -> Separate interceptors or combine in same interceptor
+```
\ No newline at end of file
diff --git a/.github/workflows/plugins-jdk17-test.0.yaml b/.github/workflows/plugins-jdk17-test.0.yaml
index 74ae79f0fe..49b3675c01 100644
--- a/.github/workflows/plugins-jdk17-test.0.yaml
+++ b/.github/workflows/plugins-jdk17-test.0.yaml
@@ -81,6 +81,9 @@ jobs:
- jetty-10.x-scenario
- spring-ai-1.x-scenario
- spring-rabbitmq-scenario
+ - graphql-20plus-scenario
+ - spring-kafka-3.x-scenario
+ - undertow-2.3.x-scenario
steps:
- uses: actions/checkout@v2
with:
diff --git a/.github/workflows/plugins-test.3.yaml b/.github/workflows/plugins-test.3.yaml
index 3d5880fd9f..36ad8289a0 100644
--- a/.github/workflows/plugins-test.3.yaml
+++ b/.github/workflows/plugins-test.3.yaml
@@ -69,6 +69,7 @@ jobs:
case:
- aerospike-scenario
- mysql-scenario
+ - mysql-9.x-scenario
- undertow-scenario
- webflux-scenario
- zookeeper-scenario
diff --git a/CHANGES.md b/CHANGES.md
index e49c314aab..f55806ecda 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -9,6 +9,14 @@ Release Notes.
* Add Spring AI 1.x plugin and GenAI layer.
* Fix httpclient-5.x plugin injecting sw8 propagation headers into ClickHouse HTTP requests (port 8123), causing HTTP 400. Add `PROPAGATION_EXCLUDE_PORTS` config to skip tracing (including header injection) for specified ports in the classic client interceptor.
* Add Spring RabbitMQ 2.x - 4.x plugin.
+* Extend MySQL plugin to support MySQL Connector/J 8.4.0 and 9.x (9.0 -> 9.6).
+* Extend MariaDB plugin to support MariaDB Connector/J 2.7.x.
+* Extend MongoDB 4.x plugin to support MongoDB Java Driver 4.2 -> 4.10. Fix db.bind_vars extraction for driver 4.9+ where InsertOperation/DeleteOperation/UpdateOperation classes were removed.
+* Extend Feign plugin to support OpenFeign 10.x, 11.x, 12.1.
+* Extend Undertow plugin to support Undertow 2.1.x, 2.2.x, 2.3.x.
+* Extend GraphQL plugin to support graphql-java 18 -> 24 (20+ requires JDK 17).
+* Extend Spring Kafka plugin to support Spring Kafka 2.4 -> 2.9 and 3.0 -> 3.3.
+* Enhance test/plugin/run.sh to support extra Maven properties per version in support-version.list (format: version,key=value).
All issues and pull requests are [here](https://github.com/apache/skywalking/milestone/249?closed=1)
diff --git a/apm-sniffer/apm-sdk-plugin/CLAUDE.md b/apm-sniffer/apm-sdk-plugin/CLAUDE.md
index a0c73a2a04..eb88c1cbe1 100644
--- a/apm-sniffer/apm-sdk-plugin/CLAUDE.md
+++ b/apm-sniffer/apm-sdk-plugin/CLAUDE.md
@@ -292,6 +292,10 @@ dependencies: # External services (docker-compose
4.3.6
4.4.1
4.5.0
+
+# Optional: extra Maven properties per version (comma-separated key=value)
+# Useful when different versions need different dependency versions
+2.7.14,spring.boot.version=2.5.15
```
**expectedData.yaml:**
diff --git a/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/LegacyOperationHelper.java b/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/LegacyOperationHelper.java
new file mode 100644
index 0000000000..3c3a19a7b7
--- /dev/null
+++ b/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/LegacyOperationHelper.java
@@ -0,0 +1,59 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.plugin.mongodb.v4.support;
+
+import com.mongodb.internal.bulk.DeleteRequest;
+import com.mongodb.internal.bulk.InsertRequest;
+import com.mongodb.internal.bulk.UpdateRequest;
+import com.mongodb.internal.operation.DeleteOperation;
+import com.mongodb.internal.operation.InsertOperation;
+import com.mongodb.internal.operation.UpdateOperation;
+
+import java.util.List;
+
+/**
+ * Handles trace parameter extraction for legacy MongoDB driver versions (4.0 - 4.8).
+ * InsertOperation, DeleteOperation, and UpdateOperation were removed in driver 4.9.
+ * This class is only loaded when those classes exist (guarded by
+ * {@link MongoOperationHelper#HAS_LEGACY_WRITE_OPERATIONS}).
+ */
+@SuppressWarnings("deprecation")
+class LegacyOperationHelper {
+
+ private LegacyOperationHelper() {
+ }
+
+ /**
+ * Extract trace parameters from legacy write operation types.
+ * @return the trace parameter string, or null if obj is not a legacy write operation
+ */
+ static String getTraceParam(Object obj) {
+ if (obj instanceof DeleteOperation) {
+ List writeRequestList = ((DeleteOperation) obj).getDeleteRequests();
+ return MongoOperationHelper.getFilter(writeRequestList);
+ } else if (obj instanceof InsertOperation) {
+ List writeRequestList = ((InsertOperation) obj).getInsertRequests();
+ return MongoOperationHelper.getFilter(writeRequestList);
+ } else if (obj instanceof UpdateOperation) {
+ List writeRequestList = ((UpdateOperation) obj).getUpdateRequests();
+ return MongoOperationHelper.getFilter(writeRequestList);
+ }
+ return null;
+ }
+}
diff --git a/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/MongoOperationHelper.java b/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/MongoOperationHelper.java
index 8a561cc887..96cf84c985 100644
--- a/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/MongoOperationHelper.java
+++ b/apm-sniffer/apm-sdk-plugin/mongodb-4.x-plugin/src/main/java/org/apache/skywalking/apm/plugin/mongodb/v4/support/MongoOperationHelper.java
@@ -27,18 +27,15 @@
import com.mongodb.internal.operation.CreateCollectionOperation;
import com.mongodb.internal.operation.CreateIndexesOperation;
import com.mongodb.internal.operation.CreateViewOperation;
-import com.mongodb.internal.operation.DeleteOperation;
import com.mongodb.internal.operation.DistinctOperation;
import com.mongodb.internal.operation.FindAndDeleteOperation;
import com.mongodb.internal.operation.FindAndReplaceOperation;
import com.mongodb.internal.operation.FindAndUpdateOperation;
import com.mongodb.internal.operation.FindOperation;
-import com.mongodb.internal.operation.InsertOperation;
import com.mongodb.internal.operation.ListCollectionsOperation;
import com.mongodb.internal.operation.MapReduceToCollectionOperation;
import com.mongodb.internal.operation.MapReduceWithInlineResultsOperation;
import com.mongodb.internal.operation.MixedBulkWriteOperation;
-import com.mongodb.internal.operation.UpdateOperation;
import org.bson.BsonDocument;
import java.util.List;
@@ -49,6 +46,21 @@
})
public class MongoOperationHelper {
+ // InsertOperation, DeleteOperation, UpdateOperation were removed in MongoDB driver 4.9.
+ // Use class existence check to determine which extraction path to use.
+ private static final boolean HAS_LEGACY_WRITE_OPERATIONS;
+
+ static {
+ boolean hasLegacy;
+ try {
+ Class.forName("com.mongodb.internal.operation.InsertOperation");
+ hasLegacy = true;
+ } catch (ClassNotFoundException e) {
+ hasLegacy = false;
+ }
+ HAS_LEGACY_WRITE_OPERATIONS = hasLegacy;
+ }
+
private MongoOperationHelper() {
}
@@ -69,22 +81,25 @@ public static String getTraceParam(Object obj) {
} else if (obj instanceof FindOperation) {
BsonDocument filter = ((FindOperation) obj).getFilter();
return limitFilter(filter.toString());
- } else if (obj instanceof ListCollectionsOperation) {
+ } else if (obj instanceof ListCollectionsOperation) {
BsonDocument filter = ((ListCollectionsOperation) obj).getFilter();
return limitFilter(filter.toString());
} else if (obj instanceof MapReduceWithInlineResultsOperation) {
BsonDocument filter = ((MapReduceWithInlineResultsOperation) obj).getFilter();
return limitFilter(filter.toString());
- } else if (obj instanceof DeleteOperation) {
- List writeRequestList = ((DeleteOperation) obj).getDeleteRequests();
- return getFilter(writeRequestList);
- } else if (obj instanceof InsertOperation) {
- List writeRequestList = ((InsertOperation) obj).getInsertRequests();
- return getFilter(writeRequestList);
- } else if (obj instanceof UpdateOperation) {
- List writeRequestList = ((UpdateOperation) obj).getUpdateRequests();
- return getFilter(writeRequestList);
- } else if (obj instanceof CreateCollectionOperation) {
+ } else if (HAS_LEGACY_WRITE_OPERATIONS) {
+ String result = LegacyOperationHelper.getTraceParam(obj);
+ if (result != null) {
+ return result;
+ }
+ return getCommonTraceParam(obj);
+ } else {
+ return getCommonTraceParam(obj);
+ }
+ }
+
+ private static String getCommonTraceParam(Object obj) {
+ if (obj instanceof CreateCollectionOperation) {
String filter = ((CreateCollectionOperation) obj).getCollectionName();
return limitFilter(filter);
} else if (obj instanceof CreateIndexesOperation) {
@@ -128,7 +143,7 @@ private static String getPipelines(List pipelines) {
return params.toString();
}
- private static String getFilter(List extends WriteRequest> writeRequestList) {
+ static String getFilter(List extends WriteRequest> writeRequestList) {
StringBuilder params = new StringBuilder();
for (WriteRequest request : writeRequestList) {
if (request instanceof InsertRequest) {
diff --git a/docs/en/setup/service-agent/java-agent/Claude-Skills.md b/docs/en/setup/service-agent/java-agent/Claude-Skills.md
new file mode 100644
index 0000000000..f851ee6c52
--- /dev/null
+++ b/docs/en/setup/service-agent/java-agent/Claude-Skills.md
@@ -0,0 +1,64 @@
+# Claude Code Skills
+
+[Claude Code](https://claude.ai/claude-code) is an AI-powered CLI tool by Anthropic. This project includes
+custom **skills** (`.claude/skills/`) that teach Claude Code how to work with the SkyWalking Java Agent codebase.
+
+Skills are reusable prompt templates that Claude Code can invoke via slash commands. They encode project-specific
+knowledge so that common development tasks can be performed consistently and correctly.
+
+## Available Skills
+
+### `/new-plugin` — Develop a New Plugin
+
+Guides the full lifecycle of creating a new SkyWalking Java agent plugin:
+
+1. **Gather requirements** — target library, observation type (tracing/meter), span types
+2. **Identify interception points** — understand library usage, trace execution flow, choose classes/methods
+3. **Create plugin module** — directory structure, pom.xml, dependencies (`provided` scope)
+4. **Implement instrumentation** — V2 API, class matching (ByteBuddy), method matching
+5. **Implement interceptors** — ContextManager spans, ContextCarrier inject/extract, EnhancedInstance dynamic fields
+6. **Register plugin** — `skywalking-plugin.def`
+7. **Write unit tests** — TracingSegmentRunner, SegmentStorage
+8. **Write E2E tests** — Docker-based scenarios, expectedData.yaml
+9. **Code style** — checkstyle compliance, import restrictions
+10. **Update documentation** — Supported-list.md, CHANGES.md
+
+Key principles encoded in this skill:
+- Always use **V2 API** (`ClassEnhancePluginDefineV2`, `InstanceMethodsAroundInterceptorV2`)
+- **Never use `.class` references** in instrumentation — always string literals
+- **Never use reflection** to access private fields — choose interception points with accessible data
+- **Never use Maps** to cache per-instance context — use `EnhancedInstance.setSkyWalkingDynamicField()`
+- **Verify actual source code** of target libraries — never speculate from version numbers
+- Span lifecycle APIs are **ThreadLocal-based** — create/stop in same thread unless async mode
+
+### `/compile` — Build the Project
+
+Runs the appropriate build command based on what you need:
+- Full build with or without tests
+- Single module build
+- Checkstyle check
+- Plugin E2E test scenarios
+- Protobuf source generation for IDE setup
+
+## How to Use
+
+1. Install [Claude Code](https://docs.anthropic.com/en/docs/claude-code/overview)
+2. Navigate to the `skywalking-java` repository root
+3. Run `claude` to start Claude Code
+4. Type `/new-plugin` or `/compile` to invoke a skill
+
+Skills can also be triggered implicitly — when you describe a task that matches a skill's purpose,
+Claude Code may suggest or invoke it automatically.
+
+## Project Context Files
+
+In addition to skills, the project includes `CLAUDE.md` files that provide codebase context:
+
+| File | Purpose |
+|------|---------|
+| `CLAUDE.md` (root) | Project overview, build system, architecture, conventions |
+| `apm-sniffer/apm-sdk-plugin/CLAUDE.md` | SDK plugin development guide (V2 API, class matching, testing) |
+| `apm-sniffer/bootstrap-plugins/CLAUDE.md` | Bootstrap plugin specifics (JDK class instrumentation) |
+
+These files are automatically loaded by Claude Code when working in the repository, providing it with
+the knowledge needed to assist with development tasks.
diff --git a/docs/en/setup/service-agent/java-agent/Plugin-test.md b/docs/en/setup/service-agent/java-agent/Plugin-test.md
index e9dabb7515..a40eb82e35 100644
--- a/docs/en/setup/service-agent/java-agent/Plugin-test.md
+++ b/docs/en/setup/service-agent/java-agent/Plugin-test.md
@@ -101,6 +101,16 @@ File Name | Descriptions
`*` support-version.list format requires every line for a single version (contains only the last version number of each minor version). You may use `#` to comment out this version.
+Each version line supports optional extra Maven properties using comma-separated `key=value` pairs:
+```
+# Simple version
+2.3.10.RELEASE
+
+# Version with extra Maven properties (passed as -D flags)
+2.7.14,spring.boot.version=2.5.15
+```
+This allows different framework versions to use different dependency versions without creating separate test scenarios.
+
### configuration.yml
| Field | description
diff --git a/docs/en/setup/service-agent/java-agent/Supported-list.md b/docs/en/setup/service-agent/java-agent/Supported-list.md
index 1645c43b4e..bbc7da6940 100644
--- a/docs/en/setup/service-agent/java-agent/Supported-list.md
+++ b/docs/en/setup/service-agent/java-agent/Supported-list.md
@@ -16,7 +16,7 @@ metrics based on the tracing data.
* Resin 4 (Optional¹), See [SkySPM Plugin Repository](https://github.com/SkyAPM/java-plugin-extensions)
* [Jetty Server](http://www.eclipse.org/jetty/) 9.x -> 11.x
* [Spring WebFlux](https://docs.spring.io/spring/docs/current/spring-framework-reference/web-reactive.html) 5.x (Optional²) -> 6.x (Optional²)
- * [Undertow](http://undertow.io/) 1.3.0.Final -> 2.0.27.Final
+ * [Undertow](http://undertow.io/) 1.3.0.Final -> 2.3.18.Final
* [RESTEasy](https://resteasy.dev/) 3.1.0.Final -> 6.2.4.Final
* [Play Framework](https://www.playframework.com/) 2.6.x -> 2.8.x
* [Light4J Microservices Framework](https://doc.networknt.com/) 1.6.x -> 2.x
@@ -28,7 +28,7 @@ metrics based on the tracing data.
* [Netty HTTP](https://github.com/netty/netty) 4.1.x (Optional²)
* [Solon](https://github.com/opensolon/solon) 2.7.x -> 2.8.x
* HTTP Client
- * [Feign](https://github.com/OpenFeign/feign) 9.x
+ * [Feign](https://github.com/OpenFeign/feign) 9.x -> 12.1
* [Netflix Spring Cloud Feign](https://github.com/spring-cloud/spring-cloud-openfeign) 1.1.x -> 2.x
* [Okhttp](https://github.com/square/okhttp) 2.x -> 3.x -> 4.x
* [Apache httpcomponent HttpClient](http://hc.apache.org/) 2.0 -> 3.1, 4.2, 4.3, 5.0, 5.1
@@ -45,12 +45,12 @@ metrics based on the tracing data.
* [Spring Cloud Gateway](https://spring.io/projects/spring-cloud-gateway) 2.0.2.RELEASE -> 4.3.x (Optional²)
* [Apache ShenYu](https://shenyu.apache.org) (Rich protocol support: `HTTP`,`Spring Cloud`,`gRPC`,`Dubbo`,`SOFARPC`,`Motan`,`Tars`) 2.4.x (Optional²)
* JDBC
- * Mysql Driver 5.x, 6.x, 8.x
+ * Mysql Driver 5.x, 6.x, 8.x, 9.x
* Oracle Driver (Optional¹), See [SkySPM Plugin Repository](https://github.com/SkyAPM/java-plugin-extensions)
* H2 Driver 1.3.x -> 1.4.x
* [ShardingSphere](https://github.com/apache/shardingsphere) 3.0.0, 4.0.0, 4.0.1, 4.1.0, 4.1.1, 5.0.0
* PostgreSQL Driver 8.x, 9.x, 42.x
- * Mariadb Driver 2.x, 1.8
+ * Mariadb Driver 1.8, 2.x (2.0 -> 2.7)
* [InfluxDB](https://github.com/influxdata/influxdb-java) 2.5 -> 2.17
* [Mssql-Jtds](https://github.com/milesibastos/jTDS) 1.x
* [Mssql-jdbc](https://github.com/microsoft/mssql-jdbc) 6.x -> 8.x
@@ -77,7 +77,7 @@ metrics based on the tracing data.
* [RocketMQ](https://github.com/apache/rocketmq) 3.x-> 5.x
* [RocketMQ-gRPC](http://github.com/apache/rocketmq-clients) 5.x
* [Kafka](http://kafka.apache.org) 0.11.0.0 -> 3.9.1
- * [Spring-Kafka](https://github.com/spring-projects/spring-kafka) Spring Kafka Consumer 1.3.x -> 2.3.x (2.0.x and 2.1.x not tested and not recommended by [the official document](https://spring.io/projects/spring-kafka))
+ * [Spring-Kafka](https://github.com/spring-projects/spring-kafka) Spring Kafka Consumer 1.3.x -> 3.3.x (2.0.x and 2.1.x not tested and not recommended by [the official document](https://spring.io/projects/spring-kafka))
* [ActiveMQ](https://github.com/apache/activemq) 5.10.0 -> 5.15.4
* [RabbitMQ](https://www.rabbitmq.com/) 3.x-> 5.x
* [Spring-RabbitMQ](https://github.com/spring-projects/spring-amqp) 2.x -> 4.x
@@ -91,7 +91,7 @@ metrics based on the tracing data.
* [Jedis](https://github.com/xetorthio/jedis) 2.x-4.x
* [Redisson](https://github.com/redisson/redisson) Easy Java Redis client 3.5.0 -> 3.30.0
* [Lettuce](https://github.com/lettuce-io/lettuce-core) 5.x -> 6.7.1
- * [MongoDB Java Driver](https://github.com/mongodb/mongo-java-driver) 2.13-2.14, 3.4.0-3.12.7, 4.0.0-4.1.0
+ * [MongoDB Java Driver](https://github.com/mongodb/mongo-java-driver) 2.13-2.14, 3.4.0-3.12.7, 4.0.0-4.10.2
* Memcached Client
* [Spymemcached](https://github.com/couchbase/spymemcached) 2.x
* [Xmemcached](https://github.com/killme2008/xmemcached) 2.x
@@ -147,7 +147,7 @@ metrics based on the tracing data.
* Kotlin
* [Coroutine](https://kotlinlang.org/docs/coroutines-overview.html) 1.0.1 -> 1.3.x (Optional²)
* GraphQL
- * [Graphql](https://github.com/graphql-java) 8.0 -> 17.x
+ * [Graphql](https://github.com/graphql-java) 8.0 -> 24.x
* Pool
* [Apache Commons DBCP](https://github.com/apache/commons-dbcp) 2.x
* [Alibaba Druid](https://github.com/alibaba/druid) 1.x
diff --git a/docs/menu.yml b/docs/menu.yml
index 7ba331d586..edf1dfa77b 100644
--- a/docs/menu.yml
+++ b/docs/menu.yml
@@ -106,6 +106,8 @@ catalog:
path: "/en/setup/service-agent/java-agent/java-plugin-development-guide/"
- name: "Java Agent Performance Test"
path: "https://skyapmtest.github.io/Agent-Benchmarks/"
+ - name: "Claude Code Skills"
+ path: "/en/setup/service-agent/java-agent/Claude-Skills"
- name: "FAQs"
catalog:
- name: "Why is java.ext.dirs not supported?"
diff --git a/test/plugin/run.sh b/test/plugin/run.sh
index 164eebfe6d..b530c78e98 100755
--- a/test/plugin/run.sh
+++ b/test/plugin/run.sh
@@ -231,8 +231,21 @@ ls "${jacoco_home}"/jacocoagent.jar || curl -Lso "${jacoco_home}"/jacocoagent.ja
ls "${jacoco_home}"/jacococli.jar || curl -Lso "${jacoco_home}"/jacococli.jar https://repo1.maven.org/maven2/org/jacoco/org.jacoco.cli/${jacoco_version}/org.jacoco.cli-${jacoco_version}-nodeps.jar
supported_versions=`grep -v -E "^$|^#" ${supported_version_file}`
-for version in ${supported_versions}
+for version_line in ${supported_versions}
do
+ # Support format: version[,key=value[,key=value...]]
+ # e.g., "2.7.14,spring.boot.version=2.7.19"
+ # First token is test.framework.version, rest are extra Maven properties.
+ version=$(echo "${version_line}" | cut -d',' -f1)
+ extra_props=""
+ remaining=$(echo "${version_line}" | cut -d',' -f2- -s)
+ if [[ -n "${remaining}" ]]; then
+ IFS=',' read -ra props <<< "${remaining}"
+ for prop in "${props[@]}"; do
+ extra_props="${extra_props} -D${prop}"
+ done
+ fi
+
testcase_name="${scenario_name}-${version}"
# testcase working directory, there are logs, data and packages.
@@ -245,7 +258,7 @@ do
cp ./config/expectedData.yaml ${case_work_base}/data
# echo "build ${testcase_name}"
- ${mvnw} -q --batch-mode clean package -Dtest.framework.version=${version} && \
+ ${mvnw} -q --batch-mode clean package -Dtest.framework.version=${version} ${extra_props} && \
mv ./target/${scenario_name}.* ${case_work_base}
java -jar \
diff --git a/test/plugin/scenarios/feign-scenario/support-version.list b/test/plugin/scenarios/feign-scenario/support-version.list
index 54e90a79ca..eb3667e110 100644
--- a/test/plugin/scenarios/feign-scenario/support-version.list
+++ b/test/plugin/scenarios/feign-scenario/support-version.list
@@ -20,3 +20,6 @@
9.3.1
9.4.0
9.5.1
+10.12
+11.10
+12.1
diff --git a/test/plugin/scenarios/graphql-16plus-scenario/support-version.list b/test/plugin/scenarios/graphql-16plus-scenario/support-version.list
index c8a4778deb..3d9f037eba 100644
--- a/test/plugin/scenarios/graphql-16plus-scenario/support-version.list
+++ b/test/plugin/scenarios/graphql-16plus-scenario/support-version.list
@@ -15,6 +15,9 @@
# limitations under the License.
# lists your version here
+# graphql-java 20+ requires Java 11+, so only 16-19 are tested here (Java 8 scenario)
16.0
17.0
+18.7
+19.11
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/bin/startup.sh b/test/plugin/scenarios/graphql-20plus-scenario/bin/startup.sh
new file mode 100644
index 0000000000..aba240987a
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/bin/startup.sh
@@ -0,0 +1,21 @@
+#!/bin/bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+home="$(cd "$(dirname $0)"; pwd)"
+
+java -jar ${agent_opts} ${home}/../libs/graphql-20plus-scenario.jar &
\ No newline at end of file
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/config/expectedData.yaml b/test/plugin/scenarios/graphql-20plus-scenario/config/expectedData.yaml
new file mode 100644
index 0000000000..df3b3315ec
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/config/expectedData.yaml
@@ -0,0 +1,88 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+segmentItems:
+- serviceName: graphql-20plus-scenario
+ segmentSize: gt 1
+ segments:
+ - segmentId: not null
+ spans:
+ - operationName: user
+ parentSpanId: 0
+ spanId: 1
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 92
+ isError: false
+ spanType: Local
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: x-le, value: '{"logic-span":true}'}
+ - operationName: users
+ parentSpanId: 0
+ spanId: 2
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 92
+ isError: false
+ spanType: Local
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: x-le, value: '{"logic-span":true}'}
+ - operationName: user
+ parentSpanId: 0
+ spanId: 3
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 92
+ isError: false
+ spanType: Local
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: x-le, value: '{"logic-span":true}'}
+ - operationName: users
+ parentSpanId: 0
+ spanId: 4
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 92
+ isError: false
+ spanType: Local
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: x-le, value: '{"logic-span":true}'}
+ - operationName: GET:/graphql-scenario/case/graphql
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 1
+ isError: false
+ spanType: Entry
+ peer: ''
+ tags:
+ - {key: url, value: 'http://localhost:8080/graphql-scenario/case/graphql'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ skipAnalysis: 'false'
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/configuration.yml b/test/plugin/scenarios/graphql-20plus-scenario/configuration.yml
new file mode 100644
index 0000000000..9032d2abff
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/configuration.yml
@@ -0,0 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+type: jvm
+entryService: http://localhost:8080/graphql-scenario/case/graphql
+healthCheck: http://localhost:8080/graphql-scenario/case/healthCheck
+startScript: ./bin/startup.sh
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/pom.xml b/test/plugin/scenarios/graphql-20plus-scenario/pom.xml
new file mode 100644
index 0000000000..d1daa8b2a4
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/pom.xml
@@ -0,0 +1,117 @@
+
+
+
+
+ org.apache.skywalking.apm.testcase
+ graphql-20plus-scenario
+ 1.0.0
+ jar
+
+ 4.0.0
+
+ skywalking-graphql-20plus-scenario
+
+
+ UTF-8
+ 17
+ 3.8.1
+ 3.0.13
+ 20.0
+ 1.18.30
+ graphql
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-dependencies
+ ${spring.boot.version}
+ pom
+ import
+
+
+
+
+
+
+ com.graphql-java
+ graphql-java
+ ${test.framework.version}
+
+
+ org.projectlombok
+ lombok
+ ${lombok.version}
+ provided
+
+
+ org.springframework.boot
+ spring-boot-starter-web
+
+
+
+
+ graphql-20plus-scenario
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+ ${spring.boot.version}
+
+
+
+ repackage
+
+
+
+
+
+ maven-compiler-plugin
+ ${maven-compiler-plugin.version}
+
+ ${compiler.version}
+ ${compiler.version}
+ ${project.build.sourceEncoding}
+
+
+
+ org.apache.maven.plugins
+ maven-assembly-plugin
+
+
+ assemble
+ package
+
+ single
+
+
+
+ src/main/assembly/assembly.xml
+
+ ./target/
+
+
+
+
+
+
+
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/assembly/assembly.xml b/test/plugin/scenarios/graphql-20plus-scenario/src/main/assembly/assembly.xml
new file mode 100644
index 0000000000..8366461a93
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/assembly/assembly.xml
@@ -0,0 +1,41 @@
+
+
+
+
+ zip
+
+
+
+
+ ./bin
+ 0775
+
+
+
+
+
+ ${project.build.directory}/graphql-20plus-scenario.jar
+ ./libs
+ 0775
+
+
+
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/Application.java b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/Application.java
new file mode 100644
index 0000000000..657ca45983
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/Application.java
@@ -0,0 +1,29 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.skywalking.apm.testcase.graphql;
+
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+
+@SpringBootApplication
+public class Application {
+
+ public static void main(String[] args) {
+ SpringApplication.run(Application.class, args);
+ }
+}
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/configuration/GraphSchema.java b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/configuration/GraphSchema.java
new file mode 100644
index 0000000000..607fdf7222
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/configuration/GraphSchema.java
@@ -0,0 +1,120 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.skywalking.apm.testcase.graphql.configuration;
+
+import graphql.GraphQL;
+import graphql.schema.GraphQLFieldDefinition;
+import graphql.schema.GraphQLList;
+import graphql.schema.GraphQLOutputType;
+import graphql.schema.GraphQLSchema;
+import org.apache.skywalking.apm.testcase.graphql.data.User;
+import org.springframework.context.annotation.Bean;
+import org.springframework.stereotype.Component;
+
+import jakarta.annotation.PostConstruct;
+import java.util.ArrayList;
+import java.util.List;
+
+import static graphql.Scalars.GraphQLInt;
+import static graphql.Scalars.GraphQLString;
+import static graphql.schema.GraphQLArgument.newArgument;
+import static graphql.schema.GraphQLFieldDefinition.newFieldDefinition;
+import static graphql.schema.GraphQLObjectType.newObject;
+
+@Component
+public class GraphSchema {
+ private graphql.schema.GraphQLSchema schema;
+ private GraphQLOutputType userType;
+
+ @PostConstruct
+ public void init() {
+ initOutputType();
+ schema = graphql.schema.GraphQLSchema.newSchema().query(newObject()
+ .name("GraphQuery")
+ .field(createUsersField())
+ .field(createUserField())
+ .build()).build();
+ }
+
+ @Bean
+ public GraphQL graphQL() {
+ return new GraphQL.Builder(getSchema()).build();
+ }
+
+ private void initOutputType() {
+
+ userType = newObject()
+ .name("User")
+ .field(newFieldDefinition().name("id").type(GraphQLInt).build())
+ .field(newFieldDefinition().name("name").type(GraphQLString).build())
+ .build();
+ }
+
+ private GraphQLFieldDefinition createUserField() {
+ return GraphQLFieldDefinition.newFieldDefinition()
+ .name("user")
+ .argument(newArgument().name("id").type(GraphQLInt).build())
+ .type(userType)
+ .dataFetcher(environment -> {
+ int id = environment.getArgument("id");
+ try {
+ Thread.sleep(300L);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ User user = new User();
+ user.setId(id);
+ user.setName("Name_" + id);
+ return user;
+ })
+ .build();
+ }
+
+ private GraphQLFieldDefinition createUsersField() {
+ return GraphQLFieldDefinition.newFieldDefinition()
+ .name("users")
+ .argument(newArgument().name("page").type(GraphQLInt).build())
+ .argument(newArgument().name("size").type(GraphQLInt).build())
+ .argument(newArgument().name("name").type(GraphQLString).build())
+ .type(new GraphQLList(userType))
+ .dataFetcher(environment -> {
+ int page = environment.getArgument("page");
+ int size = environment.getArgument("size");
+ String name = environment.getArgument("name");
+
+ try {
+ Thread.sleep(100L);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ List list = new ArrayList<>(size);
+ for (int i = 0; i < size; i++) {
+ User user = new User();
+ user.setId(i);
+ user.setName(name + "_" + page + "_" + i);
+ list.add(user);
+ }
+ return list;
+ })
+ .build();
+ }
+
+ public GraphQLSchema getSchema() {
+ return schema;
+ }
+}
\ No newline at end of file
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/controller/CaseController.java b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/controller/CaseController.java
new file mode 100644
index 0000000000..a014af2ae4
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/controller/CaseController.java
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.testcase.graphql.controller;
+
+import graphql.ExecutionInput;
+import graphql.GraphQL;
+import org.springframework.web.bind.annotation.RequestMapping;
+import org.springframework.web.bind.annotation.ResponseBody;
+import org.springframework.web.bind.annotation.RestController;
+
+import jakarta.annotation.Resource;
+
+@RestController
+@RequestMapping("/case")
+public class CaseController {
+
+ private static final String SUCCESS = "Success";
+
+ @Resource
+ private GraphQL graphQL;
+
+ @RequestMapping("/healthCheck")
+ @ResponseBody
+ public String healthCheck() {
+ return SUCCESS;
+ }
+
+ @RequestMapping("/graphql")
+ @ResponseBody
+ public String graphql() {
+ String query3 = "{user(id:6) {id,name},users(page:2,size:5,name:\"john\") {id,name}}";
+ ExecutionInput executionInput = ExecutionInput.newExecutionInput().query(query3).build();
+ graphQL.execute(executionInput);
+ graphQL.executeAsync(executionInput);
+ return SUCCESS;
+ }
+}
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/data/User.java b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/data/User.java
new file mode 100644
index 0000000000..c23a2baf3f
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/java/org/apache/skywalking/apm/testcase/graphql/data/User.java
@@ -0,0 +1,26 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.skywalking.apm.testcase.graphql.data;
+
+import lombok.Data;
+
+@Data
+public class User {
+ private int id;
+ private String name;
+}
\ No newline at end of file
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/src/main/resources/application.yml b/test/plugin/scenarios/graphql-20plus-scenario/src/main/resources/application.yml
new file mode 100644
index 0000000000..3d9f580c4e
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/src/main/resources/application.yml
@@ -0,0 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+server:
+ port: 8080
+ servlet:
+ context-path: /graphql-scenario
\ No newline at end of file
diff --git a/test/plugin/scenarios/graphql-20plus-scenario/support-version.list b/test/plugin/scenarios/graphql-20plus-scenario/support-version.list
new file mode 100644
index 0000000000..0d28512b2e
--- /dev/null
+++ b/test/plugin/scenarios/graphql-20plus-scenario/support-version.list
@@ -0,0 +1,23 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# graphql-java 20+ requires Java 11+, tested with JDK 17
+
+20.9
+21.5
+22.4
+23.1
+24.1
diff --git a/test/plugin/scenarios/mariadb-scenario/support-version.list b/test/plugin/scenarios/mariadb-scenario/support-version.list
index 8d642a99e3..09e354a134 100644
--- a/test/plugin/scenarios/mariadb-scenario/support-version.list
+++ b/test/plugin/scenarios/mariadb-scenario/support-version.list
@@ -14,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
+2.7.12
2.6.0
2.5.4
2.4.4
diff --git a/test/plugin/scenarios/mongodb-4.x-scenario/support-version.list b/test/plugin/scenarios/mongodb-4.x-scenario/support-version.list
index 68dfcd9852..d48bca9d2d 100644
--- a/test/plugin/scenarios/mongodb-4.x-scenario/support-version.list
+++ b/test/plugin/scenarios/mongodb-4.x-scenario/support-version.list
@@ -15,4 +15,13 @@
# limitations under the License.
4.0.5
-4.1.0
\ No newline at end of file
+4.1.0
+4.2.3
+4.3.4
+4.4.2
+4.5.1
+4.6.1
+4.7.2
+4.8.2
+4.9.1
+4.10.2
\ No newline at end of file
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/bin/startup.sh b/test/plugin/scenarios/mysql-9.x-scenario/bin/startup.sh
new file mode 100644
index 0000000000..7e4ac0fd1d
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/bin/startup.sh
@@ -0,0 +1,21 @@
+#!/bin/bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+home="$(cd "$(dirname $0)"; pwd)"
+
+java -jar ${agent_opts} ${home}/../libs/mysql-9.x-scenario.jar &
\ No newline at end of file
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/config/expectedData.yaml b/test/plugin/scenarios/mysql-9.x-scenario/config/expectedData.yaml
new file mode 100644
index 0000000000..17f823f911
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/config/expectedData.yaml
@@ -0,0 +1,151 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+segmentItems:
+- serviceName: mysql-9.x-scenario
+ segmentSize: ge 2
+ segments:
+ - segmentId: not null
+ spans:
+ - operationName: Mysql/JDBC/PreparedStatement/execute
+ parentSpanId: 0
+ spanId: 1
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - key: db.statement
+ value: "CREATE TABLE test_007(\nid VARCHAR(1) PRIMARY KEY, \nvalue VARCHAR(1)\
+ \ NOT NULL)"
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: Mysql/JDBC/PreparedStatement/execute
+ parentSpanId: 0
+ spanId: 2
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: 'INSERT INTO test_007(id, value) VALUES(?,?)'}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: Mysql/JDBC/Statement/execute
+ parentSpanId: 0
+ spanId: 3
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: DROP table test_007}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: Mysql/JDBC/Statement/execute
+ parentSpanId: 0
+ spanId: 4
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: "create procedure testProcedure(IN id varchar(10)) \n begin \n select id; \n end"}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ # MySQL Connector 8.4+ no longer issues internal SHOW CREATE PROCEDURE query
+ - operationName: Mysql/JDBC/CallableStatement/execute
+ parentSpanId: 0
+ spanId: 5
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: "call testProcedure( ? )"}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: Mysql/JDBC/Statement/execute
+ parentSpanId: 0
+ spanId: 6
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: "drop procedure testProcedure"}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: Mysql/JDBC/Connection/close
+ parentSpanId: 0
+ spanId: 7
+ tags:
+ - {key: db.type, value: Mysql}
+ - {key: db.instance, value: test}
+ - {key: db.statement, value: ''}
+ logs: []
+ startTime: nq 0
+ endTime: nq 0
+ isError: false
+ spanLayer: Database
+ spanType: Exit
+ componentId: 33
+ peer: mysql-server:3306
+ skipAnalysis: 'false'
+ - operationName: GET:/mysql-scenario/case/mysql-scenario
+ parentSpanId: -1
+ spanId: 0
+ startTime: nq 0
+ endTime: nq 0
+ spanLayer: Http
+ isError: false
+ spanType: Entry
+ componentId: 1
+ tags:
+ - {key: url, value: 'http://localhost:8080/mysql-scenario/case/mysql-scenario'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ logs: []
+ skipAnalysis: 'false'
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/configuration.yml b/test/plugin/scenarios/mysql-9.x-scenario/configuration.yml
new file mode 100644
index 0000000000..0d486ac74b
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/configuration.yml
@@ -0,0 +1,32 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+type: jvm
+entryService: http://localhost:8080/mysql-scenario/case/mysql-scenario
+healthCheck: http://localhost:8080/mysql-scenario/case/healthCheck
+startScript: ./bin/startup.sh
+environment:
+depends_on:
+ - mysql-server
+dependencies:
+ mysql-server:
+ image: mysql:8.0
+ hostname: mysql-server
+ expose:
+ - "3306"
+ environment:
+ - MYSQL_ROOT_PASSWORD=root
+ - MYSQL_DATABASE=test
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/pom.xml b/test/plugin/scenarios/mysql-9.x-scenario/pom.xml
new file mode 100644
index 0000000000..c99900594a
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/pom.xml
@@ -0,0 +1,123 @@
+
+
+
+
+ org.apache.skywalking.apm.testcase
+ mysql-9.x-scenario
+ 1.0.0
+ jar
+
+ 4.0.0
+
+
+ UTF-8
+ 1.8
+ 3.8.1
+
+ 9.0.0
+ ${test.framework.version}
+
+ 2.1.6.RELEASE
+
+
+ skywalking-mysql-9.x-scenario
+
+
+
+
+ org.springframework.boot
+ spring-boot-dependencies
+ ${spring.boot.version}
+ pom
+ import
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-web
+
+
+ org.springframework.boot
+ spring-boot-starter-logging
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-log4j2
+
+
+
+ com.mysql
+ mysql-connector-j
+ ${test.framework.version}
+
+
+
+
+ mysql-9.x-scenario
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+ ${spring.boot.version}
+
+
+
+ repackage
+
+
+
+
+
+ maven-compiler-plugin
+ ${maven-compiler-plugin.version}
+
+ ${compiler.version}
+ ${compiler.version}
+ ${project.build.sourceEncoding}
+
+
+
+ org.apache.maven.plugins
+ maven-assembly-plugin
+
+
+ assemble
+ package
+
+ single
+
+
+
+ src/main/assembly/assembly.xml
+
+ ./target/
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/assembly/assembly.xml b/test/plugin/scenarios/mysql-9.x-scenario/src/main/assembly/assembly.xml
new file mode 100644
index 0000000000..9aae4e672b
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/assembly/assembly.xml
@@ -0,0 +1,41 @@
+
+
+
+
+ zip
+
+
+
+
+ ./bin
+ 0775
+
+
+
+
+
+ ${project.build.directory}/mysql-9.x-scenario.jar
+ ./libs
+ 0775
+
+
+
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/Application.java b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/Application.java
new file mode 100644
index 0000000000..b7d42e6fc2
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/Application.java
@@ -0,0 +1,34 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.testcase.mysql;
+
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+
+@SpringBootApplication
+public class Application {
+
+ public static void main(String[] args) {
+ try {
+ SpringApplication.run(Application.class, args);
+ } catch (Exception e) {
+ // Never do this
+ }
+ }
+}
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/MysqlConfig.java b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/MysqlConfig.java
new file mode 100644
index 0000000000..f30a6ebc37
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/MysqlConfig.java
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.testcase.mysql;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.Properties;
+import org.apache.logging.log4j.LogManager;
+import org.apache.logging.log4j.Logger;
+
+public class MysqlConfig {
+ private static final Logger LOGGER = LogManager.getLogger(MysqlConfig.class);
+ private static String URL;
+ private static String USER_NAME;
+ private static String PASSWORD;
+
+ static {
+ InputStream inputStream = MysqlConfig.class.getClassLoader().getResourceAsStream("/jdbc.properties");
+ Properties properties = new Properties();
+ try {
+ properties.load(inputStream);
+ } catch (IOException e) {
+ LOGGER.error("Failed to load config", e);
+ }
+
+ URL = properties.getProperty("mysql.url");
+ USER_NAME = properties.getProperty("mysql.username");
+ PASSWORD = properties.getProperty("mysql.password");
+ }
+
+ public static String getUrl() {
+ return URL;
+ }
+
+ public static String getUserName() {
+ return USER_NAME;
+ }
+
+ public static String getPassword() {
+ return PASSWORD;
+ }
+}
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/SQLExecutor.java b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/SQLExecutor.java
new file mode 100644
index 0000000000..74e3d12d6c
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/SQLExecutor.java
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.testcase.mysql;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.SQLException;
+import java.sql.Statement;
+
+public class SQLExecutor implements AutoCloseable {
+ private Connection connection;
+
+ public SQLExecutor() throws SQLException {
+ try {
+ Class.forName("com.mysql.cj.jdbc.Driver");
+ } catch (ClassNotFoundException e) {
+ //
+ }
+ connection = DriverManager.getConnection(MysqlConfig.getUrl(), MysqlConfig.getUserName(), MysqlConfig.getPassword());
+ }
+
+ public void createTable(String sql) throws SQLException {
+ PreparedStatement preparedStatement = connection.prepareStatement(sql);
+ preparedStatement.execute();
+ preparedStatement.close();
+ }
+
+ public void insertData(String sql, String id, String value) throws SQLException {
+ PreparedStatement preparedStatement = connection.prepareStatement(sql);
+ preparedStatement.setString(1, id);
+ preparedStatement.setString(2, value);
+ preparedStatement.execute();
+ preparedStatement.close();
+ }
+
+ public void dropTable(String sql) throws SQLException {
+ executeStatement(sql);
+ }
+
+ public void createProcedure(String sql) throws SQLException {
+ executeStatement(sql);
+ }
+
+ public void dropProcedure(String sql) throws SQLException {
+ executeStatement(sql);
+ }
+
+ public void callProcedure(String sql, String id) throws SQLException {
+ PreparedStatement preparedStatement = connection.prepareCall(sql);
+ preparedStatement.setString(1, id);
+ preparedStatement.execute();
+ preparedStatement.close();
+ }
+
+ public void executeStatement(String sql) throws SQLException {
+ Statement preparedStatement = connection.createStatement();
+ preparedStatement.execute(sql);
+ preparedStatement.close();
+ }
+
+ public void closeConnection() throws SQLException {
+ if (this.connection != null) {
+ this.connection.close();
+ }
+ }
+
+ @Override
+ public void close() throws Exception {
+ closeConnection();
+ }
+}
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/controller/CaseController.java b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/controller/CaseController.java
new file mode 100644
index 0000000000..626ef9a190
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/java/org/apache/skywalking/apm/testcase/mysql/controller/CaseController.java
@@ -0,0 +1,71 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.apm.testcase.mysql.controller;
+
+import org.apache.logging.log4j.LogManager;
+import org.apache.logging.log4j.Logger;
+import org.apache.skywalking.apm.testcase.mysql.SQLExecutor;
+import org.springframework.web.bind.annotation.RequestMapping;
+import org.springframework.web.bind.annotation.ResponseBody;
+import org.springframework.web.bind.annotation.RestController;
+
+@RestController
+@RequestMapping("/case")
+public class CaseController {
+
+ private static final Logger LOGGER = LogManager.getLogger(CaseController.class);
+
+ private static final String SUCCESS = "Success";
+
+ private static final String CREATE_TABLE_SQL = "CREATE TABLE test_007(\n" + "id VARCHAR(1) PRIMARY KEY, \n" + "value VARCHAR(1) NOT NULL)";
+ private static final String INSERT_DATA_SQL = "INSERT INTO test_007(id, value) VALUES(?,?)";
+ private static final String QUERY_DATA_SQL = "SELECT id, value FROM test_007 WHERE id=?";
+ private static final String DELETE_DATA_SQL = "DELETE FROM test_007 WHERE id=?";
+ private static final String DROP_TABLE_SQL = "DROP table test_007";
+ private static final String CREATE_PROCEDURE_SQL = "create procedure testProcedure(IN id varchar(10)) \n begin \n select id; \n end";
+ private static final String CALL_PROCEDURE_SQL = "call testProcedure( ? )";
+ private static final String DROP_PROCEDURE_SQL = "drop procedure testProcedure";
+
+ @RequestMapping("/mysql-scenario")
+ @ResponseBody
+ public String testcase() throws Exception {
+ try (SQLExecutor sqlExecute = new SQLExecutor()) {
+ sqlExecute.createTable(CREATE_TABLE_SQL);
+ sqlExecute.insertData(INSERT_DATA_SQL, "1", "1");
+ sqlExecute.dropTable(DROP_TABLE_SQL);
+ sqlExecute.createProcedure(CREATE_PROCEDURE_SQL);
+ sqlExecute.callProcedure(CALL_PROCEDURE_SQL, "nihao");
+ sqlExecute.dropProcedure(DROP_PROCEDURE_SQL);
+ } catch (Exception e) {
+ LOGGER.error("Failed to execute sql.", e);
+ throw e;
+ }
+ return SUCCESS;
+ }
+
+ @RequestMapping("/healthCheck")
+ @ResponseBody
+ public String healthCheck() throws Exception {
+ try (SQLExecutor sqlExecutor = new SQLExecutor()) {
+ // ignore
+ }
+ return SUCCESS;
+ }
+
+}
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/application.yaml b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/application.yaml
new file mode 100644
index 0000000000..8dc5e28d15
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/application.yaml
@@ -0,0 +1,23 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+#
+server:
+ port: 8080
+ servlet:
+ context-path: /mysql-scenario
+logging:
+ config: classpath:log4j2.xml
\ No newline at end of file
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/jdbc.properties b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/jdbc.properties
new file mode 100644
index 0000000000..7d999c28da
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/jdbc.properties
@@ -0,0 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+mysql.url=jdbc:mysql://mysql-server:3306/test?serverTimezone=UTC&useSSL=false&allowPublicKeyRetrieval=true
+mysql.username=root
+mysql.password=root
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/log4j2.xml b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/log4j2.xml
new file mode 100644
index 0000000000..9849ed5a8a
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/src/main/resources/log4j2.xml
@@ -0,0 +1,30 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/test/plugin/scenarios/mysql-9.x-scenario/support-version.list b/test/plugin/scenarios/mysql-9.x-scenario/support-version.list
new file mode 100644
index 0000000000..d0f41f5673
--- /dev/null
+++ b/test/plugin/scenarios/mysql-9.x-scenario/support-version.list
@@ -0,0 +1,25 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Uses com.mysql:mysql-connector-j artifact (8.0.31+ and 9.x)
+8.4.0
+9.0.0
+9.1.0
+9.2.0
+9.3.0
+9.4.0
+9.5.0
+9.6.0
diff --git a/test/plugin/scenarios/mysql-scenario/support-version.list b/test/plugin/scenarios/mysql-scenario/support-version.list
index 746fa92395..b0b1761ebb 100644
--- a/test/plugin/scenarios/mysql-scenario/support-version.list
+++ b/test/plugin/scenarios/mysql-scenario/support-version.list
@@ -14,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
+# 8.0.30+ requires allowPublicKeyRetrieval=true and serverTimezone=UTC
+# Those versions are tested in mysql-9.x-scenario with the fixed JDBC URL
8.0.19
8.0.15
6.0.6
diff --git a/test/plugin/scenarios/spring-kafka-1.3.x-scenario/configuration.yml b/test/plugin/scenarios/spring-kafka-1.3.x-scenario/configuration.yml
index 8d387e2d45..8ff824fa43 100644
--- a/test/plugin/scenarios/spring-kafka-1.3.x-scenario/configuration.yml
+++ b/test/plugin/scenarios/spring-kafka-1.3.x-scenario/configuration.yml
@@ -25,7 +25,7 @@ depends_on:
- kafka-server
dependencies:
zookeeper-server:
- image: zookeeper:3.4
+ image: zookeeper:3.7
hostname: zookeeper-server
kafka-server:
image: bitnamilegacy/kafka:2.4.1
diff --git a/test/plugin/scenarios/spring-kafka-2.2.x-scenario/configuration.yml b/test/plugin/scenarios/spring-kafka-2.2.x-scenario/configuration.yml
index 164b29135a..3728f3a9cb 100644
--- a/test/plugin/scenarios/spring-kafka-2.2.x-scenario/configuration.yml
+++ b/test/plugin/scenarios/spring-kafka-2.2.x-scenario/configuration.yml
@@ -25,7 +25,7 @@ depends_on:
- kafka-server
dependencies:
zookeeper-server:
- image: zookeeper:3.4
+ image: zookeeper:3.7
hostname: zookeeper-server
kafka-server:
image: bitnamilegacy/kafka:2.4.1
diff --git a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/configuration.yml b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/configuration.yml
index 295f967541..8948c60ab6 100644
--- a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/configuration.yml
+++ b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/configuration.yml
@@ -25,7 +25,7 @@ depends_on:
- kafka-server
dependencies:
zookeeper-server:
- image: zookeeper:3.4
+ image: zookeeper:3.7
hostname: zookeeper-server
kafka-server:
image: bitnamilegacy/kafka:2.4.1
diff --git a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/pom.xml b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/pom.xml
index cb1b20edd5..822455c920 100644
--- a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/pom.xml
+++ b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/pom.xml
@@ -55,17 +55,8 @@
-
- org.apache.kafka
- kafka-clients
- ${kafka-version}
-
-
- slf4j-api
- *
-
-
-
+
com.squareup.okhttp3okhttp
diff --git a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/support-version.list b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/support-version.list
index 6859d9a3bb..126ffe549e 100644
--- a/test/plugin/scenarios/spring-kafka-2.3.x-scenario/support-version.list
+++ b/test/plugin/scenarios/spring-kafka-2.3.x-scenario/support-version.list
@@ -14,4 +14,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
+# Format: spring-kafka-version[,key=value] — extra properties passed to Maven.
+# Spring Boot autoconfigure must match the spring-kafka version's expected APIs.
+# Mapping: https://spring.io/projects/spring-kafka#overview (compatibility matrix)
2.3.10.RELEASE
+2.4.13.RELEASE
+2.5.17.RELEASE
+2.6.13
+2.7.14,spring.boot.version=2.5.15
+2.8.11,spring.boot.version=2.7.18
+2.9.13,spring.boot.version=2.7.18
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/bin/startup.sh b/test/plugin/scenarios/spring-kafka-3.x-scenario/bin/startup.sh
new file mode 100644
index 0000000000..5a10cdccba
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/bin/startup.sh
@@ -0,0 +1,21 @@
+#!/bin/bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+home="$(cd "$(dirname $0)"; pwd)"
+
+java -Dbootstrap.servers=${BOOTSTRAP_SERVERS} -jar ${agent_opts} "-Dskywalking.agent.service_name=spring-kafka-3.x-scenario" ${home}/../libs/spring-kafka-3.x-scenario.jar &
\ No newline at end of file
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/config/expectedData.yaml b/test/plugin/scenarios/spring-kafka-3.x-scenario/config/expectedData.yaml
new file mode 100644
index 0000000000..f58e28b78d
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/config/expectedData.yaml
@@ -0,0 +1,123 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+segmentItems:
+ - serviceName: spring-kafka-3.x-scenario
+ segmentSize: nq 0
+ segments:
+ - segmentId: not null
+ spans:
+ - operationName: Kafka/spring_test/Producer
+ parentSpanId: 0
+ spanId: 1
+ spanLayer: MQ
+ startTime: not null
+ endTime: not null
+ componentId: 40
+ isError: false
+ spanType: Exit
+ peer: kafka-server:9092
+ skipAnalysis: false
+ tags:
+ - {key: mq.broker, value: 'kafka-server:9092'}
+ - {key: mq.topic, value: spring_test}
+ - operationName: Kafka/spring_test/Producer
+ parentSpanId: 0
+ spanId: 2
+ spanLayer: MQ
+ startTime: not null
+ endTime: not null
+ componentId: 40
+ isError: false
+ spanType: Exit
+ peer: kafka-server:9092
+ skipAnalysis: false
+ tags:
+ - { key: mq.broker, value: 'kafka-server:9092' }
+ - { key: mq.topic, value: spring_test }
+ - operationName: GET:/spring-kafka-3.x-scenario/case/spring-kafka-case
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: not null
+ endTime: not null
+ componentId: 1
+ isError: false
+ spanType: Entry
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: url, value: 'http://localhost:8080/spring-kafka-3.x-scenario/case/spring-kafka-case'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ - segmentId: not null
+ spans:
+ - operationName: GET:/spring-kafka-3.x-scenario/case/spring-kafka-consumer-ping
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: not null
+ endTime: not null
+ componentId: 1
+ isError: false
+ spanType: Entry
+ peer: ''
+ skipAnalysis: false
+ tags:
+ - {key: url, value: 'http://localhost:8080/spring-kafka-3.x-scenario/case/spring-kafka-consumer-ping'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ refs:
+ - {parentEndpoint: 'Kafka/spring_test/Consumer/grop:spring_test', networkAddress: 'localhost:8080',
+ refType: CrossProcess, parentSpanId: 1, parentTraceSegmentId: not null,
+ parentServiceInstance: not null, parentService: spring-kafka-3.x-scenario,
+ traceId: not null}
+ - segmentId: not null
+ spans:
+ - operationName: /spring-kafka-3.x-scenario/case/spring-kafka-consumer-ping
+ parentSpanId: 0
+ spanId: 1
+ spanLayer: Http
+ startTime: not null
+ endTime: not null
+ componentId: 12
+ isError: false
+ spanType: Exit
+ peer: localhost:8080
+ skipAnalysis: false
+ tags:
+ - {key: http.method, value: GET}
+ - {key: url, value: 'http://localhost:8080/spring-kafka-3.x-scenario/case/spring-kafka-consumer-ping'}
+ - {key: http.status_code, value: '200'}
+ - operationName: Kafka/spring_test/Consumer/grop:spring_test
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: MQ
+ startTime: not null
+ endTime: not null
+ componentId: 41
+ isError: false
+ spanType: Entry
+ peer: kafka-server:9092
+ skipAnalysis: false
+ tags:
+ - {key: mq.broker, value: 'kafka-server:9092'}
+ - {key: mq.topic, value: spring_test}
+ - {key: transmission.latency, value: not null}
+ refs:
+ - {parentEndpoint: 'GET:/spring-kafka-3.x-scenario/case/spring-kafka-case', networkAddress: 'kafka-server:9092',
+ refType: CrossProcess, parentSpanId: not null, parentTraceSegmentId: not null,
+ parentServiceInstance: not null, parentService: spring-kafka-3.x-scenario,
+ traceId: not null}
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/configuration.yml b/test/plugin/scenarios/spring-kafka-3.x-scenario/configuration.yml
new file mode 100644
index 0000000000..1397670aff
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/configuration.yml
@@ -0,0 +1,39 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+type: jvm
+entryService: http://localhost:8080/spring-kafka-3.x-scenario/case/spring-kafka-case
+healthCheck: http://localhost:8080/spring-kafka-3.x-scenario/case/healthCheck
+startScript: ./bin/startup.sh
+environment:
+ - BOOTSTRAP_SERVERS=kafka-server:9092
+depends_on:
+ - zookeeper-server
+ - kafka-server
+dependencies:
+ zookeeper-server:
+ image: zookeeper:3.7
+ hostname: zookeeper-server
+ kafka-server:
+ image: bitnamilegacy/kafka:2.4.1
+ hostname: kafka-server
+ environment:
+ - KAFKA_ZOOKEEPER_CONNECT=zookeeper-server:2181
+ - KAFKA_BROKER_ID=1
+ - ALLOW_PLAINTEXT_LISTENER=yes
+ - KAFKA_LISTENERS=PLAINTEXT://0.0.0.0:9092
+ depends_on:
+ - zookeeper-server
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/pom.xml b/test/plugin/scenarios/spring-kafka-3.x-scenario/pom.xml
new file mode 100644
index 0000000000..2e6a5200a4
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/pom.xml
@@ -0,0 +1,134 @@
+
+
+
+ 4.0.0
+
+ org.apache.skywalking
+ spring-kafka-3.x-scenario
+ 5.0.0
+
+
+ UTF-8
+ 17
+ 3.8.1
+ 3.0.0
+ 2.6.2
+ 3.2.12
+ 3.3.2
+ 3.0.0
+
+
+ skywalking-spring-kafka-3.x-scenario
+
+
+
+
+ org.springframework.boot
+ spring-boot-dependencies
+ ${spring.boot.version}
+ pom
+ import
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-starter-web
+ ${spring.boot.version}
+
+
+ org.springframework.kafka
+ spring-kafka
+ ${test.framework.version}
+
+
+ org.slf4j
+ *
+
+
+
+
+
+ com.squareup.okhttp3
+ okhttp
+ ${okhttp-version}
+
+
+
+
+ spring-kafka-3.x-scenario
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+ ${spring.boot.version}
+
+
+
+ repackage
+
+
+
+
+
+ maven-compiler-plugin
+ ${maven-compiler-plugin.version}
+
+ ${compiler.version}
+ ${compiler.version}
+ ${project.build.sourceEncoding}
+
+
+
+ org.apache.maven.plugins
+ maven-assembly-plugin
+
+
+ assemble
+ package
+
+ single
+
+
+
+ src/main/assembly/assembly.xml
+
+ ./target/
+
+
+
+
+
+
+
+
+
+ spring-snapshots
+ https://repo.spring.io/snapshot
+
+
+ spring-milestones
+ https://repo.spring.io/milestone
+
+
+
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/assembly/assembly.xml b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/assembly/assembly.xml
new file mode 100644
index 0000000000..567e5690ed
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/assembly/assembly.xml
@@ -0,0 +1,41 @@
+
+
+
+
+ zip
+
+
+
+
+ ./bin
+ 0775
+
+
+
+
+
+ ${project.build.directory}/spring-kafka-3.x-scenario.jar
+ ./libs
+ 0775
+
+
+
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/Application.java b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/Application.java
new file mode 100644
index 0000000000..56eb4f0b97
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/Application.java
@@ -0,0 +1,30 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package test.apache.skywalking.apm.testcase.spring.kafka;
+
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+
+@SpringBootApplication
+public class Application {
+
+ public static void main(String[] args) {
+ SpringApplication.run(Application.class, args);
+ }
+}
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/controller/CaseController.java b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/controller/CaseController.java
new file mode 100644
index 0000000000..6044346ae9
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/java/test/apache/skywalking/apm/testcase/spring/kafka/controller/CaseController.java
@@ -0,0 +1,159 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package test.apache.skywalking.apm.testcase.spring.kafka.controller;
+
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.Response;
+import org.apache.kafka.clients.consumer.ConsumerConfig;
+import org.apache.kafka.clients.consumer.ConsumerRecord;
+import org.apache.kafka.clients.producer.ProducerConfig;
+import org.apache.kafka.common.serialization.Deserializer;
+import org.apache.kafka.common.serialization.StringDeserializer;
+import org.apache.kafka.common.serialization.StringSerializer;
+import org.springframework.beans.factory.annotation.Value;
+import org.springframework.context.annotation.PropertySource;
+import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
+import org.springframework.kafka.core.DefaultKafkaProducerFactory;
+import org.springframework.kafka.core.KafkaTemplate;
+import org.springframework.kafka.listener.AcknowledgingMessageListener;
+import org.springframework.kafka.listener.ContainerProperties;
+import org.springframework.kafka.listener.KafkaMessageListenerContainer;
+import org.springframework.kafka.support.Acknowledgment;
+import org.springframework.stereotype.Controller;
+import org.springframework.web.bind.annotation.RequestMapping;
+import org.springframework.web.bind.annotation.ResponseBody;
+
+import jakarta.annotation.PostConstruct;
+import java.util.Arrays;
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.concurrent.CountDownLatch;
+
+@Controller
+@RequestMapping("/case")
+@PropertySource("classpath:application.properties")
+public class CaseController {
+
+ private static final String SUCCESS = "Success";
+
+ @Value("${bootstrap.servers:127.0.0.1:9092}")
+ private String bootstrapServers;
+ private String topicName;
+ private KafkaTemplate kafkaTemplate;
+ private KafkaTemplate kafkaTemplate2;
+
+ private CountDownLatch latch;
+ private String helloWorld = "helloWorld";
+
+ @PostConstruct
+ private void setUp() {
+ topicName = "spring_test";
+ setUpProvider();
+ setUpAnotherProvider();
+ setUpConsumer();
+ }
+
+ private void setUpProvider() {
+ Map props = new HashMap<>();
+ props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
+ props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
+ props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
+ kafkaTemplate = new KafkaTemplate(new DefaultKafkaProducerFactory<>(props));
+ try {
+ kafkaTemplate.send(topicName, "key", "ping").get();
+ kafkaTemplate.flush();
+ } catch (Exception e) {
+ e.printStackTrace();
+ }
+ }
+
+ private void setUpAnotherProvider() {
+ Map props = new HashMap<>();
+ // use list type here
+ props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, Arrays.asList(bootstrapServers.split(",")));
+ props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
+ props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
+ kafkaTemplate2 = new KafkaTemplate(new DefaultKafkaProducerFactory<>(props));
+ try {
+ kafkaTemplate2.send(topicName, "key", "ping").get();
+ kafkaTemplate2.flush();
+ } catch (Exception e) {
+ e.printStackTrace();
+ }
+ }
+
+ private void setUpConsumer() {
+ Map configs = new HashMap<>();
+ configs.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
+ configs.put(ConsumerConfig.GROUP_ID_CONFIG, "grop:" + topicName);
+ configs.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
+ Deserializer stringDeserializer = new StringDeserializer();
+ DefaultKafkaConsumerFactory factory = new DefaultKafkaConsumerFactory(configs, stringDeserializer, stringDeserializer);
+ ContainerProperties props = new ContainerProperties(topicName);
+ props.setMessageListener(new AcknowledgingMessageListener() {
+ @Override
+ public void onMessage(ConsumerRecord data, Acknowledgment acknowledgment) {
+ if (data.value().equals(helloWorld)) {
+ OkHttpClient client = new OkHttpClient.Builder().build();
+ Request request = new Request.Builder().url("http://localhost:8080/spring-kafka-3.x-scenario/case/spring-kafka-consumer-ping").build();
+ Response response = null;
+ try {
+ response = client.newCall(request).execute();
+ } catch (IOException e) {
+ }
+ response.body().close();
+ acknowledgment.acknowledge();
+ latch.countDown();
+ }
+ }
+ });
+ KafkaMessageListenerContainer container = new KafkaMessageListenerContainer<>(factory, props);
+ container.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
+ container.start();
+ }
+
+ @RequestMapping("/spring-kafka-case")
+ @ResponseBody
+ public String springKafkaCase() throws Exception {
+ this.latch = new CountDownLatch(1);
+ kafkaTemplate.send(topicName, "key", helloWorld).get();
+ this.latch.await();
+ kafkaTemplate.flush();
+ this.latch = new CountDownLatch(1);
+ kafkaTemplate2.send(topicName, "key", helloWorld).get();
+ this.latch.await();
+ kafkaTemplate2.flush();
+ return SUCCESS;
+ }
+
+ @RequestMapping("/spring-kafka-consumer-ping")
+ @ResponseBody
+ public String springKafkaConsumerPing() {
+ return SUCCESS;
+ }
+
+ @RequestMapping("/healthCheck")
+ @ResponseBody
+ public String healthCheck() {
+ return SUCCESS;
+ }
+}
+
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/application.properties b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/application.properties
new file mode 100644
index 0000000000..09a297f5b4
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/application.properties
@@ -0,0 +1,19 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+#
+server.port=8080
+server.servlet.context-path=/spring-kafka-3.x-scenario
\ No newline at end of file
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/log4j2.xml b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/log4j2.xml
new file mode 100644
index 0000000000..b5cda5ae8a
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/src/main/resources/log4j2.xml
@@ -0,0 +1,30 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/test/plugin/scenarios/spring-kafka-3.x-scenario/support-version.list b/test/plugin/scenarios/spring-kafka-3.x-scenario/support-version.list
new file mode 100644
index 0000000000..3f547d82e4
--- /dev/null
+++ b/test/plugin/scenarios/spring-kafka-3.x-scenario/support-version.list
@@ -0,0 +1,22 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Spring Kafka 3.x requires Spring Boot 3.x / JDK 17+
+# This scenario uses Spring Boot 3.1.12 BOM (Spring Framework 6.1).
+# spring-kafka 3.0.x not tested here (needs Spring Boot 3.0 — covered by separate scenario if needed).
+3.1.4
+3.2.10
+3.3.7
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/bin/startup.sh b/test/plugin/scenarios/undertow-2.3.x-scenario/bin/startup.sh
new file mode 100644
index 0000000000..e7e48dc3b2
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/bin/startup.sh
@@ -0,0 +1,21 @@
+#!/bin/bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+home="$(cd "$(dirname $0)"; pwd)"
+
+java -jar ${agent_opts} ${home}/../libs/undertow-2.3.x-scenario.jar &
\ No newline at end of file
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/config/expectedData.yaml b/test/plugin/scenarios/undertow-2.3.x-scenario/config/expectedData.yaml
new file mode 100644
index 0000000000..b364557b9f
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/config/expectedData.yaml
@@ -0,0 +1,142 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+segmentItems:
+- serviceName: undertow-2.3.x-scenario
+ segmentSize: gt 5
+ segments:
+ - segmentId: not null
+ spans:
+ - operationName: GET:/undertow-scenario/case/undertow
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 84
+ isError: false
+ spanType: Entry
+ peer: ''
+ tags:
+ - {key: url, value: 'http://localhost:8080/undertow-scenario/case/undertow'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ skipAnalysis: 'false'
+ - segmentId: not null
+ spans:
+ - operationName: GET:/undertow-routing-scenario/case/{context}
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 84
+ isError: false
+ spanType: Entry
+ peer: ''
+ tags:
+ - {key: url, value: 'http://localhost:8081/undertow-routing-scenario/case/undertow'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ refs:
+ - {parentEndpoint: UndertowDispatch, networkAddress: 'localhost:8081', refType: CrossProcess,
+ parentSpanId: 1, parentTraceSegmentId: not null, parentServiceInstance: not
+ null, parentService: undertow-2.3.x-scenario, traceId: not null}
+ skipAnalysis: 'false'
+ - segmentId: not null
+ spans:
+ - operationName: GET:/undertow-scenario/case/undertow1
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 84
+ isError: false
+ spanType: Entry
+ peer: ''
+ tags:
+ - {key: url, value: 'http://localhost:8080/undertow-scenario/case/undertow1'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ refs:
+ - {parentEndpoint: UndertowDispatch, networkAddress: 'localhost:8080', refType: CrossProcess,
+ parentSpanId: 1, parentTraceSegmentId: not null, parentServiceInstance: not
+ null, parentService: undertow-2.3.x-scenario, traceId: not null}
+ skipAnalysis: 'false'
+ - segmentId: not null
+ spans:
+ - operationName: /undertow-routing-scenario/case/undertow
+ parentSpanId: 0
+ spanId: 1
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 2
+ isError: false
+ spanType: Exit
+ peer: localhost:8081
+ tags:
+ - {key: url, value: 'http://localhost:8081/undertow-routing-scenario/case/undertow?send=httpHandler'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ skipAnalysis: 'false'
+ - operationName: UndertowDispatch
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 84
+ isError: false
+ spanType: Local
+ peer: ''
+ refs:
+ - {parentEndpoint: GET:/undertow-scenario/case/undertow, networkAddress: '', refType: CrossThread,
+ parentSpanId: 0, parentTraceSegmentId: not null, parentServiceInstance: not
+ null, parentService: not null, traceId: not null}
+ skipAnalysis: 'false'
+ - segmentId: not null
+ spans:
+ - operationName: /undertow-scenario/case/undertow1
+ parentSpanId: 0
+ spanId: 1
+ spanLayer: Http
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 2
+ isError: false
+ spanType: Exit
+ peer: localhost:8080
+ tags:
+ - {key: url, value: 'http://localhost:8080/undertow-scenario/case/undertow1?send=runnable'}
+ - {key: http.method, value: GET}
+ - {key: http.status_code, value: '200'}
+ skipAnalysis: 'false'
+ - operationName: UndertowDispatch
+ parentSpanId: -1
+ spanId: 0
+ spanLayer: Unknown
+ startTime: nq 0
+ endTime: nq 0
+ componentId: 84
+ isError: false
+ spanType: Local
+ peer: ''
+ refs:
+ - {parentEndpoint: 'GET:/undertow-routing-scenario/case/{context}', networkAddress: '',
+ refType: CrossThread, parentSpanId: 0, parentTraceSegmentId: not null, parentServiceInstance: not
+ null, parentService: undertow-2.3.x-scenario, traceId: not null}
+ skipAnalysis: 'false'
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/configuration.yml b/test/plugin/scenarios/undertow-2.3.x-scenario/configuration.yml
new file mode 100644
index 0000000000..3994b092f8
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/configuration.yml
@@ -0,0 +1,20 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+type: jvm
+entryService: http://localhost:8080/undertow-scenario/case/undertow
+healthCheck: http://localhost:8080/undertow-scenario/case/healthCheck
+startScript: ./bin/startup.sh
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/pom.xml b/test/plugin/scenarios/undertow-2.3.x-scenario/pom.xml
new file mode 100644
index 0000000000..736443dea5
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/pom.xml
@@ -0,0 +1,118 @@
+
+
+
+ 4.0.0
+
+ org.apache.skywalking
+ undertow-2.3.x-scenario
+ jar
+ 5.0.0
+
+
+ UTF-8
+ 11
+ 3.8.1
+ org.apache.skywalking.amp.testcase.undertow.Application
+
+ undertow
+ 2.3.0.Final
+ 2.1.6.RELEASE
+
+
+ skywalking-undertow-2.3.x-scenario
+
+
+
+ javax.servlet
+ javax.servlet-api
+ 3.1.0
+ provided
+
+
+ io.undertow
+ undertow-core
+ ${test.framework.version}
+
+
+
+ org.apache.httpcomponents
+ httpclient
+ 4.3
+
+
+
+
+ undertow-2.3.x-scenario
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+ ${spring.boot.version}
+
+
+
+ repackage
+
+
+
+
+
+ maven-compiler-plugin
+ ${maven-compiler-plugin.version}
+
+ ${compiler.version}
+ ${compiler.version}
+ ${project.build.sourceEncoding}
+
+
+
+ org.apache.maven.plugins
+ maven-assembly-plugin
+
+
+ assemble
+ package
+
+ single
+
+
+
+ src/main/assembly/assembly.xml
+
+ ./target/
+
+
+
+
+
+
+
+
+
+ spring-snapshots
+ https://repo.spring.io/snapshot
+
+
+ spring-milestones
+ https://repo.spring.io/milestone
+
+
+
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/assembly/assembly.xml b/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/assembly/assembly.xml
new file mode 100644
index 0000000000..443828cfcf
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/assembly/assembly.xml
@@ -0,0 +1,41 @@
+
+
+
+
+ zip
+
+
+
+
+ ./bin
+ 0775
+
+
+
+
+
+ ${project.build.directory}/undertow-2.3.x-scenario.jar
+ ./libs
+ 0775
+
+
+
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/java/org/apache/skywalking/amp/testcase/undertow/Application.java b/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/java/org/apache/skywalking/amp/testcase/undertow/Application.java
new file mode 100644
index 0000000000..fd45a07d01
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/src/main/java/org/apache/skywalking/amp/testcase/undertow/Application.java
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ *
+ */
+
+package org.apache.skywalking.amp.testcase.undertow;
+
+import io.undertow.Undertow;
+import io.undertow.server.HttpHandler;
+import io.undertow.server.RoutingHandler;
+import io.undertow.util.Headers;
+import io.undertow.util.Methods;
+import java.io.IOException;
+import org.apache.http.HttpEntity;
+import org.apache.http.client.ResponseHandler;
+import org.apache.http.client.methods.HttpGet;
+import org.apache.http.impl.client.CloseableHttpClient;
+import org.apache.http.impl.client.HttpClients;
+import org.apache.http.util.EntityUtils;
+
+public class Application {
+
+ private static final String CASE_URL = "/undertow-scenario/case/undertow";
+
+ private static final String TEMPLATE = "/undertow-routing-scenario/case/{context}";
+
+ private static final String ROUTING_CASE_URL = "/undertow-routing-scenario/case/undertow";
+
+ public static void main(String[] args) throws InterruptedException {
+ new Thread(Application::undertowRouting).start();
+ undertow();
+ }
+
+ private static void undertow() {
+ Undertow server = Undertow.builder().addHttpListener(8080, "0.0.0.0").setHandler(exchange -> {
+ if (CASE_URL.equals(exchange.getRequestPath())) {
+ exchange.dispatch(() -> {
+ try {
+ visit("http://localhost:8081/undertow-routing-scenario/case/undertow?send=httpHandler");
+ } catch (IOException e) {
+ e.printStackTrace();
+ }
+ });
+ }
+ exchange.getResponseHeaders().put(Headers.CONTENT_TYPE, "text/plain");
+ exchange.getResponseSender().send("Success");
+ }).build();
+ Runtime.getRuntime().addShutdownHook(new Thread(server::stop));
+ server.start();
+ }
+
+ private static void undertowRouting() {
+ HttpHandler httpHandler = exchange -> {
+ if (ROUTING_CASE_URL.equals(exchange.getRequestPath())) {
+ exchange.dispatch(httpServerExchange -> visit("http://localhost:8080/undertow-scenario/case/undertow1?send=runnable"));
+ }
+ exchange.getResponseHeaders().put(Headers.CONTENT_TYPE, "text/plain");
+ exchange.getResponseSender().send("Success");
+ };
+ RoutingHandler handler = new RoutingHandler();
+ handler.add(Methods.GET, TEMPLATE, httpHandler);
+ handler.add(Methods.HEAD, TEMPLATE, httpHandler);
+ Undertow server = Undertow.builder().addHttpListener(8081, "0.0.0.0").setHandler(handler).build();
+ Runtime.getRuntime().addShutdownHook(new Thread(server::stop));
+ server.start();
+ }
+
+ private static void visit(String url) throws IOException {
+ CloseableHttpClient httpClient = HttpClients.createDefault();
+ try {
+ HttpGet httpget = new HttpGet(url);
+ ResponseHandler responseHandler = response -> {
+ HttpEntity entity = response.getEntity();
+ return entity != null ? EntityUtils.toString(entity) : null;
+ };
+ httpClient.execute(httpget, responseHandler);
+ } finally {
+ httpClient.close();
+ }
+ }
+}
diff --git a/test/plugin/scenarios/undertow-2.3.x-scenario/support-version.list b/test/plugin/scenarios/undertow-2.3.x-scenario/support-version.list
new file mode 100644
index 0000000000..a3dca1ddcf
--- /dev/null
+++ b/test/plugin/scenarios/undertow-2.3.x-scenario/support-version.list
@@ -0,0 +1,18 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Undertow 2.3.x requires Java 11+, tested with JDK 17
+2.3.18.Final
diff --git a/test/plugin/scenarios/undertow-scenario/support-version.list b/test/plugin/scenarios/undertow-scenario/support-version.list
index 420cc76010..a6d77253e2 100644
--- a/test/plugin/scenarios/undertow-scenario/support-version.list
+++ b/test/plugin/scenarios/undertow-scenario/support-version.list
@@ -19,6 +19,9 @@
# 1.3.x 1.4.x: select the first, middle, and last version
# 2.0.x: when there are many releases in the same month, choose the last one
+# Undertow 2.3.x requires Java 11+, tested in undertow-2.3.x-scenario (JDK 17 workflow)
1.3.33.Final
1.4.27.Final
2.0.27.Final
+2.1.8.Final
+2.2.37.Final