-
Notifications
You must be signed in to change notification settings - Fork 99
MINIFICPP-2668 Move standard processor tests to modular docker tests #2061
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: MINIFICPP-2666
Are you sure you want to change the base?
Conversation
5e51067 to
1b11558
Compare
b6a74cc to
67bf898
Compare
e172fa7 to
2640d9b
Compare
67bf898 to
d86d34a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR migrates standard processor tests from the legacy docker integration test framework to the newer modular behave-based test framework. The changes consolidate testing infrastructure by removing duplicate processor implementations and container classes from the old framework.
Key changes:
- Adds new container implementations (TcpClientContainer, DiagSlave) to the modular test framework
- Updates 8 feature files to use the new framework's syntax and step definitions
- Removes deprecated processor and container classes from the old docker integration framework
- Extends the behave framework with new step definitions and helper methods
Reviewed Changes
Copilot reviewed 26 out of 26 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| tcp_client_container.py | New TCP client container for network listener tests |
| diag_slave_container.py | New DiagSlave container for Modbus testing |
| steps.py | Adds step definitions for PLC/Modbus and TCP client setup |
| *.feature files (8 files) | Migrated to new framework syntax with updated assertions |
| environment.py | Adds platform detection for x86/x64-only tests |
| core_steps.py | New steps for file creation and FIPS mode |
| checking_steps.py | New assertion steps for JSON and empty file validation |
| container.py | Enhanced with JSON validation and debug print statement |
| minifi_container.py | Adds FIPS mode support |
| Old framework files | Removes deprecated processor/container implementations |
Comments suppressed due to low confidence (1)
extensions/standard-processors/tests/features/defragtextflowfiles.feature:42
- The expected output appears to have changed from having one '<1>,' to two '<1>,<1>,' at the beginning. This change affects multiple test scenarios (lines 42 and 69). Ensure this change reflects the actual expected behavior and that corresponding tests verify this is correct, as this could indicate a change in how the DefragmentText processor handles end-of-message patterns.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
behave_framework/src/minifi_test_framework/containers/container.py
Outdated
Show resolved
Hide resolved
extensions/standard-processors/tests/features/steps/tcp_client_container.py
Outdated
Show resolved
Hide resolved
extensions/standard-processors/tests/features/steps/diag_slave_container.py
Outdated
Show resolved
Hide resolved
f4bbe06 to
5a197d8
Compare
2640d9b to
985912c
Compare
5a197d8 to
667d6bd
Compare
| | input_one | input_two | pattern | pattern location | success_flow_files | | ||
| | <1>cat%dog%mouse%<2>apple%banana%<3>English% | <1>Katze%Hund%Maus%<2>Apfel%Banane%<3>Deutsch% | <[0-9]+> | Start of Message | <1>cat%dog%mouse%,<1>Katze%Hund%Maus%,<2>apple%banana%,<2>Apfel%Banane% | | ||
| | <1>cat%dog%mouse%<2>apple%banana%<3>English% | <1>Katze%Hund%Maus%<2>Apfel%Banane%<3>Deutsch% | <[0-9]+> | End of Message | <1>,cat%dog%mouse%<2>,Katze%Hund%Maus%<2>,apple%banana%<3>,Apfel%Banane%<3> | | ||
| | <1>cat%dog%mouse%<2>apple%banana%<3>English% | <1>Katze%Hund%Maus%<2>Apfel%Banane%<3>Deutsch% | <[0-9]+> | End of Message | <1>,<1>,cat%dog%mouse%<2>,Katze%Hund%Maus%<2>,apple%banana%<3>,Apfel%Banane%<3> | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What changed here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure either what changed here, but when I checked this should have been the right result, so maybe the checker function was wrong previously.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be nice to add a few negative tests if possible, and that should also validate that the test framework doesn't just accept any input or output.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure adding additional tests just to test the test framework is a good way to go. Especially in system tests where setting up an environment is time and resource heavy. Maybe some unit tests could be added to some more complex test utilities.
In this case my guess is that the previous framework was checking if a specific output is present in the output file (e.g. "output" in file_content) and because the expected output was a substring of the actual output, the test case succeeded. Now we are checking for an exact match and that's why the check with the original expectation failed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Negative tests are a good practice anyway, to see that the code fails when it's supposed to fail, and a partial validation of the test framework is just a happy side effect. But if it's too cumbersome to write, I'm not gonna insist.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's true, I think I was misunderstanding your intention in the previous comment. We could add some negative tests in the future, although I would check what particular negative tests with error/invalid scenarios are already tested on unit test level and only have additional tests for scenarios here which are not possible/viable on unit test level, otherwise only add maybe 1 or 2 for system level validation if needed. But I think re-evaluating the DefragmentText processor test scenarios and extending its test suites is out of scope for this PR.
| cmd = ( | ||
| "/bin/sh -c 'apk add netcat-openbsd && " | ||
| "echo TCP client container started; " | ||
| "while true; do echo test_tcp_message | " | ||
| f"nc minifi-primary-{test_context.scenario_id} 10254; " | ||
| "sleep 1; done'" | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is wrapping in sh -c necessary? I suspect it's not, because the command is given as a single string, meaning it has to go through a command line interpreter before being executed.
| cmd = ( | |
| "/bin/sh -c 'apk add netcat-openbsd && " | |
| "echo TCP client container started; " | |
| "while true; do echo test_tcp_message | " | |
| f"nc minifi-primary-{test_context.scenario_id} 10254; " | |
| "sleep 1; done'" | |
| ) | |
| cmd = ( | |
| "apk add netcat-openbsd && " | |
| "echo TCP client container started; " | |
| "while true; do" | |
| f" echo test_tcp_message | nc minifi-primary-'{test_context.scenario_id}' 10254; " | |
| " sleep 1;" | |
| "done" | |
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately it fails without it, it is needed when passing a chained command otherwise the apk would try to interpret each parameter after apk add as a separate package to be installed.
extensions/standard-processors/tests/features/evaluate_json_path.feature
Show resolved
Hide resolved
behave_framework/src/minifi_test_framework/containers/container.py
Outdated
Show resolved
Hide resolved
extensions/standard-processors/tests/features/file_system_operations.feature
Show resolved
Hide resolved
985912c to
74f9297
Compare
667d6bd to
ac43cb9
Compare
https://issues.apache.org/jira/browse/MINIFICPP-2668
Depends on #2059
Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:
For all changes:
Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
Does your PR title start with MINIFICPP-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character.
Has your PR been rebased against the latest commit within the target branch (typically main)?
Is your initial contribution a single, squashed commit?
For code changes:
For documentation related changes:
Note:
Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible.