Deprecate LP batch solve across Python, server, and docs#915
Deprecate LP batch solve across Python, server, and docs#915anandhkb wants to merge 2 commits intoNVIDIA:mainfrom
Conversation
anandhkb
commented
Feb 27, 2026
- Add DeprecationWarning to LP BatchSolve in solver.py
- Replace server batch path with sequential Solve() loop
- Add empty-list guard for batch LP_data (HTTP 400)
- Update cuopt_self_hosted client batch mode docs
- Update FAQ, lp-examples, batch_mode_example.sh
- Fix test_lp_solver: individual_solutions = [] (was [] * nb_solves)
- Add DeprecationWarning to LP BatchSolve in solver.py - Replace server batch path with sequential Solve() loop - Add empty-list guard for batch LP_data (HTTP 400) - Update cuopt_self_hosted client batch mode docs - Update FAQ, lp-examples, batch_mode_example.sh - Fix test_lp_solver: individual_solutions = [] (was [] * nb_solves)
📝 WalkthroughWalkthroughLP batch mode is deprecated: docs and examples updated to "Multiple LPs", BatchSolve now emits a DeprecationWarning, server processes multiple LPs by calling Solve per-item, CLI/client help text and warmstart validation updated, and tests assert deprecation and equivalence with sequential solves. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py (1)
491-496:⚠️ Potential issue | 🟠 MajorStrengthen batch-vs-sequential checks to include numerical equality.
This test currently validates only termination status. It should also compare objective/primal outputs to confirm numerical correctness equivalence.
Proposed assertion additions
for i in range(nb_solves): assert ( batch_solution[i].get_termination_status() == individual_solutions[i].get_termination_status() ) + assert batch_solution[i].get_primal_objective() == pytest.approx( + individual_solutions[i].get_primal_objective(), rel=1e-6, abs=1e-8 + ) + assert np.array(batch_solution[i].get_primal_solution()) == pytest.approx( + np.array(individual_solutions[i].get_primal_solution()), + rel=1e-5, + abs=1e-7, + )As per coding guidelines: "Write tests validating numerical correctness of optimization results (not just 'runs without error')".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py` around lines 491 - 496, The test only checks termination status; extend it to assert numerical equality between batch_solution and individual_solutions for each i in range(nb_solves) by comparing objective values and primal variable vectors: for each index use the solver result accessors (e.g., get_objective_value() or objective/obj property and get_primal_values() or primal/vars accessor on batch_solution[i] and individual_solutions[i]) and assert the objectives are equal within a small tolerance (pytest.approx or an absolute/relative tol) and the primal arrays match element-wise within tolerance; keep the existing termination-status assertion and add these numeric assertions in the same loop.
🧹 Nitpick comments (3)
python/cuopt_server/cuopt_server/utils/linear_programming/solver.py (1)
358-362: Consider: wall-clock timing vs. engine solve time.The timing logic uses
time.time()around theSolvecall, which measures wall-clock time including Python overhead. The originalBatchSolvemay have returned engine-level solve time. This could result in slightly differenttotal_solve_timevalues compared to the deprecated batch path.If precise engine-level timing is required, consider summing
i_sol.get_solve_time()instead:💡 Alternative timing approach
for i_data in LP_data: i_warnings, data_model = create_data_model(i_data) warnings.extend(i_warnings) - solve_begin_time = time.time() i_sol = linear_programming.Solve( data_model, solver_settings=solver_settings ) - total_solve_time += time.time() - solve_begin_time + total_solve_time += i_sol.get_solve_time() sol.append(i_sol)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@python/cuopt_server/cuopt_server/utils/linear_programming/solver.py` around lines 358 - 362, The current timing wraps the call to linear_programming.Solve with time.time() (solve_begin_time and total_solve_time), which measures wall-clock/Python overhead rather than the solver engine's internal time; change the accumulation to use the solver-reported time by calling i_sol.get_solve_time() (or equivalent on the Solve result) and add that to total_solve_time so timings match the original BatchSolve semantics and exclude Python overhead.docs/cuopt/source/cuopt-server/examples/lp-examples.rst (2)
18-21: Section title inconsistent with deprecation messaging.Line 5 states batch mode is deprecated, but the section title at line 20 still reads "Genric Example With Normal Mode and Batch Mode" without indicating deprecation. Additionally, "Genric" appears to be a typo for "Generic".
Consider updating the title to reflect deprecation:
📝 Suggested title update
-.. _generic-example-with-normal-and-batch-mode: +.. _generic-example-with-normal-and-multiple-lps: -Genric Example With Normal Mode and Batch Mode ------------------------------------------------ +Generic Example With Normal Mode and Multiple LPs (Batch Deprecated) +---------------------------------------------------------------------🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst` around lines 18 - 21, The section title anchored by "_generic-example-with-normal-and-batch-mode" contains a typo ("Genric") and fails to indicate that batch mode is deprecated; update the heading text "Genric Example With Normal Mode and Batch Mode" to correct the typo to "Generic" and explicitly mark batch mode as deprecated (e.g., "Generic Example With Normal Mode (Batch Mode deprecated)") so the title aligns with the deprecation note in the body.
405-408: CLI batch mode section lacks deprecation notice.The CLI batch mode section describes sending multiple MPS files at once but doesn't mention that batch mode is deprecated, unlike the Python examples section (line 5). For consistency, consider adding a deprecation note here as well.
📝 Suggested deprecation notice
-In the case of batch mode, you can send a bunch of ``mps`` files at once, and acquire results. The batch mode works only for ``mps`` in the case of CLI: +In the case of batch mode, you can send a bunch of ``mps`` files at once, and acquire results. The batch mode works only for ``mps`` in the case of CLI. + +.. note:: + LP batch mode is deprecated. Multiple problems are now solved sequentially. .. note:: Batch mode is not available for MILP problems.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst` around lines 405 - 408, Add a deprecation note to the CLI batch-mode paragraph that starts "In the case of batch mode, you can send a bunch of ``mps`` files at once" so it matches the Python examples deprecation: insert a short ".. deprecated::" directive (or equivalent deprecation note) immediately before or after that paragraph and keep the existing clarifying note "Batch mode is not available for MILP problems." so readers see both the deprecation and the current limitation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@python/cuopt_self_hosted/cuopt_sh_client/cuopt_sh.py`:
- Line 375: Add a guard in solve() before calling get_LP_solve() that rejects
using warmstart_id when more than one LP problem is supplied: if
args.warmstart_id is set and the request contains multiple LP problems (same
condition used in the existing init_ids validation), raise/exit with a clear
error message. This mirrors the existing init_ids check and prevents forwarding
warmstart_id into get_LP_solve()/_send_request() when multiple problems are
present.
In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py`:
- Around line 480-482: The test calls the deprecated BatchSolve but doesn't
assert that a DeprecationWarning is emitted; wrap the call to
solver.BatchSolve(...) in a pytest.warns(DeprecationWarning) context (or use
pytest.deprecated_call) to assert the deprecation contract, keeping the rest of
the assertions on batch_solution and solve_time unchanged; reference the
existing call to BatchSolve in
python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py to locate where to
add the pytest.warns assertion and ensure the test fails if the warning is not
emitted.
---
Outside diff comments:
In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py`:
- Around line 491-496: The test only checks termination status; extend it to
assert numerical equality between batch_solution and individual_solutions for
each i in range(nb_solves) by comparing objective values and primal variable
vectors: for each index use the solver result accessors (e.g.,
get_objective_value() or objective/obj property and get_primal_values() or
primal/vars accessor on batch_solution[i] and individual_solutions[i]) and
assert the objectives are equal within a small tolerance (pytest.approx or an
absolute/relative tol) and the primal arrays match element-wise within
tolerance; keep the existing termination-status assertion and add these numeric
assertions in the same loop.
---
Nitpick comments:
In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst`:
- Around line 18-21: The section title anchored by
"_generic-example-with-normal-and-batch-mode" contains a typo ("Genric") and
fails to indicate that batch mode is deprecated; update the heading text "Genric
Example With Normal Mode and Batch Mode" to correct the typo to "Generic" and
explicitly mark batch mode as deprecated (e.g., "Generic Example With Normal
Mode (Batch Mode deprecated)") so the title aligns with the deprecation note in
the body.
- Around line 405-408: Add a deprecation note to the CLI batch-mode paragraph
that starts "In the case of batch mode, you can send a bunch of ``mps`` files at
once" so it matches the Python examples deprecation: insert a short "..
deprecated::" directive (or equivalent deprecation note) immediately before or
after that paragraph and keep the existing clarifying note "Batch mode is not
available for MILP problems." so readers see both the deprecation and the
current limitation.
In `@python/cuopt_server/cuopt_server/utils/linear_programming/solver.py`:
- Around line 358-362: The current timing wraps the call to
linear_programming.Solve with time.time() (solve_begin_time and
total_solve_time), which measures wall-clock/Python overhead rather than the
solver engine's internal time; change the accumulation to use the
solver-reported time by calling i_sol.get_solve_time() (or equivalent on the
Solve result) and add that to total_solve_time so timings match the original
BatchSolve semantics and exclude Python overhead.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (11)
.github/skills/cuopt-lp-milp/resources/server_examples.mddocs/cuopt/source/cuopt-server/examples/lp-examples.rstdocs/cuopt/source/cuopt-server/examples/lp/examples/batch_mode_example.shdocs/cuopt/source/faq.rstpython/cuopt/cuopt/linear_programming/solver/solver.pypython/cuopt/cuopt/tests/linear_programming/test_lp_solver.pypython/cuopt_self_hosted/README.mdpython/cuopt_self_hosted/cuopt_sh_client/cuopt_self_host_client.pypython/cuopt_self_hosted/cuopt_sh_client/cuopt_sh.pypython/cuopt_server/cuopt_server/utils/linear_programming/solver.pypython/cuopt_server/cuopt_server/webserver.py
- Add warmstart_id guard in cuopt_sh.py when multiple LP problems - Fix lp-examples.rst: Genric → Generic, add deprecation to section title - Add deprecation note to CLI batch mode section - Fix lp-qp-features.rst references to renamed RST anchor - Use i_sol.get_solve_time() in server batch path (engine timing) - Fix error status serialization: use .name for JSON in batch error path - Add numerical equality assertions in test_parser_and_batch_solver - Remove pytest.warns for DeprecationWarning (fails with older installed pkg)
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (1)
docs/cuopt/source/cuopt-server/examples/lp-examples.rst (1)
405-406: Clarify phrasing for deprecated batch mode.The text "In the case of batch mode, you can send a bunch of mps files at once" uses present tense and suggests batch mode is a current feature, which may confuse readers given that batch mode is deprecated. Consider rephrasing to clarify the deprecated status more clearly in this context.
📝 Suggested rephrasing
-In the case of batch mode, you can send a bunch of ``mps`` files at once, and acquire results. The batch mode works only for ``mps`` in the case of CLI. +For solving multiple ``mps`` files, you can send them together (they will be solved sequentially). This multi-file mode works only for ``mps`` in the case of CLI.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst` around lines 405 - 406, The sentence "In the case of batch mode, you can send a bunch of ``mps`` files at once, and acquire results." incorrectly uses present tense and implies batch mode is current; update the wording to clearly indicate batch mode is deprecated (e.g., change to past or explicitly mention "deprecated") and clarify it only applied to CLI for ``mps`` files—edit the sentence in the paragraph that mentions batch mode so it explicitly states that batch mode was deprecated and therefore should not be relied upon.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst`:
- Line 5: Remove inline deprecation notices from the example: delete the
parenthetical "(solved sequentially; batch mode is deprecated)", remove "(Batch
Deprecated)" from the section title, and remove the "LP batch mode is
deprecated" note block; instead use neutral wording that describes behavior
(e.g., "solved sequentially" or "batch behavior") and add a brief pointer to the
external documentation site (a short "See external docs for deprecation and
migration guidance") so all deprecation and migration details are maintained
off-repo per project guidelines.
In `@python/cuopt_server/cuopt_server/utils/linear_programming/solver.py`:
- Around line 351-360: The batch loop currently builds solver_settings once from
create_solver(LP_data[0], warmstart_data) and reuses it for all items, dropping
per-item solver_config; update the logic so for each i_data you call
create_solver(i_data, warmstart_data) (or merge per-item solver_config into a
base via create_solver) before invoking linear_programming.Solve; locate
create_solver, LP_data, warmstart_data, solver_settings, create_data_model and
linear_programming.Solve and ensure warnings from create_solver are appended per
item and that each Solve receives the correct per-item solver_settings.
---
Nitpick comments:
In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst`:
- Around line 405-406: The sentence "In the case of batch mode, you can send a
bunch of ``mps`` files at once, and acquire results." incorrectly uses present
tense and implies batch mode is current; update the wording to clearly indicate
batch mode is deprecated (e.g., change to past or explicitly mention
"deprecated") and clarify it only applied to CLI for ``mps`` files—edit the
sentence in the paragraph that mentions batch mode so it explicitly states that
batch mode was deprecated and therefore should not be relied upon.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
docs/cuopt/source/cuopt-server/examples/lp-examples.rstdocs/cuopt/source/lp-qp-features.rstpython/cuopt/cuopt/tests/linear_programming/test_lp_solver.pypython/cuopt_self_hosted/cuopt_sh_client/cuopt_sh.pypython/cuopt_server/cuopt_server/utils/linear_programming/solver.py
🚧 Files skipped from review as they are similar to previous changes (1)
- python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py
| =============================== | ||
|
|
||
| The following example showcases how to use the ``CuOptServiceSelfHostClient`` to solve a simple LP problem in normal mode and batch mode (where multiple problems are solved at once). | ||
| The following example showcases how to use the ``CuOptServiceSelfHostClient`` to solve a simple LP problem in normal mode and with multiple problems (solved sequentially; batch mode is deprecated). |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Check the file content to see the deprecation notices mentioned
cat -n docs/cuopt/source/cuopt-server/examples/lp-examples.rst | head -30Repository: NVIDIA/cuopt
Length of output: 1520
🏁 Script executed:
# Check more of the file around lines 407-408
sed -n '400,415p' docs/cuopt/source/cuopt-server/examples/lp-examples.rstRepository: NVIDIA/cuopt
Length of output: 716
🏁 Script executed:
# Search for any references to external docs or deprecation guidance
rg -i "deprecat" docs/cuopt/source/cuopt-server/examples/lp-examples.rst -A 2 -B 2Repository: NVIDIA/cuopt
Length of output: 884
Remove or relocate deprecation notices to external documentation site.
This file has been updated with multiple deprecation notices added directly to the in-repo .rst documentation:
- Line 5: "(solved sequentially; batch mode is deprecated)"
- Line 20: Section title includes "(Batch Deprecated)"
- Line 407-408: Note block stating "LP batch mode is deprecated"
Per cuOpt project guidelines, deprecation notices and migration guidance for RAPIDS projects are maintained on the external docs site rather than in in-repo documentation. Remove these inline deprecation references and rely on external documentation to communicate deprecated functionality.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/cuopt/source/cuopt-server/examples/lp-examples.rst` at line 5, Remove
inline deprecation notices from the example: delete the parenthetical "(solved
sequentially; batch mode is deprecated)", remove "(Batch Deprecated)" from the
section title, and remove the "LP batch mode is deprecated" note block; instead
use neutral wording that describes behavior (e.g., "solved sequentially" or
"batch behavior") and add a brief pointer to the external documentation site (a
short "See external docs for deprecation and migration guidance") so all
deprecation and migration details are maintained off-repo per project
guidelines.
| cswarnings, solver_settings = create_solver( | ||
| LP_data[0], warmstart_data | ||
| ) | ||
| warnings.extend(cswarnings) | ||
| sol, total_solve_time = linear_programming.BatchSolve( | ||
| data_model_list, solver_settings | ||
| ) | ||
| for i_data in LP_data: | ||
| i_warnings, data_model = create_data_model(i_data) | ||
| warnings.extend(i_warnings) | ||
| i_sol = linear_programming.Solve( | ||
| data_model, solver_settings=solver_settings | ||
| ) |
There was a problem hiding this comment.
Batch solves currently ignore per-item solver settings after index 0.
At Line 351-353, create_solver is only built from LP_data[0], and Line 359 reuses it for every i_data. Any different solver_config in later batch entries is silently dropped.
🔧 Proposed fix
- cswarnings, solver_settings = create_solver(
- LP_data[0], warmstart_data
- )
- warnings.extend(cswarnings)
for i_data in LP_data:
i_warnings, data_model = create_data_model(i_data)
warnings.extend(i_warnings)
+ cswarnings, solver_settings = create_solver(
+ i_data, warmstart_data
+ )
+ warnings.extend(cswarnings)
i_sol = linear_programming.Solve(
data_model, solver_settings=solver_settings
)As per coding guidelines "**/*.{h,hpp,py}: ... maintain backward compatibility in Python and server APIs with deprecation warnings".
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| cswarnings, solver_settings = create_solver( | |
| LP_data[0], warmstart_data | |
| ) | |
| warnings.extend(cswarnings) | |
| sol, total_solve_time = linear_programming.BatchSolve( | |
| data_model_list, solver_settings | |
| ) | |
| for i_data in LP_data: | |
| i_warnings, data_model = create_data_model(i_data) | |
| warnings.extend(i_warnings) | |
| i_sol = linear_programming.Solve( | |
| data_model, solver_settings=solver_settings | |
| ) | |
| for i_data in LP_data: | |
| i_warnings, data_model = create_data_model(i_data) | |
| warnings.extend(i_warnings) | |
| cswarnings, solver_settings = create_solver( | |
| i_data, warmstart_data | |
| ) | |
| warnings.extend(cswarnings) | |
| i_sol = linear_programming.Solve( | |
| data_model, solver_settings=solver_settings | |
| ) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@python/cuopt_server/cuopt_server/utils/linear_programming/solver.py` around
lines 351 - 360, The batch loop currently builds solver_settings once from
create_solver(LP_data[0], warmstart_data) and reuses it for all items, dropping
per-item solver_config; update the logic so for each i_data you call
create_solver(i_data, warmstart_data) (or merge per-item solver_config into a
base via create_solver) before invoking linear_programming.Solve; locate
create_solver, LP_data, warmstart_data, solver_settings, create_data_model and
linear_programming.Solve and ensure warnings from create_solver are appended per
item and that each Solve receives the correct per-item solver_settings.
7004771 to
81719c3
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
docs/cuopt/source/cuopt-server/examples/lp/examples/batch_mode_example.sh (1)
7-11: Make terminology fully consistent across the example.These updated notes correctly describe sequential solving, but the script banner still says “Batch Mode” (Line 49). Renaming that banner to “Sequential LP solve” would avoid mixed messaging.
As per coding guidelines,
docs/**/*should ensure “Clarity” and “Consistency … terminology match code”.Also applies to: 56-56
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/cuopt/source/cuopt-server/examples/lp/examples/batch_mode_example.sh` around lines 7 - 11, Update the script's banner and any matching labels from "Batch Mode" to "Sequential LP solve" so the terminology matches the note about sequential solving; search for the literal "Batch Mode" string in batch_mode_example.sh (including the banner at the top and any other occurrences such as the secondary message around line 56) and replace them with "Sequential LP solve" to keep the docs' terminology consistent with the updated behavior.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py`:
- Around line 480-483: Update the misleading comment near the BatchSolve call in
test_lp_solver.py: either remove the sentence claiming "CI asserts it via
pytest.warns" or change it to point to the actual location that asserts the
DeprecationWarning (e.g., the test function or file that uses pytest.warns), and
ensure the comment references the deprecated method name BatchSolve and the
recommended replacement Solve so readers can find the related code paths; keep
the comment concise and accurate.
---
Nitpick comments:
In `@docs/cuopt/source/cuopt-server/examples/lp/examples/batch_mode_example.sh`:
- Around line 7-11: Update the script's banner and any matching labels from
"Batch Mode" to "Sequential LP solve" so the terminology matches the note about
sequential solving; search for the literal "Batch Mode" string in
batch_mode_example.sh (including the banner at the top and any other occurrences
such as the secondary message around line 56) and replace them with "Sequential
LP solve" to keep the docs' terminology consistent with the updated behavior.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (12)
.github/skills/cuopt-lp-milp/resources/server_examples.mddocs/cuopt/source/cuopt-server/examples/lp-examples.rstdocs/cuopt/source/cuopt-server/examples/lp/examples/batch_mode_example.shdocs/cuopt/source/faq.rstdocs/cuopt/source/lp-qp-features.rstpython/cuopt/cuopt/linear_programming/solver/solver.pypython/cuopt/cuopt/tests/linear_programming/test_lp_solver.pypython/cuopt_self_hosted/README.mdpython/cuopt_self_hosted/cuopt_sh_client/cuopt_self_host_client.pypython/cuopt_self_hosted/cuopt_sh_client/cuopt_sh.pypython/cuopt_server/cuopt_server/utils/linear_programming/solver.pypython/cuopt_server/cuopt_server/webserver.py
🚧 Files skipped from review as they are similar to previous changes (7)
- docs/cuopt/source/faq.rst
- docs/cuopt/source/cuopt-server/examples/lp-examples.rst
- docs/cuopt/source/lp-qp-features.rst
- python/cuopt_self_hosted/README.md
- python/cuopt_server/cuopt_server/utils/linear_programming/solver.py
- .github/skills/cuopt-lp-milp/resources/server_examples.md
- python/cuopt_self_hosted/cuopt_sh_client/cuopt_self_host_client.py
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py`:
- Around line 480-482: The comment block above BatchSolve is duplicated and
includes an incomplete/truncated line; edit the comment that references
BatchSolve (the deprecation note) to remove the duplicate/truncated lines and
replace them with a single clear sentence such as "Call BatchSolve (deprecated;
use sequential Solve instead)" so that only one complete, non-redundant comment
remains immediately above the BatchSolve usage.
| # Call BatchSolve (deprecated; use sequential Solve instead) | ||
| # DeprecationWarning is emitted when running against a build with the | ||
| # Call BatchSolve (deprecated; use sequential Solve instead) |
There was a problem hiding this comment.
Remove the duplicated/truncated comment above BatchSolve.
Line 481 is incomplete, and Line 480/Line 482 duplicate the same sentence, which makes intent harder to read.
Suggested cleanup
- # Call BatchSolve (deprecated; use sequential Solve instead)
- # DeprecationWarning is emitted when running against a build with the
- # Call BatchSolve (deprecated; use sequential Solve instead)
+ # Call BatchSolve (deprecated; use sequential Solve instead)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Call BatchSolve (deprecated; use sequential Solve instead) | |
| # DeprecationWarning is emitted when running against a build with the | |
| # Call BatchSolve (deprecated; use sequential Solve instead) | |
| # Call BatchSolve (deprecated; use sequential Solve instead) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@python/cuopt/cuopt/tests/linear_programming/test_lp_solver.py` around lines
480 - 482, The comment block above BatchSolve is duplicated and includes an
incomplete/truncated line; edit the comment that references BatchSolve (the
deprecation note) to remove the duplicate/truncated lines and replace them with
a single clear sentence such as "Call BatchSolve (deprecated; use sequential
Solve instead)" so that only one complete, non-redundant comment remains
immediately above the BatchSolve usage.
baec347 to
81719c3
Compare