Skip to content

Add optimizer comparison example: HillClimbing vs BayesianOptimizer#241

Open
direkkakkar319-ops wants to merge 1 commit intohyperactive-project:mainfrom
direkkakkar319-ops:add-optimizer-comparision
Open

Add optimizer comparison example: HillClimbing vs BayesianOptimizer#241
direkkakkar319-ops wants to merge 1 commit intohyperactive-project:mainfrom
direkkakkar319-ops:add-optimizer-comparision

Conversation

@direkkakkar319-ops
Copy link

Description

Adds a new example optimizer_comparison_example.py to examples/gfo/ that demonstrates optimizer swapping — one of Hyperactive's core strengths. The same RandomForestClassifier experiment and search space is run with both HillClimbing and BayesianOptimizer, showing that only the optimizer line needs to change.

Related Issues

None — this is a proactive addition. No existing example in examples/gfo/ covers multiple optimizers side by side.

Type of Change

  • [BUG] - Bug fix (non-breaking change fixing an issue)
  • [ENH] - New feature (non-breaking change adding functionality)
  • [DOC] - Documentation changes
  • [MNT] - Maintenance

How was this solved?

Created a single example file that:

  1. Defines the experiment (SklearnCvExperiment with RandomForestClassifier + 5-fold CV on Iris) once
  2. Defines the search space once
  3. Runs HillClimbing and BayesianOptimizer separately using the exact same experiment and search space
  4. Prints a side-by-side comparison table of best parameters and CV scores

Added optimizer comparison example (HillClimbing vs BayesianOptimizer) on RandomForest — not present in existing examples/gfo/ folder.

Checklist

  • PR title includes appropriate tag: [ENH]
  • Linked to related issue (if applicable)
  • Code passes make check (lint, format, isort)
  • Tests added/updated for changes (if applicable)
  • Documentation updated (if applicable)

Testing

Run the example directly to verify:
bash

python examples/gfo/optimizer_comparison_example.py

Expected output: best parameters and CV scores for both optimizers printed, followed by a comparison table.

Additional Notes

  • n_iter=30 is used-->to show meaningful differences between optimizers
  • random_state=42-- ensures reproducible results across runs
  • Every existing file in examples/gfo/ demonstrates a single optimizer only this is the first cross-optimizer comparison example
image

@direkkakkar319-ops
Copy link
Author

Hi, @SimonBlanke After i read the Documentation part in , when i saw the examples there this example was missing so I thought to add it . I would provide better understanding to the user for using to compare two algorithms

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant