Conversation
since the log(A_ij) is constant, there's no need to include it in the linear objectives, which allows for simplifying the loop dramatically (at least when the matrix has no zeros)
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #15 +/- ##
==========================================
+ Coverage 92.84% 92.87% +0.03%
==========================================
Files 5 5
Lines 852 856 +4
==========================================
+ Hits 791 795 +4
Misses 61 61 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
| model = JuMP.Model(HiGHS.Optimizer) | ||
| JuMP.set_silent(model) | ||
| @variable(model, α[1:n]) | ||
| @objective(model, Min, sum(α[i] + α[j] - logA[i, j] for i in 1:n, j in 1:n if A[i, j] != 0)) |
There was a problem hiding this comment.
You're right this is a good simplification (although it's the constraints that make it slow), but we need to handle the case where A has zeros. I think precomputing sum(iszero, A; dims=1 or 2) might enable this?
There was a problem hiding this comment.
I think we're need both. It would be somewhat surprising to me if going from an O(mn) to O(m+n) size objective didn't speed things up a bit.
|
Clearly I need more test coverage for the julia> A = [
8 20 18;
20 10 0;
18 0 5
];
julia> a = symcover_lmin(A)
3-element Vector{Float64}:
6.324555320336757
3.1622776601683795
2.8460498941515415
julia> cover_lobjective(a, A)
2.0918640616783923
julia> a = symcover_lmin_bad(A)
3-element Vector{Float64}:
8.04984471899924
3.1622776601683795
2.23606797749979
julia> cover_lobjective(a, A)
2.574290210922684where |
|
Now this recovers the correct cover. |
|
Thanks! This is great! |
since the log(A_ij) is constant, there's no need to include it in the linear objectives, which allows for simplifying the loop dramatically (at least when the matrix has no zeros)