Conversation
|
Is defining a method on an internal struct really the best solution? How are we getting into this error in the first place? Maybe we're missing something else. |
|
Also, CC @nhz2 who introduced this struct in JuliaLang/julia#60533 |
| Base.cconvert(::Type{MPIPtr}, x::Union{Ptr{T}, Array{T}, Ref{T}}) where T = Base.cconvert(Ptr{T}, x) | ||
| Base.cconvert(::Type{MPIPtr}, x::SubArray{T}) where T = Base.cconvert(Ptr{T}, x) | ||
| function Base.unsafe_convert(::Type{MPIPtr}, x::MPIBuffertype{T}) where T | ||
| ptr = Base.unsafe_convert(Ptr{T}, x) | ||
| reinterpret(MPIPtr, ptr) | ||
| end |
There was a problem hiding this comment.
Yes, I think this code assumes that Base.cconvert(Ptr{T}, x) is a noop, but that doesn't work for generic SubArray{T}.
One fix is to add a wrapper type that is returned by Base.cconvert storing the cconverted subarray like:
struct MPIBufferCConvertWrap{T, C}
x::C
end
And then unwrap and call Base.unsafe_convert on x in Base.unsafe_convert(::Type{MPIPtr}, ::MPIBufferCConvertWrap{T, C})
There was a problem hiding this comment.
I am not sure I follow.
cconvert must return something that you can call unsafe_convert on and the user must guarantee that the value returned by cconvert is GC preserved across a use of the value produced by unsafe_convert.
Can you expand on what you think is going wrong? Or show an example that is currently broken?
There was a problem hiding this comment.
Can you expand on what you think is going wrong? Or show an example that is currently broken?
test_gather is failing on nightly:
ERROR: ERROR: ERROR: LoadError: LoadError: LoadError: ERROR: LoadError: MethodError: no method matching MethodError: MethodError: no method matching no method matching unsafe_convert(::Type{MPI.API.MPIPtr}, ::Base.OffsetCConvert{Char, MemoryRef{Char}})
The function `unsafe_convert` exists, but no method is defined for this combination of argument types.unsafe_convert(::Type{MPI.API.MPIPtr}, ::Base.OffsetCConvert{Char, MemoryRef{Char}})
The unsafe_convert(::Type{MPI.API.MPIPtr}, ::Base.OffsetCConvert{Char, MemoryRef{Char}})
The function `unsafe_convert` exists, but no method is defined for this combination of argument types.function `unsafe_convert` exists, but no method is defined for this combination of argument types.MethodError: no method matching unsafe_convert(::Type{MPI.API.MPIPtr}, ::Base.OffsetCConvert{Char, MemoryRef{Char}})
The function `unsafe_convert` exists, but no method is defined for this combination of argument types.
Closest candidates are:
unsafe_convert(::Type{Cwstring}, ::Any)
@ Base strings/cstring.jl:101
unsafe_convert(::Type{MPI.API.MPIPtr}, ::MPI.API.SentinelPtr)
@ MPI ~/work/MPI.jl/MPI.jl/src/api/api.jl:75
unsafe_convert(::Type{MPI.API.MPIPtr}, ::String)
@ MPI ~/work/MPI.jl/MPI.jl/src/buffers.jl:13
...
Closest candidates are:
unsafe_convert(::Type{Cwstring}, ::Any)
@ Base strings/cstring.jl:101
unsafe_convert(::Type{MPI.API.MPIPtr}, ::MPI.API.SentinelPtr)
@ MPI ~/work/MPI.jl/MPI.jl/src/api/api.jl:75
unsafe_convert(::Type{MPI.API.MPIPtr}, ::String)
@ MPI ~/work/MPI.jl/MPI.jl/src/buffers.jl:13
...
Closest candidates are:
unsafe_convert(::Type{Cwstring}, ::Any)
@ Base strings/cstring.jl:101
unsafe_convert(::Type{MPI.API.MPIPtr}, ::MPI.API.SentinelPtr)
@ MPI ~/work/MPI.jl/MPI.jl/src/api/api.jl:75
unsafe_convert(::Type{MPI.API.MPIPtr}, ::String)
@ MPI ~/work/MPI.jl/MPI.jl/src/buffers.jl:13
...
Stacktrace:
[1] MPI_Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, sendcount::Int32, sendtype::MPI.Datatype, recvbuf::Nothing, recvcount::Int32, recvtype::MPI.Datatype, root::Int64, comm::MPI.Comm)
@ MPI.API ~/work/MPI.jl/MPI.jl/src/api/generated_api.jl:414
[2] Gather!(sendbuf::MPI.Buffer{SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:304
[3] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:308
[4] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::Nothing, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:312
[5] Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, root::Int64, comm::MPI.Comm)
@ MPI
Stacktrace:
~/work/MPI.jl/MPI.jl/src/collective.jl:338
[6] Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm; root::Int64)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[7] [1] MPI_Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, sendcount::Int32, sendtype::MPI.Datatype, recvbuf::Nothing, recvcount::Int32, recvtype::MPI.Datatype, root::Int64, comm::MPI.Comm)
@ MPI.API ~/work/MPI.jl/MPI.jl/src/api/generated_api.jl:414
[2] Gather!(sendbuf::MPI.Buffer{SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:304
[3] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:308
[4]
Stacktrace:
[1] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::Nothing, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:312
[5] kwcall(::@NamedTuple{root::Int64}, ::typeof(MPI.Gather), sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm)
@ MPIGather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:338
[6] ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[8] top-level scope
@ ~/work/MPI.jl/MPI.jl/test/test_gather.jl:45
[9] include(mod::Module, _path::String)
@ Base ./Base.jl:309
[10] exec_options(opts::Base.JLOptions)
@ Base ./client.jl:344
[11] _start()
@ Base ./client.jl:585
in expression starting at /home/runner/work/MPI.jl/MPI.jl/test/test_gather.jl:11
Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm; root::Int64)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[7] MPI_Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, sendcount::Int32, sendtype::MPI.Datatype, recvbuf::Nothing, recvcount::Int32, recvtype::MPI.Datatype, root::Int64, comm::MPI.Comm)
@ MPI.API ~/work/MPI.jl/MPI.jl/src/api/generated_api.jl:414
[2] Gather!(sendbuf::MPI.Buffer{SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:304
[3] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::UBuffer{Nothing}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:308
[4] Gather!(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, recvbuf::Nothing, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:312
[5] kwcall(::@NamedTuple{root::Int64}, ::typeof(MPI.Gather), sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[8] top-level scope
@ ~/work/MPI.jl/MPI.jl/test/test_gather.jl:45
[9] include(mod::Module, _path::String)
@ Base ./Base.jl:309
[10] exec_options(opts::Base.JLOptions)
@ Base ./client.jl:344
[11] _start()
@ Base ./client.jl:585
in expression starting at /home/runner/work/MPI.jl/MPI.jl/test/test_gather.jl:11
Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, root::Int64, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:338
[6] Gather(sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm; root::Int64)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[7] kwcall(::@NamedTuple{root::Int64}, ::typeof(MPI.Gather), sendbuf::SubArray{Char, 1, Vector{Char}, Tuple{UnitRange{Int64}}, true}, comm::MPI.Comm)
@ MPI ~/work/MPI.jl/MPI.jl/src/collective.jl:336
[8] top-level scope
@ ~/work/MPI.jl/MPI.jl/test/test_gather.jl:45
[9] include(mod::Module, _path::String)
@ Base ./Base.jl:309
[10] exec_options(opts::Base.JLOptions)
@ Base ./client.jl:344
[11] _start()
@ Base ./client.jl:585
in expression starting at /home/runner/work/MPI.jl/MPI.jl/test/test_gather.jl:11
Offending code is
Line 45 in 05c7f88
There was a problem hiding this comment.
Sure that is the issue this PR is trying to fix, with the method definition below? But it sounded like @nhz2 had an issue with this approach
There was a problem hiding this comment.
In this case cconvert is returning a OffsetCConvert, but this might not always be the case. I think the only thing we know about the value returned by cconvert is that it can be used in a matching unsafe_convert.
The method Base.cconvert(::Type{MPIPtr}, x::SubArray{T}) where T = Base.cconvert(Ptr{T}, x) implies that cconvert(Ptr{T}, x) can be used in the matching unsafe_convert(::Type{MPIPtr}, ::typeof(cconvert(Ptr{T}, x))), but a matching unsafe_convert is missing.
There was a problem hiding this comment.
I understand your point now, I think we can't define it generally...
Because for
function Base.unsafe_convert(::Type{MPIPtr}, x) = Base.unsafe_convert(Ptr{T}, x)
we are missing Ptr{T}. So we would need to add this to MPIPtr{T}
|
Closing in favour of #941 |
No description provided.