Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AssertionError: has when differentiating DynamicExpressions on Enzyme v0.12 #2080

Closed
MilesCranmer opened this issue Nov 11, 2024 · 1 comment

Comments

@MilesCranmer
Copy link
Contributor

I ran the example in DynamicExpression.jl's documentation: https://ai.damtp.cam.ac.uk/dynamicexpressions/dev/eval/#Enzyme that does a simple evaluation. This used to work fine on v0.12.(?) but on v0.12.36. it throws this error:

ERROR: AssertionError: has
Stacktrace:
  [1] shadow_alloc_rewrite(V::Ptr{LLVM.API.LLVMOpaqueValue}, gutils::Ptr{Nothing})
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:3084
  [2] EnzymeCreatePrimalAndGradient(logic::Enzyme.Logic, todiff::LLVM.Function, retType::Enzyme.API.CDIFFE_TYPE, constant_args::Vector{…}, TA::Enzyme.TypeAnalysis, returnValue::Bool, dretUsed::Bool, mode::Enzyme.API.CDerivativeMode, width::Int64, additionalArg::Ptr{…}, forceAnonymousTape::Bool, typeInfo::Enzyme.FnTypeInfo, uncacheable_args::Vector{…}, augmented::Ptr{…}, atomicAdd::Bool)
    @ Enzyme.API ~/.julia/packages/Enzyme/TiboG/src/api.jl:163
  [3] enzyme!(job::GPUCompiler.CompilerJob{…}, mod::LLVM.Module, primalf::LLVM.Function, TT::Type, mode::Enzyme.API.CDerivativeMode, width::Int64, parallel::Bool, actualRetType::Type, wrap::Bool, modifiedBetween::NTuple{…}, returnPrimal::Bool, expectedTapeType::Type, loweredArgs::Set{…}, boxedArgs::Set{…})
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:4168
  [4] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6438
  [5] codegen
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:5614 [inlined]
  [6] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7241
  [7] _thunk
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7241 [inlined]
  [8] cached_compilation
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7282 [inlined]
  [9] thunkbase(ctx::LLVM.Context, mi::Core.MethodInstance, ::Val{…}, ::Type{…}, ::Type{…}, tt::Type{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Val{…}, ::Type{…}, ::Val{…})
    @ Enzyme.Compiler ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7355
 [10] #s2080#19115
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7407 [inlined]
 [11] var"#s2080#19115"(FA::Any, A::Any, TT::Any, Mode::Any, ModifiedBetween::Any, width::Any, ReturnPrimal::Any, ShadowInit::Any, World::Any, ABI::Any, ErrIfFuncWritten::Any, ::Any, ::Any, ::Any, ::Any, tt::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any)
    @ Enzyme.Compiler ./none:0
 [12] (::Core.GeneratedFunctionStub)(::UInt64, ::LineNumberNode, ::Any, ::Vararg{Any})
    @ Core ./boot.jl:707
 [13] autodiff
    @ ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:315 [inlined]
 [14] autodiff(::ReverseMode{false, FFIABI, false, false}, ::typeof(my_loss_function), ::Type{Active}, ::Const{Node{Float64}}, ::Duplicated{Matrix{Float64}}, ::Const{OperatorEnum{Tuple{typeof(+), typeof(-), typeof(*), typeof(/)}, Tuple{typeof(cos), typeof(sin)}}})
    @ Enzyme ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:332
 [15] top-level scope
    @ REPL[13]:3
Some type information was truncated. Use `show(err)` to see complete types.

The MWE is as follows:

using DynamicExpressions
using Enzyme

operators = OperatorEnum(binary_operators=(+, -, *, /), unary_operators=(cos, sin))

x1 = Node{Float64}(feature=1)
x2 = Node{Float64}(feature=2)

tree = 0.5 * x1 + cos(x2 - 0.2)
X = [1.0 2.0 3.0; 4.0 5.0 6.0]  # 2x3 matrix (2 features, 3 rows)

function my_loss_function(tree, X, operators)
    # Get the outputs
    y = tree(X, operators)
    # Sum them (so we can take a gradient, rather than a jacobian)
    return sum(y)
end


dX = begin
    storage=zero(X)
    autodiff(
        Reverse,
        my_loss_function,
        Active,
        ## Actual arguments to function:
        Const(tree),
        Duplicated(X, storage),
        Const(operators),
    )
    storage
end
@wsmoses
Copy link
Member

wsmoses commented Nov 12, 2024

dynamicexpressions has been run on every commit since you added the integration [and passes on main], so not really sure what to tell you:

- DynamicExpressions

In any case we aren't going to be making a backport release of 0.12, so you should migrate to 0.13

@wsmoses wsmoses closed this as not planned Won't fix, can't repro, duplicate, stale Nov 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants