Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trixi forcing downgrades when using Enzyme.jl, Zygote.jl, SciMLsensitivity.jl #2157

Open
erny123 opened this issue Nov 12, 2024 · 10 comments
Open
Labels

Comments

@erny123
Copy link

erny123 commented Nov 12, 2024

Xpost: https://discourse.julialang.org/t/scimlsensitivity-jl-precompile-error-because-of-enzyme/122561/9

Trying to follow along this the SciMLSensitivity.jl tutorial:

Parameter Estimation of Ordinary Differential Equations · SciMLSensitivity.jl

If anyone has any idea what's going on, I'd appreciate it.

Julia version: Version 1.11.1 (2024-10-16)

When looking for Enzyme.jl version:

] status Enzyme

Status ~/TRIXI/Project.toml ⌅ [7da242da] Enzyme v0.11.20 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use 'status --outdated'

and

status --outdated
Status `~/TRIXI/Project.toml`
⌅ [7da242da] Enzyme v0.11.20 (<v0.13.14): SciMLSensitivity
⌃ [7ed4a6bd] LinearSolve v2.22.1 (<v2.36.2)
⌅ [7f7a1694] Optimization v3.19.3 (<v4.0.5): OptimizationOptimJL, OptimizationOptimisers, OptimizationPolyalgorithms
⌃ [500b13db] OptimizationPolyalgorithms v0.1.2 (<v0.3.0)
⌃ [1dea7af3] OrdinaryDiffEq v6.66.0 (<v6.90.1)
⌃ [1ed8b502] SciMLSensitivity v7.51.0 (<v7.71.1)

When using Pkg.add:

Pkg.add("OrdinaryDiffEq")
Pkg.add("Trixi")
Pkg.add("Plots")
Pkg.add("LinearSolve")
Pkg.add("LaTeXStrings")
Pkg.add("LinearAlgebra")
Pkg.add("Enzyme")
Pkg.add("Optimization")
Pkg.add("OptimizationPolyalgorithms")
Pkg.add("SciMLSensitivity")
Pkg.add("Zygote")

I get many errors but most being due to Enzym.jl :

0 dependencies successfully precompiled in 25 seconds. 496 already precompiled. 7 dependencies errored. For a report of the errors see `julia> err`. To retry use `pkg> precompile` Resolving package versions... Updating `[~/TRIXI/Project.toml](http://127.0.0.1:8888/lab/tree/TRIXI/TRIXI/Project.toml)` [37e2e46d] + LinearAlgebra v1.11.0 No Changes to `[~/TRIXI/Manifest.toml](http://127.0.0.1:8888/lab/tree/TRIXI/TRIXI/Manifest.toml)` Precompiling project... ✗ Enzyme ✗ Enzyme → EnzymeSpecialFunctionsExt ✗ DiffEqBase → DiffEqBaseEnzymeExt ✗ Optimization → OptimizationEnzymeExt ✗ LinearSolve → LinearSolveEnzymeExt ✗ SparseDiffTools → SparseDiffToolsEnzymeExt ✗ SciMLSensitivity 0 dependencies successfully precompiled in 22 seconds. 496 already precompiled. 7 dependencies errored. For a report of the errors see `julia> err`. To retry use `pkg> precompile`

And then using :

using OrdinaryDiffEq
using Trixi
using Plots
using LaTeXStrings
using LinearSolve
using LinearAlgebra
using Optimization
using OptimizationPolyalgorithms 
using SciMLSensitivity
using Zygote
[ Info: Precompiling IJuliaExt [2f4121a4-3b3a-5ce6-9c5e-1f2673ce168a] 
[ Info: Precompiling SciMLSensitivity [1ed8b502-d754-442c-8d5d-10ac956f44a1] (cache misses: wrong dep version loaded (2))
ERROR: LoadError: InitError: type Nothing has no field major
Stacktrace:
  [1] getproperty
    @ ./Base.jl:49 [inlined]
  [2] __init__()
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/GPUCompiler.jl:64
  [3] run_module_init(mod::Module, i::Int64)
    @ Base ./loading.jl:1336
  [4] register_restored_modules(sv::Core.SimpleVector, pkg::Base.PkgId, path::String)
    @ Base ./loading.jl:1324
  [5] _include_from_serialized(pkg::Base.PkgId, path::String, ocachepath::String, depmods::Vector{Any}, ignore_native::Nothing; register::Bool)
    @ Base ./loading.jl:1213
  [6] _include_from_serialized (repeats 2 times)
    @ ./loading.jl:1169 [inlined]
  [7] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String, build_id::UInt128, stalecheck::Bool; reasons::Dict{String, Int64}, DEPOT_PATH::Vector{String})
    @ Base ./loading.jl:1985
  [8] _require(pkg::Base.PkgId, env::String)
    @ Base ./loading.jl:2450
  [9] __require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2315
 [10] #invoke_in_world#3
    @ ./essentials.jl:1089 [inlined]
 [11] invoke_in_world
    @ ./essentials.jl:1086 [inlined]
 [12] _require_prelocked(uuidkey::Base.PkgId, env::String)
    @ Base ./loading.jl:2302
 [13] macro expansion
    @ ./loading.jl:2241 [inlined]
 [14] macro expansion
    @ ./lock.jl:273 [inlined]
 [15] __require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2198
 [16] #invoke_in_world#3
    @ ./essentials.jl:1089 [inlined]
 [17] invoke_in_world
    @ ./essentials.jl:1086 [inlined]
 [18] require(into::Module, mod::Symbol)
    @ Base ./loading.jl:2191
 [19] include(mod::Module, _path::String)
    @ Base ./Base.jl:557
 [20] include(x::String)
    @ Enzyme ~/.julia/packages/Enzyme/l4FS0/src/Enzyme.jl:1
 [21] top-level scope
    @ ~/.julia/packages/Enzyme/l4FS0/src/Enzyme.jl:43
 [22] include
    @ ./Base.jl:557 [inlined]
 [23] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::String)
    @ Base ./loading.jl:2790
 [24] top-level scope
    @ stdin:5
during initialization of module GPUCompiler
in expression starting at /home/evaldez/.julia/packages/Enzyme/l4FS0/src/typetree.jl:6
in expression starting at /home/evaldez/.julia/packages/Enzyme/l4FS0/src/Enzyme.jl:1
in expression starting at stdin:5
ERROR: LoadError: Failed to precompile Enzyme [7da242da-08ed-463a-9acd-ee780be4f1d9] to "/home/evaldez/.julia/compiled/v1.11/Enzyme/jl_oI4hIh".

Additionally when removing Trixi.jl

] rm Trixi
] update

allows everything to update properly.

then:

] add Trixi

gives the previous downgrades.

@sloede
Copy link
Member

sloede commented Nov 13, 2024

Maybe related to #1789 #2061 #2150 etc.?

@ChrisRackauckas
Copy link

Why is this upstream? It's Trixi that's setting the bound?

@ranocha
Copy link
Member

ranocha commented Nov 13, 2024

Since it's related to upstream changes that force us to update quite a lot of stuff here. When that's done, we can look for the upstream label to check similar/related issues.

@ChrisRackauckas
Copy link

I thought all of the static array issues were closed other than SciML/RecursiveArrayTools.jl#400. What's a case I could look at?

@ranocha
Copy link
Member

ranocha commented Nov 14, 2024

Right now, it needs some work on our side like #2150 and StartUpDG.jl updates. As far as I know, SciML/RecursiveArrayTools.jl#400 is the only issue there we currently know.

@huiyuxie
Copy link
Member

huiyuxie commented Nov 14, 2024

@erny123 Thank you for reporting this issue and I can confirm this on my end.

We can simplify the issue further and find that it’s a problem between Trixi.jl and Enzyme.jl - see

(@v1.11) pkg> st
Status `C:\Users\huiyu\.julia\environments\v1.11\Project.toml`
  [a7f1ee26] Trixi v0.9.6

(@v1.11) pkg> add Enzyme
   Resolving package versions...
    Updating `C:\Users\huiyu\.julia\environments\v1.11\Project.toml`
⌃ [7da242da] + Enzyme v0.11.20
    Updating `C:\Users\huiyu\.julia\environments\v1.11\Manifest.toml`
⌃ [7da242da] + Enzyme v0.11.20
⌅ [61eb1bfa] + GPUCompiler v0.25.0
⌅ [929cbde3] + LLVM v6.6.3
  [d8793406] + ObjectFile v0.4.2
  [6c6a2e73] + Scratch v1.2.1
  [53d494c1] + StructIO v0.3.1
⌅ [7cc45869] + Enzyme_jll v0.0.102+0
⌅ [dad2f222] + LLVMExtra_jll v0.0.29+0
        Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. To see why use `status --outdated -m`
Precompiling project...
  ✗ Enzyme
  ✗ Enzyme  EnzymeSpecialFunctionsExt
  0 dependencies successfully precompiled in 9 seconds. 238 already precompiled.
  2 dependencies errored.
  For a report of the errors see `julia> err`. To retry use `pkg> precompile`

And it is essentially the bug from old version GPUCompiler.jl see JuliaGPU/CUDA.jl#2520 (as a dependency of Enzyme.jl) and we can also confirm this from your stacktrace

ERROR: LoadError: InitError: type Nothing has no field major
Stacktrace:
  [1] getproperty
    @ ./Base.jl:49 [inlined]
  [2] __init__()
    @ GPUCompiler ~/.julia/packages/GPUCompiler/U36Ed/src/GPUCompiler.jl:64

One straightforward way would be to update this dependency, but unfortunately, our current restriction on DiffEqBase.jl prevents it from being updated (or it would cause method overwritten warnings) - we force restriction on DiffEqBase.jl as a WAR for SciML dropping support of arrays of SVectors see #1789.

Since it's related to upstream changes that force us to update quite a lot of stuff here.

Exactly, and the upstream is not handling releases well enough; they roll them out too quickly without thorough testing. We found that the updates are error-prone, so we need to fix their bugs first. Also, as the upstream provider, they should ensure backward compatibility in new versions (at least for several years) instead of directly dropping support for older versions.

@huiyuxie
Copy link
Member

huiyuxie commented Nov 15, 2024

What's a case I could look at?

Please take a look at this new issue SciML/RecursiveArrayTools.jl#410 if you have the bandwidth. Thanks.

@huiyuxie
Copy link
Member

Update: The given project setting can be configured with no errors using Julia 1.10 (I'm trying via this version https://julialang.org/downloads/#long_term_support_release)

@erny123 WAR: Downgrade your Julia version to 1.10 and try configure all the packages again if you are urgent

@ChrisRackauckas
Copy link

ChrisRackauckas commented Nov 20, 2024

#410 is patched, I'll look at the ArrayPartition performance bit on my next no-wifi flight.

@erny123
Copy link
Author

erny123 commented Nov 21, 2024

@erny123 WAR: Downgrade your Julia version to 1.10 and try configure all the packages again if you are urgent

I'll give it a shot just to confirm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants