Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running @memoize a second time silently does nothing, even if the second call refers to nonexistent names #66

Open
bzinberg opened this issue Dec 15, 2020 · 8 comments

Comments

@bzinberg
Copy link
Contributor

julia> @memoize LRU(maxsize=10) f(x) = x + 1
f (generic function with 1 method)

julia> @memoize NonexistentCacheType f(x) = x + 1
f (generic function with 1 method)

julia> f(3)
4

I would expect an error (or a loud warning message) on that second call. At the very least, I'd want a loud warning in the documentation.

@willow-ahrens
Copy link

willow-ahrens commented Dec 30, 2020

Currently, Memoize uses only one cache per function. It stores this cache in a global variable in the module that expands the macro. If the global variable is already defined, then Memoize doesn't construct another cache, and doesn't call the function you supply. Sadly, this means that only the first cache you give for the function is used. However, I'm not sure this will remain the case in the future. Consider the approach in #59, where each method gets a separate cache.

@rikhuijzer
Copy link
Contributor

rikhuijzer commented Dec 30, 2020

julia> @memoize NonexistentCacheType f(x) = x + 1
ERROR: LoadError: UndefVarError: NonexistentCacheType not defined
Stacktrace:
 [1] top-level scope at none:1
 [2] eval(::Module, ::Any) at ./boot.jl:331
 [3] @memoize(::LineNumberNode, ::Module, ::Vararg{Any,N} where N) at /home/rik/.julia/packages/Memoize/12ANR/src/Memoize.jl:56
in expression starting at REPL[8]:1

Pretty big warning I'd say. Parts of it are also in red in the REPL.

@cstjean
Copy link
Collaborator

cstjean commented Dec 30, 2020

I'd be curious to hear what people think of per-method caches. To me it makes more sense, and it supports my org's use case. But it is a breaking change, @peterahrens thank you for pointing that out about #59.

@willow-ahrens
Copy link

willow-ahrens commented Dec 30, 2020

@rikhuijzer It's true that if you define the NonexistentCacheType memo function first, you get the error. However, if the function already has a cache (if you use the LRU cache first, for instance), I can reproduce that no warning is given because the macro ignores the cache type entirely.

@cstjean I'd also like to hear the opinion of other community members on this. Speaking for myself, I think that per-method caching is the most correct semantic, especially because Julia semantics already allow us to directly call different methods, and Memoize currently returns only the most specific result. For instance, consider

julia> f(x) = 1
julia> f(x::Bool) = 2
julia> @memoize g(x) = 1
julia> @memoize g(x::Bool) = 2

julia> f(true)
2

julia> invoke(f, Tuple{Integer}, true)
1

julia> g(true)
2

julia> invoke(g, Tuple{Integer}, true)
2

@rikhuijzer
Copy link
Contributor

@rikhuijzer It's true that if you define the NonexistentCacheType memo function first, you get the error. However, if the function already has a cache (if you use the LRU cache first, for instance), I can reproduce that no warning is given because the macro ignores the cache type entirely.

Aha. Check. Got it.

julia> using Memoize, LRUCache

julia> @memoize LRU(maxsize=10) f(x) = x
f (generic function with 1 method)

julia> f(3)
3

julia> f(3)
3

julia> @memoize NonexistentCacheType f(x) = x + 1
f (generic function with 1 method)

julia> f(3)
4

julia> f(3)
4

@maxkapur
Copy link

maxkapur commented May 25, 2024

I would also be interested in per-method cache. The workaround I came up with was to redefine (using the OP's notation) f(x) as an unmemoized wrapper function that dispatches f(x::Int) = f_int(x), f(x::Float64) = f_float(x), and so on, where the inner methods are memoized, which is kind of clunky.

Example of where this gotcha comes up in real life

Here's an example where I try to create a strongly-typed dict for each cache and get unexpected results:

julia> using Memoize

julia> @memoize Dict{Tuple{Float64}, Float64} my_logistic(x::Float64) = 1 / (1 + exp(-x))
my_logistic (generic function with 1 method)

julia> @memoize Dict{Tuple{Float64, Float64}, Float64} my_logistic(x::Float64, scale::Float64) = 1 / (1 + exp(-x / scale))
my_logistic (generic function with 2 methods)

julia> my_logistic(2.5)
0.9241418199787566

julia> my_logistic(2.5, 1.0)
ERROR: MethodError: Cannot `convert` an object of type 
  Tuple{Float64,Float64} to an object of type 
  Tuple{Float64}

Closest candidates are:
  convert(::Type{T}, ::Tuple{Vararg{Any, N}}) where {N, T<:Tuple}
   @ Base essentials.jl:457
  convert(::Type{T}, ::T) where T<:Tuple
   @ Base essentials.jl:456
  convert(::Type{T}, ::T) where T
   @ Base Base.jl:84
  ...

Stacktrace:
 [1] _tuple_error(T::Type, x::Tuple{Float64, Float64})
   @ Base ./essentials.jl:454
 [2] convert(::Type{Tuple{Float64}}, x::Tuple{Float64, Float64})
   @ Base ./essentials.jl:461
 [3] get!(default::Function, h::Dict{Tuple{Float64}, Float64}, key0::Tuple{Float64, Float64})
   @ Base ./dict.jl:465
 [4] my_logistic(x::Float64, scale::Float64)
   @ Main ~/gits/Memoize.jl/src/Memoize.jl:61
 [5] top-level scope
   @ REPL[5]:1

Is this package still being maintained?

@willow-ahrens
Copy link

I think that this package is what you want: https://github.com/willow-ahrens/MemoizedMethods.jl

I'm not currently using that package, but if you run into any issues and want to file a PR, I would happily review!

@cstjean
Copy link
Collaborator

cstjean commented May 29, 2024

Is this package still being maintained?

Barely. I'll merge interesting PRs, but I'm a bit down that caching is so complicated in Julia (eg. #59, or maintaining type-stability), and I don't have any particular vision for fixing these issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants