Here’s a utility function you can use to cache the return value of any function. It’s useful for caching large queries you do multiple times, for when things shouldn’t change more often than in a single load of SB.
-- cache the value a function returns
local _call_cache = {}
local cache_fn_retval = function(name, fn)
print('CACHE(prepare):', name)
-- build a function
return function(...)
-- temp variable that's local to the lambda
local _name = name
local key = _name .. '(' .. table.concat({...}, ',') .. ')'
local ret = _call_cache[key]
-- check if we have cached the result of the function call
if ret == nil then
-- run the function and cache its return value
ret = fn(...)
_call_cache[key] = ret
print('CACHE(miss): ', key)
else
print('CACHE(hit): ', key)
end
-- return the result of the function
return ret
end
end
-- short hand for making a cached fn and putting it on an object
local cache_fn_to_object = function(obj, name, fn)
obj[name] = cache_fn_retval(name, fn)
end
Feel free to remove the comments if you choose to use this. There are quite a few since it’s somewhat unusual lua code.
Examples
Example 1: query caching
-- Get all tasks
local query_all_tasks = cache_fn_retval('query_all_tasks', function()
return query [[ from index.tag 'task' ]]
end
Example 2: argument handling
local add_numbers = cache_fn_retval('add_numbers', function(a, b)
return a + b
end
add_numbers(1, 1) -- caches the result of 1+1
add_numbers(1, 1) -- retrieves the cached value from the previous call
add_numbers(1, 2) -- caches 1+2 (the cache function takes into account the arguments you pass it)
add_numbers(2, 1) -- caches 2+1
add_numbers(1, 2) -- uses cached value for 1+2
Example 3: `cache_fn_to_object()`
You may want to use cache_fn_to_object() instead of cache_fn_retval, since it means you don’t duplicate the name of the function by typing it in the name and in the call to cache_fn_retval():
Lib = {}
-- ...
cache_fn_to_object(Lib, 'add_numbers', function(a, b)
return a + b
end
A note about key generation
The method for generating cache keys might run into collisions if you pass multiple data types into your functions. For example, if a cached function is called myfunc, the following calls will be the same:
myfunc(1, 123)myfunc('1', 123)myfunc(1, '123')myfunc('1', '123')
All of these correspond to the myfunc(1,123) key, and the cache will return the same value for each of them, even if the function would have done something different. This shouldn’t be happen in most cases, but it’s worth knowing about.
A way to fix this would be to embed type information for each argument into the cache keys, but I suspect that would make it slower. I haven’t tested though, so maybe it wouldn’t have much of an impact.
Potential improvements
A way to make this better would be to explore cache invalidation strategies, like setting _call_cache[thing] = nil when pages are created or other events happen. I haven’t found that I really need this yet, so I haven’t looked in to how to do it, but it would be a nice extension to make in a more full-fledged implementation.
Enjoy!