-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
deal with erroneous/outdated cached data #75
Comments
https://martinfowler.com/bliki/TwoHardThings.html But more positively, my own use case for a cache was very specific - to act as a snapshot of catmaid state at a given time associated with a full run of a single markdown analysis. However initial attempts to share even across analyses failed. And none of those were writing to the catmaid database. I think you probably want to define what your target use cases are and then either cache the most expensive stuff or try and cater for read only situations completely. |
What seemed to be the problem when sharing it across analysis, @jefferis? |
So I'm not quite sure yet, but I think it was one of two things – cacheing the login object or annotation requests where somehow different POST requested were being aliased to the same hash value (which defines the on disk response). |
Made some improvements to the caching. From commit 5c3c8fd:
The main problem here is how to deal with cases where existing cache is the reason for failure. That's because we don't know what the function requested from the server. Possible solution is to (a) let the cache keep a log of requests and (b) let the decorator track the log during running the function - upon failure delete all cached urls that have been requested as per log. |
Examples:
2.
CatmaidNeuron.reload()
also uses cached data. Is this working as intended?Possible solutions:
@no_cache
decorator..reload
) to overwritecaching=True
decoratorclear_cached_response()
analogous toget_cached_response()
pymaid.clear_cache()
function and refer to it inCached data used
message.The text was updated successfully, but these errors were encountered: