#387 Cache Digests
The cache_digests gem (also included in Rails 4) will automatically add a digest to the fragment cache key based on the template. If a template changes the cache will auto-expire. But watch out for the gotchas!
- Download:
- source code
- mp4
- m4v
- webm
- ogv
So that's what 37signals calls Russian Doll Caching?
Yeah, I believe that's what they're doing.
http://37signals.com/svn/posts/3112-how-basecamp-next-got-to-be-so-damn-fast-without-using-much-client-side-ui
Would there be any gotchas when doing this with sensitive links that show up or remain hidden depending if a user is an admin or not?
The SVN blog discusses that. The short answer is you can't use conditionals that can change when the template itself doesn't. Impacts system design a bit.
You could include the things that can change in the cache key, like
<% cache [post, user.admin?] do %>
, but then you lose some of the automatic cache invalidation that cache digests provide when nesting cache fragments.I'm fairly new to all of this but yes, I believe caching the template renders any erb conditionals contained within the cached portion useless (except for the first time the cache is created.) You can get around this, using JS and HTML5 data attributes:
First, create your 'sensitive link' - in this case, we'll wrap it in a div named delete-btn - and use CSS to HIDE the link. Then, assign a data attribute to the div:
Of course, this assumes the User model has an admin attribute.
Then, it's as easy as fetching the admin attribute using JS and writing a quick if statement that checks if the admin data attribute is true. If so, unhide the link.
Angelo
This seems highly insecure, what if I simply in spec your page in my browser and alter the html?
I suppose you would be validating credentials in controllers as well though...
On the other hand what if it's secure data? Are there not valid cases where the HTML should never be delivered for a subset of users?
I'm wondering what the purpose of caching an individual task is? It seems as though caching the project view may be sufficient.
I understand the benefit of caching an individual task is that when a project is re-rendered (after a task changes), most tasks will still be cached. But does rendering a task really take that long?
I think the answer to my question lies around how we avoid hitting the database when using fragment caching. I haven't fully thought it through, though.
Rendering the ERB can be fairly intensive and time consuming as well.
Using this method gets you down to the <100ms page load times.
Also, this is exactly why people use client-side frameworks like Backbone, Ember, Spine, etc.... instead of having the server spend the time rendering it, it just sends the json data to the client's browser where the framework figures out how to render it.
@cpuguy83 You're absolutely right, that's one big advantage of clint-side MVC.'s
But after spending a year battling with Backbone, I'm just not a big fan. Backbone and javascript is a lot of fun, but I find sticking to mainly Ruby/Rails to be where I'm most productive.
By sticking with mostly Rails, you can avoid having to test/debug lots of JS, devising a strategy for sharing views between Rails and JS, logic duplication in Backbone models and Rails models, etc.
I think caching + utilizing Rails UJS + RJS (rails js templates) is a winning combo for many apps. That's why cache digests is sweet :-)
Yep!
Even starting to wonder if N+1 issues can be mitigated with fragment caching instead of always having the server do the include at the action.
I don't understand the purpose of this gem at all.
Templates don't (or, at least, shouldn't) change in production, so there are no changes to detect.
Templates do change in development, however this gem requires extra code (explicit partial and collection arguments to render) and developer behavior changes (frequent server restarts).
I've wondered for years why there isn't better (any?) support for testing caching. Wouldn't it be better to have that than to jump through these hoops to test caching in development mode?
Because when it comes time to push to production you would either have to update all your versions in each render call or manually clear your cache (something like Rails.cache.clear).
Using cache_digests you not only get auto-expiration of what should expire, but also get to keep all your other cached items as well.
It just makes deployment much simpler.
OK, so the purpose is to reuse cached fragments between production deploys. In Capistrano terms, store them in shared, rather than current. That makes sense.
It still seems absurd to jump through those hoops in order to test that caching works in development mode. Testing gets great attention in the Ruby community. I'm surprised we don't have better coverage over this. I took a stab at it a three years ago with my Banker plug-in, but haven't maintained it. Perhaps I should turn it into a gem and modernize it.
But there are no hoops.
You install this gem and it does the work for you.
Can someone tell me how this helps? Isnt the controller being called on each request where the heavy lifting is done anyway? (DB calls etc)
Hi Luke,
This is actually the problem caching is trying to solve. Instead of reading from your DB and rendering partials rails is smart enough to look for a cached file, which, if it exists, will be used for the response.
Only if information critical for the request has changed, will a new version be created.
Hope this helps.
cache_digests and fragment caching is 1 piece of the puzzle.
Rendering out templates is actually pretty heavy on the server as well. Using this method will get you closer (performance-wise) to using a full client side JS solution, which takes all the fun out of using Rails.
If your controller actions are heavy then considering caching the underlying data as well.
In various places in my app I'm using:
1. Just fragment cachine
2. Fragment caching + action caching
3. Model caching + fragment caching
I'm not too fond of action caching, but it does speed things up tremendously if you can use it.
I'm interested in #3, model caching + fragment caching. Why would you need to use both? I just saw the model caching video, and it looks like it solves almost all the caching problems to me, tradeoff being more memory for memcache. Can you give me an example of how you're using both model/fragment caching?
What to force cache to expire? for example when the list of items are updated?
You aren't caching the lists themselves, just the individual objects.
When you setup your associations you do the ":touch => true" option so the related object gets touched, thus automatically expiring the cache on it.
So how would you cache json? I think I am missing something obvious.
This is specifically for fragment caching.
JSON would need to get cached by directly calling
Rails.cache.fetch
in your model or controller.I had a problem using fragment caching when calling the partial in response to javascript i.e. in index.js.erb and using <%= j render @collection %>
I was getting a response, but the javascript to append the response to the page just didn't fire
In order to formulate the proper cache key for some template so that we can check Memcached, does that not require 1+ DB calls (more for nested templates that use some db objects) to fetch the updated_at value on every request?
If so, doesn't this/these mandatory call(s) somewhat offset the performance benefits a bit?
If not, is it implied that cache_digests gem creates static HTML files specific for every call and somehow tag them w/ specific ETags (though this method yields no benefit for users viewing the page the first time)?
No, the updated_at timestamp is coming from the object you've already pulled... specifically this is generated using the cache_key method on your object. Then it's generating an MD5 of your template to see if it's changed and adding that into the key being stored in memcache.
There should be noted, that this gem isn't aware of
I18n
. Because of this you should always write something likein multilingual applications or write a helper for this. And there is no plans to support
I18n
in future. More info.thanks for the tip
+1
Is there a way to use this for pages that contain mostly static content, that is only updated whenever the app is updated?
I have some static pages (tutorial, help, contact, about, etc.) that use embedded ruby for for some things, but should otherwise be cached.
use http caching: http://railscasts.com/episodes/321-http-caching
or page caching.
Hey Everyone!
I have a few very basic questions about cache digests in rails 4.0
Do I need to include
gem 'cache_digests"
in Gemfile for rails 4.0 project?I didn't find a cache_digests gem in my project's rvm gemset neither did I put
gem 'cache_digests"
in my Gemfile. However<% cache xxx do %>
actually worked. It wrote to cache and read from it likeRead fragment views/pets/67-20130627183100171326000/1e699c80fb4b885994ecf5d1b8e61933 (0.1ms)
There is no
cache_digests:nested_dependencies
andcache_digests:dependencies
tasks if I didn't includegem 'cache_digests"
in Gemfile? Is that right?Any helper's appreciated!
Will these two interfere if m is the same instance but it is used in different partials?
and then
V