GitHub User: cpuguy83
Still not getting around having to wrap your code into a cache block, but check out:
It just lets you call:
cache_digests is for generating md5 digests of your fragments and inserting them into the cachekey for the object you are caching, this is for caching in the model itself.
Check out https://github.com/cpuguy83/pack_rat for some cache helpers in your model
This episode makes me a little sad.
It's almost easier to setup Faye to handle this.
No, the updated_at timestamp is coming from the object you've already pulled... specifically this is generated using the cache_key method on your object. Then it's generating an MD5 of your template to see if it's changed and adding that into the key being stored in memcache.
I've messed with both.
It should be a fairly simple transition, assuming you aren't using things that have been deprecated in Rails 3 (which would be removed in Rails 4)... and of course assuming you are at least on Rails 3.1. The Asset Pipeline is a fairly large hurdle.
You must 'CREATE EXTENSION hstore' first
And what is wrong with that?
If the business logic is related to that model class and only that model class, then it should go there.
A fat model is not a bad model, so long as your model isn't fat because it's including functionality that belongs to another class.
cache_digests and fragment caching is 1 piece of the puzzle.
Rendering out templates is actually pretty heavy on the server as well. Using this method will get you closer (performance-wise) to using a full client side JS solution, which takes all the fun out of using Rails.
If your controller actions are heavy then considering caching the underlying data as well.
In various places in my app I'm using:
1. Just fragment cachine
2. Fragment caching + action caching
3. Model caching + fragment caching
I'm not too fond of action caching, but it does speed things up tremendously if you can use it.
This is specifically for fragment caching.
JSON would need to get cached by directly calling Rails.cache.fetch in your model or controller.
But there are no hoops.
You install this gem and it does the work for you.
You aren't caching the lists themselves, just the individual objects.
When you setup your associations you do the ":touch => true" option so the related object gets touched, thus automatically expiring the cache on it.
Even starting to wonder if N+1 issues can be mitigated with fragment caching instead of always having the server do the include at the action.
Also, this is exactly why people use client-side frameworks like Backbone, Ember, Spine, etc.... instead of having the server spend the time rendering it, it just sends the json data to the client's browser where the framework figures out how to render it.
Because when it comes time to push to production you would either have to update all your versions in each render call or manually clear your cache (something like Rails.cache.clear).
Using cache_digests you not only get auto-expiration of what should expire, but also get to keep all your other cached items as well.
It just makes deployment much simpler.
Rendering the ERB can be fairly intensive and time consuming as well.
Using this method gets you down to the <100ms page load times.
I would imagine Rails 4 handles it a lot better once it's released.
The great thing about this is you get the benefits of schemaless data with your schema'd data... and you get to keep using AR.
Nice bit to know about calling "my" to get the original context.
I'd started creating variables before the where clause so I could call them, I think "my" is better.
Love Squeel. Love that I don't have to mess with ARel.