RailsCasts Pro episodes are now free!

Learn more or hide this

Mark Allen's Profile

GitHub User: TSMMark

Site: http://vydia.com

Comments by Mark Allen


Great cast.
I don't really like to override the .json mime type to always respond with the datatable-formatted JSON
I suggest aliasing the "text/json" mime type to a new type called something like "datatable"
This leaves .json still functioning as it would

Mime::Type.register_alias "text/json", :datatable
respond_to do |format|
  format.json       { render json: @products }
  format.datatable  { render json: ProductsDatatable.new(view_context) }
data-source="<%= products_url(format: "datatable") %>"

I'm wondering about the best way to handle very large data imports that can take minutes to parse... The app I'm working on requires handling potentially huge imports consisting of tens/hundreds of thousands of rows.
I'm thinking it makes sense to store the CSV temporarily on upload, and do all the parsing via cron... and somehow notify the user that the import is processing.
I'm pretty new to rails, and I'm not sure the best way to execute.
Anyone have a suggestion?