RailsCasts Pro episodes are now free!

Learn more or hide this

Mark Allen's Profile

GitHub User: TSMMark

Site: http://vydia.com

Comments by Mark Allen

Avatar

Great cast.
I don't really like to override the .json mime type to always respond with the datatable-formatted JSON
I suggest aliasing the "text/json" mime type to a new type called something like "datatable"
This leaves .json still functioning as it would

initializers/mime_types.rb
Mime::Type.register_alias "text/json", :datatable
products_controller.rb
respond_to do |format|
  format.html
  format.json       { render json: @products }
  format.datatable  { render json: ProductsDatatable.new(view_context) }
end
views/products/index.html.erb
data-source="<%= products_url(format: "datatable") %>"
Avatar

I'm wondering about the best way to handle very large data imports that can take minutes to parse... The app I'm working on requires handling potentially huge imports consisting of tens/hundreds of thousands of rows.
I'm thinking it makes sense to store the CSV temporarily on upload, and do all the parsing via cron... and somehow notify the user that the import is processing.
I'm pretty new to rails, and I'm not sure the best way to execute.
Anyone have a suggestion?