I have a channel model with 2 associations, "contents" and "subscriptions".
In the channel index the user has the possibility of ordering the channels by number of subscriptions or number of approved contents.
While in development everything seems to work properly (by observation of the results, can be malfunctioning and be a question of not enough data to see it properly), in staging the results are random, sometimes showing them properly, sometimes don't.
At first I wasn't using delta indexes and thought the problem could be there so every time I approve a content I call:
Delayed::Job.enqueue(DelayedRake.new("ts:index"), queue: "sphinx")
Since the subscriptions don't have indexes, I don't reindex every time I create one ( should I do it? )
Then I started using delta indexes in the channel and I still get the same problems:
ThinkingSphinx::Index.define :channel, with: :active_record, delta: true do
# fields
indexes :name, sortable: true
indexes description
# attributes
has created_at, sortable: true
has approved, type: :boolean
has public, type: :boolean
join subscriptions
has "COUNT(subscriptions.id)", as: :subscription_count, type: :integer, sortable: true
join contents.approved
has "COUNT(contents.id)", as: :content_count, type: :integer, sortable: true
end
And here is the search call in the controller:
def index
if params[:order_by].present?
#channels = Channel.search params[:search],
order: "#{params[:order_by]} DESC",
page: params[:page], per_page: 6
else
#channels = Channel.search params[:search],
order: :name,
page: params[:page], per_page: 6
end
end
Summarising, my questions would be:
1. Are my channel indexes well formed?
2. Should subscriptions by indexed as well or is it enough to join them in my channel index?
3. Should I run reindex after I create a subscription / approve a content or the delta index in the channel deals with that since I have those two controllers joined in the channel index?
Your index looks fine, but if you're using deltas (and I think that's the wisest approach here, to have the data up-to-date), then you want to fire deltas for the related channels when a subscription or content is created/edited/deleted. This is covered in the documentation (see the "Deltas and Associations" section), but you'd be looking at something like this in both Subscription and Content:
after_save :set_channel_delta_flag
after_destroy :set_channel_delta_flag
# ...
private
def set_channel_delta_flag
channel.update_attributes :delta => true
end
Given you're using Delayed Job, I'd recommend investigating ts-delayed-delta to ensure delta updates are happening out of your normal HTTP request flow. And I highly recommend not running a full index after every change - that has the potential of getting quite slow quite quickly (and adding to the server load unnecessarily).
Related
I have an events table and sessions table. Events has_many sessions, this is the association. Now I want to move the time_zone column from the sessions table to events table only. So how do I do this with help of migrations. How do I move the existing records for time_zone in sessions table to events table?
First, you need to be sure that sessions associated with the same event have the same time zone. You can do this with:
Session.group(:event_id).count(:time_zone)
This will return a hash mapping an event_id to the number of time zones associated with it. This number should always be one.
Second, I recommend that you first add events.time_zone and start using it and remove sessions.time_zone in a separate migration after the new code has been in production for some time and is proved to work.
Third, the migration to add events.time_zone should look like this (I added some comments for clarity):
class AddTimeZoneToEvents < ActiveRecord::Migration
class Event < ActiveRecord::Base; end
class Session < ActiveRecord::Base; end
def up
# Add a NULLable time_zone column to events. Even if the column should be
# non-NULLable, we first allow NULLs and will set the appropriate values
# in the next step.
add_column :events, :time_zone, :string
# Ensure the new column is visible.
Event.reset_column_information
# Iterate over events in batches. Use #update_columns to set the newly
# added time_zone without modifying updated_at. If you want to update
# updated_at you have at least two options:
#
# 1. Set it to the time at which the migration is run. In this case, just
# replace #update_columns with #update!
# 2. Set it to the maximum of `events.updated_at` and
# `sessions.updated_at`.
#
# Also, if your database is huge you may consider a different query to
# perform the update (it also depends on your database).
Event.find_each do |event|
session = Session.where(event_id: event.id).last
event.update_columns(time_zone: session.time_zone)
end
# If events don't always need to have time zone information then
# you can remove the line below.
change_column_null :events, :time_zone, false
end
def down
remove_column :events, :time_zone
end
end
Note that I redefined models in the migration. It's crucial to do so because:
The original model may have callbacks and validations (sure, you can skip them but it's one extra precaution that contributes zero value).
If you remove the models in 6 months down the road the migration will stop working.
Once you're sure your changes work as expected you can remove sessions.time_zone. If something goes awry you can simply roll back the above migration and restore a working version easily.
You can simply use the following migration.
class Test < ActiveRecord::Migration
def change
add_column :events, :time_zone, :string
Event.all.each do |e|
e.update_attributes(time_zone: e.sessions.last.time_zone)
end
remove_column :sessions, :time_zone
end
end
I'm a newbie of using RealmSwift and i'm creating chat like application using swift 3.0 with backend database as RealmSwift. while inserting the chat works good into realm, but the thing when fetch the records
let newChat = uiRealm.objects(Chats.self).filter(
"(from_id == \(signUser!.user_id)
OR from_id == \(selectedList.user_id))
AND (to_id == \(signUser!.user_id)
OR to_id == \(selectedList.user_id))"
).sorted(byProperty: "id", ascending: true)
i don't know how to limit the last 30 records for the chat conversation. In the above code i just fetch the records from "Chat" table with filtering the chat as "SIGNED USERID AND TO USERID". and also if i list all the records(like more than 150 chat conversation) for the particular chat, then scrolling up the records from tableview got stuck up or hang for a while. So please give some idea about how to limit last 30 records and stop hanging the tableview. Thanks in advance
Like I wrote in the Realm documentation, because Realm Results objects are lazily-loaded, it doesn't matter if you query for all of the objects and then simply load the ones you need.
If you want to line it up to a table view, you could create an auxiliary method that maps the last 30 results to a 0-30 index range, which would be easier to then pass straight to the table view's data source:
func chat(atIndex index: Integer) -> Chats {
let mappedIndex = (newChat.count - 30) + index
return newChat[mappedIndex]
}
If you've already successfully queried and started accessing these objects (i.e. the query itself didn't hang), I'm not sure why the table view would hang after the fact. You could try running the Time Profiler in Instruments to track down exactly what's causing the main thread to be blocked.
I currently have the following models: MinorCategory > Product > Review
On a view, I show the 12 MinorCategories that have the most reviews. This view is very slow to respond, and I think it is a problem with how I do the query.
Here is my current code:
class MinorCategory < ActiveRecord::Base
has_many :products
has_many :reviews, through: :products
...
def count_reviews
self.reviews.count
end
...
end
class Review < ActiveRecord::Base
belongs_to :product, touch: true
...
end
class HomeController < ApplicationController
#categories = MinorCategory.all.sort_by(&:count_reviews).reverse.take(12)
end
So that is basically it. In the view itself I go through each #categories and display a few things, but the query in the controller is what seems to be slow. From SkyLight:
SELECT COUNT(*) FROM "reviews" INNER JOIN "products" ON "reviews"."product_id" = "products"."id" WHERE "products"."minor_category_id" = ? ... avg 472ms
I am not good with sql or active record, and still pretty new to Ruby on Rails. I've spent a couple hours trying other methods, but I can not get them to work so I thought I would check here.
Thank you in advance to anybody that has a moment.
You need some basic SQL knowledge to better understand how database queries work, and how to take advantage of a DBMS. Using ActiveRecord is not an excuse to not learn some SQL.
That said, your query is very inefficient because you don't use the power of the database at all. It's a waste of resources both on the Ruby environment and on the database environment.
The only database query is
MinorCategory.all
which extracts all the records. This is insanely expensive, especially if you have a large number of categories.
Moreover, self.reviews.count is largely inefficient because it is affected by the N+1 query issue.
Last but not least, the sorting and limiting is made in the Ruby environment, whereas you should really do it in the database.
You can easily obtain a more efficient query by taking advantage of the database computation capabilities. You will need to join the two tables together. The query should look like:
SELECT
minor_categories.*, COUNT(reviews.id) AS reviews_count
FROM
"minor_categories" INNER JOIN "reviews" ON "reviews"."minor_category_id" = "minor_categories"."id"
GROUP BY
minor_categories.id
ORDER BY
reviews_count DESC
LIMIT 10
which in ActiveRecord translates as
categories = MinorCategory.select('minor_categories.*, COUNT(reviews.id) AS reviews_count').joins(:reviews).order('reviews_count DESC').group('minor_categories.id').limit(10)
You can access a single category count by using reviews_count
# take a category
category = categories[0]
category.reviews_count
Another approach that doesn't require a JOIN would be to cache the counter in the category table.
I was hoping this would be simple, but in essence I would like to count the number of Users who have certain attributes through a join table in Rails 4.
I have a table called Views which holds three columns:
t.references :viewer
t.references :viewed
t.boolean :sentmessage, default: false
I then have the fields references as:
belongs_to :viewer, :class_name => "User"
belongs_to :viewed, :class_name => "User"
Each user record is then associated with a number of other records like Stats, Questions and a number of others. I'm interested in effectively counting how many viewers of a viewed record are Male or Female (and other search fields) which is data all held in User.stat.gender.name etc.
I'm trying to use a group statement but have no idea how to drill down and count the number of Males etc. I've tried:
#results = View.where(viewed: 63).group("viewer.stat.gender")
But this is so wrong it's frightening.
Any help to do this would be appreciated.
I worked it out finally. For anyone else who is interested:
View.where(viewed_id: 63).joins(viewer: {stat: :gender}).group("name").count
Didn't realise what an INNER JOIN was but some research and some trial and error means I can now show information about the users who have visited.
In my Sqlite table, I have more than 5 years records for price and quantity of an Product. Now I am trying to render data in High Charts. For Filtration part, I am putting 4 radio button i.e, Day, Week, Month and Year.
If User click on Day selection, then Graph should render the day wise data. If User clicks on Week button, then Graph should render the weekly calculated value and so on.
Now I am not getting the way to implement it. I trying to implement the filter code in MODEL.
Please suggest something, how to proceed.
This solution is quick and dirty, and might have some performance issues. I'll suggest potential optimizations down below.
Assuming a Product model with the following schema,
# Table name: products
#
# id :integer not null, primary key
# name :string
# amount :float
# quantity :integer
# created_at :datetime not null
# updated_at :datetime not null
in models/product.rb:
class Product < ActiveRecord::Base
def self.group_by_day
# queries the database
self.all.group_by { |p| p.created_at.beginning_of_day }
# if you're using a different date field, replace 'created_at' with that field.
end
def self.quantity_by_day
# returns an array of hashes, each containing the date and quantity
quantity_array = []
self.group_by_day.each do |day, products|
quantity_array << {
date: day.to_s(:db),
quantity: products.map(&:quantity).sum
}
end
quantity_array
end
def self.group_by_week
self.all.group_by { |p| p.created_at.beginning_of_week }
end
def self.quantity_by_week
# ...
end
def self.group_by_month
self.all.group_by { |p| p.created_at.beginning_of_month }
end
def self.quantity_by_month
# ...
end
end
Product.quantity_by_day.to_json will give you the data which you can use for the chart. Here's a JSFiddle of a HighCharts chart using the data. (I don't know why the X-axis dates aren't working for me, but the chart shows correctly.edit: fixed it.)
As I mentioned earlier, querying all the products and grouping them in Ruby is inefficient. I would suggest:
Using PostgreSQL instead of SQLite, then use the groupdate gem. This uses SQL to group the records, which is a lot faster. SQLite does not have good date support, so we have to use group_by
Also, see this SO question about grouping by week/month in SQL
Lastly, showing your Products table schema would help us come up with a more specific answer.