feat(rollup): add models and write path [1/3] (#13796)

## PR#1: Reporting events rollup — model and write path

Reporting queries currently hit the `reporting_events` table directly.
This works, but the table grows linearly with event volume, and
aggregation queries (counts, averages over date ranges) get
progressively slower as accounts age.

This PR introduces a pre-aggregated `reporting_events_rollups` table
that stores daily per-metric, per-dimension (account/agent/inbox)
totals. The write path is intentionally decoupled from the read path —
rollup rows are written inline from the event listener via upsert, and a
backfill service exists to rebuild historical data from raw events.
Nothing reads from this table yet.

The write path activates when an account has a `reporting_timezone` set
(new account setting). The `reporting_events_rollup` feature flag
controls only the future read path, not writes — so rollup data
accumulates silently once timezone is configured. A `MetricRegistry`
maps raw event names to rollup column semantics in one place, keeping
the write and (future) read paths aligned.

### What changed

- Migration for `reporting_events_rollups` with a unique composite index
for upsert
- `ReportingEventsRollup` model
- `reporting_timezone` account setting with IANA timezone validation
- `MetricRegistry` — single source of truth for event-to-metric mappings
- `RollupService` — real-time upsert from event listener
- `BackfillService` — rebuilds rollups for a given account + date from
raw events
- Rake tasks for interactive backfill and timezone setup
- `reporting_events_rollup` feature flag (disabled by default)

### How to test

1. Set a `reporting_timezone` on an account
(`Account.first.update!(reporting_timezone: 'Asia/Kolkata')`)
2. Resolve a conversation or trigger a first response
3. Check `ReportingEventsRollup.where(account_id: ...)` — rows should
appear
4. Run backfill: `bundle exec rake reporting_events_rollup:backfill` and
verify historical data populates

---------

Co-authored-by: Muhsin Keloth <muhsinkeramam@gmail.com>
This commit is contained in:
Shivam Mishra
2026-03-19 13:12:36 +05:30
committed by GitHub
parent 654fcd43f2
commit 9967101b48
20 changed files with 2013 additions and 1 deletions

View File

@@ -1,4 +1,10 @@
module TimezoneHelper
def timezone_name_from_params(timezone, offset)
return timezone if timezone.present? && ActiveSupport::TimeZone[timezone].present?
timezone_name_from_offset(offset)
end
# ActiveSupport TimeZone is not aware of the current time, so ActiveSupport::Timezone[offset]
# would return the timezone without considering day light savings. To get the correct timezone,
# this method uses zone.now.utc_offset for comparison as referenced in the issues below

View File

@@ -21,6 +21,7 @@ class ReportingEventListener < BaseListener
create_bot_resolved_event(conversation, reporting_event)
reporting_event.save!
safe_rollup(reporting_event)
end
def first_reply_created(event)
@@ -42,6 +43,7 @@ class ReportingEventListener < BaseListener
)
reporting_event.save!
safe_rollup(reporting_event)
end
def reply_created(event)
@@ -66,13 +68,16 @@ class ReportingEventListener < BaseListener
event_end_time: message.created_at
)
reporting_event.save!
safe_rollup(reporting_event)
end
def conversation_bot_handoff(event)
conversation = extract_conversation_and_account(event)[0]
event_end_time = event.timestamp
# check if a conversation_bot_handoff event exists for this conversation
# Best-effort guard: raw report reads count bot handoffs with DISTINCT conversation_id,
# while rollup counts assume one conversation_bot_handoff event per conversation.
# That uniqueness is not currently enforced at the database level.
bot_handoff_event = ReportingEvent.find_by(conversation_id: conversation.id, name: 'conversation_bot_handoff')
return if bot_handoff_event.present?
@@ -90,6 +95,7 @@ class ReportingEventListener < BaseListener
event_end_time: event_end_time
)
reporting_event.save!
safe_rollup(reporting_event)
end
def conversation_captain_inference_resolved(event)
@@ -166,5 +172,16 @@ class ReportingEventListener < BaseListener
bot_resolved_event = reporting_event.dup
bot_resolved_event.name = 'conversation_bot_resolved'
bot_resolved_event.save!
safe_rollup(bot_resolved_event)
end
def safe_rollup(reporting_event)
# Rollups are derived from the raw reporting event. If a transient rollup write
# failure bubbles out here, Sidekiq retries the dispatcher job and can insert the
# same raw event again. That can temporarily under-report rollups, but the source
# event is preserved and rollup data can be rebuilt or re-applied later.
ReportingEvents::RollupService.perform(reporting_event)
rescue StandardError => e
ChatwootExceptionTracker.new(e, account: reporting_event.account).capture_exception
end
end

View File

@@ -85,11 +85,13 @@ class Account < ApplicationRecord
validates_with JsonSchemaValidator,
schema: SETTINGS_PARAMS_SCHEMA,
attribute_resolver: ->(record) { record.settings }
validate :validate_reporting_timezone
store_accessor :settings, :auto_resolve_after, :auto_resolve_message, :auto_resolve_ignore_waiting
store_accessor :settings, :audio_transcriptions, :auto_resolve_label
store_accessor :settings, :captain_models, :captain_features
store_accessor :settings, :reporting_timezone
store_accessor :settings, :keep_pending_on_bot_failure
store_accessor :settings, :captain_auto_resolve_mode
include AccountCaptainAutoResolve
@@ -215,6 +217,12 @@ class Account < ApplicationRecord
# method overridden in enterprise module
end
def validate_reporting_timezone
return if reporting_timezone.blank? || ActiveSupport::TimeZone[reporting_timezone].present?
errors.add(:reporting_timezone, I18n.t('errors.account.reporting_timezone.invalid'))
end
def remove_account_sequences
ActiveRecord::Base.connection.exec_query("drop sequence IF EXISTS camp_dpid_seq_#{id}")
ActiveRecord::Base.connection.exec_query("drop sequence IF EXISTS conv_dpid_seq_#{id}")

View File

@@ -0,0 +1,48 @@
# == Schema Information
#
# Table name: reporting_events_rollups
#
# id :bigint not null, primary key
# count :bigint default(0), not null
# date :date not null
# dimension_id :bigint not null
# dimension_type :string not null
# metric :string not null
# sum_value :float default(0.0), not null
# sum_value_business_hours :float default(0.0), not null
# created_at :datetime not null
# updated_at :datetime not null
# account_id :integer not null
#
# Indexes
#
# index_rollup_summary (account_id,dimension_type,date)
# index_rollup_timeseries (account_id,metric,date)
# index_rollup_unique_key (account_id,date,dimension_type,dimension_id,metric) UNIQUE
#
class ReportingEventsRollup < ApplicationRecord
belongs_to :account
# Store string values directly in the database for better readability and debugging
enum :dimension_type, %w[account agent inbox team].index_by(&:itself)
enum :metric, %w[
resolutions_count
first_response
resolution_time
reply_time
bot_resolutions_count
bot_handoffs_count
].index_by(&:itself)
validates :account_id, presence: true
validates :date, presence: true
validates :dimension_type, presence: true
validates :dimension_id, presence: true
validates :metric, presence: true
validates :count, numericality: { greater_than_or_equal_to: 0 }
scope :for_date_range, ->(start_date, end_date) { where(date: start_date..end_date) }
scope :for_dimension, ->(type, id) { where(dimension_type: type, dimension_id: id) }
scope :for_metric, ->(metric) { where(metric: metric) }
end

View File

@@ -0,0 +1,142 @@
# frozen_string_literal: true
class ReportingEvents::BackfillService
DIMENSIONS = [
{ type: 'account', group_column: nil },
{ type: 'agent', group_column: :user_id },
{ type: 'inbox', group_column: :inbox_id }
].freeze
# TODO: Move this to EventMetricRegistry when we expand distinct-counting support.
# The live path already guards uniqueness in ReportingEventListener#conversation_bot_handoff,
# but historical duplicates can exist since it's not enforced at the DB level.
# These events are queried per-dimension (not group-then-sum) because COUNT(DISTINCT) is not additive.
DISTINCT_COUNT_EVENTS = %w[conversation_bot_handoff].freeze
DISTINCT_COUNT_SQL = Arel.sql('COUNT(DISTINCT conversation_id)')
def self.backfill_date(account, date)
new(account, date).perform
end
def initialize(account, date)
@account = account
@date = date
end
def perform
start_utc, end_utc = date_boundaries_in_utc
rollup_rows = build_rollup_rows(start_utc, end_utc)
ReportingEventsRollup.transaction do
delete_existing_rollups
bulk_insert_rollups(rollup_rows) if rollup_rows.any?
end
end
private
def delete_existing_rollups
ReportingEventsRollup.where(account_id: @account.id, date: @date).delete_all
end
def date_boundaries_in_utc
tz = ActiveSupport::TimeZone[@account.reporting_timezone]
start_in_tz = tz.parse(@date.to_s)
end_in_tz = tz.parse((@date + 1.day).to_s)
[start_in_tz.utc, end_in_tz.utc]
end
def build_rollup_rows(start_utc, end_utc)
aggregates = build_aggregates(start_utc, end_utc)
aggregates.map do |(dimension_type, dimension_id, metric), data|
{
account_id: @account.id,
date: @date,
dimension_type: dimension_type,
dimension_id: dimension_id,
metric: metric,
count: data[:count],
sum_value: data[:sum_value],
sum_value_business_hours: data[:sum_value_business_hours],
created_at: Time.current,
updated_at: Time.current
}
end
end
def build_aggregates(start_utc, end_utc)
aggregates = Hash.new { |h, k| h[k] = { count: 0, sum_value: 0.0, sum_value_business_hours: 0.0 } }
standard_names = ReportingEvents::EventMetricRegistry.event_names - DISTINCT_COUNT_EVENTS
base = @account.reporting_events.where(created_at: start_utc...end_utc)
DIMENSIONS.each do |dimension|
aggregate_standard_events(aggregates, base.where(name: standard_names), dimension)
aggregate_distinct_events(aggregates, base.where(name: DISTINCT_COUNT_EVENTS), dimension)
end
aggregates
end
def aggregate_standard_events(aggregates, scope, dimension)
group_cols, selects = dimension_groups_and_selects(dimension)
scope.group(*group_cols).pluck(*selects).each do |row|
event_name, dimension_id, count, sum_value, sum_value_business_hours = unpack_row(row, dimension)
next if dimension_id.nil?
accumulate_metrics(aggregates, dimension[:type], dimension_id, event_name,
{ count: count, sum_value: sum_value, sum_value_business_hours: sum_value_business_hours })
end
end
def accumulate_metrics(aggregates, dimension_type, dimension_id, event_name, values)
ReportingEvents::EventMetricRegistry.metrics_for_aggregate(event_name, **values).each do |metric, metric_data|
key = [dimension_type, dimension_id, metric]
aggregates[key][:count] += metric_data[:count]
aggregates[key][:sum_value] += metric_data[:sum_value].to_f
aggregates[key][:sum_value_business_hours] += metric_data[:sum_value_business_hours].to_f
end
end
def aggregate_distinct_events(aggregates, scope, dimension)
return if DISTINCT_COUNT_EVENTS.empty?
group_cols = dimension[:group_column] ? [:name, dimension[:group_column]] : [:name]
scope.group(*group_cols).pluck(*group_cols, DISTINCT_COUNT_SQL).each do |row|
event_name, dimension_id, count = dimension[:group_column] ? row : [row[0], @account.id, row[1]]
next if dimension_id.nil?
accumulate_metrics(aggregates, dimension[:type], dimension_id, event_name,
{ count: count, sum_value: 0, sum_value_business_hours: 0 })
end
end
def dimension_groups_and_selects(dimension)
agg_selects = [Arel.sql('COUNT(*)'), Arel.sql('COALESCE(SUM(value), 0)'), Arel.sql('COALESCE(SUM(value_in_business_hours), 0)')]
if dimension[:group_column]
[[:name, dimension[:group_column]], [:name, dimension[:group_column], *agg_selects]]
else
[[:name], [:name, *agg_selects]]
end
end
def unpack_row(row, dimension)
if dimension[:group_column]
# [name, dimension_id, count, sum_value, sum_value_business_hours]
row
else
# [name, count, sum_value, sum_value_business_hours] → inject account id
[row[0], @account.id, row[1], row[2], row[3]]
end
end
def bulk_insert_rollups(rollup_rows)
# rubocop:disable Rails/SkipsModelValidations
ReportingEventsRollup.insert_all(rollup_rows)
# rubocop:enable Rails/SkipsModelValidations
end
end

View File

@@ -0,0 +1,79 @@
module ReportingEvents::EventMetricRegistry
# Describes one rollup metric emitted by a raw reporting event.
# rollup_metric: metric name stored in reporting_events_rollups.
# payload_kind: whether the emitted row carries only a count or a duration payload.
Metric = Data.define(:rollup_metric, :payload_kind)
EVENTS = {
conversation_resolved: [
Metric.new(rollup_metric: :resolutions_count, payload_kind: :count),
Metric.new(rollup_metric: :resolution_time, payload_kind: :duration)
].freeze,
first_response: [
Metric.new(rollup_metric: :first_response, payload_kind: :duration)
].freeze,
reply_time: [
Metric.new(rollup_metric: :reply_time, payload_kind: :duration)
].freeze,
conversation_bot_resolved: [
Metric.new(rollup_metric: :bot_resolutions_count, payload_kind: :count)
].freeze,
conversation_bot_handoff: [
Metric.new(rollup_metric: :bot_handoffs_count, payload_kind: :count)
].freeze
}.freeze
module_function
def event_names
EVENTS.keys.map(&:to_s)
end
def metrics_for(event)
return {} if event.blank?
metrics_for_aggregate(
event.name,
count: 1,
sum_value: event.try(:value),
sum_value_business_hours: event.try(:value_in_business_hours)
)
end
def metrics_for_aggregate(event_name, count:, sum_value:, sum_value_business_hours:)
return {} if event_name.blank?
values = {
count: count.to_i,
sum_value: sum_value.to_f,
sum_value_business_hours: sum_value_business_hours.to_f
}
EVENTS.fetch(event_name.to_sym, []).to_h do |metric|
[metric.rollup_metric, metric_values(metric.payload_kind, values)]
end
end
private_class_method def metric_values(payload_kind, values)
case payload_kind
when :count
count_values(values[:count])
when :duration
duration_values(values)
else
raise ArgumentError, "Unknown metric payload kind: #{payload_kind.inspect}"
end
end
private_class_method def count_values(count)
{ count: count, sum_value: 0, sum_value_business_hours: 0 }
end
private_class_method def duration_values(values)
{
count: values[:count],
sum_value: values[:sum_value],
sum_value_business_hours: values[:sum_value_business_hours]
}
end
end

View File

@@ -0,0 +1,107 @@
# Raw reporting events and rollup rows do not share a single metric namespace; this registry keeps write and read paths aligned.
# TODO: Split this into separate registries for raw event mappings and report metric definitions.
module ReportingEvents::MetricRegistry
# Maps report summary response keys to the metric definitions they read from.
SUMMARY_METRICS = {
resolutions_count: :resolved_conversations_count,
avg_resolution_time: :avg_resolution_time,
avg_first_response_time: :avg_first_response_time,
reply_time: :avg_reply_time
}.freeze
# Expands each raw reporting event into the rollup metric payloads persisted for aggregation.
EVENT_METRICS = {
'conversation_resolved' => lambda do |values|
{
resolutions_count: count_metric(values[:count]),
resolution_time: duration_metric(values)
}
end,
'first_response' => ->(values) { { first_response: duration_metric(values) } },
'reply_time' => ->(values) { { reply_time: duration_metric(values) } },
'conversation_bot_resolved' => ->(values) { { bot_resolutions_count: count_metric(values[:count]) } },
'conversation_bot_handoff' => ->(values) { { bot_handoffs_count: count_metric(values[:count]) } }
}.freeze
# Describes which report metrics are supported and how each one is sourced and aggregated.
REPORT_METRICS = {
conversations_count: { aggregate: :count }.freeze,
incoming_messages_count: { aggregate: :count }.freeze,
outgoing_messages_count: { aggregate: :count }.freeze,
avg_first_response_time: { raw_event_name: :first_response, rollup_metric: :first_response, aggregate: :average }.freeze,
avg_resolution_time: { raw_event_name: :conversation_resolved, rollup_metric: :resolution_time, aggregate: :average }.freeze,
reply_time: { raw_event_name: :reply_time, rollup_metric: :reply_time, aggregate: :average }.freeze,
resolutions_count: { raw_event_name: :conversation_resolved, rollup_metric: :resolutions_count, aggregate: :count }.freeze,
bot_resolutions_count: { raw_event_name: :conversation_bot_resolved, rollup_metric: :bot_resolutions_count, aggregate: :count }.freeze,
bot_handoffs_count: { raw_event_name: :conversation_bot_handoff, rollup_metric: :bot_handoffs_count, aggregate: :count,
raw_count_strategy: :distinct_conversation }.freeze
}.freeze
module_function
def event_metrics_for(event)
return {} if event.blank?
return {} unless EVENT_METRICS.key?(event.name.to_s)
event_metrics_for_aggregate(
event.name,
count: 1,
sum_value: event.try(:value),
sum_value_business_hours: event.try(:value_in_business_hours)
)
end
def event_metrics_for_aggregate(event_name, count:, sum_value:, sum_value_business_hours:)
values = {
count: count.to_i,
sum_value: sum_value.to_f,
sum_value_business_hours: sum_value_business_hours.to_f
}
EVENT_METRICS[event_name.to_s]&.call(values) || {}
end
def report_metric(metric)
return if metric.blank?
REPORT_METRICS[metric.to_sym]
end
def supported_metric?(metric)
report_metric(metric).present?
end
def aggregate_for(metric)
report_metric(metric)&.dig(:aggregate)
end
def rollup_supported_metric?(metric)
rollup_metric_for(metric).present?
end
def rollup_metric_for(metric)
report_metric(metric)&.dig(:rollup_metric)
end
def raw_event_name_for(metric)
report_metric(metric)&.dig(:raw_event_name)
end
def summary_metrics
SUMMARY_METRICS.map do |metric_name, summary_key|
report_metric(metric_name).merge(metric_name: metric_name, summary_key: summary_key)
end
end
private_class_method def count_metric(count)
{ count: count, sum_value: 0, sum_value_business_hours: 0 }
end
private_class_method def duration_metric(values)
{
count: values[:count],
sum_value: values[:sum_value],
sum_value_business_hours: values[:sum_value_business_hours]
}
end
end

View File

@@ -0,0 +1,81 @@
class ReportingEvents::RollupService
def self.perform(reporting_event)
new(reporting_event).perform
end
def initialize(reporting_event)
@reporting_event = reporting_event
@account = reporting_event.account
end
def perform
return unless rollup_enabled?
rows = build_rollup_rows
return if rows.empty?
upsert_rollups(rows)
end
private
# NOTE: This is intentionally not gated by the reporting_events_rollup feature flag.
# Rollup data is collected for all accounts with a valid reporting timezone (soft toggle).
# The feature flag only controls the read path — whether reports query rollups or raw events.
def rollup_enabled?
@account.reporting_timezone.present? && ActiveSupport::TimeZone[@account.reporting_timezone].present?
end
def event_date
@event_date ||= @reporting_event.created_at.in_time_zone(@account.reporting_timezone).to_date
end
def dimensions
{
account: @account.id,
agent: @reporting_event.user_id,
inbox: @reporting_event.inbox_id
}
end
def build_rollup_rows
event_metrics = ReportingEvents::EventMetricRegistry.metrics_for(@reporting_event)
dimensions.each_with_object([]) do |(dimension_type, dimension_id), rows|
next if dimension_id.nil?
event_metrics.each do |metric, metric_data|
rows << rollup_attributes(dimension_type, dimension_id, metric, metric_data)
end
end
end
def upsert_rollups(rows)
# rubocop:disable Rails/SkipsModelValidations
ReportingEventsRollup.upsert_all(
rows,
unique_by: [:account_id, :date, :dimension_type, :dimension_id, :metric],
on_duplicate: upsert_on_duplicate_sql
)
# rubocop:enable Rails/SkipsModelValidations
end
def rollup_attributes(dimension_type, dimension_id, metric, metric_data)
{
account_id: @account.id, date: event_date,
dimension_type: dimension_type, dimension_id: dimension_id, metric: metric,
count: metric_data[:count], sum_value: metric_data[:sum_value].to_f,
sum_value_business_hours: metric_data[:sum_value_business_hours].to_f,
created_at: Time.current, updated_at: Time.current
}
end
def upsert_on_duplicate_sql
Arel.sql(
'count = reporting_events_rollups.count + EXCLUDED.count, ' \
'sum_value = reporting_events_rollups.sum_value + EXCLUDED.sum_value, ' \
'sum_value_business_hours = reporting_events_rollups.sum_value_business_hours + EXCLUDED.sum_value_business_hours, ' \
'updated_at = EXCLUDED.updated_at'
)
end
end

View File

@@ -48,6 +48,9 @@ en:
inbox_deletetion_response: Your inbox deletion request will be processed in some time.
errors:
account:
reporting_timezone:
invalid: is not a valid timezone
validations:
presence: must not be blank
webhook:

View File

@@ -0,0 +1,37 @@
class CreateReportingEventsRollup < ActiveRecord::Migration[7.1]
def change
create_rollups_table
add_rollups_indexes
end
private
def create_rollups_table
create_table :reporting_events_rollups do |t|
t.integer :account_id, null: false
t.date :date, null: false
t.string :dimension_type, null: false
t.bigint :dimension_id, null: false
t.string :metric, null: false
t.bigint :count, default: 0, null: false
t.float :sum_value, default: 0.0, null: false
t.float :sum_value_business_hours, default: 0.0, null: false
t.timestamps
end
end
def add_rollups_indexes
add_index :reporting_events_rollups,
[:account_id, :date, :dimension_type, :dimension_id, :metric],
unique: true, name: 'index_rollup_unique_key'
add_index :reporting_events_rollups,
[:account_id, :metric, :date],
name: 'index_rollup_timeseries'
add_index :reporting_events_rollups,
[:account_id, :dimension_type, :date],
name: 'index_rollup_summary'
end
end

View File

@@ -1124,6 +1124,22 @@ ActiveRecord::Schema[7.1].define(version: 2026_02_26_153427) do
t.index ["user_id"], name: "index_reporting_events_on_user_id"
end
create_table "reporting_events_rollups", force: :cascade do |t|
t.integer "account_id", null: false
t.date "date", null: false
t.string "dimension_type", null: false
t.bigint "dimension_id", null: false
t.string "metric", null: false
t.bigint "count", default: 0, null: false
t.float "sum_value", default: 0.0, null: false
t.float "sum_value_business_hours", default: 0.0, null: false
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
t.index ["account_id", "date", "dimension_type", "dimension_id", "metric"], name: "index_rollup_unique_key", unique: true
t.index ["account_id", "dimension_type", "date"], name: "index_rollup_summary"
t.index ["account_id", "metric", "date"], name: "index_rollup_timeseries"
end
create_table "sla_events", force: :cascade do |t|
t.bigint "applied_sla_id", null: false
t.bigint "conversation_id", null: false

View File

@@ -0,0 +1,268 @@
# frozen_string_literal: true
namespace :reporting_events_rollup do
desc 'Backfill rollup table from historical reporting events'
task backfill: :environment do
ReportingEventsRollupBackfill.new.run
end
end
class ReportingEventsRollupBackfill # rubocop:disable Metrics/ClassLength
def run
print_header
account = prompt_account
timezone = resolve_timezone(account)
first_event, last_event = discover_events(account)
start_date, end_date, total_days = resolve_date_range(account, timezone, first_event, last_event)
dry_run = prompt_dry_run?
print_plan(account, timezone, start_date, end_date, total_days, first_event, last_event, dry_run)
return if dry_run
confirm_and_execute(account, start_date, end_date, total_days)
end
private
def print_header
puts ''
puts color('=' * 70, :cyan)
puts color('Reporting Events Rollup Backfill', :bold, :cyan)
puts color('=' * 70, :cyan)
puts color('Plan:', :bold, :yellow)
puts '1. Ensure account.reporting_timezone is set before running this task.'
puts '2. Wait for the current day to end in that account timezone.'
puts '3. Run backfill for closed days only (today is skipped by default).'
puts '4. Verify parity, then enable reporting_events_rollup read path.'
puts ''
puts color('Note:', :bold, :yellow)
puts '- This task always uses account.reporting_timezone.'
puts '- Default range is first event day -> yesterday (in account timezone).'
puts ''
end
def prompt_account
print 'Enter Account ID: '
account_id = $stdin.gets.chomp
abort color('Error: Account ID is required', :red, :bold) if account_id.blank?
account = Account.find_by(id: account_id)
abort color("Error: Account with ID #{account_id} not found", :red, :bold) unless account
puts color("Found account: #{account.name}", :gray)
puts ''
account
end
def resolve_timezone(account)
timezone = account.reporting_timezone
abort color("Error: Account #{account.id} must have reporting_timezone set", :red, :bold) if timezone.blank?
abort color("Error: Account #{account.id} has invalid reporting_timezone '#{timezone}'", :red, :bold) if ActiveSupport::TimeZone[timezone].blank?
puts color("Using account reporting timezone: #{timezone}", :gray)
puts ''
timezone
end
def discover_events(account)
first_event = account.reporting_events.order(:created_at).first
last_event = account.reporting_events.order(:created_at).last
if first_event.nil?
puts ''
puts "No reporting events found for account #{account.id}"
puts 'Nothing to backfill.'
exit(0)
end
[first_event, last_event]
end
def resolve_date_range(account, timezone, first_event, last_event)
dates = discovered_dates(timezone, first_event, last_event)
print_discovered_date_range(account, dates)
build_date_range(dates)
end
def prompt_dry_run?
print 'Dry run? (y/N): '
input = $stdin.gets.chomp.downcase
puts ''
%w[y yes].include?(input)
end
# rubocop:disable Metrics/ParameterLists
def print_plan(account, timezone, start_date, end_date, total_days, first_event, last_event, dry_run)
zone = ActiveSupport::TimeZone[timezone]
print_plan_summary(account, timezone, start_date, end_date, total_days, zone, first_event, last_event, dry_run)
return unless dry_run
puts color("DRY RUN MODE: Would process #{total_days} days", :yellow, :bold)
puts "Would use account reporting_timezone '#{timezone}'"
puts 'Run without dry run to execute backfill'
end
# rubocop:enable Metrics/ParameterLists
def print_plan_summary(account, timezone, start_date, end_date, total_days, zone, first_event, last_event, dry_run) # rubocop:disable Metrics/ParameterLists
puts color('=' * 70, :cyan)
puts color('Backfill Plan Summary', :bold, :cyan)
puts color('=' * 70, :cyan)
puts "Account: #{account.name} (ID: #{account.id})"
puts "Timezone: #{timezone}"
puts "Date Range: #{start_date} to #{end_date} (#{total_days} days)"
puts "First Event: #{format_event_time(first_event, zone)}"
puts "Last Event: #{format_event_time(last_event, zone)}"
puts "Dry Run: #{dry_run ? 'YES (no data will be written)' : 'NO'}"
puts color('=' * 70, :cyan)
puts ''
end
def format_event_time(event, zone)
event.created_at.in_time_zone(zone).strftime('%Y-%m-%d %H:%M:%S %Z')
end
def discovered_dates(timezone, first_event, last_event)
tz = ActiveSupport::TimeZone[timezone]
discovered_start = first_event.created_at.in_time_zone(tz).to_date
discovered_end = last_event.created_at.in_time_zone(tz).to_date
{
discovered_start: discovered_start,
discovered_end: discovered_end,
discovered_days: (discovered_end - discovered_start).to_i + 1,
default_end: [discovered_end, Time.current.in_time_zone(tz).to_date - 1.day].min
}
end
def print_discovered_date_range(account, dates)
message = "Discovered date range: #{dates[:discovered_start]} to #{dates[:discovered_end]} " \
"(#{dates[:discovered_days]} days) [Account: #{account.name}]"
puts color(message, :gray)
puts color("Default end date (excluding today): #{dates[:default_end]}", :gray)
puts ''
end
def build_date_range(dates)
start_date = dates[:discovered_start]
end_date = dates[:default_end]
total_days = (end_date - start_date).to_i + 1
abort_no_closed_days if total_days <= 0
[start_date, end_date, total_days]
end
def abort_no_closed_days
puts 'No closed days available to backfill in the default range.'
exit(0)
end
def confirm_and_execute(account, start_date, end_date, total_days)
if total_days > 730
puts color("WARNING: Large backfill detected (#{total_days} days / #{(total_days / 365.0).round(1)} years)", :yellow, :bold)
puts ''
end
print 'Proceed with backfill? (y/N): '
confirm = $stdin.gets.chomp.downcase
abort 'Backfill cancelled' unless %w[y yes].include?(confirm)
puts ''
execute_backfill(account, start_date, end_date, total_days)
end
def execute_backfill(account, start_date, end_date, total_days)
puts 'Processing dates...'
puts ''
start_time = Time.current
days_processed = 0
(start_date..end_date).each do |date|
ReportingEvents::BackfillService.backfill_date(account, date)
days_processed += 1
percentage = (days_processed.to_f / total_days * 100).round(1)
print "\r#{date} | #{days_processed}/#{total_days} days | #{percentage}% "
$stdout.flush
end
print_success(account, days_processed, total_days, Time.current - start_time)
rescue StandardError => e
print_failure(e, days_processed, total_days)
else
prompt_enable_rollup_read_path(account)
end
def print_success(account, days_processed, _total_days, elapsed_time)
puts "\n\n"
puts color('=' * 70, :green)
puts color('BACKFILL COMPLETE', :bold, :green)
puts color('=' * 70, :green)
puts "Total Days Processed: #{days_processed}"
puts "Total Time: #{elapsed_time.round(2)} seconds"
puts "Average per Day: #{(elapsed_time / days_processed).round(3)} seconds"
puts ''
puts 'Next steps:'
puts '1. Verify parity before enabling the reporting_events_rollup read path.'
puts '2. Verify rollups in database:'
puts " ReportingEventsRollup.where(account_id: #{account.id}).count"
puts '3. Test reports to compare rollup vs raw performance'
puts color('=' * 70, :green)
end
def prompt_enable_rollup_read_path(account)
if account.feature_enabled?(:report_rollup)
puts color('report_rollup is already enabled for this account.', :yellow, :bold)
return
end
print 'Enable report_rollup read path now? Only do this after parity verification. (y/N): '
confirm = $stdin.gets.to_s.chomp.downcase
puts ''
return unless %w[y yes].include?(confirm)
account.enable_features!('report_rollup')
puts color("Enabled report_rollup for account #{account.id}", :green, :bold)
end
def print_failure(error, days_processed, total_days)
puts "\n\n"
puts color('=' * 70, :red)
puts color('BACKFILL FAILED', :bold, :red)
puts color('=' * 70, :red)
print_error_details(error)
print_progress(days_processed, total_days)
exit(1)
end
def print_error_details(error)
puts color("Error: #{error.class.name} - #{error.message}", :red, :bold)
puts ''
puts 'Stack trace:'
puts error.backtrace.first(10).map { |line| " #{line}" }.join("\n")
puts ''
end
def print_progress(days_processed, total_days)
percentage = (days_processed.to_f / total_days * 100).round(1)
puts "Processed: #{days_processed}/#{total_days} days (#{percentage}%)"
puts color('=' * 70, :red)
end
ANSI_COLORS = {
reset: "\e[0m",
bold: "\e[1m",
red: "\e[31m",
green: "\e[32m",
yellow: "\e[33m",
cyan: "\e[36m",
gray: "\e[90m"
}.freeze
def color(text, *styles)
return text unless $stdout.tty?
codes = styles.filter_map { |style| ANSI_COLORS[style] }.join
"#{codes}#{text}#{ANSI_COLORS[:reset]}"
end
end

View File

@@ -0,0 +1,196 @@
# frozen_string_literal: true
namespace :reporting_events_rollup do
desc 'Interactively set account.reporting_timezone and show recommended backfill run times'
task set_timezone: :environment do
ReportingEventsRollupTimezoneSetup.new.run
end
end
class ReportingEventsRollupTimezoneSetup
def run
print_header
account = prompt_account
print_current_timezone(account)
timezone = prompt_timezone
confirm_and_update(account, timezone)
print_next_steps(account, timezone)
end
private
def print_header
puts ''
puts color('=' * 70, :cyan)
puts color('Reporting Events Rollup Timezone Setup', :bold, :cyan)
puts color('=' * 70, :cyan)
puts color('Help:', :bold, :yellow)
puts '1. This task writes a valid account.reporting_timezone.'
puts '2. Backfill uses this timezone and skips today by default.'
puts '3. Run backfill only after the account timezone day closes.'
puts ''
end
def prompt_account
print 'Enter Account ID: '
account_id = $stdin.gets.chomp
abort color('Error: Account ID is required', :red, :bold) if account_id.blank?
account = Account.find_by(id: account_id)
abort color("Error: Account with ID #{account_id} not found", :red, :bold) unless account
puts color("Found account: #{account.name}", :gray)
puts ''
account
end
def print_current_timezone(account)
current_timezone = account.reporting_timezone.presence || '(not set)'
puts color("Current reporting_timezone: #{current_timezone}", :gray)
puts ''
end
def prompt_timezone
loop do
print 'Enter UTC offset to pick timezone (e.g., +5:30, -8, 0): '
offset_input = $stdin.gets.chomp
abort color('Error: UTC offset is required', :red, :bold) if offset_input.blank?
matching_zones = find_matching_zones(offset_input)
abort color("Error: No timezones found for offset '#{offset_input}'", :red, :bold) if matching_zones.empty?
display_matching_zones(matching_zones, offset_input)
timezone = select_timezone(matching_zones)
return timezone if timezone.present?
end
end
def find_matching_zones(offset_input)
total_seconds = utc_offset_in_seconds(offset_input)
return [] unless total_seconds
ActiveSupport::TimeZone.all.select { |tz| tz.utc_offset == total_seconds }
end
def utc_offset_in_seconds(offset_input)
normalized = offset_input.strip
return unless normalized.match?(/\A[+-]?\d{1,2}(:\d{2})?\z/)
sign = normalized.start_with?('-') ? -1 : 1
raw = normalized.delete_prefix('+').delete_prefix('-')
hours_part, minutes_part = raw.split(':', 2)
hours = Integer(hours_part, 10)
minutes = Integer(minutes_part || '0', 10)
return unless minutes.between?(0, 59)
total_minutes = (hours * 60) + minutes
return if total_minutes > max_utc_offset_minutes(sign)
sign * total_minutes * 60
rescue ArgumentError
nil
end
def max_utc_offset_minutes(sign)
sign.negative? ? 12 * 60 : 14 * 60
end
def display_matching_zones(zones, offset_input)
puts ''
puts color("Timezones matching UTC#{offset_input}:", :yellow, :bold)
puts ''
zones.each_with_index do |tz, index|
puts " #{index + 1}. #{tz.name} (#{tz.tzinfo.identifier})"
end
puts ' 0. Re-enter UTC offset'
puts ''
end
def select_timezone(zones)
print "Select timezone (1-#{zones.size}, 0 to go back): "
selection = $stdin.gets.chomp.to_i
return if selection.zero?
abort color('Error: Invalid selection', :red, :bold) if selection < 1 || selection > zones.size
timezone = zones[selection - 1].tzinfo.identifier
puts color("Selected timezone: #{timezone}", :gray)
puts ''
timezone
end
def confirm_and_update(account, timezone)
print "Update account #{account.id} reporting_timezone to '#{timezone}'? (y/N): "
confirm = $stdin.gets.chomp.downcase
abort 'Timezone setup cancelled' unless %w[y yes].include?(confirm)
account.update!(reporting_timezone: timezone)
puts ''
puts color("Updated reporting_timezone for account '#{account.name}' to '#{timezone}'", :green, :bold)
puts ''
end
def print_next_steps(account, timezone)
run_times = recommended_run_times(timezone)
print_next_steps_header
print_next_steps_schedule(timezone, run_times)
print_next_steps_backfill(account)
puts color('=' * 70, :green)
end
def print_next_steps_header
puts color('=' * 70, :green)
puts color('Next Steps', :bold, :green)
puts color('=' * 70, :green)
end
def print_next_steps_schedule(timezone, run_times)
puts "1. Wait for today's day-boundary to pass in #{timezone}."
puts '2. Recommended earliest backfill start time:'
puts " - #{timezone}: #{format_time(run_times[:account_tz])}"
puts " - UTC: #{format_time(run_times[:utc])}"
puts " - IST: #{format_time(run_times[:ist])}"
puts " - PCT/PT: #{format_time(run_times[:pct])}"
end
def print_next_steps_backfill(account)
puts '3. Run backfill:'
puts ' bundle exec rake reporting_events_rollup:backfill'
puts "4. Backfill will use account.reporting_timezone and skip today by default for account #{account.id}."
end
def recommended_run_times(timezone)
account_zone = ActiveSupport::TimeZone[timezone]
next_day = Time.current.in_time_zone(account_zone).to_date + 1.day
account_time = account_zone.parse(next_day.to_s) + 30.minutes
{
account_tz: account_time,
utc: account_time.in_time_zone('UTC'),
ist: account_time.in_time_zone('Asia/Kolkata'),
pct: account_time.in_time_zone('Pacific Time (US & Canada)')
}
end
def format_time(time)
time.strftime('%Y-%m-%d %H:%M:%S %Z')
end
ANSI_COLORS = {
reset: "\e[0m",
bold: "\e[1m",
red: "\e[31m",
green: "\e[32m",
yellow: "\e[33m",
cyan: "\e[36m",
gray: "\e[90m"
}.freeze
def color(text, *styles)
return text unless $stdout.tty?
codes = styles.filter_map { |style| ANSI_COLORS[style] }.join
"#{codes}#{text}#{ANSI_COLORS[:reset]}"
end
end

View File

@@ -0,0 +1,12 @@
FactoryBot.define do
factory :reporting_events_rollup do
account
date { Date.current }
dimension_type { 'account' }
dimension_id { 1 }
metric { 'first_response' }
count { 1 }
sum_value { 100.0 }
sum_value_business_hours { 50.0 }
end
end

View File

@@ -18,6 +18,23 @@ describe ReportingEventListener do
expect(account.reporting_events.where(name: 'conversation_resolved').count).to be 1
end
context 'when rollup creation fails' do
let(:event) { Events::Base.new('conversation.resolved', Time.zone.now, conversation: conversation) }
let(:error) { StandardError.new('rollup failed') }
let(:exception_tracker) { instance_double(ChatwootExceptionTracker, capture_exception: true) }
before do
allow(ReportingEvents::RollupService).to receive(:perform).and_raise(error)
allow(ChatwootExceptionTracker).to receive(:new).and_return(exception_tracker)
end
it 'captures the error without interrupting raw event creation' do
expect { listener.conversation_resolved(event) }.not_to raise_error
expect(ChatwootExceptionTracker).to have_received(:new).with(error, account: account)
expect(account.reporting_events.where(name: 'conversation_resolved').count).to be 1
end
end
context 'when business hours enabled for inbox' do
let(:created_at) { Time.zone.parse('March 20, 2022 00:00') }
let(:updated_at) { Time.zone.parse('March 26, 2022 23:59') }

View File

@@ -255,6 +255,21 @@ RSpec.describe Account do
expect(described_class.with_auto_resolve.pluck(:id)).not_to include(account.id)
end
end
context 'when reporting_timezone is set' do
it 'allows valid timezone names' do
account.reporting_timezone = 'America/New_York'
expect(account).to be_valid
end
it 'rejects invalid timezone names' do
account.reporting_timezone = 'Invalid/Timezone'
expect(account).not_to be_valid
expect(account.errors[:reporting_timezone]).to include(I18n.t('errors.account.reporting_timezone.invalid'))
end
end
end
describe 'captain_preferences' do

View File

@@ -0,0 +1,182 @@
require 'rails_helper'
RSpec.describe ReportingEventsRollup do
describe 'associations' do
it { is_expected.to belong_to(:account) }
end
describe 'enums' do
describe 'dimension_type enum' do
it 'stores dimension_type as string values' do
rollup = build(:reporting_events_rollup, dimension_type: 'account')
expect(rollup.dimension_type).to eq('account')
end
it 'has account dimension type' do
rollup = build(:reporting_events_rollup, dimension_type: 'account')
expect(rollup.account?).to be true
end
it 'has agent dimension type' do
rollup = build(:reporting_events_rollup, dimension_type: 'agent')
expect(rollup.agent?).to be true
end
it 'has inbox dimension type' do
rollup = build(:reporting_events_rollup, dimension_type: 'inbox')
expect(rollup.inbox?).to be true
end
it 'has team dimension type' do
rollup = build(:reporting_events_rollup, dimension_type: 'team')
expect(rollup.team?).to be true
end
end
describe 'metric enum' do
it 'stores metric as string values' do
rollup = build(:reporting_events_rollup, metric: 'first_response')
expect(rollup.metric).to eq('first_response')
end
it 'has resolutions_count metric' do
rollup = build(:reporting_events_rollup, metric: 'resolutions_count')
expect(rollup.resolutions_count?).to be true
end
it 'has first_response metric' do
rollup = build(:reporting_events_rollup, metric: 'first_response')
expect(rollup.first_response?).to be true
end
it 'has resolution_time metric' do
rollup = build(:reporting_events_rollup, metric: 'resolution_time')
expect(rollup.resolution_time?).to be true
end
it 'has reply_time metric' do
rollup = build(:reporting_events_rollup, metric: 'reply_time')
expect(rollup.reply_time?).to be true
end
it 'has bot_resolutions_count metric' do
rollup = build(:reporting_events_rollup, metric: 'bot_resolutions_count')
expect(rollup.bot_resolutions_count?).to be true
end
it 'has bot_handoffs_count metric' do
rollup = build(:reporting_events_rollup, metric: 'bot_handoffs_count')
expect(rollup.bot_handoffs_count?).to be true
end
end
end
describe 'validations' do
it { is_expected.to validate_presence_of(:account_id) }
it { is_expected.to validate_presence_of(:date) }
it { is_expected.to validate_presence_of(:dimension_type) }
it { is_expected.to validate_presence_of(:dimension_id) }
it { is_expected.to validate_presence_of(:metric) }
it { is_expected.to validate_numericality_of(:count).is_greater_than_or_equal_to(0) }
end
describe 'scopes' do
let(:account) { create(:account) }
let(:other_account) { create(:account) }
describe '.for_date_range' do
it 'filters by date range' do
create(:reporting_events_rollup, account: account, date: '2026-02-08'.to_date)
create(:reporting_events_rollup, account: account, date: '2026-02-10'.to_date)
create(:reporting_events_rollup, account: account, date: '2026-02-12'.to_date)
results = described_class.for_date_range('2026-02-10'.to_date, '2026-02-11'.to_date)
expect(results.count).to eq(1)
expect(results.first.date).to eq('2026-02-10'.to_date)
end
it 'returns empty array when no records match' do
create(:reporting_events_rollup, account: account, date: '2026-02-08'.to_date)
results = described_class.for_date_range('2026-02-20'.to_date, '2026-02-25'.to_date)
expect(results).to be_empty
end
end
describe '.for_dimension' do
it 'filters by dimension type and id' do
create(:reporting_events_rollup, account: account, dimension_type: 'account', dimension_id: 1)
create(:reporting_events_rollup, account: account, dimension_type: 'agent', dimension_id: 1)
create(:reporting_events_rollup, account: account, dimension_type: 'agent', dimension_id: 2)
results = described_class.for_dimension('agent', 1)
expect(results.count).to eq(1)
expect(results.first.dimension_type).to eq('agent')
expect(results.first.dimension_id).to eq(1)
end
it 'returns empty array when no records match' do
create(:reporting_events_rollup, account: account, dimension_type: 'account', dimension_id: 1)
results = described_class.for_dimension('agent', 1)
expect(results).to be_empty
end
end
describe '.for_metric' do
it 'filters by metric' do
create(:reporting_events_rollup, account: account, metric: 'first_response', dimension_id: 1)
create(:reporting_events_rollup, account: account, metric: 'resolution_time', dimension_id: 2)
create(:reporting_events_rollup, account: account, metric: 'first_response', dimension_id: 3)
results = described_class.for_metric('first_response')
expect(results.count).to eq(2)
expect(results.all? { |r| r.metric == 'first_response' }).to be true
end
it 'returns empty array when no records match' do
create(:reporting_events_rollup, account: account, metric: 'first_response')
results = described_class.for_metric('resolution_time')
expect(results).to be_empty
end
end
end
describe 'database schema' do
let(:account) { create(:account) }
let(:rollup) do
create(:reporting_events_rollup,
account: account,
date: '2026-02-10'.to_date,
dimension_type: 'account',
metric: 'first_response')
end
it 'has all required columns' do
expect(rollup).to have_attributes(
account_id: account.id,
date: '2026-02-10'.to_date,
dimension_type: 'account',
dimension_id: 1,
metric: 'first_response',
count: 1,
sum_value: 100.0,
sum_value_business_hours: 50.0
)
end
it 'stores enum values as strings in database' do
rollup
db_record = described_class.find(rollup.id)
expect(db_record.dimension_type).to eq('account')
expect(db_record.metric).to eq('first_response')
end
end
end

View File

@@ -0,0 +1,207 @@
require 'rails_helper'
describe ReportingEvents::BackfillService do
describe '.backfill_date' do
let(:account) { create(:account, reporting_timezone: 'America/New_York') }
let(:date) { Date.new(2026, 2, 11) }
let(:user) { create(:user, account: account) }
let(:inbox) { create(:inbox, account: account) }
let(:conversation) { create(:conversation, account: account, inbox: inbox, assignee: user) }
it 'treats nil metric values as zero during backfill' do
reporting_event = create_backfill_event(
name: 'first_response', value: 100, value_in_business_hours: 50,
user: user, inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 15)
)
# Simulate a legacy row that already exists in the database with nil metrics.
# rubocop:disable Rails/SkipsModelValidations
reporting_event.update_columns(value: nil, value_in_business_hours: nil)
# rubocop:enable Rails/SkipsModelValidations
expect { described_class.backfill_date(account, date) }.not_to raise_error
rollup = find_rollup('account', account.id, 'first_response')
expect(rollup.count).to eq(1)
expect(rollup.sum_value).to eq(0)
expect(rollup.sum_value_business_hours).to eq(0)
end
context 'when replacing rows fails atomically' do
before do
create(
:reporting_events_rollup,
account: account, date: date, dimension_type: 'account', dimension_id: account.id,
metric: 'first_response', count: 7, sum_value: 700, sum_value_business_hours: 350
)
end
it 'preserves existing rollups when building replacement rows fails' do
service = described_class.new(account, date)
allow(service).to receive(:build_rollup_rows).and_raise(StandardError, 'boom')
expect { service.perform }.to raise_error(StandardError, 'boom')
rollup = find_rollup('account', account.id, 'first_response')
expect(rollup.count).to eq(7)
expect(rollup.sum_value).to eq(700)
expect(rollup.sum_value_business_hours).to eq(350)
end
it 'preserves existing rollups when bulk insert fails' do
create_backfill_event(
name: 'first_response', value: 100, value_in_business_hours: 50,
user: user, inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 15)
)
service = described_class.new(account, date)
allow(service).to receive(:bulk_insert_rollups).and_raise(StandardError, 'boom')
expect { service.perform }.to raise_error(StandardError, 'boom')
rollup = find_rollup('account', account.id, 'first_response')
expect(rollup.count).to eq(7)
expect(rollup.sum_value).to eq(700)
expect(rollup.sum_value_business_hours).to eq(350)
end
end
context 'when aggregating grouped rows' do
let(:second_user) { create(:user, account: account) }
let(:second_inbox) { create(:inbox, account: account) }
let(:second_conversation) { create(:conversation, account: account, inbox: second_inbox, assignee: second_user) }
before do
create_backfill_event(name: 'first_response', value: 100, value_in_business_hours: 60, user: user,
inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 14))
create_backfill_event(name: 'first_response', value: 40, value_in_business_hours: 20, user: user,
inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 15))
create_backfill_event(name: 'conversation_resolved', value: 200, value_in_business_hours: 80, user: second_user,
inbox: second_inbox, conversation: second_conversation, created_at: Time.utc(2026, 2, 11, 16))
create_backfill_event(name: 'reply_time', value: 500, value_in_business_hours: 300, user: user,
inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 12, 5))
described_class.backfill_date(account, date)
end
it 'does not instantiate reporting events' do
reporting_event_instantiations = count_reporting_event_instantiations do
described_class.backfill_date(account, date)
end
expect(reporting_event_instantiations).to eq(0)
end
it 'creates the expected number of rollup rows' do
rollups = ReportingEventsRollup.where(account_id: account.id, date: date)
# 3 dimensions × first_response + 3 dimensions × resolutions_count + 3 dimensions × resolution_time
expect(rollups.count).to eq(9)
end
it 'aggregates first_response at the account dimension' do
account_first_response = find_rollup('account', account.id, 'first_response')
expect(account_first_response.count).to eq(2)
expect(account_first_response.sum_value).to eq(140)
expect(account_first_response.sum_value_business_hours).to eq(80)
end
it 'aggregates first_response at the agent dimension' do
agent_first_response = find_rollup('agent', user.id, 'first_response')
expect(agent_first_response.count).to eq(2)
expect(agent_first_response.sum_value).to eq(140)
expect(agent_first_response.sum_value_business_hours).to eq(80)
end
it 'aggregates resolution_time at the agent dimension' do
agent_resolution_time = find_rollup('agent', second_user.id, 'resolution_time')
expect(agent_resolution_time.count).to eq(1)
expect(agent_resolution_time.sum_value).to eq(200)
expect(agent_resolution_time.sum_value_business_hours).to eq(80)
end
it 'aggregates first_response at the inbox dimension' do
inbox_first_response = find_rollup('inbox', inbox.id, 'first_response')
expect(inbox_first_response.count).to eq(2)
expect(inbox_first_response.sum_value).to eq(140)
expect(inbox_first_response.sum_value_business_hours).to eq(80)
end
it 'aggregates resolution_time at the inbox dimension' do
inbox_resolution_time = find_rollup('inbox', second_inbox.id, 'resolution_time')
expect(inbox_resolution_time.count).to eq(1)
expect(inbox_resolution_time.sum_value).to eq(200)
expect(inbox_resolution_time.sum_value_business_hours).to eq(80)
end
end
context 'when deduplicating distinct-count events' do
let(:second_user) { create(:user, account: account) }
let(:second_inbox) { create(:inbox, account: account) }
let(:conversation_b) { create(:conversation, account: account, inbox: inbox, assignee: user) }
let(:conversation_c) { create(:conversation, account: account, inbox: second_inbox, assignee: second_user) }
before do
# Two events for the same conversation — should count as 1
create_backfill_event(name: 'conversation_bot_handoff', value: 0, value_in_business_hours: 0, user: user,
inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 14))
create_backfill_event(name: 'conversation_bot_handoff', value: 0, value_in_business_hours: 0, user: user,
inbox: inbox, conversation: conversation, created_at: Time.utc(2026, 2, 11, 15))
# Different conversation, same agent/inbox
create_backfill_event(name: 'conversation_bot_handoff', value: 0, value_in_business_hours: 0, user: user,
inbox: inbox, conversation: conversation_b, created_at: Time.utc(2026, 2, 11, 16))
# Different agent/inbox
create_backfill_event(name: 'conversation_bot_handoff', value: 0, value_in_business_hours: 0, user: second_user,
inbox: second_inbox, conversation: conversation_c, created_at: Time.utc(2026, 2, 11, 17))
described_class.backfill_date(account, date)
end
it 'creates the expected number of rollup rows' do
rollups = ReportingEventsRollup.where(account_id: account.id, date: date)
expect(rollups.count).to eq(5)
end
it 'counts 3 distinct conversations at the account dimension' do
expect(find_rollup('account', account.id, 'bot_handoffs_count').count).to eq(3)
end
it 'counts distinct conversations per agent' do
expect(find_rollup('agent', user.id, 'bot_handoffs_count').count).to eq(2)
expect(find_rollup('agent', second_user.id, 'bot_handoffs_count').count).to eq(1)
end
it 'counts distinct conversations per inbox' do
expect(find_rollup('inbox', inbox.id, 'bot_handoffs_count').count).to eq(2)
expect(find_rollup('inbox', second_inbox.id, 'bot_handoffs_count').count).to eq(1)
end
end
def create_backfill_event(**attributes)
create(
:reporting_event,
account: account,
**attributes
)
end
def find_rollup(dimension_type, dimension_id, metric)
ReportingEventsRollup.find_by!(
account_id: account.id,
date: date,
dimension_type: dimension_type,
dimension_id: dimension_id,
metric: metric
)
end
def count_reporting_event_instantiations(&)
instantiation_count = 0
subscriber = lambda do |_name, _start, _finish, _id, payload|
next unless payload[:class_name] == 'ReportingEvent'
instantiation_count += payload[:record_count]
end
ActiveSupport::Notifications.subscribed(subscriber, 'instantiation.active_record', &)
instantiation_count
end
end
end

View File

@@ -0,0 +1,125 @@
require 'rails_helper'
RSpec.describe ReportingEvents::EventMetricRegistry do
describe '.event_names' do
it 'returns the supported raw event names' do
expect(described_class.event_names).to eq(
%w[
conversation_resolved
first_response
reply_time
conversation_bot_resolved
conversation_bot_handoff
]
)
end
end
describe '.metrics_for' do
it 'returns the emitted rollup metrics for conversation_resolved' do
event = instance_double(ReportingEvent, name: 'conversation_resolved', value: 120, value_in_business_hours: 45)
expect(described_class.metrics_for(event)).to eq(
resolutions_count: {
count: 1,
sum_value: 0,
sum_value_business_hours: 0
},
resolution_time: {
count: 1,
sum_value: 120.0,
sum_value_business_hours: 45.0
}
)
end
it 'returns the emitted rollup metrics for first_response' do
event = instance_double(ReportingEvent, name: 'first_response', value: 80, value_in_business_hours: 20)
expect(described_class.metrics_for(event)).to eq(
first_response: {
count: 1,
sum_value: 80.0,
sum_value_business_hours: 20.0
}
)
end
it 'returns the emitted rollup metrics for reply_time' do
event = instance_double(ReportingEvent, name: 'reply_time', value: 40, value_in_business_hours: 15)
expect(described_class.metrics_for(event)).to eq(
reply_time: {
count: 1,
sum_value: 40.0,
sum_value_business_hours: 15.0
}
)
end
it 'returns the emitted rollup metrics for conversation_bot_resolved' do
event = instance_double(ReportingEvent, name: 'conversation_bot_resolved')
expect(described_class.metrics_for(event)).to eq(
bot_resolutions_count: {
count: 1,
sum_value: 0,
sum_value_business_hours: 0
}
)
end
it 'returns the emitted rollup metrics for conversation_bot_handoff' do
event = instance_double(ReportingEvent, name: 'conversation_bot_handoff')
expect(described_class.metrics_for(event)).to eq(
bot_handoffs_count: {
count: 1,
sum_value: 0,
sum_value_business_hours: 0
}
)
end
it 'returns an empty hash for unsupported events' do
event = instance_double(ReportingEvent, name: 'conversation_created')
expect(described_class.metrics_for(event)).to eq({})
end
end
describe '.metrics_for_aggregate' do
it 'returns aggregated rollup metrics for conversation_resolved groups' do
expect(
described_class.metrics_for_aggregate(
'conversation_resolved',
count: 3,
sum_value: 420,
sum_value_business_hours: 210
)
).to eq(
resolutions_count: {
count: 3,
sum_value: 0,
sum_value_business_hours: 0
},
resolution_time: {
count: 3,
sum_value: 420.0,
sum_value_business_hours: 210.0
}
)
end
it 'returns an empty hash for unsupported grouped events' do
expect(
described_class.metrics_for_aggregate(
'conversation_created',
count: 2,
sum_value: 100,
sum_value_business_hours: 50
)
).to eq({})
end
end
end

View File

@@ -0,0 +1,446 @@
require 'rails_helper'
describe ReportingEvents::RollupService do
describe '.perform' do
let(:account) { create(:account, reporting_timezone: 'America/New_York') }
let(:user) { create(:user, account: account) }
let(:inbox) { create(:inbox, account: account) }
let(:conversation) { create(:conversation, account: account, inbox: inbox, assignee: user) }
context 'when reporting_timezone is not set' do
before { account.update!(reporting_timezone: nil) }
it 'skips rollup creation' do
reporting_event = create(:reporting_event,
account: account,
name: 'conversation_resolved',
conversation: conversation)
expect do
described_class.perform(reporting_event)
end.not_to change(ReportingEventsRollup, :count)
end
end
context 'when reporting_timezone is invalid' do
it 'skips rollup creation' do
reporting_event = create(:reporting_event,
account: account,
name: 'conversation_resolved',
conversation: conversation)
allow(account).to receive(:reporting_timezone).and_return('Invalid/Zone')
expect do
described_class.perform(reporting_event)
end.not_to change(ReportingEventsRollup, :count)
end
end
context 'when reporting_timezone is set' do
describe 'conversation_resolved event' do
let(:rollup_event_created_at) { Time.utc(2026, 2, 12, 4, 0, 0) }
let(:rollup_write_time) { Time.utc(2026, 2, 12, 10, 0, 0) }
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'conversation_resolved',
value: 1000,
value_in_business_hours: 500,
user: user,
inbox: inbox,
conversation: conversation)
end
let(:expected_upsert_options) do
{
unique_by: %i[account_id date dimension_type dimension_id metric],
on_duplicate: 'count = reporting_events_rollups.count + EXCLUDED.count, ' \
'sum_value = reporting_events_rollups.sum_value + EXCLUDED.sum_value, ' \
'sum_value_business_hours = reporting_events_rollups.sum_value_business_hours + EXCLUDED.sum_value_business_hours, ' \
'updated_at = EXCLUDED.updated_at'
}
end
it 'creates rollup rows for account, agent, and inbox' do
described_class.perform(reporting_event)
# Account dimension
account_row = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
dimension_id: account.id,
metric: 'resolutions_count'
)
expect(account_row).to be_present
# Agent dimension
agent_row = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'agent',
dimension_id: user.id,
metric: 'resolutions_count'
)
expect(agent_row).to be_present
# Inbox dimension
inbox_row = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'inbox',
dimension_id: inbox.id,
metric: 'resolutions_count'
)
expect(inbox_row).to be_present
end
it 'creates correct metrics for conversation_resolved' do
described_class.perform(reporting_event)
resolutions_count = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'resolutions_count'
)
expect(resolutions_count.count).to eq(1)
expect(resolutions_count.sum_value).to eq(0)
expect(resolutions_count.sum_value_business_hours).to eq(0)
resolution_time = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'resolution_time'
)
expect(resolution_time.count).to eq(1)
expect(resolution_time.sum_value).to eq(1000)
expect(resolution_time.sum_value_business_hours).to eq(500)
end
it 'batches all rollup rows in a single upsert_all call' do
reporting_event.update!(created_at: rollup_event_created_at)
travel_to(rollup_write_time) do
captured_upsert = capture_upsert_all_call
described_class.perform(reporting_event)
expect(ReportingEventsRollup).to have_received(:upsert_all).once
expect(captured_upsert[:options][:unique_by]).to eq(expected_upsert_options[:unique_by])
expect(captured_upsert[:options][:on_duplicate].to_s.squish).to eq(expected_upsert_options[:on_duplicate])
expect(captured_upsert[:rows]).to match_array(expected_rollup_rows)
end
end
it 'respects account timezone for date bucketing' do
# Event created at 2026-02-11 22:00 UTC
# In EST (UTC-5) that's 2026-02-11 17:00 (same day)
reporting_event.update!(created_at: '2026-02-11 22:00:00 UTC')
described_class.perform(reporting_event)
rollup = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account'
)
expect(rollup.date).to eq('2026-02-11'.to_date)
end
it 'handles timezone boundary crossing' do
# Event created at 2026-02-12 04:00 UTC
# In EST (UTC-5) that's 2026-02-11 23:00 (previous day)
reporting_event.update!(created_at: '2026-02-12 04:00:00 UTC')
described_class.perform(reporting_event)
rollup = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account'
)
expect(rollup.date).to eq('2026-02-11'.to_date)
end
def capture_upsert_all_call
{}.tap do |captured_upsert|
allow(ReportingEventsRollup).to receive(:upsert_all) do |rows, **options|
captured_upsert[:rows] = rows
captured_upsert[:options] = options
end
end
end
def expected_rollup_rows
{
account: account.id,
agent: user.id,
inbox: inbox.id
}.flat_map do |dimension_type, dimension_id|
[
rollup_row(dimension_type, dimension_id, :resolutions_count, 0.0, 0.0),
rollup_row(dimension_type, dimension_id, :resolution_time, 1000.0, 500.0)
]
end
end
def rollup_row(dimension_type, dimension_id, metric, sum_value, sum_value_business_hours)
{
account_id: account.id,
date: Date.new(2026, 2, 11),
dimension_type: dimension_type,
dimension_id: dimension_id,
metric: metric,
count: 1,
sum_value: sum_value,
sum_value_business_hours: sum_value_business_hours,
created_at: rollup_write_time,
updated_at: rollup_write_time
}
end
end
describe 'first_response event' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'first_response',
value: 500,
value_in_business_hours: 300,
user: user,
inbox: inbox,
conversation: conversation)
end
it 'creates first_response metric' do
described_class.perform(reporting_event)
first_response = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'first_response'
)
expect(first_response).to be_present
expect(first_response.count).to eq(1)
expect(first_response.sum_value).to eq(500)
expect(first_response.sum_value_business_hours).to eq(300)
end
end
describe 'reply_time event' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'reply_time',
value: 200,
value_in_business_hours: 100,
user: user,
inbox: inbox,
conversation: conversation)
end
it 'creates reply_time metric' do
described_class.perform(reporting_event)
reply_time = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'reply_time'
)
expect(reply_time).to be_present
expect(reply_time.count).to eq(1)
expect(reply_time.sum_value).to eq(200)
expect(reply_time.sum_value_business_hours).to eq(100)
end
end
describe 'when metric values are nil' do
[
%w[conversation_resolved resolution_time],
%w[first_response first_response],
%w[reply_time reply_time]
].each do |event_name, metric|
it "treats nil values as zero for #{event_name}" do
reporting_event = create(:reporting_event,
account: account,
name: event_name,
value: 100,
value_in_business_hours: 50,
user: user,
inbox: inbox,
conversation: conversation)
reporting_event.assign_attributes(value: nil, value_in_business_hours: nil)
expect do
described_class.perform(reporting_event)
end.not_to raise_error
rollup = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: metric
)
expect(rollup).to be_present
expect(rollup.count).to eq(1)
expect(rollup.sum_value).to eq(0)
expect(rollup.sum_value_business_hours).to eq(0)
end
end
end
describe 'conversation_bot_resolved event' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'conversation_bot_resolved',
user: user,
inbox: inbox,
conversation: conversation)
end
it 'creates bot_resolutions_count metric' do
described_class.perform(reporting_event)
bot_resolutions = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'bot_resolutions_count'
)
expect(bot_resolutions).to be_present
expect(bot_resolutions.count).to eq(1)
expect(bot_resolutions.sum_value).to eq(0)
end
end
describe 'conversation_bot_handoff event' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'conversation_bot_handoff',
user: user,
inbox: inbox,
conversation: conversation)
end
it 'creates bot_handoffs_count metric' do
described_class.perform(reporting_event)
bot_handoffs = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'bot_handoffs_count'
)
expect(bot_handoffs).to be_present
expect(bot_handoffs.count).to eq(1)
expect(bot_handoffs.sum_value).to eq(0)
end
end
describe 'dimension handling' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'conversation_resolved',
value: 1000,
value_in_business_hours: 500,
user: user,
inbox: inbox,
conversation: conversation)
end
it 'skips dimensions with nil IDs' do
# Create event without user (user_id will be nil)
reporting_event.update!(user_id: nil)
described_class.perform(reporting_event)
# Agent dimension should not be created
agent_rows = ReportingEventsRollup.where(
account_id: account.id,
dimension_type: 'agent'
)
expect(agent_rows).to be_empty
end
end
describe 'upsert behavior' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'conversation_resolved',
value: 1000,
value_in_business_hours: 500,
user: user,
inbox: inbox,
conversation: conversation)
end
it 'increments count and sums on duplicate entries' do
# First call
described_class.perform(reporting_event)
resolution_time = ReportingEventsRollup.find_by(
account_id: account.id,
dimension_type: 'account',
metric: 'resolution_time'
)
expect(resolution_time.count).to eq(1)
expect(resolution_time.sum_value).to eq(1000)
expect(resolution_time.sum_value_business_hours).to eq(500)
# Second call with same event
reporting_event2 = create(:reporting_event,
account: account,
name: 'conversation_resolved',
value: 500,
value_in_business_hours: 250,
user: user,
inbox: inbox,
conversation: conversation,
created_at: reporting_event.created_at)
described_class.perform(reporting_event2)
# Total should be incremented
resolution_time.reload
expect(resolution_time.count).to eq(2)
expect(resolution_time.sum_value).to eq(1500)
expect(resolution_time.sum_value_business_hours).to eq(750)
end
it 'does not create duplicate rollup rows' do
described_class.perform(reporting_event)
initial_count = ReportingEventsRollup.count
# Create another event with same date and dimensions
reporting_event2 = create(:reporting_event,
account: account,
name: 'conversation_resolved',
value: 500,
value_in_business_hours: 250,
user: user,
inbox: inbox,
conversation: conversation,
created_at: reporting_event.created_at)
described_class.perform(reporting_event2)
# Row count should remain the same
expect(ReportingEventsRollup.count).to eq(initial_count)
end
end
end
context 'when event name in unknown' do
let(:reporting_event) do
create(:reporting_event,
account: account,
name: 'unknown_event',
user: user,
inbox: inbox,
conversation: conversation)
end
it 'does not create any rollup rows' do
expect do
described_class.perform(reporting_event)
end.not_to change(ReportingEventsRollup, :count)
end
end
end
end