fix: search faqs in account language (#13428)

# Pull Request Template

## Description

Reply suggestions uses `search_documentation`. While this is useful,
there is a subtle bug, a user's message may be in a different language
(say spanish) than the FAQs present (english).
This results in embedding search in spanish and compared against english
vectors, which results in poor retrieval and poor suggestions.


Fixes # (issue)
This PR fixes the above behaviour by making a small llm call translate
the query before searching in the search documentation tool


## Type of change

- [x] Bug fix (non-breaking change which fixes an issue)

## How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide
instructions so we can reproduce. Please also list any relevant details
for your test configuration.

before:
<img width="894" height="157" alt="image"
src="https://github.com/user-attachments/assets/83871ee5-511e-4432-8b99-39e803759f63"
/>

after:
<img width="1149" height="294" alt="image"
src="https://github.com/user-attachments/assets/f9617d7a-6d48-4ca1-ad1c-2181e16c1f3d"
/>


test on rails console:
<img width="2094" height="380" alt="image"
src="https://github.com/user-attachments/assets/159fdaa5-8808-49d2-be5d-304d69fa97f7"
/>


## Checklist:

- [x] My code follows the style guidelines of this project
- [x] I have performed a self-review of my code
- [x] I have commented on my code, particularly in hard-to-understand
areas
- [ ] I have made corresponding changes to the documentation
- [x] My changes generate no new warnings
- [x] I have added tests that prove my fix is effective or that my
feature works
- [x] New and existing unit tests pass locally with my changes
- [x] Any dependent changes have been merged and published in downstream
modules
This commit is contained in:
Aakash Bakhle
2026-02-09 17:25:11 +05:30
committed by GitHub
parent 67112647e8
commit bd732f1fa9
7 changed files with 80 additions and 16 deletions

View File

@@ -197,6 +197,8 @@ gem 'ai-agents', '>= 0.7.0'
gem 'ruby_llm', '>= 1.8.2'
gem 'ruby_llm-schema'
gem 'cld3', '~> 3.7'
# OpenTelemetry for LLM observability
gem 'opentelemetry-sdk'
gem 'opentelemetry-exporter-otlp'

View File

@@ -186,6 +186,7 @@ GEM
byebug (11.1.3)
childprocess (5.1.0)
logger (~> 1.5)
cld3 (3.7.0)
climate_control (1.2.0)
coderay (1.1.3)
commonmarker (0.23.10)
@@ -1037,6 +1038,7 @@ DEPENDENCIES
bullet
bundle-audit
byebug
cld3 (~> 3.7)
climate_control
commonmarker
csv-safe

View File

@@ -0,0 +1,49 @@
class Captain::Llm::TranslateQueryService < Captain::BaseTaskService
MODEL = 'gpt-4.1-nano'.freeze
pattr_initialize [:account!]
def translate(query, target_language:)
return query if query_in_target_language?(query)
messages = [
{ role: 'system', content: system_prompt(target_language) },
{ role: 'user', content: query }
]
response = make_api_call(model: MODEL, messages: messages)
return query if response[:error]
response[:message].strip
rescue StandardError => e
Rails.logger.warn "TranslateQueryService failed: #{e.message}, falling back to original query"
query
end
private
def event_name
'translate_query'
end
def query_in_target_language?(query)
detector = CLD3::NNetLanguageIdentifier.new(0, 1000)
result = detector.find_language(query)
result.reliable? && result.language == account_language_code
rescue StandardError
false
end
def account_language_code
account.locale&.split('_')&.first
end
def system_prompt(target_language)
<<~SYSTEM_PROMPT_MESSAGE
You are a helpful assistant that translates queries from one language to another.
Translate the query to #{target_language}.
Return just the translated query, no other text.
SYSTEM_PROMPT_MESSAGE
end
end

View File

@@ -9,7 +9,11 @@ class Captain::Tools::SearchDocumentationService < Captain::Tools::BaseTool
def execute(query:)
Rails.logger.info { "#{self.class.name}: #{query}" }
responses = assistant.responses.approved.search(query)
translated_query = Captain::Llm::TranslateQueryService
.new(account: assistant.account)
.translate(query, target_language: assistant.account.locale_english_name)
responses = assistant.responses.approved.search(translated_query)
return 'No FAQs found for the given query' if responses.empty?

View File

@@ -18,7 +18,11 @@ class Captain::Tools::SearchReplyDocumentationService < RubyLLM::Tool
def execute(query:)
Rails.logger.info { "#{self.class.name}: #{query}" }
responses = search_responses(query)
translated_query = Captain::Llm::TranslateQueryService
.new(account: @account)
.translate(query, target_language: @account.locale_english_name)
responses = search_responses(translated_query)
return 'No FAQs found for the given query' if responses.empty?
responses.map { |response| format_response(response) }.join

View File

@@ -1,5 +1,6 @@
module Captain::ToolInstrumentation
extend ActiveSupport::Concern
include Integrations::LlmInstrumentationConstants
private
@@ -10,15 +11,10 @@ module Captain::ToolInstrumentation
response = nil
executed = false
tracer.in_span(params[:span_name]) do |span|
span.set_attribute('langfuse.user.id', params[:account_id].to_s) if params[:account_id]
span.set_attribute('langfuse.tags', [params[:feature_name]].to_json)
span.set_attribute('langfuse.observation.input', params[:messages].to_json)
set_tool_session_attributes(span, params)
response = yield
executed = true
# Output just the message for cleaner Langfuse display
span.set_attribute('langfuse.observation.output', response[:message] || response.to_json)
span.set_attribute(ATTR_LANGFUSE_OBSERVATION_OUTPUT, response[:message] || response.to_json)
end
response
rescue StandardError => e
@@ -26,17 +22,24 @@ module Captain::ToolInstrumentation
executed ? response : yield
end
def set_tool_session_attributes(span, params)
span.set_attribute(ATTR_LANGFUSE_USER_ID, params[:account_id].to_s) if params[:account_id]
span.set_attribute(ATTR_LANGFUSE_SESSION_ID, "#{params[:account_id]}_#{params[:conversation_id]}") if params[:conversation_id].present?
span.set_attribute(ATTR_LANGFUSE_TAGS, [params[:feature_name]].to_json)
span.set_attribute(ATTR_LANGFUSE_OBSERVATION_INPUT, params[:messages].to_json)
end
def record_generation(chat, message, model)
return unless ChatwootApp.otel_enabled?
return unless message.respond_to?(:role) && message.role.to_s == 'assistant'
tracer.in_span("llm.#{event_name}.generation") do |span|
span.set_attribute('gen_ai.system', 'openai')
span.set_attribute('gen_ai.request.model', model)
span.set_attribute('gen_ai.usage.input_tokens', message.input_tokens)
span.set_attribute('gen_ai.usage.output_tokens', message.output_tokens) if message.respond_to?(:output_tokens)
span.set_attribute('langfuse.observation.input', format_chat_messages(chat))
span.set_attribute('langfuse.observation.output', message.content.to_s) if message.respond_to?(:content)
span.set_attribute(ATTR_GEN_AI_PROVIDER, 'openai')
span.set_attribute(ATTR_GEN_AI_REQUEST_MODEL, model)
span.set_attribute(ATTR_GEN_AI_USAGE_INPUT_TOKENS, message.input_tokens)
span.set_attribute(ATTR_GEN_AI_USAGE_OUTPUT_TOKENS, message.output_tokens) if message.respond_to?(:output_tokens)
span.set_attribute(ATTR_LANGFUSE_OBSERVATION_INPUT, format_chat_messages(chat))
span.set_attribute(ATTR_LANGFUSE_OBSERVATION_OUTPUT, message.content.to_s) if message.respond_to?(:content)
end
rescue StandardError => e
Rails.logger.warn "Failed to record generation: #{e.message}"

View File

@@ -33,7 +33,7 @@ General guidelines:
- Reply in the customer's language
{% if has_search_tool %}
**Important**: You have access to a `search_documentation` tool that can search the company's knowledge base for product details, policies, FAQs, and other information.
**Important**: You have access to a `search_documentation` tool that can search the company's knowledge base for product details, policies, FAQs, and other information.
**Use the search_documentation tool first** to find relevant information before composing your reply. This ensures your response is accurate and based on actual company documentation.
{% endif %}