fix: Hardcoded 500 in AI api error response(#13005)

## Description

Please include a summary of the change and issue(s) fixed. Also, mention
relevant motivation, context, and any dependencies that this change
requires.

Fixes false new relic alerts set due to hardcoding an error code

## Type of change


Bug fix (non-breaking change which fixes an issue)
## How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide
instructions so we can reproduce. Please also list any relevant details
for your test configuration.

Before
<img width="776" height="666" alt="image"
src="https://github.com/user-attachments/assets/f086890d-eaf1-4e83-b383-fe3675b24159"
/>

the 500 was hardcoded. 
RubyLLM doesn't send any error codes, so i removed the error code
argument and just pass the error message

Langfuse gets just the error message

<img width="883" height="700" alt="image"
src="https://github.com/user-attachments/assets/fc8c3907-b9a5-4c87-bfc6-8e05cfe9c8b0"
/>

local logs only show error
<img width="1434" height="200" alt="image"
src="https://github.com/user-attachments/assets/716c6371-78f0-47b8-88a4-03e4196c0e9a"
/>

Better fix is to handle each case and show the user wherever necessary

## Checklist:

- [x] My code follows the style guidelines of this project
- [x] I have performed a self-review of my code
- [x] I have commented on my code, particularly in hard-to-understand
areas
- [] I have made corresponding changes to the documentation
- [x] My changes generate no new warnings
- [x] I have added tests that prove my fix is effective or that my
feature works
- [x] New and existing unit tests pass locally with my changes
- [x] Any dependent changes have been merged and published in downstream
modules

---------

Co-authored-by: aakashb95 <aakash@chatwoot.com>
This commit is contained in:
Aakash Bakhle
2025-12-04 20:32:46 +05:30
committed by GitHub
parent efc3b5e7d4
commit 67dc21ea5f
3 changed files with 6 additions and 10 deletions

View File

@@ -164,6 +164,6 @@ class Integrations::LlmBaseService
end
def build_error_response_from_exception(error, messages)
{ error: error.message, error_code: 500, request_messages: messages }
{ error: error.message, request_messages: messages }
end
end

View File

@@ -32,10 +32,8 @@ module Integrations::LlmInstrumentationHelpers
error = result[:error] || result['error']
return if error.blank?
error_code = result[:error_code] || result['error_code']
span.set_attribute(ATTR_GEN_AI_RESPONSE_ERROR, error.to_json)
span.set_attribute(ATTR_GEN_AI_RESPONSE_ERROR_CODE, error_code) if error_code
span.status = OpenTelemetry::Trace::Status.error("API Error: #{error_code}")
span.status = OpenTelemetry::Trace::Status.error(error.to_s.truncate(1000))
end
def set_metadata_attributes(span, params)

View File

@@ -200,17 +200,15 @@ RSpec.describe Integrations::LlmInstrumentation do
result = instance.instrument_llm_call(params) do
{
error: { message: 'API rate limit exceeded' },
error_code: 'rate_limit_exceeded'
error: 'API rate limit exceeded'
}
end
expect(result[:error_code]).to eq('rate_limit_exceeded')
expect(result[:error]).to eq('API rate limit exceeded')
expect(mock_span).to have_received(:set_attribute)
.with('gen_ai.response.error', '{"message":"API rate limit exceeded"}')
expect(mock_span).to have_received(:set_attribute).with('gen_ai.response.error_code', 'rate_limit_exceeded')
.with('gen_ai.response.error', '"API rate limit exceeded"')
expect(mock_span).to have_received(:status=).with(mock_status)
expect(OpenTelemetry::Trace::Status).to have_received(:error).with('API Error: rate_limit_exceeded')
expect(OpenTelemetry::Trace::Status).to have_received(:error).with('API rate limit exceeded')
end
end