Migration Guide: https://chwt.app/v4/migration This PR imports all the work related to Captain into the EE codebase. Captain represents the AI-based features in Chatwoot and includes the following key components: - Assistant: An assistant has a persona, the product it would be trained on. At the moment, the data at which it is trained is from websites. Future integrations on Notion documents, PDF etc. This PR enables connecting an assistant to an inbox. The assistant would run the conversation every time before transferring it to an agent. - Copilot for Agents: When an agent is supporting a customer, we will be able to offer additional help to lookup some data or fetch information from integrations etc via copilot. - Conversation FAQ generator: When a conversation is resolved, the Captain integration would identify questions which were not in the knowledge base. - CRM memory: Learns from the conversations and identifies important information about the contact. --------- Co-authored-by: Vishnu Narayanan <vishnu@chatwoot.com> Co-authored-by: Sojan <sojan@pepalo.com> Co-authored-by: iamsivin <iamsivin@gmail.com> Co-authored-by: Sivin Varghese <64252451+iamsivin@users.noreply.github.com>
83 lines
2.4 KiB
Ruby
83 lines
2.4 KiB
Ruby
module Enterprise::Integrations::OpenaiProcessorService
|
|
ALLOWED_EVENT_NAMES = %w[rephrase summarize reply_suggestion label_suggestion fix_spelling_grammar shorten expand
|
|
make_friendly make_formal simplify].freeze
|
|
CACHEABLE_EVENTS = %w[label_suggestion].freeze
|
|
|
|
def label_suggestion_message
|
|
payload = label_suggestion_body
|
|
return nil if payload.blank?
|
|
|
|
response = make_api_call(label_suggestion_body)
|
|
|
|
return response if response[:error].present?
|
|
|
|
# LLMs are not deterministic, so this is bandaid solution
|
|
# To what you ask? Sometimes, the response includes
|
|
# "Labels:" in it's response in some format. This is a hacky way to remove it
|
|
# TODO: Fix with with a better prompt
|
|
{ message: response[:message] ? response[:message].gsub(/^(label|labels):/i, '') : '' }
|
|
end
|
|
|
|
private
|
|
|
|
def labels_with_messages
|
|
return nil unless valid_conversation?(conversation)
|
|
|
|
labels = hook.account.labels.pluck(:title).join(', ')
|
|
character_count = labels.length
|
|
|
|
messages = init_messages_body(false)
|
|
add_messages_until_token_limit(conversation, messages, false, character_count)
|
|
|
|
return nil if messages.blank? || labels.blank?
|
|
|
|
"Messages:\n#{messages}\nLabels:\n#{labels}"
|
|
end
|
|
|
|
def valid_conversation?(conversation)
|
|
return false if conversation.nil?
|
|
return false if conversation.messages.incoming.count < 3
|
|
|
|
# Think Mark think, at this point the conversation is beyond saving
|
|
return false if conversation.messages.count > 100
|
|
|
|
# if there are more than 20 messages, only trigger this if the last message is from the client
|
|
return false if conversation.messages.count > 20 && !conversation.messages.last.incoming?
|
|
|
|
true
|
|
end
|
|
|
|
def summarize_body
|
|
{
|
|
model: self.class::GPT_MODEL,
|
|
messages: [
|
|
{ role: 'system',
|
|
content: prompt_from_file('summary', enterprise: true) },
|
|
{ role: 'user', content: conversation_messages }
|
|
]
|
|
}.to_json
|
|
end
|
|
|
|
def label_suggestion_body
|
|
return unless label_suggestions_enabled?
|
|
|
|
content = labels_with_messages
|
|
return value_from_cache if content.blank?
|
|
|
|
{
|
|
model: self.class::GPT_MODEL,
|
|
messages: [
|
|
{
|
|
role: 'system',
|
|
content: prompt_from_file('label_suggestion', enterprise: true)
|
|
},
|
|
{ role: 'user', content: content }
|
|
]
|
|
}.to_json
|
|
end
|
|
|
|
def label_suggestions_enabled?
|
|
hook.settings['label_suggestion'].present?
|
|
end
|
|
end
|