-
Suggestion
-
Resolution: Unresolved
-
None
-
1
-
Issue Summary
Many customers are excited to utilize Atlassian Intelligence with a custom LLM provider
Currently Atlassian Intelligence uses a diverse range of open-source, Atlassian-hosted LLMs, including models from the LLama series, Phi series, and Mixtral, alongside third-party hosted LLMs from OpenAI's GPT series of models, Anthropic's Claude series of models, and Google's Gemini series of models, to deliver the best outcomes for customers. Our features use dynamic routing to select the appropriate mix of models that can deliver the best experience and accuracy for each scenario.
From:
https://www.atlassian.com/trust/atlassian-intelligence
Some admins would like to utilize a single LLM of their choice, test different LLMs or create a self hosted internal LLM that they manage for optimize performance and to meet security requirements for their industry.
Requested outcome:
Please provide a configuration that admins can manage to connect with the LLM of their choice, including a self-hosted LLM
- relates to
-
ENT-2371 Failed to load
Is there any update on this feature?