Uploaded image for project: 'Atlassian Intelligence'
  1. Atlassian Intelligence
  2. AI-973

Allow admins to configure Atlassian Intelligence with custom LLM

    • Icon: Suggestion Suggestion
    • Resolution: Unresolved
    • Admin Experience
    • None
    • 1
    • Our product teams collect and evaluate feedback from a number of different sources. To learn more about how we use customer feedback in the planning process, check out our new feature policy.

      Issue Summary

      Many customers are excited to utilize Atlassian Intelligence with a custom LLM provider

      Currently Atlassian Intelligence uses a diverse range of open-source, Atlassian-hosted LLMs, including models from the LLama series, Phi series, and Mixtral, alongside third-party hosted LLMs from OpenAI's GPT series of models, Anthropic's Claude series of models, and Google's Gemini series of models, to deliver the best outcomes for customers. Our features use dynamic routing to select the appropriate mix of models that can deliver the best experience and accuracy for each scenario.

      From:
      https://www.atlassian.com/trust/atlassian-intelligence

      Some admins would like to utilize a single LLM of their choice, test different LLMs or create a self hosted internal LLM that they manage for optimize performance and to meet security requirements for their industry.

      Requested outcome:

      Please provide a configuration that admins can manage to connect with the LLM of their choice, including a self-hosted LLM

            [AI-973] Allow admins to configure Atlassian Intelligence with custom LLM

            Is there any update on this feature?

            Justin Fiore added a comment - Is there any update on this feature?

              461003fd0421 Ashish Dobhal
              99b354671bde Nick Messer
              Votes:
              7 Vote for this issue
              Watchers:
              7 Start watching this issue

                Created:
                Updated: