Allow admins to configure Atlassian Intelligence with custom LLM

XMLWordPrintable

    • 13

      Issue Summary

      Many customers are excited to utilize Atlassian Intelligence with a custom LLM provider

      Currently Atlassian Intelligence uses a diverse range of open-source, Atlassian-hosted LLMs, including models from the LLama series, Phi series, and Mixtral, alongside third-party hosted LLMs from OpenAI's GPT series of models, Anthropic's Claude series of models, and Google's Gemini series of models, to deliver the best outcomes for customers. Our features use dynamic routing to select the appropriate mix of models that can deliver the best experience and accuracy for each scenario.

      From:
      https://www.atlassian.com/trust/atlassian-intelligence

      Some admins would like to utilize a single LLM of their choice, test different LLMs or create a self hosted internal LLM that they manage for optimize performance and to meet security requirements for their industry.

      Requested outcome:

      Please provide a configuration that admins can manage to connect with the LLM of their choice, including a self-hosted LLM

            Assignee:
            sgarg12
            Reporter:
            Nick Messer
            Votes:
            23 Vote for this issue
            Watchers:
            21 Start watching this issue

              Created:
              Updated: