Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Version History

« Previous Version 6 Next »

In our system architecture, a backend service facilitates data transfer between the OpenAI service and the Confluence Macro. This backend service functions solely as a proxy, ensuring that no data is stored within the function or any part of our backend infrastructure. Its primary role is to relay requests and responses between the OpenAI service and the Confluence Macro.

The Azure OpenAI Service, managed by Microsoft, operates within Microsoft's Azure environment. The service hosts OpenAI models and does not interact with any services directly operated by OpenAI, such as ChatGPT or the OpenAI API.

Prompts (inputs) and completions (outputs):

  • are NOT available to other customers.

  • are NOT available to OpenAI.

  • are NOT used to improve OpenAI models.

  • are NOT used to train, retrain, or improve Azure OpenAI Service foundation models.

  • are NOT used to improve any Microsoft or 3rd party products or services without your permission or instruction.

Check out the High Level Architecture (Cloud) and https://communardo-products.atlassian.net/wiki/spaces/SPONLINE/pages/782172164/Confluence+Configure+AI+Features documentation on how to configure the AI features.

  • No labels