In the background we use our system architecture, a backend to proxy the service facilitates data transfer between the OpenAI service and the Confluence Macro. There is This backend service functions solely as a proxy, ensuring that no data is stored in within the function or anywhere in any part of our backend infrastructure. The function acts just as a proxy to send and receive requests Its primary role is to relay requests and responses between the OpenAI service and the Confluence Macro.
The Azure OpenAI Service is operated , managed by Microsoft as an Azure service; Microsoft hosts the OpenAI models in Microsoft's Azure environment and the Service does NOT , operates within Microsoft's Azure environment. The service hosts OpenAI models and does not interact with any services directly operated by OpenAI (e.g. ChatGPT, , such as ChatGPT or the OpenAI API).
Prompts (inputs) and completions (outputs):
are NOT available to other customers.
are NOT available to OpenAI.
are NOT used to improve OpenAI models.
are NOT used to train, retrain, or improve Azure OpenAI Service foundation models.
are NOT used to improve any Microsoft or 3rd party products or services without your permission or instruction.
Important Notice: As previously stated, we are committed to ensuring the privacy of your data. Nevertheless, we recommend refraining from sharing personal, sensitive, or confidential information when using AI services to enhance your privacy protection.
Check out the High Level Architecture (Cloud) and https://communardo-products.atlassian.net/wiki/spaces/SPONLINE/pages/782172164/Confluence+Configure+AI+Features documentation on how to configure the AI features.