Browse to: http://localhost:50505. For details on using the Aspire Content Source Management page, please refer to Admin UI
Step 2. Add a new Content Source
For this step please follow the step from the Configuration Tutorial of the connector of you choice, please refer to Connector list
Step 3. Add aAzure OpenAI Chat Completions application to the Workflow
To add a Azure OpenAI Chat Completions application, drag it from the Applicationssection to the desired Workflow event. This will automatically open the Azure OpenAI Chat Completions configuration for the application.
Step 3a. Specify Prompt Type
In the "General" configuration section, specify the type of prompt to use. Currently, "User", "System" and "Assistant" types are supported.
Step 3b. Specify Prompt Text
In the "General" configuration section, specify the text for your prompt. It can be plain text or a groovy script that must have text as its final output.
Step 3c. Add more prompts (optional)
Optionally, in the "General" section, you can add as many prompts as desired. You must then again, specify a prompt type and a prompt text for each of them.
Step 3d. Configure the Endpoint
In the "Endpoint Configuration" section, specify the API version and Model to be used.
Step 3e. Configure Model Parameters
In the "Model Parameters" section, specify parameters to be used with the Model specified in the previous step.
After this, you can press the add button to finish the workflow.
Now the workflow application is ready to be used in your Aspire Seeds.
Output example
Here is an example of the output. The application will add a "chatCompletionsResponse" object to the document which in turn will have the response content in a "content" field, a "responseProperties" field with response properties, including usage statistics, a "requestProperties" field, with the corresponding request properites and finally, a "modelProperties" field with the properties with which the model was configured.
For details on using the Workflow section, please refer to Workflow introduction.