Image for screenshot of my playground on Azure AI Foundry
When we are using the Azure AI Foundry, within a project, we choose our models from model catalogue and try it out in the playground. Once you enter playground, there will be an option to upload data source, which serves as knowledge base based on which the response we are expecting will be grounded upon.
Here, if you see in the attached screenshot of my Azure AI Foundry's chat playground, you can see at top left, there is a blue button denoting 'view code' option. when you click on it, you can see the code, which can be integrated with your current prompt and app.
But, when you are trying to use the endpoints to connect to the model locally from your project, you see that the data on which you grounded the model's responses is not working. One suggestion you may try out, as I read in the following link on official Microsoft documentation for Azure AI Foundry: Microsoft's Documentation for Azure AI Foundry
I read that when we click on the view code button, in one of the lines of code, we should be able to see the endpoint in the format as follows: https://<project-name>.<region>.inference.ai.azure.com/chat
When I tried it on my Azure Foundry, I got in a different format, but if you're able to see the endpoint in the suggested format, it might solve your problem.
Would love to hear if I need to follow differently to get the correct endpoint format. Thanks!