AI Helper custom Base URL
Allow us to set a custom base URL for the AI Helper to point it towards another OpenAI compatible server. for example LocalAI ( https://localai.io/basics/getting_started/#clients ) or Azure AI.
That would allow us to use the AI Helper while sending the data to our own models and private OpenAI instances.
-
Chris Daniels commented
This is something our organization wants as well. We have our own internal AI that supports OpenAPI GPT, Llama, and Claude that we would like to use. Even though the tool promises to just sent metadata, our company would not take any risk in exposing our data to the the public in any way. I used Studio3T on a mostly daily basis and this tool would great improve my ability to write aggregation queries. It would also help non-developers do reporting aggregation queries easily.
-
Ryan Mentzer commented
My organization uses a similar openai-compatible internal API - the AI Helper would be significantly more powerful for us if we could point to it. Heck, I'd even take an option where 3T reads a OPENAI_BASE_URL environment variable before defaulting to https://api.openai.com/v1... this could get the feature out the door with very little work.