Is your feature request related to a problem? Please describe.
We have LLM access but through our own Azure
Describe the solution you'd like
currently the module supports only OpenAI call
Describe alternatives you've considered
Set up capacity to support alternative LLM API inference server
Additional context
Really a great idea! Looking forward to test and implement!
Is your feature request related to a problem? Please describe.
We have LLM access but through our own Azure
Describe the solution you'd like
currently the module supports only OpenAI call
Describe alternatives you've considered
Set up capacity to support alternative LLM API inference server
Additional context
Really a great idea! Looking forward to test and implement!