Getting this error
Any chance of getting support with this one ?
WARNING - Judge ERROR for https://autocomplete.indeed.com/api/v0/initialLog?fetchOccupations=false: LLM request failed (gemini/gemini/gemini-3.0-flash): litellm.NotFoundError: VertexAIException - {
"error": {
"code": 404,
"message": "models/gemini-3.0-flash is not found for API version v1beta, or is not supported for generateContent. Call ListModels to see the list of available models and their supported methods.",
"status": "NOT_FOUND"
}
}
Getting this error
Any chance of getting support with this one ?
WARNING - Judge ERROR for https://autocomplete.indeed.com/api/v0/initialLog?fetchOccupations=false: LLM request failed (gemini/gemini/gemini-3.0-flash): litellm.NotFoundError: VertexAIException - {
"error": {
"code": 404,
"message": "models/gemini-3.0-flash is not found for API version v1beta, or is not supported for generateContent. Call ListModels to see the list of available models and their supported methods.",
"status": "NOT_FOUND"
}
}