Skip to content
This repository was archived by the owner on Dec 16, 2025. It is now read-only.
This repository was archived by the owner on Dec 16, 2025. It is now read-only.

internal endpoint jsonPayload.endpoint points to projects/3972195257/ instead of my project #735

@cristobalufro

Description

@cristobalufro

Description of the bug:

My Python script, using TextEmbeddingModel.from_pretrained with location='southamerica-east1', generates embeddings with text-multilingual-embedding-002 in my alimetra-fc43f project. However, my Node.js cloud function, using PredictionServiceClient with an explicit endpoint for projects/alimetra-fc43f/locations/southamerica-east1/publishers/google/models/text-multilingual-embedding-002 (and task_type), fails with 3 INVALID_ARGUMENTS. Additionally, the prediction_access logs from the failed Node.js call show that the internal endpoint jsonPayload.endpoint points to projects/3972195257/.... What is causing this discrepancy and the Node.js client to crash?

Actual vs expected behavior:

Expected Behavior:

The Node.js Cloud Function (processSurveyAnalysisCore, specifically the getEmbeddingsBatch helper) constructs a request to the Vertex AI publisher model text-multilingual-embedding-002.
The endpoint in this request is correctly formatted as projects/alimetra-fc43f/locations/us-central1/publishers/google/models/text-multilingual-embedding-002 (or the specified VERTEX_AI_LOCATION).
The instances array contains valid text content, e.g., [{ "content": "manzana" }] or [{ "content": "hello world" }].
The parameters object includes { task_type: "RETRIEVAL_DOCUMENT" } (or is an empty object {}).
The predictionServiceClient.predict(request) call successfully contacts Vertex AI.
Vertex AI returns a valid response containing the embedding vectors for the provided text instances.
The aiplatform.googleapis.com%2Fprediction_access log (server-side Vertex AI log) should show the jsonPayload.endpoint reflecting the publisher model path under the context of project alimetra-fc43f.
Actual Behavior:

The Node.js Cloud Function constructs the request to Vertex AI with the seemingly correct endpoint for project alimetra-fc43f and valid instances (e.g., [{ "content": "manzana" }] or [{ "content": "hello world" }]) and parameters (including task_type or {}).
The predictionServiceClient.predict(request) call fails.
The error caught in the Cloud Function's stderr log is consistently Error: 3 INVALID_ARGUMENT:, with error.details being an empty string. The gRPC stack trace points to an issue during the onReceiveStatus phase of the client call.
Crucially, the aiplatform.googleapis.com%2Fprediction_access log (server-side Vertex AI log) for the failed request shows:
logName: "projects/alimetra-fc43f/logs/aiplatform.googleapis.com%2Fprediction_access" (correctly attributing the call to the user's project).
resource.labels.resource_container: "alimetra-fc43f" (correctly identifying the user's project as the resource consuming the API).
BUT, jsonPayload.endpoint shows: "projects/3972195257/locations/us-central1/endpoints/text-multilingual-embedding-002" (where 3972195257 is not the user's project ID alimetra-fc43f). This path format (.../endpoints/...) also differs from the expected publisher model path format (.../publishers/google/models/...).
This same prediction_access log entry contains jsonPayload.error.code: 3.

Any other information you'd like to share?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions