Skip to content

[Bug]: still need to provide s3 path even it defined in warehouse config in helm chart. #11353

@echzhai

Description

@echzhai

What happened

in helm chart value file:

warehouses:
    - name: warehouse1
      location: s3://bucket/prefix/nessiewarehouse1

later when start pyspark, why do i still need to provide s3 path:
.config("spark.sql.catalog.nessie.warehouse", "s3://bucket/prefix/nessiewarehouse1")

why can not i just config like below?:
.config("spark.sql.catalog.nessie.warehouse", warehouse1) # the warehouse name inside the helm chart.

How to reproduce it

deploy nessie with helm chart value file

Nessie server type (docker/uber-jar/built from source) and version

0.105.2

Client type (Ex: UI/Spark/pynessie ...) and version

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions