Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
a4f69d6
Added support for layer-specific dimensions
brian-r-calder Jun 17, 2024
afa60bc
Bug-fixes for VR readers
brian-r-calder Jun 18, 2024
ca684a5
Update bag_vrrefinements.cpp
brian-r-calder Jun 25, 2024
78a1205
Adjustment to remove debugging code after initial tests.
brian-r-calder Jun 26, 2024
6fd3fb9
Merge branch 'master' into 109-LayerDescriptor-shape-parameters
selimnairb Nov 22, 2024
b45aecf
Fix tests broken by addressing #109.
selimnairb Nov 22, 2024
a3cbb27
CI: Enable examples building for test reporting workflow
selimnairb Dec 6, 2024
cea1c15
Modification to exception text for consistency
brian-r-calder Dec 21, 2024
c8b6515
Fix single-line if statement error; added braces.
selimnairb Jan 2, 2025
f9e186e
Fix single-line if statement error; added braces.
selimnairb Jan 2, 2025
7c75ebf
CI: test reporting: Make sure test output is reported to the console,…
selimnairb Jan 2, 2025
2cc6302
Support VR elements stored in either 1D or 2D arrays to be compatible…
selimnairb Jan 2, 2025
d49c72d
test: Restructure test 'test vr metadata write read' to only attempt …
selimnairb Jan 2, 2025
3b91e1e
SWIG: Python: first pass updating swig .i files to mirror changes to …
selimnairb Jan 2, 2025
39cee19
Tests: Python: Update tests to reflect layer dimension API fixes
selimnairb Jan 3, 2025
566324e
Store dimensions of layer descriptors as uint64_t internally (rather …
selimnairb Jan 3, 2025
65c33d2
LayerDescriptor: Return dimensions as uint32_t rather than the underl…
selimnairb Jan 3, 2025
98344af
Partial fix to update VR refinements and nodes to use 2D dataspace (a…
selimnairb Jan 4, 2025
72b166d
Merge branch 'master' into 109-LayerDescriptor-shape-parameters
selimnairb Jan 6, 2025
d7cd221
Update VR refinements and nodes to use 2D dataspace (as other impleme…
selimnairb Jan 6, 2025
27189ec
Ensure VR descriptors are consistent with shape of underlying H5 data…
selimnairb Jan 6, 2025
60a7a92
Remove unnecessary casts
selimnairb Jan 6, 2025
065acbd
Add explicit cast to uint32_t in LayerDescriptor::getDims()
selimnairb Jan 6, 2025
728ee2f
Fix wheel build on Windows
selimnairb Jan 7, 2025
772e7c9
VR: Add test_vr_bag to test reading an existing VR BAG file; Add exce…
selimnairb Jan 8, 2025
27a6687
test_vr_bag: Ensure BAG dimensions match VR metadata descriptor dimen…
selimnairb Jan 8, 2025
cd83fda
tests: Add example VR BAG from NBS archive
selimnairb Jan 9, 2025
0bac717
#109: Add skeleton test case for opening VR BAG produced by BAG 1.6.3
selimnairb Jan 2, 2026
043fa25
#109: Harmonize 1.6.3 VR BAG test with existing VR BAG compatibility …
selimnairb Jan 5, 2026
d4d3e55
#109: Add check for VR refinement array dimensions for test_bag_vrref…
selimnairb Jan 5, 2026
042fb10
#109: test_bag_vrrefinements: Close and re-open test BAG before check…
selimnairb Jan 6, 2026
0032e1e
Merge pull request #143 from OpenNavigationSurface/109-LayerDescripto…
selimnairb Jan 13, 2026
0bedc02
CI: Change download location of libxml2 dependency from GitLab to Gno…
selimnairb Jan 22, 2026
b5239c5
CI: Windows: Unzip libxml2 with tar instead of 7z since 7z is incapab…
selimnairb Jan 22, 2026
e2e53a9
CI: Update runner version since Ubuntu 20.04 has been sunsetted. Also…
selimnairb Jan 23, 2026
672028e
CI: Rollback attempt to more explicitly set Python version, which doe…
selimnairb Jan 23, 2026
79826f7
CI: Add setup-python action to matrix build
selimnairb Jan 23, 2026
a794ce8
CI: Remove redundant python-version 3.10
selimnairb Jan 23, 2026
55f0064
CI: Fix regression in file glob pattern for finding bagPy wheels
selimnairb Jan 23, 2026
0130e4c
#109: Remove commented out with previous, incorrect, behavior, leavin…
selimnairb Jan 30, 2026
a97d3e7
CI: Remove deprecated set-output command
selimnairb Jan 30, 2026
4d6a330
CI: Fix ignored paths to include all other actions workflows
selimnairb Jan 30, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 12 additions & 4 deletions .github/workflows/testmatrix.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ on:
- '*.md'
- '*.svg'
- '*.png'
- .github/workflows/testreporting.yml
- .github/workflows/testwindows.yml
branches: [ "master" ]
pull_request:
Expand All @@ -19,6 +20,7 @@ on:
- '*.md'
- '*.svg'
- '*.png'
- .github/workflows/testreporting.yml
- .github/workflows/testwindows.yml
branches: [ "master" ]

Expand All @@ -32,16 +34,22 @@ jobs:
build:
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11"]
os: [ubuntu-20.04]
python-version: ["3.10", "3.11", "3.12", "3.13"]
os: [ubuntu-22.04]
# The CMake configure and build commands are platform agnostic and should work equally well on Windows or Mac.
# You can convert this to a matrix build if you need cross-platform coverage.
# See: https://docs.github.com/en/free-pro-team@latest/actions/learn-github-actions/managing-complex-workflows#using-a-build-matrix
runs-on: ${{matrix.os}}

steps:
- name: Checkout
uses: actions/checkout@v4
uses: actions/checkout@v6

- name: Setup Python
uses: actions/setup-python@v6
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'

- name: Install dependencies
run: |
Expand All @@ -68,7 +76,7 @@ jobs:
run: |
sudo cmake --install build
source python-venv/bin/activate
python -m pip install ./wheel/bagPy-*.whl
python -m pip install ./wheel/bag[pP]y-*.whl
deactivate

- name: Tests
Expand Down
11 changes: 6 additions & 5 deletions .github/workflows/testreporting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ on:
- README.*
- '*.md'
- '*.svg'
- .github/workflows/testmatrix.yml
- .github/workflows/testwindows.yml
branches: [ "master" ]
pull_request:
Expand Down Expand Up @@ -50,15 +51,15 @@ jobs:
# Get current banch name to use it as dest directory
- name: Extract branch name
shell: bash
run: echo "##[set-output name=branch;]$(echo ${GITHUB_REF#refs/heads/})"
run: echo "branch=$(echo ${GITHUB_REF#refs/heads/})" >> $GITHUB_OUTPUT
id: extract_branch

- name: Prepare environment
id: coverage
run: |
# Output values to be used by other steps
echo "##[set-output name=path;]${BADGE_PATH}"
echo "##[set-output name=branch;]${BRANCH}"
echo "path=${BADGE_PATH}" >> $GITHUB_OUTPUT
echo "branch=${BRANCH}" >> $GITHUB_OUTPUT
env:
BADGE_PATH: ${{ steps.extract_branch.outputs.branch }}/coverage.svg
BRANCH: badges
Expand All @@ -79,7 +80,7 @@ jobs:
run: |
export CC=${{env.CC}}
export CXX=${{env.CXX}}
cmake -G Ninja -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}} -B build -S . -DCMAKE_INSTALL_PREFIX=/usr/local -DBAG_BUILD_TESTS:BOOL=ON -DBAG_CODE_COVERAGE:BOOL=ON
cmake -G Ninja -DCMAKE_BUILD_TYPE=${{env.BUILD_TYPE}} -B build -S . -DCMAKE_INSTALL_PREFIX=/usr/local -DBAG_BUILD_EXAMPLES:BOOL=ON -DBAG_BUILD_TESTS:BOOL=ON -DBAG_CODE_COVERAGE:BOOL=ON

- name: Build
# Build your program with the given configuration
Expand All @@ -90,7 +91,7 @@ jobs:

- name: Run tests
run: |
BAG_SAMPLES_PATH=${{github.workspace}}/examples/sample-data ./build/tests/bag_tests_d -r junit -o build/tests/bag_tests-testreport.xml
BAG_SAMPLES_PATH=${{github.workspace}}/examples/sample-data ./build/tests/bag_tests_d -r junit | tee build/tests/bag_tests-testreport.xml

- name: Test Reporter
uses: mikepenz/action-junit-report@v5
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/testwindows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:
SDK: release-1911
MSVC_VER: 1920
ZLIB_URL: "https://github.com/madler/zlib/releases/download/v1.3/zlib13.zip"
LIBXML2_URL: "https://gitlab.gnome.org/GNOME/libxml2/-/archive/v2.13.5/libxml2-v2.13.5.zip"
LIBXML2_URL: "https://download.gnome.org/sources/libxml2/2.13/libxml2-2.13.9.tar.xz"
HDF5_URL: "https://github.com/HDFGroup/hdf5/archive/refs/tags/hdf5_1.14.5.zip"
CATCH2_URL: "https://github.com/catchorg/Catch2/archive/refs/tags/v3.4.0.zip"

Expand Down Expand Up @@ -102,7 +102,7 @@ jobs:
cd downloads
$env:ZLIB_ZIP="zlib.zip"
if(-Not (Test-Path -Path $env:ZLIB_ZIP -PathType Leaf)) { Invoke-WebRequest "$env:ZLIB_URL" -OutFile "$env:ZLIB_ZIP" }
$env:LIBXML2_ZIP="libxml2.zip"
$env:LIBXML2_ZIP="libxml2.tar.xz"
if(-Not (Test-Path -Path $env:LIBXML2_ZIP -PathType Leaf)) { Invoke-WebRequest "$env:LIBXML2_URL" -OutFile "$env:LIBXML2_ZIP" }
$env:HDF5_ZIP="hdf5.zip"
if(-Not (Test-Path -Path $env:HDF5_ZIP -PathType Leaf)) { Invoke-WebRequest "$env:HDF5_URL" -OutFile "$env:HDF5_ZIP" }
Expand All @@ -121,8 +121,8 @@ jobs:
cmake --build build --config Release --target install -- /nologo /verbosity:minimal
cd ..
# libxml2
exec { 7z x ..\downloads\$env:LIBXML2_ZIP }
cd libxml2-v2.13.5
exec { tar xf ..\downloads\$env:LIBXML2_ZIP }
cd libxml2-2.13.9
if(-Not (Test-Path -Path build)) { mkdir build }
cmake -B build -G $env:VS_VERSION -S . $env:CMAKE_INSTALL_PREFIX -DCMAKE_BUILD_TYPE=Release `
-DLIBXML2_WITH_ZLIB=ON -DLIBXML2_WITH_ICONV=OFF -DLIBXML2_WITH_LZMA=OFF -DLIBXML2_WITH_PYTHON=OFF
Expand Down
4 changes: 2 additions & 2 deletions api/bag.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -824,7 +824,7 @@ BagError bagGetErrorString(
strncpy(str, "Metadata One or more elements of the requested coverage are missing from the XML file", MAX_STR-1);
break;
case BAG_METADTA_INVLID_DIMENSIONS:
sprintf(str, "Metadata The number of dimensions is incorrect (not equal to %d)", RANK);
snprintf(str, MAX_STR, "Metadata The number of dimensions is incorrect (not equal to %d)", RANK);
break;
case BAG_METADTA_BUFFER_EXCEEDED:
strncpy(str, "Metadata supplied buffer is too large to be stored in the internal array", MAX_STR-1);
Expand Down Expand Up @@ -866,7 +866,7 @@ BagError bagGetErrorString(
strncpy(str, "HDF Bag is not an HDF5 File", MAX_STR-1);
break;
case BAG_HDF_RANK_INCOMPATIBLE:
sprintf(str, "HDF Bag's rank is incompatible with expected Rank of the Datasets: %d", RANK);
snprintf(str, MAX_STR, "HDF Bag's rank is incompatible with expected Rank of the Datasets: %d", RANK);
break;
case BAG_HDF_TYPE_NOT_FOUND:
strncpy(str, "HDF Bag surface Datatype parameter not available", MAX_STR-1);
Expand Down
42 changes: 29 additions & 13 deletions api/bag_dataset.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -502,12 +502,12 @@ void Dataset::createDataset(

// Mandatory Layers
// Elevation
this->addLayer(SimpleLayer::create(*this, Elevation, chunkSize,
compressionLevel));
this->addLayer(SimpleLayer::create(*this, Elevation, m_pMetadata->rows(), m_pMetadata->columns(),
chunkSize, compressionLevel));

// Uncertainty
this->addLayer(SimpleLayer::create(*this, Uncertainty, chunkSize,
compressionLevel));
this->addLayer(SimpleLayer::create(*this, Uncertainty, m_pMetadata->rows(), m_pMetadata->columns(),
chunkSize, compressionLevel));
}

//! Create an optional simple layer.
Expand Down Expand Up @@ -546,8 +546,8 @@ Layer& Dataset::createSimpleLayer(
case Num_Soundings: //[[fallthrough]];
case Average_Elevation: //[[fallthrough]];
case Nominal_Elevation:
return this->addLayer(SimpleLayer::create(*this, type, chunkSize,
compressionLevel));
return this->addLayer(SimpleLayer::create(*this, type,
m_pMetadata->rows(), m_pMetadata->columns(), chunkSize, compressionLevel));
case Surface_Correction: //[[fallthrough]];
case Georef_Metadata: //[[fallthrough]];
default:
Expand Down Expand Up @@ -1073,10 +1073,17 @@ void Dataset::readDataset(
OpenMode openMode)
{
signal(SIGABRT, handleAbrt);
m_pH5file = std::unique_ptr<::H5::H5File, DeleteH5File>(new ::H5::H5File{
fileName.c_str(),
(openMode == BAG_OPEN_READONLY) ? H5F_ACC_RDONLY : H5F_ACC_RDWR},
DeleteH5File{});
try {
m_pH5file = std::unique_ptr<::H5::H5File, DeleteH5File>(new ::H5::H5File{
fileName.c_str(),
(openMode == BAG_OPEN_READONLY) ? H5F_ACC_RDONLY : H5F_ACC_RDWR},
DeleteH5File{});
}
catch( ::H5::FileIException& e )
{
std::cerr << "Unable to read BAG file, error was: " << e.getCDetailMsg() << std::endl;
e.printErrorStack();
}

m_pMetadata = std::make_unique<Metadata>(*this);

Expand All @@ -1103,7 +1110,10 @@ void Dataset::readDataset(

H5Dclose(id);

auto layerDesc = SimpleLayerDescriptor::open(*this, layerType);
// Pre-stage the layer-specific desciptor. Note that we don't need to specify the
// dimensions of the layer here, since they're set from the HDF5 dataset when it
// gets opened with SimpleLayer::open().
auto layerDesc = SimpleLayerDescriptor::open(*this, layerType, 0, 0);
this->addLayer(SimpleLayer::open(*this, *layerDesc));
}

Expand Down Expand Up @@ -1166,7 +1176,10 @@ void Dataset::readDataset(
}

{
auto descriptor = VRRefinementsDescriptor::open(*this);
// Pre-stage the layer-specific descriptor for the refinements; note that this
// doesn't have to have specific dimensions since they're set when the refinements
// layer is read in VRRefinements::open().
auto descriptor = VRRefinementsDescriptor::open(*this, 0, 0);
this->addLayer(VRRefinements::open(*this, *descriptor));
}

Expand All @@ -1176,7 +1189,10 @@ void Dataset::readDataset(
{
H5Dclose(id);

auto descriptor = VRNodeDescriptor::open(*this);
// Pre-stage the layer-specific descriptor for the nodes; note that this doesn't
// have to have specific dimensions since they're set when the nodes layer is
// read in VRNode::open().
auto descriptor = VRNodeDescriptor::open(*this, 0, 0);
this->addLayer(VRNode::open(*this, *descriptor));
}
}
Expand Down
2 changes: 1 addition & 1 deletion api/bag_exceptions.h
Original file line number Diff line number Diff line change
Expand Up @@ -441,7 +441,7 @@ struct BAG_API InvalidVRRefinementDimensions final : virtual std::exception
{
const char* what() const noexcept override
{
return "The variable resolution refinement layer is not 1 dimensional.";
return "The variable resolution refinement layer is inconsistent with specification.";
}
};

Expand Down
24 changes: 20 additions & 4 deletions api/bag_georefmetadatalayer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -104,8 +104,13 @@ std::shared_ptr<GeorefMetadataLayer> GeorefMetadataLayer::create(
keyType != DT_UINT64)
throw InvalidKeyType{};

// The keys array should be the same dimensions as the mandatory elevation layer, so read
// from the file global descriptor, and set.
uint32_t rows = 0, cols = 0;
std::tie<uint32_t, uint32_t>(rows, cols) = dataset.getDescriptor().getDims();
auto pDescriptor = GeorefMetadataLayerDescriptor::create(dataset, name, profile, keyType,
definition, chunkSize, compressionLevel);
definition, rows, cols,
chunkSize, compressionLevel);

// Create the H5 Group to hold keys & values.
const auto& h5file = dataset.getH5file();
Expand All @@ -122,7 +127,8 @@ std::shared_ptr<GeorefMetadataLayer> GeorefMetadataLayer::create(
auto h5valueDataSet = GeorefMetadataLayer::createH5valueDataSet(dataset, *pDescriptor);

auto layer = std::make_shared<GeorefMetadataLayer>(dataset,
*pDescriptor, std::move(h5keyDataSet), std::move(h5vrKeyDataSet),
*pDescriptor, std::move(h5keyDataSet),
std::move(h5vrKeyDataSet),
std::move(h5valueDataSet));

layer->setValueTable(std::unique_ptr<ValueTable>(new ValueTable{*layer}));
Expand Down Expand Up @@ -150,6 +156,12 @@ std::shared_ptr<GeorefMetadataLayer> GeorefMetadataLayer::open(
new ::H5::DataSet{h5file.openDataSet(internalPath + COMPOUND_KEYS)},
DeleteH5dataSet{});

// The keys array has the dimensions of the layer, so we can read and reset the
// descriptor dimensions, in case they were inconsistent (or not set).
std::array<hsize_t, kRank> dims;
h5keyDataSet->getSpace().getSimpleExtentDims(dims.data(), nullptr);
descriptor.setDims(dims[0], dims[1]);

std::unique_ptr<::H5::DataSet, DeleteH5dataSet> h5vrKeyDataSet{};
if (dataset.getVRMetadata())
h5vrKeyDataSet = std::unique_ptr<::H5::DataSet, DeleteH5dataSet>(
Expand All @@ -161,7 +173,9 @@ std::shared_ptr<GeorefMetadataLayer> GeorefMetadataLayer::open(
DeleteH5dataSet{});

auto layer = std::make_shared<GeorefMetadataLayer>(dataset,
descriptor, std::move(h5keyDataSet), std::move(h5vrKeyDataSet),
descriptor,
std::move(h5keyDataSet),
std::move(h5vrKeyDataSet),
std::move(h5valueDataSet));

layer->setValueTable(std::unique_ptr<ValueTable>(new ValueTable{*layer}));
Expand All @@ -188,7 +202,9 @@ GeorefMetadataLayer::createH5keyDataSet(
std::unique_ptr<::H5::DataSet, DeleteH5dataSet> pH5dataSet;

{
// Use the dimensions from the descriptor.
// Use the dimensions from the descriptor. We could do this from the specific
// descriptor for the layer, too, which should mirror the size of the file global
// descriptor used here.
uint32_t dim0 = 0, dim1 = 0;
std::tie(dim0, dim1) = dataset.getDescriptor().getDims();
const std::array<hsize_t, kRank> fileDims{dim0, dim1};
Expand Down
4 changes: 2 additions & 2 deletions api/bag_georefmetadatalayer.h
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,8 @@ class BAG_API GeorefMetadataLayer final : public Layer
protected:
static std::shared_ptr<GeorefMetadataLayer> create(DataType keyType,
const std::string& name, GeorefMetadataProfile profile, Dataset& dataset,
const RecordDefinition& definition, uint64_t chunkSize,
int compressionLevel);
const RecordDefinition& definition,
uint64_t chunkSize, int compressionLevel);
static std::shared_ptr<GeorefMetadataLayer> open(Dataset& dataset,
GeorefMetadataLayerDescriptor& descriptor);

Expand Down
12 changes: 10 additions & 2 deletions api/bag_georefmetadatalayerdescriptor.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,11 @@ GeorefMetadataLayerDescriptor::GeorefMetadataLayerDescriptor(
GeorefMetadataProfile profile,
DataType keyType,
RecordDefinition definition,
uint32_t rows, uint32_t cols,
uint64_t chunkSize,
int compressionLevel)
: LayerDescriptor(dataset.getNextId(), GEOREF_METADATA_PATH + name, name,
Georef_Metadata, chunkSize, compressionLevel)
Georef_Metadata, rows, cols, chunkSize, compressionLevel)
, m_pBagDataset(dataset.shared_from_this())
, m_profile(profile)
, m_keyType(keyType)
Expand Down Expand Up @@ -72,12 +73,14 @@ std::shared_ptr<GeorefMetadataLayerDescriptor> GeorefMetadataLayerDescriptor::cr
GeorefMetadataProfile profile,
DataType keyType,
RecordDefinition definition,
uint32_t rows, uint32_t cols,
uint64_t chunkSize,
int compressionLevel)
{
return std::shared_ptr<GeorefMetadataLayerDescriptor>(
new GeorefMetadataLayerDescriptor{dataset, name, profile, keyType,
std::move(definition), chunkSize, compressionLevel});
std::move(definition), rows, cols,
chunkSize, compressionLevel});
}

//! Open an existing georeferenced metadata layer descriptor.
Expand Down Expand Up @@ -165,8 +168,13 @@ std::shared_ptr<GeorefMetadataLayerDescriptor> GeorefMetadataLayerDescriptor::op
profile = UNKNOWN_METADATA_PROFILE;
}

std::array<hsize_t, 2> dims;
h5dataSet.getSpace().getSimpleExtentDims(dims.data(), nullptr);

return std::shared_ptr<GeorefMetadataLayerDescriptor>(
new GeorefMetadataLayerDescriptor{dataset, name, profile, keyType, definition,
static_cast<const uint32_t>(dims[0]),
static_cast<const uint32_t>(dims[1]),
chunkSize, compressionLevel});
}

Expand Down
16 changes: 9 additions & 7 deletions api/bag_georefmetadatalayerdescriptor.h
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,13 @@ namespace BAG {
class BAG_API GeorefMetadataLayerDescriptor final : public LayerDescriptor
{
public:
static std::shared_ptr<GeorefMetadataLayerDescriptor> create(Dataset& dataset,
const std::string& name, GeorefMetadataProfile profile, DataType keyType,
RecordDefinition definition, uint64_t chunkSize,
int compressionLevel);
static std::shared_ptr<GeorefMetadataLayerDescriptor> open(Dataset& dataset,
const std::string& name);
static std::shared_ptr<GeorefMetadataLayerDescriptor>
create(Dataset& dataset,
const std::string& name, GeorefMetadataProfile profile, DataType keyType,
RecordDefinition definition, uint32_t rows, uint32_t cols,
uint64_t chunkSize, int compressionLevel);
static std::shared_ptr<GeorefMetadataLayerDescriptor>
open(Dataset& dataset, const std::string& name);

GeorefMetadataLayerDescriptor(const GeorefMetadataLayerDescriptor&) = delete;
GeorefMetadataLayerDescriptor(GeorefMetadataLayerDescriptor&&) = delete;
Expand All @@ -52,7 +53,8 @@ class BAG_API GeorefMetadataLayerDescriptor final : public LayerDescriptor

protected:
GeorefMetadataLayerDescriptor(Dataset& dataset, const std::string& name, GeorefMetadataProfile profile,
DataType keyType, RecordDefinition definition, uint64_t chunkSize,
DataType keyType, RecordDefinition definition,
uint32_t rows, uint32_t cols, uint64_t chunkSize,
int compressionLevel);

private:
Expand Down
4 changes: 4 additions & 0 deletions api/bag_interleavedlegacylayer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,10 @@ std::shared_ptr<InterleavedLegacyLayer> InterleavedLegacyLayer::open(
descriptor.setMinMax(std::get<1>(possibleMinMax),
std::get<2>(possibleMinMax));

std::array<hsize_t, 2> dims;
h5dataSet->getSpace().getSimpleExtentDims(dims.data(), nullptr);
descriptor.setDims(dims[0], dims[1]);

return std::make_shared<InterleavedLegacyLayer>(dataset,
descriptor, std::move(h5dataSet));
}
Expand Down
Loading