Skip to content

Optimize for case where size of append dimension is one #35

@forman

Description

@forman

We can possibly avoid rolling back chunk files if the size of the target dataset's append dimension is one.

In this case, the override

def __setitem__(self, key: str, value: bytes):
old_value = self._store.get(key)
self._store[key] = value
if old_value is not None:
self._rollback_cb("replace_file", key, old_value)
else:
self._rollback_cb("delete_file", key, None)

would fall back to super().__setitem__(key, value), because it is obvious that the chunk given by key does not exist.

A first step should be to verify that the statement old_value = self._store.get(key) does effectively perform a filesystem operation, e.g., stat for a local posix filesystem. If not, this optimization would not be applicable.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions