Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 13 additions & 15 deletions docs/usage/snapshots.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,15 +82,15 @@ The comment is optional but useful for identifying the snapshot later.
=== "CLI"

```bash
aimbat snapshot list <ID> # for a specific event
aimbat snapshot list --all-events # across all events
aimbat snapshot list <ID> # for a specific event
aimbat snapshot list all # across all events
```

=== "Shell"

```bash
snapshot list # uses the current event context
snapshot list --all-events
snapshot list <ID> # uses the current event context
snapshot list all # across all events
```

The table shows the snapshot ID, date and time, comment, and number of
Expand Down Expand Up @@ -128,7 +128,7 @@ Before rolling back, it can be useful to see what a snapshot contains.
snapshot preview --matrix <SNAPSHOT_ID>
```

`details` shows the event-level parameters (window, filter, min_ccnorm) as
`details` shows the event-level parameters (window, filter, min_cc) as
they were when the snapshot was taken. `preview` builds the ICCS stack from
the snapshot's parameters and displays it — without modifying anything in
the database.
Expand Down Expand Up @@ -224,15 +224,13 @@ For archiving or scripting purposes, snapshot data can be exported to JSON:
=== "CLI"

```bash
aimbat snapshot dump <ID> # specific event
aimbat snapshot dump --all-events # all events
aimbat snapshot dump
```

=== "Shell"

```bash
snapshot dump # uses the current event context
snapshot dump --all-events
snapshot dump
```

The output is a JSON object with five keys, all cross-referenced by
Expand Down Expand Up @@ -317,17 +315,17 @@ without opening individual snapshot records:
=== "CLI"

```bash
aimbat snapshot quality list <ID> # for a specific event
aimbat snapshot quality list --all-events # across all events
aimbat snapshot quality dump # raw JSON export
aimbat snapshot quality list <ID> # for a specific event
aimbat snapshot quality list all # across all events
aimbat snapshot quality dump # raw JSON export
```

=== "Shell"

```bash
snapshot quality list
snapshot quality list --all-events
snapshot quality dump
snapshot quality list <ID> # for a specific event
snapshot quality list all # across all events
snapshot quality dump # raw JSON export
```

The table shows per-snapshot aggregated ICCS correlation coefficients and, where
Expand Down
9 changes: 8 additions & 1 deletion src/aimbat/_cli/common/_parameters.py
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,14 @@ def open_in_editor(initial_content: str) -> str:
tmp_path = tmp.name

try:
subprocess.run([*shlex.split(editor), tmp_path], check=False)
result = subprocess.run([*shlex.split(editor), tmp_path], check=False)
if result.returncode != 0:
from aimbat.logger import logger

logger.warning(
f"Editor '{editor}' exited with code {result.returncode}; discarding changes."
)
Copy link

Copilot AI Mar 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

open_in_editor() logs that it is "discarding changes" when the editor exits non-zero, but it still reads and returns the temporary file content. This means changes are not actually discarded (and the warning is misleading). If the intent is to discard, return initial_content (or skip reading the file) when returncode != 0; otherwise adjust the warning text to reflect that the file is still read.

Suggested change
)
)
return initial_content

Copilot uses AI. Check for mistakes.
return initial_content
with open(tmp_path, encoding="utf-8") as f:
return f.read()
finally:
Expand Down
12 changes: 11 additions & 1 deletion src/aimbat/_cli/snapshot.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,17 @@ def cli_snapshot_rollback(
*,
_: DebugParameter = DebugParameter(),
) -> None:
"""Rollback to snapshot."""
"""Restore saved parameters from a snapshot as the current live values.

Overwrites the current event and per-seismogram parameters for the event
with those recorded in the snapshot. Any ICCS runs or parameter changes
made after the snapshot was taken are undone. The snapshot itself is not
deleted — you can roll back to it again.

If the snapshot has MCCC quality data, the live quality metrics are also
restored from the best matching snapshot (same parameter hash, most recent
MCCC run).
"""
from sqlmodel import Session

from aimbat.core import rollback_to_snapshot
Expand Down
2 changes: 1 addition & 1 deletion src/aimbat/_cli/tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def cli_pick_min_cc(
) -> None:
"""Interactively pick a new minimum cross-correlation for auto-selection.

Opens an interactive plot; click to set the cc threshold. Seismograms
Opens an interactive plot; scroll to set the cc threshold. Seismograms
whose cross-correlation with the stack falls below this value will be
automatically de-selected when running ICCS with `--autoselect`.
"""
Expand Down
10 changes: 6 additions & 4 deletions src/aimbat/_tui/help/tab-project.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,12 @@ ICCS status:

- **● ICCS ready** — the event's seismograms are loaded in memory and
alignment can run. This is the normal working state.
- **○ no ICCS** — the ICCS instance could not be built. Usually this means
a parameter combination is invalid (e.g. the time window is too wide)
or a waveform file is missing. Fix the problem and the status updates
automatically.
- **○ no ICCS** — ICCS is built automatically in the background when you
select an event. If this status persists, the ICCS instance could not be
built — usually because a parameter combination is invalid or a waveform
file is missing. Press `p` to check the event parameters; the most common
cause is a time window longer than the available waveform data. Fix the
problem and the status updates automatically.

### Events table (top)

Expand Down
8 changes: 4 additions & 4 deletions src/aimbat/_tui/help/tab-seismograms.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,10 @@ algorithm. It cross-correlates each selected seismogram against the current
stack waveform, adjusts the picks, rebuilds the stack, and repeats until
convergence. Only seismograms with `Select = ✓` contribute to the stack.

The Stack CC column updates live as soon as the event is loaded — it shows
how well each seismogram matches the current stack, even before you run
alignment. After running ICCS (`a`), the picks (Δt) and Stack CC values are
updated and written to the database immediately.
The Stack CC column is recalculated each time the event is loaded or after
any parameter change — it shows how well each seismogram matches the current
stack, even before you run alignment. After running ICCS (`a`), the picks
(Δt) and Stack CC values are updated and written to the database immediately.

### Seismogram plot (right panel)

Expand Down
4 changes: 2 additions & 2 deletions src/aimbat/_tui/help/tab-snapshots.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ needed. Each snapshot has its own note, which persists in the database.

## What a snapshot captures

- **Event parameters** — time window (`t0`/`t1` window bounds), bandpass
filter settings, and Min CC threshold
- **Event parameters** — time window (pre- and post-pick window lengths),
bandpass filter settings, and Min CC threshold
- **Per-seismogram parameters** — the `t1` pick, `select` flag, and `flip`
flag for every seismogram
- **Quality metrics** — ICCS correlation coefficients per seismogram (always captured);
Expand Down
17 changes: 14 additions & 3 deletions src/aimbat/core/_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,15 @@ def _create_event(
logger.debug(
f"Using existing event {aimbat_event.time} instead of adding new one."
)
if (
new_aimbat_event.latitude != aimbat_event.latitude
or new_aimbat_event.longitude != aimbat_event.longitude
or new_aimbat_event.depth != aimbat_event.depth
):
logger.warning(
f"Event at {aimbat_event.time} matched by time but has different "
f"location metadata in {datasource}. The existing record will be used."
)
return aimbat_event


Expand Down Expand Up @@ -136,7 +145,9 @@ def _process_datasource(
# Resolve event — use the provided UUID, extract from the source, or skip
if event_id is not None:
aimbat_event: AimbatEvent | None = session.get(AimbatEvent, event_id)
logger.debug(f"Using event {aimbat_event.time} (ID={event_id}).") # type: ignore[union-attr]
if aimbat_event is None:
raise ValueError(f"No event found with ID={event_id}.")
logger.debug(f"Using event {aimbat_event.time} (ID={event_id}).")
elif supports_event_creation(datatype):
aimbat_event = _create_event(session, datasource, datatype)
else:
Expand All @@ -148,12 +159,12 @@ def _process_datasource(

# Seismogram creation requires both a station and an event to link to
if aimbat_station is None:
raise NotImplementedError(
raise ValueError(
f"{datatype} does not support station creation. "
"Provide a station UUID via --use-station."
)
if aimbat_event is None:
raise NotImplementedError(
raise ValueError(
f"{datatype} does not support event creation. "
"Provide an event UUID via --use-event."
)
Expand Down
10 changes: 6 additions & 4 deletions src/aimbat/core/_event.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from pydantic import TypeAdapter
from sqlalchemy.exc import NoResultFound
from sqlalchemy.orm import selectinload
from sqlmodel import Session, select
from sqlmodel import Session, col, select

from aimbat._types import EventParameter
from aimbat.logger import logger
Expand Down Expand Up @@ -51,7 +51,7 @@ def resolve_event(session: Session, event_id: UUID | None = None) -> AimbatEvent
NoResultFound: If no event_id is given.
"""
if event_id:
logger.debug(f"Resolving event by explicit ID: {event_id}")
logger.debug(f"Resolving event by explicit ID: {event_id}.")
event = session.get(AimbatEvent, event_id)
if event is None:
raise NoResultFound(f"No AimbatEvent found with id: {event_id}.")
Expand Down Expand Up @@ -108,7 +108,7 @@ def get_completed_events(session: Session) -> Sequence[AimbatEvent]:
statement = (
select(AimbatEvent)
.join(AimbatEventParameters)
.where(AimbatEventParameters.completed == 1)
.where(col(AimbatEventParameters.completed).is_(True))
)

return session.exec(statement).all()
Expand Down Expand Up @@ -157,7 +157,9 @@ def get_event_quality(session: Session, event_id: UUID) -> SeismogramQualityStat
event_id: UUID of the event.

Returns:
Aggregated seismogram quality statistics.
Aggregated seismogram quality statistics. The `mccc_rmse` field is
taken from the event-level quality record rather than the per-seismogram
records, and is `None` if MCCC has not been run.

Raises:
NoResultFound: If no event with the given ID is found.
Expand Down
14 changes: 11 additions & 3 deletions src/aimbat/core/_iccs.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,9 @@ def _write_mccc_quality(

Upserts the event-level RMSE, clears MCCC fields for all seismograms in
the ICCS instance, then writes the per-seismogram metrics for the seismograms
that were actually used in the inversion.
that were actually used in the inversion. The `iccs_cc` field is preserved
when an existing quality row is found; seismograms with no prior quality row
will have `iccs_cc = NULL` until ICCS stats are written separately.

Uses its own short-lived session.

Expand Down Expand Up @@ -343,6 +345,7 @@ def build_iccs_from_snapshot(session: Session, snapshot_id: UUID) -> BoundICCS:
.selectinload(rel(AimbatEvent.seismograms))
.selectinload(rel(AimbatSeismogram.parameters)),
selectinload(rel(AimbatSnapshot.event_parameters_snapshot)),
selectinload(rel(AimbatSnapshot.seismogram_parameters_snapshots)),
)
)
snapshot = session.exec(statement).one_or_none()
Expand Down Expand Up @@ -420,6 +423,9 @@ def validate_iccs_construction(
def _write_back_seismograms(session: Session, iccs: ICCS) -> None:
"""Write t1, flip, and select from ICCS seismograms back to the database.

Calls `session.commit()` after writing; any other pending changes on
`session` are also committed.

Args:
session: Database session.
iccs: ICCS instance whose seismograms carry UUIDs in their extra dict.
Expand Down Expand Up @@ -484,7 +490,7 @@ def run_iccs(
IccsResult from the algorithm run.
"""

logger.info(f"Running ICCS with {autoflip=}, {autoselect=}.")
logger.info(f"Running ICCS (autoflip={autoflip}, autoselect={autoselect}).")

result = iccs(autoflip=autoflip, autoselect=autoselect)
n_iter = len(result.convergence)
Expand All @@ -510,7 +516,9 @@ def run_mccc(
McccResult from the algorithm run.
"""

logger.info(f"Running MCCC for event {event.id} with {all_seismograms=}.")
logger.info(
f"Running MCCC for event {event.id} (all_seismograms={all_seismograms})."
)

result = iccs.run_mccc(
all_seismograms=all_seismograms,
Expand Down
37 changes: 21 additions & 16 deletions src/aimbat/core/_project.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def create_project(engine: Engine) -> None:
# Import locally to ensure SQLModel registers all table metadata before create_all()
import aimbat.models # noqa: F401

logger.info(f"Creating new project in {engine.url}")
logger.info(f"Creating new project in {engine.url}.")

if _project_exists(engine):
raise RuntimeError(
Expand All @@ -64,7 +64,7 @@ def create_project(engine: Engine) -> None:
CREATE TRIGGER IF NOT EXISTS event_modified_on_params_update
AFTER UPDATE ON aimbateventparameters
BEGIN
UPDATE aimbatevent SET last_modified = datetime('now')
UPDATE aimbatevent SET last_modified = strftime('%Y-%m-%d %H:%M:%f', 'now')
WHERE id = NEW.event_id;
END;
""")
Expand Down Expand Up @@ -310,18 +310,23 @@ def delete_project(engine: Engine) -> None:
RuntimeError: If unable to delete project.
"""

logger.info(f"Deleting project in {engine=}.")
logger.info(f"Deleting project at {engine.url}.")

if _project_exists(engine):
if engine.driver == "pysqlite":
database = engine.url.database
engine.dispose()
if database == ":memory:":
logger.info("Running database in memory, nothing to delete.")
return
elif database:
project_path = Path(database)
logger.info(f"Deleting project file: {project_path=}")
project_path.unlink()
return
raise RuntimeError("Unable to find/delete project.")
if not _project_exists(engine):
raise RuntimeError("No project found to delete.")

if engine.driver == "pysqlite":
database = engine.url.database
engine.dispose()
if database == ":memory:":
logger.info("Running database in memory, nothing to delete.")
return
elif database:
project_path = Path(database)
logger.info(f"Deleting project file: {project_path}.")
project_path.unlink()
return

raise RuntimeError(
f"Unable to delete project: unsupported engine driver '{engine.driver}'."
)
Loading
Loading