Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 20 additions & 3 deletions docs/contact.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,22 @@
# Contact Us
# About Us

Masato Onodera (Subaru Telescope, National Astronomical Observatory of Japan)
## Contact information

:material-email-outline: <monodera@naoj.org>
Please contact the PFS Operation Helpdesk at Subaru Telescope if you have questions, comments, and suggestions related to the target database

:material-email-outline: <script language="JavaScript"><!--

Check notice on line 7 in docs/contact.md

View check run for this annotation

Codacy Production / Codacy Static Code Analysis

docs/contact.md#L7

Element: script
var name1 = "pfs";
var name2 = "obs";
var name3 = "help";
var domain = "naoj.org";
document.write('<a href=\"mailto:' + name1 + '-' + name2 + '-' + name3 + '@' + domain + '\">');
document.write(name1 + '-' + name2 + '-' + name3 + '@' + domain + '</a>');
// --></script>

## Privacy policy

Please follow the web pages listed below.

- [Subaru Telescope Privacy Statement / Access Logging](https://www.naoj.hawaii.edu/privacy/)

- [NAOJ Privacy Policy](https://www.nao.ac.jp/en/terms/privacy.html)
77 changes: 50 additions & 27 deletions docs/examples/obsproc.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,20 +23,29 @@
[schemacrawler]
SCHEMACRAWLERDIR = "<path to SchemaCrawler directory>"

# The following parameters for the uploader will be used to rsync as follows.
# $ rsync -avz -e ssh user@host:data_dir/????/??/????????-??????-{upload_id}
# user can be omitted or blank ("") if the user name is the same as the local user name or an alias is defined in ~/.ssh/config.
[uploader]
host = "<hostname of uploader>"
user = "<user name of uploader>"
data_dir = "<path to the data directory on the uploader>"
# (Optional) rsync donwload is obsolete. Use 'pfs-targetdb-cli transfer-targets-api' instead.
# # The following parameters for the uploader will be used to rsync as follows.
# # $ rsync -avz -e ssh user@host:data_dir/????/??/????????-??????-{upload_id}
# # user can be omitted or blank ("") if the user name is the same as the local user name or an alias is defined in ~/.ssh/config.
# [uploader]
# host = "<hostname of uploader>"
# user = "<user name of uploader>"
# data_dir = "<path to the data directory on the uploader>"


# Optional section for Web API access
# The following parameters are used to download data via Web API instead of rsync.
[webapi]
url = "<base URL of the Web API endpoint>" # e.g., "https://example.com/get-upload/"
api_key = "<API key for authentication>" # Optional: leave empty ("") for no authentication
verify_ssl = false # Optional: set to false to disable SSL certificate verification
```

## Working with filter names, proposal categories, and target types

### Insert to the `filter_name`, `proposal_category`, `pfs_arm`, and `target_type` tables
### Insert to the `filter_name`, `partner`, `proposal_category`, `pfs_arm`, and `target_type` tables

`filter_name`, `proposal_category`, `pfs_arm`, and `target_type` tables are expected to be very static and not frequently updated.
`filter_name`, `partner`, `proposal_category`, `pfs_arm`, and `target_type` tables are expected to be very static and not frequently updated.
Note that you are most likely to skip this step as these tables are already populated in the database.

The contents of CSV files to be inserted for these tables are as follows:
Expand Down Expand Up @@ -65,11 +74,6 @@
z_sdss,SDSS z filter
```

```csv title="proposal_categories.csv"
proposal_category_id,proposal_category_name,proposal_category_description
1,openuse,Subaru openuse proposal
```

```csv title="partner.csv"
partner_id,partner_name,partner_description
1,subaru,Subaru Telescope
Expand All @@ -78,6 +82,11 @@
4,uh,University of Hawaii
```

```csv title="proposal_categories.csv"
proposal_category_id,proposal_category_name,proposal_category_description
1,openuse,Subaru openuse proposal
```

```csv title="pfs_arm.csv"
name,description
b,"blue"
Expand All @@ -98,6 +107,8 @@
8,DCB,fiber goes to DCB/DCB2
9,HOME,cobra is going to home position
10,BLACKSPOT,cobra is going to black spot position
11,AFL,"The fiber is fed by all fiber lamp cable"
12,SCIENCE_MASKED,"The fiber is on a science target redacted for privacy"
```

You can insert these data into the database using the following commands:
Expand Down Expand Up @@ -168,33 +179,42 @@

## Working with target lists

### Parse an allocation summary file
### (Obsolete) Parse an allocation summary file

!!! warning

Parsing an Excel file is obsolete at this point. Currently, the output CSV files are directly made from the spreadsheet.

Check notice on line 186 in docs/examples/obsproc.md

View check run for this annotation

Codacy Production / Codacy Static Code Analysis

docs/examples/obsproc.md#L186

Expected: fenced; Actual: indented
See the formats of `proposal.csv` and `input_catalogs.csv` in the following sections.

Suppose you have an Excel file named `pfs_allocation_summary.xlsx` with 2 sheets, `Proposals` and `Allocation`, as shown below.

**`Proposals` Sheet**:

| proposal_id | input_catalog_name | input_catalog_description | group_id | pi_first_name | pi_last_name | pi_middle_name | proposal_category_name | upload_id | n_obj | fiberhour_total | fiberhour_lr | fiberhour_mr | rot_total | rot_lr | rot_mr |
| ----------- | ------------------ | ------------------------- | -------- | ------------- | ------------ | -------------- | ---------------------- | ---------------- | ----- | --------------- | ------------ | ------------ | --------- | ------ | ------ |
| S99A-QT001 | pfs_example_1 | Example target list 1 | o99101 | Eiichi | Shibusawa | | openuse | d6e94eae259faf4e | 1572 | 379.5 | 379.5 | | 5.2 | 5.2 | |
| S99A-QT002 | pfs_example_2 | Example target list 2 | o99102 | Umeko | Tsuda | | openuse | 5f695375c60f34c7 | 9712 | 17504 | 17504 | | 15.83 | 15.83 | |
| S99A-QT003 | pfs_example_3 | Example target list 3 | o99103 | Shibasaburo | Kitasato | | openuse | ba59115da8084653 | 2047 | 395.25 | 395.25 | | 12.7 | 12.7 | |
| S99A-QT001 | pfs_example_1 | Example target list 1 | o99101 | Eiichi | Shibusawa | | openuse | d6e94eae259faf4e | 1572 | 379.5 | 379.5 | 0.0 | 5.2 | 5.2 | 0.0 |
| S99A-QT002 | pfs_example_2 | Example target list 2 | o99102 | Umeko | Tsuda | | openuse | 5f695375c60f34c7 | 9712 | 17504 | 17504 | 0.0 | 15.83 | 15.83 | 0.0 |
| S99A-QT003 | pfs_example_3 | Example target list 3 | o99103 | Shibasaburo | Kitasato | | openuse | ba59115da8084653 | 2047 | 395.25 | 395.25 | 0.0 | 12.7 | 12.7 | 0.0 |

**`Allocation` Sheet**:

| proposal_id | grade | rank | allocated_rot_total | allocated_rot_lr | allocated_rot_mr | allocated_time_total | allocated_time_lr | allocated_time_mr | n_ppc | allocation_rate_lr | allocation_rate_mr | completion_rate_lr | completion_rate_mr |
| ----------- | ----- | ---- | ------------------- | ---------------- | ---------------- | -------------------- | ----------------- | ----------------- | ----- | ------------------ | ------------------ | ------------------ | ------------------ |
| S99A-QT001 | A | 9 | 2.8 | | 2.8 | 284.25 | 0 | 284.25 | 9 | 0.749011858 | | 0.723 | |
| S99A-QT002 | B | 6.5 | 6.5 | 6.5 | | 8140.5 | 8140.5 | 0 | 21 | 0.465065128 | | 0.279 | |
| S99A-QT003 | B | 6 | 9.6 | 9.6 | | 350.25 | 350.25 | 0 | 31 | 0.886148008 | | 0.684 | |
| S99A-QT001 | A | 9 | 2.8 | 0.0 | 2.8 | 284.25 | 0 | 284.25 | 9 | 0.749011858 | | 0.723 | |
| S99A-QT002 | B | 6.5 | 6.5 | 6.5 | 0.0 | 8140.5 | 8140.5 | 0 | 21 | 0.465065128 | | 0.279 | |
| S99A-QT003 | B | 6 | 9.6 | 9.6 | 0.0 | 350.25 | 350.25 | 0 | 31 | 0.886148008 | | 0.684 | |

Then, execute the following command to parse the Excel file and generate CSV files to be used to insert data into the database.:

```console
$ pfs-targetdb-cli parse-alloc pfs_allocation_summary.xlsx
```

The command will generate the following CSV files in the current directory.
### Generate CSV files for `proposal` and `input_catalog` tables

The command above will generate the following CSV files in the current directory.
It is also possible to directly create these CSV files from the time allocation information without using the Excel file.
Note that the order of the columns can be different from the examples below, but the content should be the same.

```csv title="proposal.csv"
proposal_id,group_id,pi_first_name,pi_last_name,pi_middle_name,rank,grade,allocated_time_total,allocated_time_lr,allocated_time_mr,proposal_category_name,is_too
Expand Down Expand Up @@ -222,9 +242,10 @@
### Transfer target lists from the uploader to local storage

You need to transfer the target lists from the uploader to the local storage.
The following command uses the Web API to download the target lists based on the `upload_id` specified in the `input_catalogs.csv` file. You must be within the network that can access the uploader's Web API.

```console
$ pfs-targetdb-cli transfer-targets input_catalogs.csv -c db_config.toml
$ pfs-targetdb-cli transfer-targets-api input_catalogs.csv -c db_config.toml

Check notice on line 248 in docs/examples/obsproc.md

View check run for this annotation

Codacy Production / Codacy Static Code Analysis

docs/examples/obsproc.md#L248

Dollar signs used before commands without showing output
```

At the end of the command, a summary will be shown as follows.
Expand Down Expand Up @@ -269,9 +290,11 @@

When there is a user-defined pointing list, it can be inserted into the `user_pointing` table in the `targetdb` database.

```bash
pfs-targetdb-cli insert -c dbconf.toml -t user_pointing pointing_list.ecsv \
--commit --upload_id "aabbccddeeffgghh"
```
$ pfs-targetdb-cli insert-pointings ./input_catalogs.csv -c db_config.toml --commit

# or insert manually

Currently, you need to insert the custom pointing list one by one. We are planning to support batch insertion in the future.
$ pfs-targetdb-cli insert -c dbconf.toml -t user_pointing pointing_list.ecsv \
--commit --upload_id "aabbccddeeffgghh"
```
61 changes: 31 additions & 30 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

This guide will help you get started with the PFS Target Database.


## Prerequisites

### PostgreSQL database
Expand All @@ -19,36 +18,32 @@ The Q3C extension is required for the database. You can install it by the follow

Python and the following packages as well as their dependencies will be required for `targetdb`.
The dependencies are automatically installed when you install the `targetdb` package via `pip`.
Package versions shown here are those used for the development (as of April 2025).
Package versions shown here are those used for the development (as of January 2026).
Newer (and somewhat older) versions should also work.

| Package | Version |
|------------------------------------------------------------------------|--------:|
| [Python](https://www.python.org/) | 3.11.x |
| [SQLAlchemy](https://www.sqlalchemy.org/) | 2.0.x |
| [pandas](https://pandas.pydata.org/) | 2.2.3 |
| [NumPy](https://numpy.org) | 1.26.4 |
| [Astropy](https://www.astropy.org/) | 7.0.1 |
| ---------------------------------------------------------------------- | ------: |
| [Python](https://www.python.org/) | 3.12.12 |
| [alembic](https://alembic.sqlalchemy.org/en/latest/) | 1.18.0 |
| [Astropy](https://www.astropy.org/) | 7.20 |
| [loguru](https://loguru.readthedocs.io/) | 0.7.3 |
| [SQLAlchemy-Utils](https://sqlalchemy-utils.readthedocs.io/en/latest/) | 0.41.2 |
| [NumPy](https://numpy.org) | 2.4.0 |
| [openpyxl](https://openpyxl.readthedocs.io/en/stable/) | 3.1.5 |
| [pandas](https://pandas.pydata.org/) | 2.3.3 |
| [psycopg2-binary](https://www.psycopg.org/) | 2.9.11 |
| [pyarrow](https://arrow.apache.org/docs/python/) | 22.0.0 |
| [requests](https://requests.readthedocs.io/en/latest/) | 2.32.5 |
| [SQLAlchemy](https://www.sqlalchemy.org/) | 2.0.45 |
| [SQLAlchemy-Utils](https://sqlalchemy-utils.readthedocs.io/en/latest/) | 0.42.1 |
| [tabulate](https://pypi.org/project/tabulate/) | 0.9.0 |
| [alembic](https://alembic.sqlalchemy.org/en/latest/) | 1.13.1 |
| [pyarrow](https://arrow.apache.org/docs/python/) | 15.0.2 |
| [Typer](https://typer.tiangolo.com/) | 0.15.2 |
| [openpyxl](https://openpyxl.readthedocs.io/en/stable/) | 3.1.2 |

If you are using Python 3.10 or earlier, you may need to install [tomli](https://github.com/hukkin/tomli) package.

| Package | Version |
|------------------------------------------|--------:|
| [tomli](https://github.com/hukkin/tomli) | 2.0.1 |
| [Typer](https://typer.tiangolo.com/) | 0.21.1 |

For building the documentation, the following packages are required.

| Package | Version |
|-----------------------------------------------------------------|--------:|
| --------------------------------------------------------------- | ------: |
| [MkDocs](https://www.mkdocs.org/) | 1.6.1 |
| [mkdocs-material](https://squidfunk.github.io/mkdocs-material/) | 9.6.11 |
| [mkdocs-material](https://squidfunk.github.io/mkdocs-material/) | 9.7.1 |

Additionally, the following tools may be useful for testing and development.

Expand All @@ -69,18 +64,13 @@ git clone https://github.com/Subaru-PFS/ets_target_database.git
# move to the directory
cd ets_target_database

# (optional but recommended) create a virtual environment and activate it
# Install with uv
uv sync

# (optional) create a virtual environment and activate it
# python3 -m venv .venv
# source .venv/bin/activate

# install the package
python3 -m pip install .

# You can also install the package in the editable mode
# python3 -m pip install -e .

# refresh the shell for command-line tools
hash -r
```

## Quick Start
Expand Down Expand Up @@ -115,6 +105,11 @@ pfs-targetdb-cli install-q3c -c dbconf.toml

# create tables in the database
pfs-targetdb-cli create-schema -c dbconf.toml

# when installed via uv, the command can be run as
# uv run pfs-targetdb-cli create-db -c dbconf.toml
# uv run pfs-targetdb-cli install-q3c -c dbconf.toml
# uv run pfs-targetdb-cli create-schema -c dbconf.toml
```

### Generate an ER diagram
Expand Down Expand Up @@ -158,8 +153,14 @@ The documentation can be built by the following command:
# Install the required packages for building the documentation
python3 -m pip install -e ".[doc]"

# or via uv
# uv sync --extra doc

# Build the documentation with MkDocs
mkdocs build

# or via uv
# uv run mkdocs build
```

The documentation will be generated in the `site` directory.
6 changes: 4 additions & 2 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Welcome to the documentation for the PFS Target Database. This database is desig

## Overview

The PFS Target Database stores information about targets for science observations as well as calibration objects. The database also contains information about observation proposals and the filters used for quality assurance.
The PFS Target Database stores information about targets for science observations as well as calibration objects. The database also contains information about observation proposals and the filters used for quality assurance.

The database is implemented in PostgreSQL built with Python/SQLAlchemy. It also includes a set of command-line tools hopefully useful for database management and science operations.

Expand All @@ -15,7 +15,9 @@ We hope this documentation will be a valuable resource as you work with the PFS
This package is distributed under the MIT License.

```
Copyright (c) 2021 Masato Onodera
MIT License

Copyright (c) 2023-2026 Subaru PFS Project

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
4 changes: 3 additions & 1 deletion examples/data/target_types.csv
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,6 @@ target_type_id,target_type_name,target_type_description
7,SUNSS_DIFFUSE,the fiber goes to the SuNSS diffuse leg
8,DCB,fiber goes to DCB/DCB2
9,HOME,cobra is going to home position
10,BLACKSPOT,cobra is going to black spot position
10,BLACKSPOT,cobra is going to black spot position
11,AFL,"The fiber is fed by all fiber lamp cable"
12,SCIENCE_MASKED,"The fiber is on a science target redacted for privacy"
Loading
Loading