-
Notifications
You must be signed in to change notification settings - Fork 2
Test Development
astrohack_local_client(cores=None, memory_limit=None, dask_local_dir=None, log_parms={}, worker_log_parms={})
- I will run astrohack_local_client with N cores and with a memory_limit of M GB to create an instance of the astrohack Dask client. If the Client is instantiated correctly, it spawns at the default address without any errors.
- I will run astrohack_local_client with N cores and with a memory_limit of M GB to create an instance of the astrohack Dask client. The Client is instantiated correctly, it spawns at the default address without any errors and the messages will be logged in the terminal.
- I will run astrohack_local_client with N cores and with a memory_limit of M GB to create an instance of the astrohack Dask client. The Client is instantiated correctly, it spawns at the default address without any errors and
Dasktemporary files are written todask_local_dir.
Test Cases
test_astrohack_local_client_default(cores=None, memory_limit=None, dask_local_dir=None, log_parms={}, worker_log_parms={})test_astrohack_local_client_two_instances(cores=None, memory_limit=None, dask_local_dir=None, log_parms={}, worker_log_parms={})test_astrohack_local_client_log_true(cores=None, memory_limit=None, dask_local_dir=dask_test_dir, log_parms={}, worker_log_parms={})
API: extract_pointing
extract_pointing(ms_name, point_name=None, parallel=False, overwrite=False)
- I will run extract_pointing to extract pointing data from the input measurement file. As a return I expect a pointing multiple measurement file as detailed in the documentation here. This is done for the following: VLA, ALMA, OTF, ngVLA
- As a user, I will sometimes specify the pointing name. As a return I expect a pointing astrohack file of the given name on disk.
- As a user I will run extract_pointing for the data from (VLA, ALMA, ...) with the intent of not overwriting existing data. If data exists it will not be overwritten.
- As a user I will run extract_pointing for the data from (VLA, ALMA, ...) with the intent of overwriting existing data. If data exists it will be overwritten.
Test Cases
extract_pointing( ms_name='J1924-2914.ms.calibrated.split.SPW3', parallel=False, overwrite=False, )extract_pointing( ms_name='J1924-2914.ms.calibrated.split.SPW3', point_name='pointing_test.point.zarr', overwrite=False, )extract_pointing( ms_name='J1924-2914.ms.calibrated.split.SPW3', parallel=False, overwrite=False, )extract_pointing( ms_name='J1924-2914.ms.calibrated.split.SPW3', parallel=False, overwrite=True, )
API: extract_holog
extract_holog(ms_name, holog_obs_dict=None, ddi=None, baseline_average_distance=None, baseline_average_nearest=None, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=False)
- As a user, I will sometimes specify the holography observation dictionary, which can be used to extract only a subset of the holography data. As return I expect a holography mds object containing only the specified subset of data.
- As a user, I will run extract_holog and want to specify a subset of ddi values, using data from VLA/ALMA supported antennas with the intention of extracting a subset off the holography data. The output should contain a subset of data defined by the ddi values.
- As a user I will run extract_holog using the baseline average distance as a filter for the data from (VLA, ALMA, ...). Returned will be data based on only the baselines with this average distance, as a result.
- As a user I will run extract_holog using the nearest baseline average of data from (VLA, ALMA, ...). Returned will be data based on only the nearest baselines as a result.
- As a user I will run extract_holog for the data from (VLA, ALMA, ...) with the intent of overwriting existing data. If data exists it will be overwritten.
Test Cases
-
test_extract_holog_vla(ms_name='J1924-2914.ms.calibrated.split.SPW3', holog_obs_dict=test_holog_obs_dict, ddi=None, baseline_average_distance=None, baseline_average_nearest=None, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=False)*test_extract_holog_vla(ms_name='J1924-2914.ms.calibrated.split.SPW3', holog_obs_dict=None, ddi=[list_of_relevant_ddi_values], baseline_average_distance=None, baseline_average_nearest=None, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=False) test_extract_holog_vla(ms_name='J1924-2914.ms.calibrated.split.SPW3', holog_obs_dict=test_holog_obs_dict, ddi=None, baseline_average_distance=baseline_average_distance, baseline_average_nearest=None, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=False)test_extract_holog_vla(ms_name='J1924-2914.ms.calibrated.split.SPW3', holog_obs_dict=test_holog_obs_dict, ddi=None, baseline_average_distance=None, baseline_average_nearest=number_of_bearest_antenna, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=False)test_extract_holog_vla(ms_name='J1924-2914.ms.calibrated.split.SPW3', holog_obs_dict=test_holog_obs_dict, ddi=None, baseline_average_distance=None, baseline_average_nearest=None, holog_name=None, data_column='CORRECTED_DATA', parallel=False, overwrite=True)
API: holog
holog(holog_name, grid_size=None, cell_size=None, image_name=None, padding_factor=50, grid_interpolation_mode='linear', chan_average=True, chan_tolerance_factor=0.005, scan_average=True, ant_id=None, ddi=None, to_stokes=True, apply_mask=True, phase_fit=True, overwrite=False, parallel=False)
-
I will run holog to process extracted holography data from (VLA or ALMA) and return an astrohack image file. I expect the astrohack image file to be created on disk.
-
The holog function should calculate the correct grid and cell size when compared to known values in the test file. The known values are provided by a test json file available via the the astrohack utility download function.
astrohack.gdown_utils.download(file='holog_numerical_verification.json') -
I will run holog to process extracted holography data from (VLA, ALMA) and return an astrohack image file with a padding_factor of 50, which should affect the resolution of the resulting image.
-
I will run holog to process extracted holography data and will apply a mask. I expect a returned image to be nulled outside of the mask radius.
-
I will run holog to process extracted holography data with a user specified antenna id. The resulting image will contain only the specified antenna id.
-
I will run holog to process extracted holography data with a user specified data description id (ddi). The resulting image will contain only the specified ddi.
-
I will run holog to process extracted holography data specifying the
chan_averageoption. I will confirm that option was propagated into the analysis by checking the file meta data. -
I will run holog to process extracted holography data specifying the
scan_averageoption. I will confirm that option was propagated into the analysis by checking the file meta data. -
I will run holog to process extracted holography data specifying the
grid_interpolationoption. I will confirm that option was propagated into the analysis by checking the file meta data. -
I will run holog to process extracted holography data specifying the
chan_toleranceoption. I will confirm that option was propagated into the analysis by checking the file meta data. -
I will run holog to process extracted holography data specifying the
to_stokesoption. I will confirm that option was propagated into the analysis by checking the file meta data and verify that the pol vector is equal to ['I', 'Q', 'U', 'V']. -
I will run holog to process extracted holography data specifying the
overwriteoption. I will confirm that image file has been overwritten. -
I will run holog to process extracted holography data specifying the
overwriteoption asFalse. I will confirm that image file has not been overwritten. Test Cases
holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', apply_mask=True, overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', antenna_id=['ea25'], overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', ddi=[0], overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', chan_average=True, overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', scan_average=True, overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', grid_interpolation='nearest', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', chan_tolerance=0.0049, overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', to_stokes=True, overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=True, parallel=False)holog(holog_name='data/ea25_cal_small_after_fixed.split.holog.zarr', image_name='data/ea25_cal_small_after_fixed.split.image.zarr', overwrite=False, parallel=False)
API: panel
combine(image_name, combine_name=None, ant='all', ddi='all', weighted=False, parallel=False, overwrite=False)
- As a user, I will run combine and I will expect to get an image mds with a single DDI per antenna.
- As a user, I will run combine with a specified combine_name and I will expect the file to be created on disk with that name.
- As a user, I will run combine with a specified antenna id, I will expect the file to be created on disk to contain data from only that antenna id.
- As a user, I will run combine with a single specified DDI, I will expect the new file to have a single DDI that is a copy of the selected input DDI.
- As a user, I will run combine with weighted=True, I will expect the output image to have different properties from when combine is called with weighted=False.
- As a user, I will run combine specifying overwrite=True and I will expect the file to be created on disk to be overwritten.
Test Cases
combine(image_name)combine(image_name, combine_name='test_combine.combine.zarr')combine(image_name, combine_name='test_combine.combine.zarr', ant='ea25')combine(image_name, combine_name='test_combine.combine.zarr', ant='ea25', ddi=1)combine(image_name, combine_name='test_combine.combine.zarr', ant='ea25', weighted=True)combine(image_name, combine_name='test_combine.combine.zarr', ant='ea25', ddi'all', overwrite=True)
API: panel
panel(image_name, panel_name=None, clip_type='sigma', clip_level=3, panel_model=None, panel_margins=0.05, ant_id=None, ddi=None, parallel=False, overwrite=False)
- As a user, I will run panel with a specified point_name and I will expect the file to be created on disk.
- As a user, I will run panel with a specified antenna id. I will expect the file to be created on disk containing panel data only from that antenna id.
- As a user, I will run panel with a specified ddi and I will expect the file to be created on disk containing panel data only from that ddi.
- As a user, I will run panel specifying overwrite=True and I will expect the file to be created on disk to be overwritten.
- As a user, I will run panel specifying overwrite=False and I will expect the file to be created on disk to be not overwritten.
- As a user, I will run panel specifying panel_margins=0.5 and I will expect panel to fail as there will be no data inside a panel
- As a user, I will run panel specifying clip_type='absolute' and clip_level=0 and I will expect panel to include all the data within the dish's inside most and outside most radii.
- As a user, I will run panel specifying clip_type='sigma' with clip_level=2 and clip_level=3 and I expect that the mask will exclude more points with clip_level=3.
- As a user, I will run panel specifying clip_type='relative' with clip_level=1 and I expect that the mask will contain no points.
- As a user, I will run panel specifying panel_model='mean' and I will expect the overall RMS to be worse than with panel_model=None, default.
Test Cases
panel(image_name, panel_name='test_panel_file.panel.zarr', parallel=False, overwrite=False)panel(image_name, panel_name='test_panel_file.panel.zarr', ant_id=['ea25'], parallel=False, overwrite=False)panel(image_name, panel_name='test_panel_file.panel.zarr', ddi=['ddi_0'], parallel=False, overwrite=False)panel(image_name, panel_name='test_panel_file.panel.zarr', parallel=False, overwrite=True)panel(image_name, panel_name='test_panel_file.panel.zarr', parallel=False, overwrite=False)panel(image_name, panel_name='test_panel_file.panel.zarr', panel_margins=0.5)panel(image_name, panel_name='test_panel_file.panel.zarr', clip_type='absolute', clip_level=0)panel(image_name, panel_name='test_panel_file.panel.zarr', clip_type='sigma', clip_level=2)panel(image_name, panel_name='test_panel_file.panel.zarr', clip_type='relative', clip_level=1)panel(image_name, panel_name='test_panel_file.panel.zarr', panel_model='mean')
API: extract_locit
extract_locit(cal_table, locit_name=None, ant_id=None, ddi=None, overwrite=False)
- As a user, I will run extract_locit with a specified cal_table and I will expect the file to be created on disk.
- As a user, I will run extract_locit with a specified antenna id. I will expect the file to be created on disk containing phase gains data only from that antenna id.
- As a user, I will run extract_locit with a specified ddi and I will expect the file to be created on disk containing phase gains data only from that ddi.
- As a user, I will run extract_locit specifying overwrite=True and I will expect the file to be created on disk to be overwritten.
- As a user, I will run extract_locit specifying overwrite=False and I will expect the file to be created on disk to be not overwritten.
Test Cases
extract_locit(cal_table, locit_name='test_extract.locit.zarr')extract_locit(cal_table, locit_name='test_extract.locit.zarr', ant_id='ea17')extract_locit(cal_table, locit_name='test_extract.locit.zarr', ddi=0)extract_locit(cal_table, locit_name='test_extract.locit.zarr', overwrite=True)extract_locit(cal_table, locit_name='test_extract.locit.zarr', overwrite=False)
API: locit
locit(locit_name, position_name=None, elevation_limit=10.0, polarization='both', fit_engine='linear algebra', fit_kterm=False, fit_slope=True, ant_id=None, ddi=None, combine_ddis=True, parallel=False, overwrite=False)
- As a user, I will run locit with a specified locit_name and I will expect the file to be created on disk.
- As a user, I will run locit with a specified antenna id. I will expect the file to be created on disk containing delays and position solutions only from that antenna id.
- As a user, I will run locit with a specified DDI. I will expect the file to be created on disk containing delays and position solutions only from that DDI.
- As a user, I will run locit specifying overwrite=True and I will expect the file to be created on disk to be overwritten.
- As a user, I will run locit specifying overwrite=False and I will expect the file to be created on disk to be not overwritten.
- As a user, I will run locit specifying elevation_limit=90 and I will expect locit to fail because there is no available data.
- As a user, I will run locit specifying polarization='R' and I will expect the file to be created on disk to contain only delays for the R polarization and position solutions derived only with the R polarization.
- As a user, I will run locit specifying fit_kterm=True and I will expect the file to be created on disk to contain a solution for the kterm.
- As a user, I will run locit specifying fit_slope=False and I will expect the file to be created on disk to contain no solution for the delay slope.
- As a user I will run locit specifying combine_ddis=False and I will expect the file to be created on disk to contain delays and position solutions for all DDIs.
Test Cases
locit(locit_name, position_name='test_position_file.position.zarr')locit(locit_name, position_name='test_position_file.position.zarr', ant_id='ea17')locit(locit_name, position_name='test_position_file.position.zarr', ddi=0)locit(locit_name, position_name='test_position_file.position.zarr', overwrite=True)locit(locit_name, position_name='test_position_file.position.zarr', overwrite=False)locit(locit_name, position_name='test_position_file.position.zarr', elevation_limit=90.0)locit(locit_name, position_name='test_position_file.position.zarr', polarization='R')locit(locit_name, position_name='test_position_file.position.zarr', fit_kterm=True)locit(locit_name, position_name='test_position_file.position.zarr', fit_slope=False)locit(locit_name, position_name='test_position_file.position.zarr', combine_ddis=False)
aips_holog_to_astrohack(amp_image, dev_image, telescope_name, holog_name, overwrite=False)
1. As a user, I will convert FITS amplitude and deviation files from AIPS’s HOLOG task and return am astrohack format .image.zarr file that can be read by panel.
Test Cases
aips_holog_to_astrohack(amp_image, dev_image, telescope_name, holog_name, overwrite=False)
API: dio
open_holog(file)
open_image(file)
open_panel(file)
open_pointing(file)
fix_pointing_table(ms_name, reference_antenna)
- I will open a holog file to access the holography multiple dataset. I expect the function to to return a holography multiple dataset.
- I will open a image file to access the image multiple dataset. I expect the function to to return a image multiple dataset.
- I will open a panel file to access the panel multiple dataset. I expect the function to to return a panel multiple dataset.
- I will open a pointing file to access the pointing multiple dataset. I expect the function to to return a pointing multiple dataset.
- I will run
fix_pointing_table(...)to zero out pointing values for a list of reference antennas.
Test Cases
test_dio_open_holog()test_dio_open_image()test_dio_open_panel()test_dio_open_ponting()
pytest-benchmark is a plugin that provides a fixture for executing performance evaluation of any function. Like asv, pytest-benchmark also supports iteration control, result comparisons and there are existing open source tools that allow integration with asv.
airspeed-velocity (asv) is a framework for comprehensive benchmarking of python packages. A previous implementation (casabench) has already proven useful for performance characterization of casatasks, although due to the complexity of building CASA from source that package did not take full advantage of certain asv features. asv can be configured alongside an existing source tree or as a standalone repository, and is designed to apply a set of benchmark tests to the revision history of a project in order to characterize trends over the lifetime of development. It also has features that allow testing and analysis at a finer resolution, such as continuous integration mode, profiling, regression detection, and result tracking for visualization in convenient dashboards.
asv isolates benchmark executions in a virtual environment, into which the package is installed from source along with its dependencies. This allows specification of version lists for each dependency, and thus a matrix of supported versions that have relevant performance implications can be tested using the same tests over the history of the repository. For astroHACK, there are certain libraries that are still under rapid development to improve their performance, so it will be interesting to make a list of specific dependency versions that we want to measure:
- python (3.11 vs. previous)
- dask
- zarr
- numba
- numpy
- scipy
- ...
Performance testing requirements in astrohack can be satisfied using a both of these tools, and the frameworks are flexible enough that the astroHACK benchmark implementation can also evolve to meet the needs of future use cases. In the short term, existing tests can be attached with the pytest-benchmark fixture and thus performance testing quickly added to the CI pipeline. Longer term, collection and analysis of performance test results for a variety of environments (different python interpreters, dependency combinations, platforms, and hardware) is desired, so systematically saving pytest-benchmark results and configuring airspeed-velocity to cover that parameter space can achieve this result.
- As a user, I will follow a link from the documentation to visually examine the performance of a particular astroHACK function when it's run using different versions of a dependency in the version of python I am using on my particular OS.
- As a developer, I will push my changes to the origin and have the CI system automatically run the same tests that are used for verification in a benchmarking mode so I can look in the same place I see other test results and be able to roughly determine whether my commits have noticeably impacted the performance of some section of the code base.