Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,10 +59,9 @@ jobs:
pip3 install setuptools
pip3 install wheel
pip3 install pytest coverage pytest-cov
pip3 install -r PyAutoConf/requirements.txt
pip3 install -r PyAutoFit/requirements.txt
pip3 install -r PyAutoFit/optional_requirements.txt
pip3 install -r PyAutoFit/build_requirements.txt
pip install ./PyAutoConf
pip install ./PyAutoFit
pip install ./PyAutoFit[optional]
- name: Run tests
run: |
export ROOT_DIR=`pwd`
Expand Down
14 changes: 1 addition & 13 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,30 +12,18 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#

from pyprojroot import here

workspace_path = str(here())

import os
import sys

sys.path.insert(0, os.path.abspath("."))

clone_path = str(here())
clone_path = os.path.split(clone_path)[0]

sys.path.insert(
0,
os.path.abspath(clone_path),
)

import autofit

# -- Project information -----------------------------------------------------

year = datetime.date.today().year
project = "PyAutoFit"
copyright = "2022, James Nightingale, Richard Hayes"
copyright = "2025, James Nightingale, Richard Hayes"
author = "James Nightingale, Richard Hayes"

# The full version, including alpha/beta/rc tags
Expand Down
64 changes: 51 additions & 13 deletions docs/overview/the_basics.rst
Original file line number Diff line number Diff line change
Expand Up @@ -573,31 +573,69 @@ Multiple Datasets
Many model-fitting problems require multiple datasets to be fitted simultaneously in order to provide the best
constraints on the model.

In **PyAutoFit**, all you have to do to fit multiple datasets is sum your ``Analysis`` classes together:
In **PyAutoFit**, all you have to do to fit multiple datasets is combine them with the model via ``AnalysisFactor``
objects.

.. code-block:: python

analysis_0 = Analysis(data=data_0, noise_map=noise_map_0)
analysis_1 = Analysis(data=data_1, noise_map=noise_map_1)
analysis_0 = Analysis(data=data, noise_map=noise_map)
analysis_1 = Analysis(data=data, noise_map=noise_map)

# This means the model is fitted to both datasets simultaneously.
analysis_list = [analysis_0, analysis_1]

analysis = analysis_0 + analysis_1
analysis_factor_list = []

# summing a list of analysis objects is also a valid API:
for analysis in analysis_list:

analysis = sum([analysis_0, analysis_1])
# The model can be customized here so that different model parameters are tied to each analysis.
model_analysis = model.copy()

By summing analysis objects the log likelihood values computed by the ``log_likelihood_function`` of each individual
analysis class are summed to give an overall log likelihood value that the non-linear search samples when model-fitting.
analysis_factor = af.AnalysisFactor(prior_model=model_analysis, analysis=analysis)

analysis_factor_list.append(analysis_factor)

All ``AnalysisFactor`` objects are combined into a ``FactorGraphModel``, which represents a global model fit to
multiple datasets using a graphical model structure.

The key outcomes of this setup are:

- The individual log likelihoods from each ``Analysis`` object are summed to form the total log likelihood
evaluated during the model-fitting process.

- Results from all datasets are output to a unified directory, with subdirectories for visualizations
from each analysis object, as defined by their ``visualize`` methods.

This is a basic use of **PyAutoFit**'s graphical modeling capabilities, which support advanced hierarchical
and probabilistic modeling for large, multi-dataset analyses.

To inspect the model, we print ``factor_graph.global_prior_model.info``.

.. code-block:: python

print(factor_graph.global_prior_model.info)

To fit multiple datasets, we pass the ``FactorGraphModel`` to a non-linear search.

Unlike single-dataset fitting, we now pass the ``factor_graph.global_prior_model`` as the model and
the ``factor_graph`` itself as the analysis object.

This structure enables simultaneous fitting of multiple datasets in a consistent and scalable way.

.. code-block:: python

search = af.DynestyStatic(
nlive=100,
)

result_list = search.fit(model=factor_graph.global_prior_model, analysis=factor_graph)

.. note::

In the simple example above, instances of the same ``Analysis`` class (``analysis_0`` and ``analysis_1``) were
summed. However, different ``Analysis`` classes can also be summed together. This is useful when fitting different
datasets that each require a unique ``log_likelihood_function`` to be fitted simultaneously. For more detailed
information and a dedicated API for customizing how the model changes across different datasets, refer to
the [multiple datasets cookbook](https://pyautofit.readthedocs.io/en/latest/cookbooks/multiple_datasets.html).
combined. However, different ``Analysis`` classes can also be combined. This is useful when fitting different
datasets that each require a unique ``log_likelihood_function`` to be fitted simultaneously. For more detailed
information and a dedicated API for customizing how the model changes across different datasets, refer to
the [multiple datasets cookbook](https://pyautofit.readthedocs.io/en/latest/cookbooks/multiple_datasets.html).

Wrap Up
-------
Expand Down
9 changes: 9 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,15 @@ optional=[
"ultranest==4.3.2",
"zeus-mcmc==2.5.4",
]
docs=[
"sphinx",
"furo",
"myst-parser",
"sphinx_copybutton",
"sphinx_design",
"sphinx_inline_tabs",
"sphinx_autodoc_typehints"
]

test = ["pytest"]
dev = ["pytest", "black"]
Expand Down
7 changes: 5 additions & 2 deletions readthedocs.yml
Original file line number Diff line number Diff line change
@@ -1,13 +1,16 @@
version: 2

build:
os: ubuntu-20.04
os: ubuntu-22.04
tools:
python: "3.11"

python:
install:
- requirements: docs/requirements.txt
- method: pip
path: .
extra_requirements:
- docs

sphinx:
configuration: docs/conf.py
Loading