autofit.DynestyDynamic#

class DynestyDynamic[source]#

Bases: AbstractDynesty

A Dynesty non-linear search, using a dynamically changing number of live points.

For a full description of Dynesty, checkout its GitHub and readthedocs webpages:

https://github.com/joshspeagle/dynesty https://dynesty.readthedocs.io/en/latest/index.html

Parameters:
  • name (Optional[str]) – The name of the search, controlling the last folder results are output.

  • path_prefix (Optional[str]) – The path of folders prefixing the name folder where results are output.

  • unique_tag (Optional[str]) – The name of a unique tag for this model-fit, which will be given a unique entry in the sqlite database and also acts as the folder after the path prefix and before the search name.

  • iterations_per_update (int) – The number of iterations performed between update (e.g. output latest model to hard-disk, visualization).

  • number_of_cores (int) – The number of cores sampling is performed using a Python multiprocessing Pool instance.

  • session – An SQLalchemy session instance so the results of the model-fit are written to an SQLite database.

Methods

check_model

check_pool

config_dict_test_mode_from

Returns a configuration dictionary for test mode meaning that the sampler terminates as quickly as possible.

copy_with_paths

exact_fit

fit

Fit a model, M with some function f that takes instances of the class represented by model M and gives a score for their fitness.

fit_sequential

Fit multiple analyses contained within the analysis sequentially.

iterations_from

Returns the next number of iterations that a dynesty call will use and the total number of iterations that have been performed so far.

live_points_init_from

By default, dynesty live points are generated via the sampler's in-built initialization.

make_pool

Make the pool instance used to parallelize a NonLinearSearch alongside a set of unique ids for every process in the pool.

make_sneakier_pool

make_sneaky_pool

Create a pool for multiprocessing that uses slight-of-hand to avoid copying the fitness function between processes multiple times.

optimise

Perform optimisation for expectation propagation.

output_search_internal

perform_update

Perform an update of the non-linear search's model-fitting results.

perform_visualization

Perform visualization of the non-linear search's model-fitting results.

plot_results

plot_start_point

Visualize the starting point of the non-linear search, using an instance of the model at the starting point of the maximum likelihood estimator.

post_fit_output

Cleans up the output folderds after a completed non-linear search.

pre_fit_output

Outputs attributes of fit before the non-linear search begins.

read_uses_pool

If a Dynesty fit does not use a parallel pool, and is then resumed using one, this causes significant slow down.

result_via_completed_fit

Returns the result of the non-linear search of a completed model-fit.

run_search_internal

Run the Dynesty sampler, which could be either the static of dynamic sampler.

samples_from

Loads the samples of a non-linear search from its output files.

samples_info_from

samples_via_internal_from

Returns a Samples object from the dynesty internal results.

search_internal_from

Returns an instance of the Dynesty dynamic sampler set up using the input variables of this class.

start_resume_fit

write_uses_pool

If a Dynesty fit does not use a parallel pool, and is then resumed using one, this causes significant slow down.

Attributes

checkpoint_file

The path to the file used for checkpointing.

config_dict_run

A property that is only computed once per instance and then replaces itself with an ordinary attribute.

config_dict_search

A property that is only computed once per instance and then replaces itself with an ordinary attribute.

config_dict_settings

config_type

logger

Log 'msg % args' with severity 'DEBUG'.

name

number_live_points

paths

plotter_cls

samples_cls

search_internal

should_plot_start_point

timer

Returns the timer of the search, which is used to output informaiton such as how long the search took and how much parallelization sped up the search time.

using_mpi

Whether the search is being performing using MPI for parallelisation or not.

search_internal_from(model, fitness, checkpoint_exists, pool, queue_size)[source]#

Returns an instance of the Dynesty dynamic sampler set up using the input variables of this class.

If no existing dynesty sampler exist on hard-disk (located via a checkpoint_file) a new instance is created with which sampler is performed. If one does exist, the dynesty restore() function is used to create the instance of the sampler.

Dynesty samplers with a multiprocessing pool may be created by inputting a dynesty Pool object, however non pooled instances can also be created by passing pool=None and queue_size=None.

Parameters:
  • model – The model which generates instances for different points in parameter space.

  • fitness – An instance of the fitness class used to evaluate the likelihood of each model.

  • pool – A dynesty Pool object which performs likelihood evaluations over multiple CPUs.

  • queue_size – The number of CPU’s over which multiprocessing is performed, determining how many samples are stored in the dynesty queue for samples.