This note book provides the code to solve the Travelling Salesman problem (TSP) with a Variational Quantum Circuit, and a quantum inspired classical machine learning model. Full documentation can be found in the Read_the_Docs
The work is described in more detail in an article
Note: Circuit 6 in this repo is re-numbered as Circuit 5 in the article.
Clone the repository to a suitable location on your computer using the following command:
git clone https://github.com/goldsmdn/TSP_VQC
Please see the installation instructions
Use the following command run from the TSP_VQC directory
uv venv
source .venv/Scripts/activate
uv pip install numpy pytest graycode qiskit qiskit_aer torch
uv pip install sphinx sphinx_rtd_theme sphinx-autodoc-typehints matplotlib
uv pip install qiskit_ibm_runtime pandas seaborn
uv pip install ipykernel notebook pylatexenc
uv pip install torchviz graphviz
To run one of the notebooks, for example manual_runs_ML.ipynb enter
jupyter notebook manual_runs_ML.ipynb
Alternatively you can run in the VS code development environment, setting the Python interpreter to Base.
An overview of the process is shown below. In summary:
- TSP networks are stored in the
networksfolder, either loaded from external sources, or created automatically bymake_data.ipynb. - runs can be executed manually by
manual_runs_ML.ipynbfor classical ML andmanual_runs_VQC.ipynbfor quantum. These allow an interactive environment for simple experiments. In manual executions the control parameters are read from the a configuration data inmodules/config.py - most runs are executed automatically by
auto_runs.ipynb - in any cases results data is updated to the
results.csvfile, and to sub-run specific results files and graphs - each execution of data causes a
run-idto be created, and each different set of configuration data causes asub-idto be created. - data is analysed by
show_results.ipynb - bespoke graphs are plotted in
plot_data.ipynb
The following Jupyter notebooks are provided for data execution:
auto_runs.ipynb: responsible for executing automatic runs, reading configuration data fromcontrol_parameters.csvmanual_runs_ML.ipynb: responsible for executing manual runs of the classical ML model, reading configuration data frommodules/config.pymanual_runs_VQC.ipynb: responsible for executing manual runs of the quantum machine learning model, reading configuration data frommodules/config.py
The following Jupyter notebooks are provided for create networks for testing. The networks are stored in the networks folder.
make_data.ipynb: responsible for setting up new networks
The following Jupyter notebooks are provided for data analysis:
show_results.ipynb: responsible for analysing the results stored in theresult/results.csvfileplot_data.ipynb: resonsible for creating bespoke graphs of individual runs and plots anomolous network with 42 locationsresource_requirements.ipynb: calculates the number of qubits needed for each formulationhot_start_analysis.ipynb: compares the Hamming distance of the hot start binary string to the binary string of the optimum solutionmonte_carlo.ipynb: carries out Monte Carlo simulations by finding the best distance over a range of bit stringsbit_strings_ranked_by_distance.ipynb: Plots a graph of the solution quality by ordered bit string
The following modules are provided in the modules folder:
helper_functions_tsp.py: general helper functionsgraph_functions.py: plots graphshelper_ML_functions.py: specific to classical machine learning model
A full suite of over 70 test Unit Test cases is provided and executed automatically using PyTest on each push to the repository
test_ML_functions.py: unit test cases for classical machine learningtest_quantum_functions.py: unit test cases for quantum machine learningtest_tsp_helper.py: general unit test cases
The following object orientated code is provided:
LRUCacheUnhashable.py: handles caches of bit string evaluationsMyDataLogger.py: handles logging of data results including updatingresults.txt, and sub-run specific data summaries and graphs. This module is object orientated, with objects for a parentrun-idand childsub-id.MyModel.py: responsbile for classical machine learing PyTorch modules
Contributions to the repository are very welcome. Please raise an issue if you have any problems, and feel free to contact me.
The optimiser is chosen setting the constant GRADIENT_TYPE. For quantum two optimisers bespoke coding is provided:
parameter_shiftwhich uses the fact that qubit rotations are trigonometric functions to find an analytical expression for the gradient. Please see Pennylane documentation for a full description of parameter shift.SPSAis an algorithm of optimisation invented by James C. Spall specially useful for noisy cost functions and the ones which the exact gradient is not available. Please see a blog for a description of SPSA code that was modified.
