Exponentially increasing cpu time while using QBSolv/D-Wave

0

I solved a simple QUBO with QBSolv and D-Wave's quantum annealer and it worked. However, it is very confusing to me why the displayed "seconds of classic cpu time" are increasing exponentially. My log file looked basically like this:

101000.... -5760.... Energy of solution 0 Number of Partitioned calls, 1 output sample 0.00300 seconds of classic cpu time 148 bits, find Min, SubMatrix= 140, -a o, timeout=2592000.0 sec

110100.... -5760.... Energy of solution 21 Number of Partitioned calls, 2 output sample 1437.91600 seconds of classic cpu time

Some additional information: I am working with my local Python IDE (PyCharm) and I used the following settings: number of reads = 1000, number of repeats = 5, verbosity = 1, solver_limit = 140. Can anyone tell me, why the classic cpu time is getting longer and longer for every sample output? Is it maybe because of the verbosity setting or do I have to change the output settings?

Thank you in advance!

asked 2 years ago289 views
1 Answer
0

Hi, qbsolv is a classical-quantum hybrid algorithm, so as the algorithm runs, you can expect both QPU and CPU resources to be utilized. The output which is printed when verbosity is set greater than zero includes the total cpu time used so far, so the seconds of classic cpu time you see for each step represents the total time used in all the steps up to that point. I hope this is able to answer your questions.

AWS
answered 2 years ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions