Exponentially increasing cpu time while using QBSolv/D-Wave

0

I solved a simple QUBO with QBSolv and D-Wave's quantum annealer and it worked. However, it is very confusing to me why the displayed "seconds of classic cpu time" are increasing exponentially. My log file looked basically like this:

101000.... -5760.... Energy of solution 0 Number of Partitioned calls, 1 output sample 0.00300 seconds of classic cpu time 148 bits, find Min, SubMatrix= 140, -a o, timeout=2592000.0 sec

110100.... -5760.... Energy of solution 21 Number of Partitioned calls, 2 output sample 1437.91600 seconds of classic cpu time

Some additional information: I am working with my local Python IDE (PyCharm) and I used the following settings: number of reads = 1000, number of repeats = 5, verbosity = 1, solver_limit = 140. Can anyone tell me, why the classic cpu time is getting longer and longer for every sample output? Is it maybe because of the verbosity setting or do I have to change the output settings?

Thank you in advance!

已提問 2 年前檢視次數 293 次
1 個回答
0

Hi, qbsolv is a classical-quantum hybrid algorithm, so as the algorithm runs, you can expect both QPU and CPU resources to be utilized. The output which is printed when verbosity is set greater than zero includes the total cpu time used so far, so the seconds of classic cpu time you see for each step represents the total time used in all the steps up to that point. I hope this is able to answer your questions.

AWS
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南