Exponentially increasing cpu time while using QBSolv/D-Wave

0

I solved a simple QUBO with QBSolv and D-Wave's quantum annealer and it worked. However, it is very confusing to me why the displayed "seconds of classic cpu time" are increasing exponentially. My log file looked basically like this:

101000.... -5760.... Energy of solution 0 Number of Partitioned calls, 1 output sample 0.00300 seconds of classic cpu time 148 bits, find Min, SubMatrix= 140, -a o, timeout=2592000.0 sec

110100.... -5760.... Energy of solution 21 Number of Partitioned calls, 2 output sample 1437.91600 seconds of classic cpu time

Some additional information: I am working with my local Python IDE (PyCharm) and I used the following settings: number of reads = 1000, number of repeats = 5, verbosity = 1, solver_limit = 140. Can anyone tell me, why the classic cpu time is getting longer and longer for every sample output? Is it maybe because of the verbosity setting or do I have to change the output settings?

Thank you in advance!

質問済み 2年前293ビュー
1回答
0

Hi, qbsolv is a classical-quantum hybrid algorithm, so as the algorithm runs, you can expect both QPU and CPU resources to be utilized. The output which is printed when verbosity is set greater than zero includes the total cpu time used so far, so the seconds of classic cpu time you see for each step represents the total time used in all the steps up to that point. I hope this is able to answer your questions.

AWS
回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ