Skip to main contentIBM Quantum Documentation

Run jobs in a batch

Important

The way batches are started within Qiskit Runtime has changed. By 1 April 2024, upgrade to qiskit-ibm-runtime version 0.20.0 or later. In addition, ensure you are using Qiskit version 0.45.0 or later. Starting 1 April, batch calls made in earlier versions of these packages will fail.

Batch mode can shorten processing time if all jobs can be provided at the outset. If you want to submit iterative jobs, use sessions instead. Using batch mode has these benefits:

  • The jobs' classical computation, such as compilation, is run in parallel. Thus, running multiple jobs in a batch is significantly faster than running them serially.
  • There is usually minimal delay between jobs, which can help avoid drift.
  • If you partition your workload into multiple jobs and run them in batch mode, you can get results from individual jobs, which makes them more flexible to work with. For example, if a job's results don't meet your expectations, you can cancel the remaining jobs. Also, if one job fails, you can re-submit it instead of re-running the entire workload.
Note

When batching, jobs are not guaranteed to run in the order they are submitted. Also, while your batch jobs will run as closely together as possible, they don't get exclusive access to the backend. Therefore, your batch jobs might run in parallel with other users' jobs if there is enough processing capacity on the QPU. Additionally, QPU calibration jobs could run between the batched jobs.

This diagram illustrates jobs submitted in a batch.  It shows five jobs, numbered 0 through 4, in a queue. The jobs are a mix of Estimator and Sampler.
Figure 1: Batch execution

Open a batch

You can open a runtime batch by using the context manager with Batch(...) or by initializing the Batch class. When you start a batch, you must specify a QPU by passing a backend object. The batch starts when its first job begins execution.

Batch class

from qiskit_ibm_runtime import Batch, SamplerV2 as Sampler, EstimatorV2 as Estimator
 
backend = service.least_busy(operational=True, simulator=False)
batch = Batch(backend=backend)
estimator = Estimator(mode=batch)
sampler = Sampler(mode=batch)
batch.close()

Context manager

The context manager automatically opens and closes the batch.

from qiskit_ibm_runtime import Batch, SamplerV2 as Sampler, EstimatorV2 as Estimator
 
backend = service.least_busy(operational=True, simulator=False)
with Batch(backend=backend):
  estimator = Estimator()
  sampler = Sampler()

Batch length

You can define the batch's maximum time to live (TTL) with the max_time parameter. This should exceed the longest job's execution time. This timer starts when the batch starts. When the value is reached, the batch is closed. Any jobs that are running will finish, but jobs still queued are failed.

with Batch(backend=backend, max_time="25m"):
  ...

There is also an interactive time to live (ITTL) value that cannot be configured. If no batch jobs are queued within that window, the batch is temporarily deactivated.

To determine a batch's TTL or ITTL, follow the instructions in Determine batch details and look for the max_timeor interactive_timeout value, respectively.

For more information about these settings, see Batch maximum execution time.


Close a batch

A batch automatically closes when it exits the context manager. When the batch context manager is exited, the batch is put into "In progress, not accepting new jobs" status. This means that the batch finishes processing all running or queued jobs until the maximum TTL value is reached. After all jobs are completed, the batch is immediately closed. You cannot submit jobs to a closed batch.

with Batch(backend=backend) as batch:
    estimator = Estimator()
    job1 = estimator.run(...)
    job2 = estimator.run(...)
 
# The batch is no longer accepting jobs but the submitted job will run to completion.
result = job1.result()
result2 = job2.result()
Tip

If you are not using a context manager, manually close the batch. If you leave the batch open and submit more jobs to it later, it is possible that the maximum TTL will be reached before the subsequent jobs start running; causing them to be canceled. You can close a batch as soon as you are done submitting jobs to it. When a batch is closed with batch.close(), it no longer accepts new jobs, but the already submitted jobs will still run until completion and their results can be retrieved.

batch = Batch(backend=backend)
 
# If using qiskit-ibm-runtime earlier than 0.24.0, change `mode=` to `batch=`
estimator = Estimator(mode=batch)
job1 = estimator.run(...)
job2 = estimator.run(...)
print(f"Result1: {job1.result()}")
print(f"Result2: {job2.result()}")
 
# Manually close the batch. Running and queued jobs will run to completion.
batch.close()

Reconfigure jobs for parallel processing

There are multiple ways you can reconfigure your jobs to take advantage of the parallel processing provided by batching. The following example shows how you can partition a long list of circuits into multiple jobs and run them as a batch to take advantage of the parallel processing.

 from qiskit_ibm_runtime import SamplerV2 as Sampler, Batch
 
 max_circuits = 100
 all_partitioned_circuits = []
 for i in range(0, len(circuits), max_circuits):
     all_partitioned_circuits.append(circuits[i : i + max_circuits])
 jobs = []
 start_idx = 0
 
 with Batch(backend=backend):
     sampler = Sampler()
     for partitioned_circuits in all_partitioned_circuits:
         job = sampler.run(partitioned_circuits)
         jobs.append(job)
Caution

If you set backend=backend in a primitive, the program is run in job mode, even if it's inside a batch or session context. Setting backend=backend is deprecated as of IBM Qiskit Runtime 0.24.0. Instead, use the mode parameter.


Determine batch details

For a comprehensive overview of a batch's configuration and status, including its interactive and max TTL, use the batch.details() method.

from qiskit_ibm_runtime import QiskitRuntimeService, SamplerV2 as Sampler
 
service = QiskitRuntimeService()
backend = service.least_busy(operational=True, simulator=False)
 
with Batch(backend=backend) as batch:
    sampler = Sampler()
    for partition in partitions:
        job = sampler.run(partition)
    print(batch.details())
Recommendations
Was this page helpful?
Report a bug or request content on GitHub.