A Practical Guide to Packaging and Sharing Reproducible Quantum Experiments
reproducibilitypackagingworkflows

A Practical Guide to Packaging and Sharing Reproducible Quantum Experiments

AAva Thompson
2026-04-08
7 min read
Advertisement

Step-by-step instructions to package and share reproducible quantum experiments: code, notebooks, environment, containers, CI, and secure transfer.

Reproducibility matters. For quantum teams building experiments with sensitive hardware and evolving SDKs, delivering a reproducible package that other researchers or engineering teams can run end-to-end is both a technical and organizational challenge. This guide gives a step-by-step workflow for packaging code, datasets, environment specs, and notebooks so other teams can reproduce quantum experiments reliably — including templates and an example qbitshare workflow.

Why packaging matters for reproducible quantum experiments

Quantum experiments depend on classical control scripts, device drivers, cloud backends, randomized seeds, and datasets. Small changes to Python packages, CUDA drivers, or networked backends can make an experiment fail to reproduce. Packaging explicitly captures the complete experiment surface: code, environment, data, notebooks, and how to execute them. That surface is what you share when you want others to reproduce results or verify performance.

Core principles

  • Explicit environment specification: pin SDK versions, OS, and drivers.
  • Minimal but complete artifact set: include only what’s required to run the experiment end-to-end.
  • Deterministic execution: seeds, fixed inputs, and hardware abstraction where applicable.
  • Automated validation: CI that recreates the environment, runs smoke tests, and validates outputs.
  • Secure transfer and access control: encrypt datasets and use secure registries or qbitshare-style secure repositories.

What to include in an experiment package

A reproducible package should include:

  1. Source code and notebooks (with outputs cleared or archived).
  2. Environment spec (conda/venv, Dockerfile, or Nix expression).
  3. Pinned quantum SDK examples and version metadata (Qiskit/Cirq/Pennylane versions).
  4. Datasets and artifacts, or pointers to secure storage with checksums.
  5. Execution scripts (run_experiment.sh), sample config files and secrets handling guidelines.
  6. Automated tests and CI pipeline definitions.
  7. README with reproducibility checklist and troubleshooting steps.

Here’s a compact layout you can adopt for a quantum notebook repository or general experiment packaging:

experiment-name/
├─ README.md
├─ LICENSE
├─ data/
│  ├─ input/   # small sample inputs or checksum files
│  └─ external_links.txt
├─ notebooks/
│  └─ analysis.ipynb
├─ src/
│  └─ run_experiment.py
├─ environment/
│  ├─ Dockerfile
│  ├─ environment.yml    # conda
│  └─ requirements.txt   # pip pins
├─ tests/
│  └─ test_smoke.py
├─ ci/
│  └─ github-actions.yml
└─ metadata.json

Template files

Minimal metadata.json

{
  "name": "example-quantum-experiment",
  "version": "0.1.0",
  "sdk": {
    "qiskit": "==0.50.0",
    "pennylane": null
  },
  "hardware": "simulator|cloud-provider",
  "last-tested": "2026-04-01",
  "notes": "Seeded RNG, CPU-only run included"
}

Dockerfile (Dockerize quantum experiments)

FROM ubuntu:22.04

# system deps
RUN apt-get update && apt-get install -y --no-install-recommends \
    build-essential python3 python3-venv python3-pip git curl && \
    rm -rf /var/lib/apt/lists/*

# create app user
RUN useradd -ms /bin/bash app
WORKDIR /home/app
COPY environment/requirements.txt ./
RUN pip3 install --no-cache-dir -r requirements.txt
COPY . .
USER app
CMD ["python3", "src/run_experiment.py"]

environment/requirements.txt

qiskit==0.50.0
numpy==1.26.4
scipy==1.12.0
pandas==2.2.0

README checklist (excerpt)

  • Prereqs: Docker 24+, Git LFS (if large datasets), access token for secure registry.
  • Steps to run locally: git clone, docker build, docker run with mount for data.
  • CI badge and validated platform matrix (Ubuntu 22.04 / Python 3.11).
  • How to reproduce raw logs and figures and where to find checksums.

Practical: containerize and run

Containerization decouples your environment from the host and is the easiest way to share reproducible experiments. Steps:

  1. Pin Python packages in requirements.txt or environment.yml.
  2. Create a Dockerfile with the target OS and necessary drivers (add NVIDIA CUDA only if GPU experiments).
  3. Build and tag the image with semantic versioning: docker build -t registry.example.com/org/experiment:0.1.0 .
  4. Push to a private registry or qbitshare repository: docker push registry... (use short-lived tokens).
  5. Share image/artifact URLs and metadata.json so other teams can pull and run with the same seed/inputs.

CI for quantum: automated validation

CI should recreate the environment, run smoke tests, and verify outputs. Example GitHub Actions workflow (ci/github-actions.yml):

name: CI
on: [push, pull_request]
jobs:
  build-and-test:
    runs-on: ubuntu-22.04
    steps:
      - uses: actions/checkout@v4
      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: '3.11'
      - name: Install deps
        run: pip install -r environment/requirements.txt
      - name: Run smoke tests
        run: pytest -q tests/test_smoke.py
      - name: Build Docker image
        run: docker build -t ${{ secrets.REGISTRY }}/org/experiment:${{ github.sha }} .

For larger experiments, CI can conditionally run heavy tests on a scheduled matrix, and only run simulated hardware tests for pull requests. If you integrate timing analysis into delivery pipelines, consider linking with deployment and hardware vacation windows — see our article on Integrating Timing Analysis into Quantum Firmware Delivery Pipelines.

Secure research file transfer

Sharing datasets and images requires secure channels. Use:

  • Encrypted S3 buckets with presigned URLs and server-side encryption.
  • Secure registries (private Docker registries with role-based access).
  • scp/sftp with key-based auth for ad-hoc transfers (avoid emailing artifacts).
  • Enterprise secure repositories such as qbitshare or an internal equivalent with audit logs.

Document how to get credentials in your README or an onboarding doc, and avoid committing secrets to the repository. For secure workflow patterns and lessons, see our piece on Building Secure Workflows for Quantum Projects.

Reproducibility tips specific to quantum experiments

  • Seed every RNG and document seed usage in the top-level script.
  • Record exact backend (simulator vs cloud device) including device name and timestamp.
  • Include job IDs and raw device logs when possible; store these as artifacts in CI.
  • Abstract hardware access with a small wrapper so tests can use simulators deterministically.
  • Keep short sample datasets in the repo and store large sets externally with integrity hashes.

Example qbitshare workflow (end-to-end)

Below is a compact workflow tailored for teams using qbitshare-style repositories and wanting to share quantum SDK examples and experiments.

  1. Developer packages experiment locally using the recommended repo layout and populates metadata.json with SDK pins.
  2. Run local smoke tests: python -m pytest tests/test_smoke.py
  3. Build and tag Docker image, push to private registry: docker build -t registry.example.com/org/exp:0.1.0 .; docker push ...
  4. Create a release in the qbitshare repository and attach metadata.json, Docker tag, and checksums for datasets.
  5. CI triggered on release executes the full validation matrix: simulator runs on Ubuntu, optional hardware smoke run scheduled as nightly job.
  6. Recipient team pulls container and runs reproducibility script: docker run --rm -v $(pwd)/data:/data registry.../exp:0.1.0 ./run_reproduce.sh
  7. If differences appear, CI artifacts and device logs are reviewed and a reproducibility issue is created with links to the qbitshare release and artifacts.

Troubleshooting & common pitfalls

  • Missing native libs: ensure Dockerfile installs system dependencies (e.g., libopenblas, CUDA drivers).
  • Hardware-only features: provide a simulator fall-back and clearly document required device access rights.
  • Large notebooks with outputs: commit cleared notebooks and store outputs as separate artifacts to avoid noisy diffs.
  • Package drift: add periodic CI jobs to report outdated dependencies and proactively test pin combinations.

Practical checklist before sharing

  1. All code and notebooks included; outputs archived separately.
  2. Environment pinned and Docker image built and pushed.
  3. Smoke tests pass locally and in CI.
  4. Secure storage and access instructions are documented.
  5. Metadata and checksums published with the release.

Further reading and next steps

This guide focuses on practical steps for packaging and sharing reproducible quantum experiments. For complementary workflow patterns and team collaboration practices, see our article on Building the Future: Quantum Team Collaboration Tools and Strategies, and for secure data planning see Preparing for the Next Wave of Quantum Data: Insights from Security Trends. If you rely heavily on command-line tooling, our guide on Command Line Power: Leveraging Linux for Quantum Development helps standardize developer environments.

Start small: package one notebook + one small dataset, add a Dockerfile and a CI smoke test. As your team gains confidence, automate more validation and expand to scheduled hardware tests. Using consistent packaging and qbitshare-style secure sharing will make your quantum experiments far more credible and reproducible across teams.

Advertisement

Related Topics

#reproducibility#packaging#workflows
A

Ava Thompson

Senior Quantum DevOps Engineer

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-04-19T22:44:13.473Z