Try Live
Add Docs
Rankings
Pricing
Enterprise
Docs
Install
Theme
Install
Docs
Pricing
Enterprise
More...
More...
Try Live
Rankings
Create API Key
Add Docs
WarpX
https://github.com/blast-warpx/warpx
Admin
WarpX is an advanced electromagnetic & electrostatic Particle-In-Cell (PIC) code, highly-parallel
...
Tokens:
354,497
Snippets:
3,291
Trust Score:
7.5
Update:
1 week ago
Context
Skills
Chat
Benchmark
68.6
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# WarpX WarpX is an advanced, highly-parallel **Particle-In-Cell (PIC)** simulation code designed for modeling electromagnetic and electrostatic plasma physics at exascale. It supports multiple field solver types (Maxwell's equations, Poisson's equation, Ampere's law coupled with Ohm's law), a variety of grid geometries (1D/2D/3D Cartesian, cylindrical, spherical), and a rich set of multi-physics packages including ionization, collisions, fusion reactions, and quantum electrodynamics (QED). WarpX was awarded the 2022 ACM Gordon Bell Prize and is built on the AMReX adaptive mesh refinement framework, enabling it to scale to the world's largest supercomputers while also running on laptops. The code is accessed primarily through its **PICMI Python interface** (`pywarpx.picmi`) — an interoperable standard for PIC input descriptions — or via native AMReX parameter input files. Python bindings allow users to inspect and modify simulation data (fields and particles) at runtime through callback hooks, enabling seamless integration with external analysis tools, AI/ML frameworks, and other simulation codes. WarpX runs on multi-core CPUs and NVIDIA/AMD/Intel GPUs, supports MPI+OpenMP parallelism with dynamic load balancing, and outputs data in the openPMD format for downstream analysis. --- ## Installation ### Install via conda-forge (CPU only) The quickest way to get a working WarpX installation on Linux/macOS/Windows without GPU support. ```bash mamba create -n warpx -c conda-forge warpx mamba activate warpx # verify installation python3 -c "import pywarpx; print(pywarpx.__version__)" ``` ### Install from source with pip (CPU/GPU) Build WarpX with Python bindings from the GitHub source. Requires CMake and a C++20 compiler. ```bash # install build-time dependencies (conda dev environment, with MPI) conda create -n warpx-dev -c conda-forge blaspp boost ccache cmake compilers git \ lapackpp "openpmd-api=*=mpi_mpich*" openpmd-viewer packaging pytest python \ numpy pandas scipy yt "fftw=*=mpi_mpich*" pkg-config matplotlib mamba mpich \ mpi4py ninja pip wheel conda activate warpx-dev # build and install WarpX Python package from source python3 -m pip install -U pip build packaging setuptools wheel cmake python3 -m pip wheel -v git+https://github.com/BLAST-WarpX/warpx.git python3 -m pip install *whl ``` ### Build from source with CMake Full control over build options including GPU compute backends, dimensionality, and optional physics. ```bash git clone https://github.com/BLAST-WarpX/warpx.git $HOME/src/warpx cd $HOME/src/warpx # configure: 3D build with Python bindings, CUDA GPU support cmake -S . -B build \ -DWarpX_DIMS=3 \ -DWarpX_PYTHON=ON \ -DWarpX_COMPUTE=CUDA \ -DWarpX_MPI=ON \ -DWarpX_PSATD=ON # compile using 8 parallel jobs cmake --build build -j 8 # executables appear in build/bin/ # e.g., build/bin/warpx.3d.MPI.CUDA.DP.QED ``` ### Install via Spack ```bash spack mirror add rolling https://binaries.spack.io/develop spack buildcache keys --install --trust # install WarpX with Python bindings and MPI spack install warpx +python spack load warpx +python ``` --- ## PICMI Simulation Setup ### `picmi.Simulation` — Central simulation object The `Simulation` class is the top-level container for all simulation components: field solver, species, diagnostics, lasers, and applied fields. It drives the time advance via `step()`. ```python #!/usr/bin/env python3 """Minimal 3D laser-plasma acceleration simulation using PICMI.""" from pywarpx import picmi c = picmi.constants.c q_e = picmi.constants.q_e # --- Grid grid = picmi.Cartesian3DGrid( number_of_cells=[32, 32, 256], lower_bound=[-30e-6, -30e-6, -56e-6], upper_bound=[ 30e-6, 30e-6, 12e-6], lower_boundary_conditions=["periodic", "periodic", "dirichlet"], upper_boundary_conditions=["periodic", "periodic", "dirichlet"], lower_boundary_conditions_particles=["periodic", "periodic", "absorbing"], upper_boundary_conditions_particles=["periodic", "periodic", "absorbing"], moving_window_velocity=[0.0, 0.0, c], # co-moving window warpx_max_grid_size=64, warpx_blocking_factor=32, ) # --- Field solver (Yee FDTD, CFL=1) solver = picmi.ElectromagneticSolver(grid=grid, method="Yee", cfl=1.0) # --- Plasma electrons (uniform density slab) plasma = picmi.Species( particle_type="electron", name="electrons", initial_distribution=picmi.UniformDistribution( density=2e23, lower_bound=[-20e-6, -20e-6, 0.0], upper_bound=[ 20e-6, 20e-6, None], # None = extends to end of box fill_in=True, ), ) # --- Drive laser (Gaussian pulse) laser = picmi.GaussianLaser( wavelength=0.8e-6, waist=5e-6, duration=15e-15, focal_position=[0, 0, 100e-6 + 9e-6], centroid_position=[0, 0, 9e-6 - c * 30e-15], propagation_direction=[0, 0, 1], polarization_direction=[0, 1, 0], E0=16e12, ) laser_antenna = picmi.LaserAntenna( position=[0.0, 0.0, 9e-6], normal_vector=[0, 0, 1] ) # --- Diagnostics field_diag = picmi.FieldDiagnostic( name="diag1", grid=grid, period=100, data_list=["B", "E", "J", "rho"], warpx_format="openpmd", ) particle_diag = picmi.ParticleDiagnostic( name="diag1", period=100, ) # --- Assemble and run sim = picmi.Simulation( solver=solver, max_steps=100, verbose=1, particle_shape="cubic", ) sim.add_species( plasma, layout=picmi.GriddedLayout(grid=grid, n_macroparticle_per_cell=[1, 1, 1]), ) sim.add_laser(laser, injection_method=laser_antenna) sim.add_diagnostic(field_diag) sim.add_diagnostic(particle_diag) # optional: dump an AMReX inputs file for running with the compiled binary sim.write_input_file(file_name="inputs_3d_picmi") # run interactively from Python (with optional Python extensions) sim.step(100) # Expected: simulation runs 100 steps; openPMD data written to ./diags/diag1/ ``` --- ## Grid Types ### `picmi.Cartesian3DGrid` / `picmi.Cartesian2DGrid` / `picmi.Cartesian1DGrid` Define the computational domain, resolution, boundary conditions, and optional moving window for Cartesian geometries. ```python from pywarpx import picmi # 2D Cartesian grid with open (PML) boundaries and a moving window grid_2d = picmi.Cartesian2DGrid( number_of_cells=[128, 512], lower_bound=[-200e-6, -200e-6], upper_bound=[ 200e-6, 200e-6], lower_boundary_conditions=["periodic", "open"], upper_boundary_conditions=["periodic", "open"], lower_boundary_conditions_particles=["periodic", "absorbing"], upper_boundary_conditions_particles=["periodic", "absorbing"], moving_window_velocity=[0.0, picmi.constants.c], warpx_max_grid_size=32, ) ``` ### `picmi.CylindricalGrid` Axisymmetric RZ geometry with support for multiple azimuthal modes (quasi-3D). ```python from pywarpx import picmi # Quasi-3D cylindrical grid (RZ) — common for laser-plasma acceleration grid_rz = picmi.CylindricalGrid( number_of_cells=[64, 256], # [nr, nz] n_azimuthal_modes=2, # m=0 and m=1 lower_bound=[0.0, -100e-6], # [rmin, zmin] upper_bound=[50e-6, 50e-6], # [rmax, zmax] lower_boundary_conditions=["none", "open"], upper_boundary_conditions=["dirichlet", "open"], lower_boundary_conditions_particles=["none", "absorbing"], upper_boundary_conditions_particles=["absorbing", "absorbing"], moving_window_velocity=[0.0, picmi.constants.c], ) ``` --- ## Field Solvers ### `picmi.ElectromagneticSolver` FDTD or PSATD solver for Maxwell's equations. Supports Yee, CKC (Cole-Karkkainen-Cowan), and PSATD methods. ```python from pywarpx import picmi # Yee FDTD with CFL = 0.99 yee_solver = picmi.ElectromagneticSolver( grid=grid, method="Yee", cfl=0.99, divE_cleaning=1, # divergence cleaning ) # Pseudo-spectral analytic time domain (PSATD) — suppresses NCI psatd_solver = picmi.ElectromagneticSolver( grid=grid, method="PSATD", cfl=1.0, warpx_galilean_velocity=[0.0, 0.0, 0.99 * picmi.constants.c], # Galilean frame ) ``` ### `picmi.ElectrostaticSolver` Poisson/Multigrid solver for purely electrostatic problems. ```python from pywarpx import picmi es_solver = picmi.ElectrostaticSolver( grid=grid, method="Multigrid", # or "FFT" warpx_absolute_tolerance=1e-7, ) ``` --- ## Particle Species ### `picmi.Species` Defines a particle species (electrons, ions, photons, custom). Species can be named, carry custom real/integer attributes, and be initialised from one or more distribution functions. ```python from pywarpx import picmi # Background plasma electrons initialised from a Gaussian bunch beam_dist = picmi.GaussianBunchDistribution( n_physical_particles=1e9, rms_bunch_size=[0.5e-6, 0.5e-6, 0.5e-6], rms_velocity=[0.01 * picmi.constants.c] * 3, centroid_position=[0.0, 0.0, -28e-6], centroid_velocity=[0.0, 0.0, 500.0 * picmi.constants.c], ) beam = picmi.Species( particle_type="electron", name="beam", initial_distribution=beam_dist, warpx_save_previous_position=True, # track x_old/y_old/z_old warpx_add_real_attributes={"energy0": "ux*ux + uy*uy + uz*uz"}, # custom attribute ) # Uniform plasma electrons — density ramp via analytic function plasma_dist = picmi.AnalyticDistribution( density_expression="n0 * (1 + tanh((z - z0) / Lramp)) / 2", momentum_expressions=["0", "0", "0"], lower_bound=[-50e-6, -50e-6, 0.0], upper_bound=[ 50e-6, 50e-6, None], fill_in=True, n0=1e23, z0=50e-6, Lramp=10e-6, # user constants ) plasma_e = picmi.Species( particle_type="electron", name="plasma_e", initial_distribution=plasma_dist, ) ``` --- ## Particle Distributions ### `picmi.UniformDistribution` Uniform density within a specified bounding box. ```python from pywarpx import picmi uniform = picmi.UniformDistribution( density=1e22, lower_bound=[-200e-6, -200e-6, 0.0], upper_bound=[ 200e-6, 200e-6, None], directed_velocity=[0.0, 0.0, 1.0e9], # bulk momentum (m/s) fill_in=True, ) ``` ### `picmi.GaussianBunchDistribution` Gaussian phase-space distribution, useful for relativistic beams. ```python from pywarpx import picmi c = picmi.constants.c bunch = picmi.GaussianBunchDistribution( n_physical_particles=1e10, rms_bunch_size=[1e-6, 1e-6, 2e-6], # σ_x, σ_y, σ_z [m] rms_velocity=[0.01*c, 0.01*c, 0.1*c], # σ_ux, σ_uy, σ_uz [m/s] centroid_position=[0.0, 0.0, -30e-6], centroid_velocity=[0.0, 0.0, 1000.0*c], ) ``` ### `picmi.ParticleListDistribution` Inject a small, manually specified list of macro-particles. ```python from pywarpx import picmi single_electron = picmi.ParticleListDistribution( x=0.0, y=0.0, z=-0.25, ux=0.5e10, uy=0.0, uz=1.0e10, weight=1, ) ``` --- ## Laser Pulses ### `picmi.GaussianLaser` + `picmi.LaserAntenna` Inject a Gaussian laser pulse through a planar antenna. ```python from pywarpx import picmi c = picmi.constants.c laser = picmi.GaussianLaser( wavelength=0.8e-6, # 800 nm waist=5e-6, # 1/e^2 intensity radius at focus (m) duration=15e-15, # FWHM duration (s) focal_position=[0, 0, 100e-6], centroid_position=[0, 0, 9e-6 - c * 30e-15], propagation_direction=[0, 0, 1], polarization_direction=[0, 1, 0], E0=16e12, # peak electric field amplitude (V/m) fill_in=False, ) antenna = picmi.LaserAntenna( position=[0.0, 0.0, 9e-6], normal_vector=[0, 0, 1], ) sim.add_laser(laser, injection_method=antenna) ``` --- ## Diagnostics ### `picmi.FieldDiagnostic` Write field data (E, B, J, rho, phi, …) to disk at regular intervals. ```python from pywarpx import picmi field_diag = picmi.FieldDiagnostic( name="fields", grid=grid, period=50, # write every 50 steps data_list=["Ex", "Ey", "Ez", "Bx", "By", "Bz", "Jx", "Jy", "Jz", "rho", "part_per_cell"], warpx_format="openpmd", # output in openPMD/HDF5 or ADIOS2 warpx_openpmd_backend="h5", ) sim.add_diagnostic(field_diag) ``` ### `picmi.ParticleDiagnostic` Write particle data (positions, momenta, weights, custom attributes) to disk. ```python from pywarpx import picmi particle_diag = picmi.ParticleDiagnostic( name="particles", period=100, species=[electrons, beam], data_list=["x", "y", "z", "ux", "uy", "uz", "weighting"], warpx_format="openpmd", ) sim.add_diagnostic(particle_diag) ``` ### `picmi.ParticleBoundaryScrapingDiagnostic` Record particles absorbed at boundaries or embedded boundaries for post-processing. ```python from pywarpx import picmi scraping_diag = picmi.ParticleBoundaryScrapingDiagnostic( name="scraped", period=-1, # -1 = write only at end of simulation species=[electrons, ions], warpx_format="openpmd", ) sim.add_diagnostic(scraping_diag) ``` --- ## Embedded Boundaries ### `picmi.EmbeddedBoundary` Define conductor or dielectric objects inside the simulation domain as implicit surfaces. Used with the electrostatic solver to model physical objects such as antennas or spacecraft. ```python from pywarpx import picmi # Conducting sphere with fixed potential (electrostatic simulation) sphere_eb = picmi.EmbeddedBoundary( implicit_function="-(x**2 + y**2 + z**2 - radius**2)", potential=1.0, # boundary potential [V] radius=0.3277, # user constant injected into the expression ) sim = picmi.Simulation( solver=picmi.ElectrostaticSolver(grid=grid, method="Multigrid"), time_step_size=1.27e-8, max_steps=1000, warpx_embedded_boundary=sphere_eb, ) ``` --- ## Python Callbacks ### `pywarpx.callbacks.installcallback` / decorator variants Install user-defined Python functions to be called at specific points in the time-advance loop. Available hook points include `afterstep`, `beforestep`, `afterEsolve`, `afterdeposition`, `particleinjection`, `poissonsolver`, and many more. ```python from pywarpx import picmi from pywarpx.callbacks import callfromafterstep, installcallback # Method 1: decorator @callfromafterstep def monitor_energy(): """Called automatically after every time step.""" warpx = sim.extension.warpx t = warpx.gett_new(0) step = warpx.getistep(0) Ex = sim.fields.get("Efield_fp", dir="x", level=0) print(f"Step {step:05d} t={t:.3e} s max(Ex)={Ex.max():.3e} V/m") # Method 2: explicit install def inject_particles(): """Inject new electrons from a custom source at every step.""" electrons = sim.particles.get("electrons") electrons.add_particles( x=[0.0], y=[0.0], z=[5e-6], ux=[0.0], uy=[0.0], uz=[1e9], w=[1.0], unique_particles=True, ) installcallback("particleinjection", inject_particles) sim.step(200) # Expected: monitor_energy() and inject_particles() called each step; # max(Ex) printed to stdout at every time step ``` --- ## Field Data Access ### `sim.fields.get` — Read and write fields at runtime Access any named `MultiFab` (field array) from a running simulation inside a callback. Supports both NumPy-like global indexing and high-performance local box-iteration for GPU-compatible code. ```python from pywarpx import picmi from pywarpx.callbacks import callfromafterstep from pywarpx.LoadThirdParty import load_cupy xp, _ = load_cupy() # returns cupy on GPU, numpy on CPU @callfromafterstep def apply_custom_field_modification(): # --- Get field objects (no data copy; direct reference) Ex = sim.fields.get("Efield_fp", dir="x", level=0) Jy = sim.fields.get("current_fp", dir=1, level=0) # --- Global numpy-like read (gathers data across MPI ranks — use for analysis) Ex_slice = Ex[:, 5, :] # all x and z at y-index 5 print("Ex max:", Ex[:, :, :].max()) # --- Global write (scatters to matching boxes) Jy[5, 6:20, 8:30] = 0.0 # zero out a region of Jy # --- Get mesh coordinates (useful for plotting) x_coords = Ex.mesh("x") # 1D array of x positions (m) z_coords = Ex.mesh("z") # --- High-performance: explicit loop over local boxes (GPU-friendly) ngv = Ex.n_grow_vect for mfi in Ex: Ex_arr = Ex.array(mfi).to_xp() # numpy or cupy array (zero-copy) Ex_arr[()] *= 0.5 # scale field by 0.5 in-place # --- Allocate a new custom MultiFab in the WarpX registry Ex_copy = sim.fields.alloc_init( name="Ex_copy", dir=0, level=0, ba=Ex.box_array(), dm=Ex.dm(), ncomp=Ex.n_comp, ngrow=Ex.n_grow_vect, initial_value=0.0, redistribute=True, redistribute_on_remake=True, ) Ex_copy.copymf(Ex, 0, 0, 1, Ex.n_grow_vect) sim.step(10) ``` --- ## Particle Data Access ### `sim.particles.get` — Read and write particle data at runtime Access particle positions, momenta, weights, and custom attributes from a callback. ```python from pywarpx import picmi from pywarpx.callbacks import callfromafterstep from pywarpx.LoadThirdParty import load_cupy xp, _ = load_cupy() @callfromafterstep def analyze_and_modify_particles(): electrons = sim.particles.get("electrons") # --- Pandas DataFrame (read-only copy; convenient for analysis) df = electrons.to_df(local=True) print("N electrons (local):", len(df)) print("Mean ux:", df["ux"].mean()) # NOTE: modifying df does NOT change simulation data # --- High-performance in-place modification via explicit box loop for pti in electrons.iterator(level=0): # pti['x'], pti['ux'], pti['w'] etc. are numpy/cupy arrays pti["ux"][:] += xp.random.normal(0, 1e5, len(pti["ux"][:])) # thermal kick # --- Count total particles (global, across MPI) print("Total electrons:", electrons.total_number_of_particles(True, False)) # --- Add new particles dynamically electrons.add_particles( x=xp.array([1e-6, 2e-6]), y=xp.array([0.0, 0.0]), z=xp.array([5e-6, 5e-6]), ux=xp.zeros(2), uy=xp.zeros(2), uz=xp.ones(2) * 1e9, w=xp.ones(2), unique_particles=True, ) sim.step(50) ``` --- ## Particle Boundary Buffer ### `particle_containers.ParticleBoundaryBufferWrapper` — Access scraped particles Retrieve particles that were absorbed at domain boundaries or embedded boundaries, enabling custom secondary emission or boundary physics. ```python import numpy as np from pywarpx import callbacks, particle_containers, picmi from pywarpx.LoadThirdParty import load_cupy xp, _ = load_cupy() def secondary_emission(): """Called after each step; injects secondary electrons when ions hit the EB.""" buffer = particle_containers.ParticleBoundaryBufferWrapper() lev = 0 n_scraped = buffer.get_particle_boundary_buffer_size("ions", "eb") electrons = sim.particles.get("electrons") if n_scraped > 0: # Extract attributes of particles scraped this step from the embedded boundary r = xp.concatenate(buffer.get_particle_scraped_this_step("ions", "eb", "r", lev)) z = xp.concatenate(buffer.get_particle_scraped_this_step("ions", "eb", "z", lev)) ux = xp.concatenate(buffer.get_particle_scraped_this_step("ions", "eb", "ux", lev)) uz = xp.concatenate(buffer.get_particle_scraped_this_step("ions", "eb", "uz", lev)) w = xp.concatenate(buffer.get_particle_scraped_this_step("ions", "eb", "w", lev)) # Emit one secondary electron per scraped ion (simplified model) electrons.add_particles( x=r.get() if hasattr(r, "get") else r, y=np.zeros(len(r)), z=z.get() if hasattr(z, "get") else z, ux=np.zeros(len(r)), uy=np.zeros(len(r)), uz=np.zeros(len(r)), w=w.get() if hasattr(w, "get") else w, ) print(f"Emitted {len(r)} secondary electrons from EB.") callbacks.installafterstep(secondary_emission) sim.step(10) ``` --- ## Global WarpX Object ### `sim.extension.warpx` — Low-level simulation controls Direct access to the C++ `WarpX` object for querying time step, physical time, and invoking advanced controls. ```python from pywarpx import picmi from pywarpx.callbacks import callfromafterstep @callfromafterstep def log_timestep(): warpx = sim.extension.warpx step = warpx.getistep(lev=0) # current integer step t = warpx.gett_new(lev=0) # current physical time (s) dt = warpx.getdt(lev=0) # current time step size (s) print(f"[Step {step:05d}] t = {t:.4e} s dt = {dt:.4e} s") # Dynamically update electrostatic boundary potential if step == 50: warpx.set_potential_on_eb("2.5") # set EB to 2.5 V at step 50 sim.step(100) # Expected output at each step: # [Step 00001] t = 1.2700e-08 s dt = 1.2700e-08 s # [Step 00002] t = 2.5400e-08 s dt = 1.2700e-08 s # ... # (EB potential updated to 2.5 V at step 50) ``` --- ## AMReX Input Files (C++ Parameter List) ### Running WarpX as a compiled binary with an `inputs` file For production HPC runs without Python overhead. Parameters use AMReX ParmParse syntax. Physical constants (`q_e`, `m_e`, `clight`, …) and user-defined constants via `my_constants.*` can be used in all numeric fields. ```bash # Run on 4 MPI ranks mpirun -np 4 ./warpx.3d.MPI.OMP.DP inputs_3d_laser # Sample inputs file: inputs_3d_laser # =================================================== stop_time = 3.0e-13 max_step = 500 amr.n_cell = 32 32 256 amr.max_level = 0 amr.blocking_factor = 32 amr.max_grid_size = 64 geometry.dims = 3 geometry.prob_lo = -30.e-6 -30.e-6 -56.e-6 geometry.prob_hi = 30.e-6 30.e-6 12.e-6 boundary.field_lo = periodic periodic pml boundary.field_hi = periodic periodic pml warpx.cfl = 1.0 warpx.do_moving_window = 1 warpx.moving_window_dir = z warpx.moving_window_v = 1.0 # in units of c particles.species_names = electrons electrons.species_type = electron electrons.injection_style = NUniformPerCell electrons.num_particles_per_cell_each_dim = 1 1 1 electrons.profile = constant electrons.density = 2.0e23 electrons.momentum_distribution_type = at_rest electrons.zmin = 0.0 warpx.serialize_initial_conditions = 1 warpx.use_filter = 1 amrex.the_arena_is_managed = 0 # User-defined constants for use in expressions my_constants.n0 = 2.0e23 my_constants.wp = "sqrt(n0 * q_e**2 / (epsilon0 * m_e))" diagnostics.diags_names = diag1 diag1.intervals = 100 diag1.diag_type = Full diag1.fields_to_plot = Ex Ey Ez Bx By Bz jx jy jz rho diag1.format = openpmd ``` --- ## Post-Processing with openPMD-viewer ### Load and visualise WarpX output using `openpmd_viewer` ```python import matplotlib.pyplot as plt from openpmd_viewer import OpenPMDTimeSeries # Open an openPMD time series from WarpX field diagnostics ts = OpenPMDTimeSeries('./diags/diag1/') print("Available iterations:", ts.iterations) print("Available fields:", ts.avail_fields) print("Available particle species:", ts.avail_species) # --- Load the Ez field at iteration 100 Ez, info = ts.get_field(field='Ez', iteration=100) print("Ez shape:", Ez.shape, " axes:", info.axes) plt.imshow(Ez.T, origin='lower', extent=[info.axes['z'][0], info.axes['z'][-1], info.axes['x'][0], info.axes['x'][-1]], aspect='auto', cmap='RdBu_r') plt.colorbar(label='Ez [V/m]') plt.xlabel('z [m]'); plt.ylabel('x [m]') plt.title('Ez at t = {:.2e} s'.format(ts.t[ts.iterations.index(100)])) plt.show() # --- Load particle data z, ux = ts.get_particle(['z', 'ux'], species='electrons', iteration=100) print(f"N electrons: {len(z)} mean ux: {ux.mean():.3e}") # --- Interactive Jupyter slider (browse all iterations) # ts.slider() ``` --- ## Summary WarpX is used across a broad range of plasma physics applications: laser-wakefield acceleration (LWFA), plasma-wakefield acceleration (PWFA), spacecraft charging in space plasma environments, high-energy astrophysical plasma modeling, capacitive discharge and ion beam extraction, and fundamental studies of plasma instabilities. Its PICMI Python interface allows researchers to describe complex multi-species simulations in concise scripts that run identically from a laptop to a petascale supercomputer, while the callback and field/particle access APIs enable tight integration with machine learning frameworks (PyTorch, TensorFlow), analysis pipelines, and custom physics modules without recompiling the code. For HPC deployment, WarpX input files can be generated from PICMI scripts (`sim.write_input_file()`), decoupling the setup logic from the compute job. Output is written in the openPMD standard (HDF5 or ADIOS2 backends), consumed by `openpmd-viewer`, `openpmd-api`, yt, ParaView, VisIt, and VisualPIC. Advanced numerical features — PSATD with Galilean/comoving frames to suppress the Numerical Cherenkov Instability, adaptive mesh refinement, embedded boundaries for complex geometry, and implicit time advance for low-frequency problems — are controlled through the same unified PICMI or parameter-list interface, making WarpX suitable as both a rapid-prototyping tool and a production-scale simulation engine.