### LASER Development Environment Setup Source: https://github.com/institutefordiseasemodeling/laser/blob/main/README.md Guides users through setting up the development environment for the LASER project. It involves cloning the repository, installing development tools like uv and tox, creating and activating a virtual environment, and running tests or builds using tox. ```bash git clone https://github.com/InstituteforDiseaseModeling/laser-core.git ``` ```bash uv tool install tox --with tox-uv ``` ```bash cd laser-core ``` ```bash uv venv ``` ```bash # Mac or Linux: source .venv/bin/activate # Windows: .venv\bin\Activate ``` ```bash tox ``` ```bash tox -e py310 ``` -------------------------------- ### Local Development Setup and Git Workflow Source: https://github.com/institutefordiseasemodeling/laser/blob/main/CONTRIBUTING.rst Steps to fork the LASER repository, clone it locally, create a new branch for development, make changes, and push them to GitHub. ```git git clone git@github.com:YOURGITHUBNAME/laser.git ``` ```git git checkout -b name-of-your-bugfix-or-feature ``` ```git git add . git commit -m "Your detailed description of your changes." ``` ```git git push origin name-of-your-bugfix-or-feature ``` -------------------------------- ### Create Singularity Container Definition File Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS A Singularity definition file (`.def`) specifying the base image, installation steps, environment variables, and metadata for building a container. It installs Python, pip, and the `idmlaser` package from a private repository. ```singularity Bootstrap: docker From: rockylinux:9 %post dnf -y install python3-pip dnf -y install gcc-c++ dnf -y install sudo dnf -y install epel-release dnf clean all python3 -m pip install pip --upgrade python3 -m pip install idmlaser -i https://packages.idmod.org/api/pypi/pypi-production/simple %runscript %environment export INPUT_ROOT=Assets export HEADLESS=1 %labels Author jonathan.bloedow@gatesfoundation.org %help Container for running LASER prototype on COMPS ``` -------------------------------- ### Create Local Workspace for Input Files Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Python command to initiate the creation of a local workspace for simulation input files. This utility guides the user through selecting directories and scenarios for setting up the input structure. ```shell python3.9 -m idmlaser.utils.build_template_workspace ``` -------------------------------- ### Build Singularity Image (SIF) Locally Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Command to build a Singularity Image Format (SIF) file from a definition file. This process requires Singularity to be installed on the local system. ```shell singularity build sifs/laser.sif laser.def ``` -------------------------------- ### Publish idmlaser to PyPI Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Command to upload a Python package distribution file to a specified PyPI repository. Requires the `twine` utility and the package's wheel file. ```shell python3 -m twine upload --verbose --repository-url https://packages.idmod.org/api/pypi/idm-pypi-production/ dist/idmlaser-0.0.5-py3-none-any.whl ``` -------------------------------- ### Install Taichi and Initialize GPU Source: https://github.com/institutefordiseasemodeling/laser/wiki/Notes-on-Performance Installs the Taichi library using pip and initializes Taichi to use the GPU backend. This is the first step before running any Taichi computations. ```shell %pip install taichi ``` ```python import taichi as ti import taichi.math as tm ti.init(arch=ti.gpu) ``` -------------------------------- ### Install Development Version Source: https://github.com/institutefordiseasemodeling/laser/blob/main/README.rst Installs the in-development version of the laser package directly from its GitHub repository. ```Shell pip install https://github.com/InstituteforDiseaseModeling/laser/archive/main.zip ``` -------------------------------- ### Editable Install with Companion Extras Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Combines the editable installation of the main package with the installation of optional companion packages. This allows for local development of both the core model and its associated data/scripts. ```shell pip3 install -e ".[lesotho]" ``` -------------------------------- ### Create SIF on COMPS using idmtools Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Python script using `idmtools` to programmatically build a Singularity image on the COMPS platform. It defines a `SingularityBuildWorkItem` specifying the definition file and desired image name, then runs it. ```python from idmtools.core.platform_factory import Platform from idmtools_platform_comps.utils.singularity_build import SingularityBuildWorkItem if __name__ == '__main__': platform = Platform("CALCULON") sbi = SingularityBuildWorkItem(name="Create sif from Artifactory dockerfile", definition_file="my_local_laser.def", image_name="my_new_laser.sif") sbi.tags = dict(my_key="my_value") sbi.run(wait_until_done=True, platform=platform) if sbi.succeeded: # Write ID file sbi.asset_collection.to_id_file("laser.id") ``` -------------------------------- ### Run LASER Simulation on COMPS Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Python script execution to run the LASER simulation on the COMPS platform. This command initiates the simulation process, which can take minutes depending on the scenario complexity and simulation duration. ```shell python3.9 run_laser_on_comps.py ``` -------------------------------- ### Install laser-core Package Source: https://github.com/institutefordiseasemodeling/laser/blob/main/README.md Installs the laser-core Python package using pip. This command is used to get the core library for the LASER project, enabling its use in various modeling efforts. ```bash python3 -m pip install laser-core ``` -------------------------------- ### Launch Optuna Dashboard Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/calibration.rst Starts the Optuna Dashboard server to visualize calibration results. It connects to the specified database storage. ```shell python -c "import optuna_dashboard; optuna_dashboard.run_server('mysql+pymysql://optuna:superSecretPassword@127.0.0.1:3306/optunaDatabase')" ``` -------------------------------- ### Installing Companion Package with Extras Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Demonstrates how to install the main LASER disease model package along with an optional companion package, such as for Lesotho calibration data. This leverages Python's extended dependency specification. ```shell pip3 install laser-leprosy[lesotho] ``` -------------------------------- ### Install laser-core Package Source: https://github.com/institutefordiseasemodeling/laser/blob/main/README.rst Installs the stable version of the laser-core package from PyPI using pip. ```Shell pip install laser-core ``` -------------------------------- ### Setup Workspace with build_template_workspace Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Initializes the workspace for the idmlaser module by running a utility script. This script prompts the user for a sandbox directory path and sets up the necessary environment. ```shell python3 -m idmlaser.utils.build_template_workspace Enter the sandbox directory path (default: /var/tmp/sandbox): ``` -------------------------------- ### LASER LaserFrame.load_snapshot Example Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/eula.rst Python example demonstrating how to load a LASER simulation snapshot using `LaserFrame.load_snapshot`. This retrieves the population frame, results matrix, and parameters from a saved file. ```Python from laser.frame import LaserFrame load_path = "/path/to/your/snapshot.lasersnap" # Load the snapshot data loaded_frame, loaded_results_r, loaded_parameters = LaserFrame.load_snapshot(load_path) # After loading, adjust capacity if necessary # If not doing births, set capacity to the current count # loaded_frame.capacity = loaded_frame.count # If doing births, capacity might be set based on projected growth # For example: loaded_frame.capacity = loaded_frame.count + projected_births # Reconstruct components using init_from_file() if needed # For example: # model_components = init_from_file(loaded_frame, loaded_results_r, loaded_parameters) print("Snapshot loaded successfully.") print(f"Loaded frame count: {loaded_frame.count}") ``` -------------------------------- ### Editable Install of Main Package Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Standard command for installing a Python package in editable mode from its source directory. This is useful for local development and testing of the core model. ```shell pip3 install -e . ``` -------------------------------- ### Add Input Directory to Task Assets Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Python code snippet demonstrating how to add a directory containing input files to a task's common assets. This ensures the simulation has access to necessary data, such as 'inputs_ew' for the England and Wales scenario. ```python task.common_assets.add_directory('inputs_ew') ``` -------------------------------- ### Install laser-core package Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/installation.rst This command installs the laser-core package using pip, the Python package installer. Ensure you have Python and pip installed on your system. ```shell pip install laser-core ``` -------------------------------- ### Configure COMPS Resources with add_schedule_config Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Illustrates how to specify computational resources for COMPS jobs using the `add_schedule_config` function. This includes setting parameters like `node_group` for resource allocation. ```python task.add_schedule_config(node_group='your_node_group', other_resource='value') ``` -------------------------------- ### Cloning Repositories for Local Development Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Illustrates the process of cloning both the main LASER disease model repository and its companion package repository. This setup is necessary for developing both components locally and linking them. ```shell git clone https://github.com/InstituteforDiseaseModeling/laser-leprosy.git git clone https://github.com/InstituteforDiseaseModeling/laser-leprosy-lesotho.git ``` -------------------------------- ### Running Tests and Builds with Tox Source: https://github.com/institutefordiseasemodeling/laser/blob/main/CONTRIBUTING.rst Commands to execute all project checks and documentation builds using tox, run specific test environments, or run tests in parallel. ```shell tox ``` ```shell tox -e envname -- pytest -k test_myfeature ``` ```shell tox -p auto ``` -------------------------------- ### GitHub Codespaces Setup for Julia Source: https://github.com/institutefordiseasemodeling/laser/wiki/Status This entry provides a link to a GitHub repository containing a pre-configured GitHub Codespace environment specifically set up for working with Julia. This facilitates quick setup and development for Julia projects without manual environment configuration. ```APIDOC GitHub Repository: URL: https://github.com/krosenfeld-IDM/julia_codespace Description: This repository provides a configuration for GitHub Codespaces to create a ready-to-use development environment for Julia. It includes necessary tools and dependencies for Julia development, allowing users to quickly start coding, running, and debugging Julia projects directly in their browser or VS Code. Key Features: - Pre-installed Julia version. - Common Julia packages might be pre-installed or easily installable. - VS Code integration with Julia extensions. - Environment configured for efficient development workflows. Usage: 1. Navigate to the repository on GitHub. 2. Click the "Use this template" or "Code" button and select "Open with Codespaces". 3. A new Codespace environment will be created and launched. 4. Begin developing Julia code within the provided environment. ``` -------------------------------- ### LASER LaserFrame.save_snapshot Example Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/eula.rst Python example showing how to save a LASER simulation snapshot using `LaserFrame.save_snapshot`. This function stores the active population, results, and parameters to a specified path. ```Python from laser.frame import LaserFrame from laser.properties import PropertySet import pandas as pd # Assume 'frame' is an instance of LaserFrame # Assume 'results_r' is a pandas DataFrame (e.g., results.R) # Assume 'parameters' is a PropertySet object save_path = "/path/to/your/snapshot.lasersnap" # Save the snapshot, including the active population, results, and parameters LaserFrame.save_snapshot(save_path, frame, results_r=results_r, pars=parameters) print(f"Snapshot saved to {save_path}") ``` -------------------------------- ### Install idmlaser (User Mode) Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Installs the idmlaser module using pip for general users. It assumes the user's pip configuration points to the IDM Artifactory. This is the primary installation method for end-users. ```shell pip3 install idmlaser ``` -------------------------------- ### Install idmlaser (Developer Mode) Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Installs the idmlaser module in developer mode from a Git repository. This involves cloning the repository, navigating into the directory, and performing an editable installation using pip. ```shell git clone -b jb_modulify --single-branch https://github.com/InstituteforDiseaseModeling/laser.git cd jb pip3 install -e . ``` -------------------------------- ### Get Best Optuna Trial Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/calibration.rst Uses the Optuna CLI to retrieve the best trial from a specified study. Requires the study name and the database storage URL. ```shell optuna best-trial \ --study-name=test_polio_calib \ --storage "mysql+pymysql://optuna:superSecretPassword@localhost:3306/optunaDatabase" ``` -------------------------------- ### Upload Assets to COMPS and Record Collection ID Source: https://github.com/institutefordiseasemodeling/laser/wiki/Running-on-COMPS Python command to create an asset collection on the COMPS platform, typically uploading local files from a specified directory. The resulting Asset Collection ID is then saved to a file. ```shell python3.9 -m COMPS create_asset_collection sifs/ ``` -------------------------------- ### Running Calibration from Companion Package Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Shows the command to execute a specific module, like a calibration script, from an installed companion package. This allows direct invocation of project-specific functionality. ```shell python3 -m laser-leprosy-lesotho.calibrate ``` -------------------------------- ### Main Simulation Execution Flow Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/eula.rst Defines the main entry point for the simulation script. It uses `click` for command-line argument parsing, allowing users to resume a simulation from a snapshot file or start a new one. It orchestrates the initialization, seeding, result population, squashing, running, and saving/plotting of the simulation. ```python @click.command() @click.option("--init-pop-file", type=click.Path(), default=None, help="Path to snapshot to resume from.") @click.option("--output", type=click.Path(), default="model_output.h5") def main(init_pop_file, output): if init_pop_file: model = RecoveredSquashModel.load(init_pop_file) model.run() model.plot() else: model = RecoveredSquashModel() model.initialize() model.seed_infections() model.populate_results() model.squash_recovered() model.save(output) print(f"Initial population saved to {output}") if __name__ == "__main__": main() ``` -------------------------------- ### Python Function with Docstring and Example Source: https://github.com/institutefordiseasemodeling/laser/wiki/LASER-Modeling-in-R-(2) Provides a Python function `calculate_force_of_infection` with a comprehensive docstring. The docstring includes parameter descriptions, return types, and a runnable example, mirroring R's emphasis on clear, example-driven documentation. ```Python def calculate_force_of_infection(infected, susceptible, contact_rate): """ Calculate the force of infection based on SIR dynamics. Parameters: infected (int): Number of infected individuals. susceptible (int): Number of susceptible individuals. contact_rate (float): Rate of contact between individuals. Returns: float: Force of infection. Example: >>> calculate_force_of_infection(10, 100, 0.05) 0.5 """ ``` -------------------------------- ### SIR Model Simulation Setup and Run Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/spatialexample.rst Demonstrates how to initialize and run the multi-node SIR simulation. It sets up parameters, creates the model instance, adds migration, transmission, and recovery components, runs the simulation, and saves/plots results. ```python # Parameters params = { "population_size": 1_000_000, "nodes": 20, "timesteps": 600, "initial_infected_fraction": 0.01, "transmission_rate": 0.25, "migration_rate": 0.001 } # Run simulation model = MultiNodeSIRModel(params) model.add_component(MigrationComponent(model)) model.add_component(TransmissionComponent(model)) model.add_component(RecoveryComponent(model)) model.run() model.save_results("simulation_results.csv") model.plot_results() ``` -------------------------------- ### Run LASER Measles Examples Source: https://github.com/institutefordiseasemodeling/laser/wiki/LASER-pip-package-structure This command demonstrates how to execute example scenarios for the measles LASER model. It is used to generate runnable demos, including notebooks, for specific geographies and time periods. ```python python -m laser_measles.get_examples ``` -------------------------------- ### Python Running the SIR Model Simulation Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/example.rst Demonstrates the execution flow of an SIR model simulation. It involves initializing the `SIRModel` with parameters, adding necessary components like `IntrahostProgression` and `Transmission`, running the simulation, and finally plotting the results. ```python # Initialize the model sir_model = SIRModel(params) # Initialize and add components sir_model.add_component(IntrahostProgression(sir_model)) sir_model.add_component(Transmission(sir_model)) # Run the simulation sir_model.run() # Plot results sir_model.plot_results() ``` -------------------------------- ### Fetch API Schema Source: https://github.com/institutefordiseasemodeling/laser/wiki/Geospatial-Input-Data-&-Webservices Retrieves the API schema from the base URL to guide user input. It makes a GET request to the root endpoint and expects a JSON response. Includes basic error handling for non-200 status codes. ```python import requests BASE_URL = "http://ipadvapp06.linux.idm.ctr:8080" def fetch_schema(): """Fetches the API schema to guide user input.""" response = requests.get(BASE_URL + "/") if response.status_code == 200: return response.json() else: print("[ERROR] Failed to fetch API schema. Please check if the server is running.") exit(1) ``` -------------------------------- ### LASER LaserFrame.squash Example Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/eula.rst Illustrative Python code snippet demonstrating the use of `LaserFrame.squash` for population defragmentation, typically used within a custom squashing function. ```Python from laser.frame import LaserFrame # Assume 'frame' is an instance of LaserFrame # Assume 'disease_state' is a column in the frame # Example: Create a mask for active agents (e.g., not recovered) # This mask should be computed BEFORE calling squash active_mask = frame.disease_state != 2 # Call squash to defragment the frame based on the mask # Only agents where active_mask is True will remain in the active portion frame.squash(active_mask) # After squashing, frame.count reflects the number of active agents. # The data beyond frame.count is considered invalid or overwritten. ``` -------------------------------- ### Install idmlaser with Artifactory URL Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Installs the idmlaser module directly from the IDM Artifactory URL. This method is useful if the user cannot configure their pip.conf file. ```shell pip3 install idmlaser -i https://packages.idmod.org/api/pypi/pypi-production/simple ``` -------------------------------- ### Building a Template Workspace for IDMLaser Source: https://github.com/institutefordiseasemodeling/laser/wiki/Towards-a-LASER-Co‐Pilot Command to create a template workspace, including essential configuration files like settings.py and demographics_settings.py, for first-time users or new simulation setups. ```bash python3 -m idmlaser.utils.build_template_workspace ``` -------------------------------- ### Configure Cloud Calibration Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/calibration.rst Sets the database connection string in the cloud calibration configuration file. This example uses a MySQL database with Optuna for storing calibration results. ```python "mysql+pymysql://optuna:superSecretPassword@localhost:3306/optunaDatabase" ``` -------------------------------- ### Navigate to Sandbox Directory Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Changes the current working directory to the sandbox directory created during the setup phase. This is a prerequisite for running module commands within the sandbox. ```shell pushd /var/tmp/sandbox ``` -------------------------------- ### pyproject.toml Configuration for Local Companion Packages Source: https://github.com/institutefordiseasemodeling/laser/wiki/Moving-Calibration-and-Location-files-into-Companion-Packages Example of how to configure the `pyproject.toml` file to define optional dependencies that point to local companion packages using file paths. This enables pip to correctly link local development versions. ```toml [project.optional-dependencies] nigeria = ["laser-polio-nigeria @ file://../laser-polio-nigeria"] ``` -------------------------------- ### OpenMP Solution Prompting Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/performance.rst Provides an example of how to prompt AI for generating OpenMP solutions, specifically requesting the inclusion of optimal pragmas for parallelization. This is useful when Numba alone is insufficient. ```text "Can you generate an OpenMP solution with the best pragmas?" ``` -------------------------------- ### Run Docker Container for Population Data Tool Source: https://github.com/institutefordiseasemodeling/laser/wiki/Geospatial-Input-Data-&-Webservices Example command to execute a Docker container for population data processing. It mounts the current directory to `/data` for output and specifies the country (NGA) and administrative level (2) as arguments. ```bash docker run -v $(pwd):/data pop_tool_image:latest tool.sh --iso NGA --adm 2 ``` -------------------------------- ### Launch Calibration Workers Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/calibration.rst Starts multiple worker processes to perform the calibration tasks. This command is executed after configuring the calibration parameters. ```shell python3 run_calib_workers.py ``` -------------------------------- ### Python R-like Chaining with Pandas Example Source: https://github.com/institutefordiseasemodeling/laser/wiki/LASER-Modeling-in-R-(2) Provides an example of creating R-like data manipulation workflows in Python using the `pipe` method with pandas DataFrames. This facilitates chaining operations for data preparation and analysis, mimicking R's `%>%` operator. ```python def prepare_population(data): return data.assign(susceptible=lambda x: x['total_population'] - x['infected']) def calculate_infections(data, contact_rate): return data.assign(new_infections=lambda x: x['susceptible'] * contact_rate) # R-like chaining in Python # Assuming population_data is a pandas DataFrame # result = ( # population_data # .pipe(prepare_population) # .pipe(calculate_infections, contact_rate=0.05) # ) ``` -------------------------------- ### Configure pip for IDM Artifactory Source: https://github.com/institutefordiseasemodeling/laser/wiki/Using-idmlaser-as-a-pip-installable-module Shows how to configure the pip configuration file (~/.pip/pip.conf) to use the IDM Artifactory as the package index. This is a prerequisite for the standard user installation. ```shell cat ~/.pip/pip.conf [global] index-url = https://packages.idmod.org/api/pypi/pypi-production/simple ``` -------------------------------- ### Main Execution Flow Source: https://github.com/institutefordiseasemodeling/laser/wiki/Geospatial-Input-Data-&-Webservices Orchestrates the data fetching process by first fetching the API schema, then entering a loop to get user input and fetch population data. Allows the user to perform multiple requests or exit. ```python import requests import io import zipfile import os BASE_URL = "http://ipadvapp06.linux.idm.ctr:8080" def fetch_schema(): """Fetches the API schema to guide user input.""" response = requests.get(BASE_URL + "/") if response.status_code == 200: return response.json() else: print("[ERROR] Failed to fetch API schema. Please check if the server is running.") exit(1) def get_user_input(schema): """Prompts the user for ISO, admin level, and visualization option.""" print("\n=== Population Data Request ===") iso_code = input("Enter 3-letter ISO country code (e.g., KEN, NGA, ETH): ").strip().upper() valid_admin_levels = list(range(4)) # API says valid levels are 0-3 while True: try: adm_level = int(input(f"Enter administrative level {valid_admin_levels}: ").strip()) if adm_level in valid_admin_levels: break else: print(f"[ERROR] Invalid admin level. Choose from {valid_admin_levels}.") except ValueError: print("[ERROR] Please enter a valid integer.") png_option = input("Include visualization (Y/N)? ").strip().lower() png_flag = "1" if png_option == "y" else "0" return iso_code, adm_level, png_flag def fetch_population_data(iso, adm, png): """Fetches population data and processes response.""" params = {"iso": iso.upper(), "adm": adm, "png": png} print(f"[INFO] Sending request to {BASE_URL}/popcsv with parameters: {params}") response = requests.get(BASE_URL + "/popcsv", params=params, stream=True) if response.status_code == 200: content_type = response.headers.get("Content-Type", "") if "application/zip" in content_type: # If ZIP file is returned zip_buffer = io.BytesIO(response.content) with zipfile.ZipFile(zip_buffer, 'r') as zip_ref: zip_ref.extractall("downloads") print(f"[SUCCESS] Files saved in 'downloads/' directory.") for file in zip_ref.namelist(): print(f" - {file}") # If PNG exists, display it if "population_map.png" in zip_ref.namelist(): try: from PIL import Image img_path = os.path.join("downloads", "population_map.png") img = Image.open(img_path) img.show() # Opens the image using the default viewer except ImportError: print("[INFO] Install PIL (Pillow) to view images.") else: # If CSV file is returned csv_filename = f"downloads/{iso.lower()}_adm{adm}_population.csv" with open(csv_filename, "wb") as f: f.write(response.content) print(f"[SUCCESS] CSV file saved as '{csv_filename}'") else: print(f"[ERROR] Failed to fetch data. Status: {response.status_code}") print(response.text) def main(): """Main loop for user interaction.""" schema = fetch_schema() while True: iso, adm, png = get_user_input(schema) fetch_population_data(iso, adm, png) repeat = input("\nWould you like to make another request? (Y/N): ").strip().lower() if repeat != "y": print("Exiting.") break if __name__ == "__main__": main() ``` -------------------------------- ### Python OO Core and FP Interface Example Source: https://github.com/institutefordiseasemodeling/laser/wiki/LASER-Modeling-in-R-(2) Demonstrates a Python class for a disease model (OO) and a function that uses it as an FP interface. This showcases encapsulating simulation logic within an object while providing a simpler, functional entry point for users. ```python # OO Core class DiseaseModel: def __init__(self, population_size, contact_matrix): self.population_size = population_size self.contact_matrix = contact_matrix self.state = None def run_timestep(self): # Update state pass # FP Interface def run_simulation(population_size, contact_matrix, timesteps): model = DiseaseModel(population_size, contact_matrix) for _ in range(timesteps): model.run_timestep() return model.state ``` -------------------------------- ### OpenAI MCP Service Client/Server Interaction Source: https://github.com/institutefordiseasemodeling/laser/wiki/MCP-Notes-&-Experiments Notes on using FastMCP for building an OpenAI MCP service. While example server code runs out-of-the-box, significant challenges exist in establishing basic client/server handshaking for testing. Using ChatGPT to generate FastMCP server code is noted as inefficient. ```python # Conceptual example of FastMCP client/server interaction # This is illustrative and based on user notes, not direct code from text. # Server-side (conceptual) # from fastmcp.server import Server # server = Server(port=8000) # server.start() # Client-side (conceptual) # from fastmcp.client import Client # client = Client(server_address="http://localhost:8000") # try: # # Attempt to establish connection/handshake # response = client.send_request("some_method", payload={...}) # print(response) # except Exception as e: # print(f"Connection or handshake failed: {e}") # Note: User reports significant pain points in getting basic handshaking working. ``` -------------------------------- ### Python FP State Management Example Source: https://github.com/institutefordiseasemodeling/laser/wiki/LASER-Modeling-in-R-(2) Illustrates functional programming style in Python by showing a function that takes explicit state and returns a new state, avoiding in-place mutations. This promotes transparency and predictability in data processing workflows. ```python def simulate_timestep(state, contact_matrix): # Compute next state based on inputs next_state = state.copy() # Avoid mutation # Update logic return next_state ``` -------------------------------- ### Load and Sample US Age Pyramid Source: https://github.com/institutefordiseasemodeling/laser/blob/main/examples/age_pyramid.ipynb Loads a U.S. age pyramid from a CSV file using LASER Core, samples agent ages from the female population distribution, and converts bin indices to specific ages. ```python america = load_pyramid_csv(Path.cwd() / "United States of America-2024.csv") sampler = AliasedDistribution(america[:, FCOL]) # We'll use the female population in this example. n_agents = 100_000 samples = sampler.sample(n_agents) # Sample 100,000 people from the distribution. # samples will be bin indices, so we need to convert them to ages. bin_min_age_days = america[:, MINCOL] * 365 # minimum age for bin, in days (include this value) bin_max_age_days = (america[:, MAXCOL] + 1) * 365 # maximum age for bin, in days (exclude this value) mask = np.zeros(n_agents, dtype=bool) ages = np.zeros(n_agents, dtype=np.int32) for i in range(len(america)): # for each possible bin value... mask[:] = samples == i # ...find the agents that belong to this bin # ...and assign a random age, in days, within the bin ages[mask] = np.random.randint(bin_min_age_days[i], bin_max_age_days[i], mask.sum()) ``` -------------------------------- ### Python Simulation Parameters with PropertySet Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/example.rst Defines simulation parameters using the `PropertySet` class. This includes key values such as population size, infection rate, and the total number of timesteps for the simulation. ```python params = PropertySet({ "population_size": 100_000, "infection_rate": 0.3, "timesteps": 160 }) ``` -------------------------------- ### Running the IDMLaser Measles Simulation Source: https://github.com/institutefordiseasemodeling/laser/wiki/Towards-a-LASER-Co‐Pilot Command to run the measles simulation for the CCS (1 node) example. Requires a settings.py file. If it's the first run, a template workspace needs to be built. ```bash python3 -m idmlaser.measles ``` -------------------------------- ### Load and Sample Nigerian Age Pyramid Source: https://github.com/institutefordiseasemodeling/laser/blob/main/examples/age_pyramid.ipynb Loads a Nigerian age pyramid from a CSV file using LASER Core, samples agent ages from the male population distribution, and converts bin indices to specific ages. ```python from pathlib import Path import numpy as np from laser_core.demographics import AliasedDistribution from laser_core.demographics import load_pyramid_csv MCOL = 2 FCOL = 3 MINCOL = 0 MAXCOL = 1 nigeria = load_pyramid_csv(Path.cwd() / "Nigeria-2024.csv") sampler = AliasedDistribution(nigeria[:, MCOL]) # We'll use the male population in this example. n_agents = 100_000 samples = sampler.sample(n_agents) # Sample 100,000 people from the distribution. # samples will be bin indices, so we need to convert them to ages. bin_min_age_days = nigeria[:, MINCOL] * 365 # minimum age for bin, in days (include this value) bin_max_age_days = (nigeria[:, MAXCOL] + 1) * 365 # maximum age for bin, in days (exclude this value) mask = np.zeros(n_agents, dtype=bool) ages = np.zeros(n_agents, dtype=np.int32) for i in range(len(nigeria)): # for each possible bin value... mask[:] = samples == i # ...find the agents that belong to this bin # ...and assign a random age, in days, within the bin ages[mask] = np.random.randint(bin_min_age_days[i], bin_max_age_days[i], mask.sum()) ``` -------------------------------- ### Python Singleton Pattern Implementation Source: https://github.com/institutefordiseasemodeling/laser/wiki/Consciously-Choosing-OOP‐ness Provides an example of the Singleton design pattern in Python, ensuring a class has only one instance and offering a global access point. Useful for managing shared resources or configurations. ```Python class Singleton: _instance = None def __new__(cls, *args, **kwargs): if not cls._instance: cls._instance = super(Singleton, cls).__new__(cls, *args, **kwargs) return cls._instance def __init__(self): self.incubation_period = 7 def get_incubation_period(self): return self.incubation_period def set_incubation_period(self, new_value): self.incubation_period = new_value ``` -------------------------------- ### SIRModel Class Implementation Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/example.rst Defines the core `SIRModel` class for simulating disease dynamics. It initializes the population using `LaserFrame`, sets up disease state and recovery timer properties, and includes methods for tracking results, running the simulation, and plotting outcomes. Requires `numpy`, `matplotlib.pyplot`, `laser_core.LaserFrame`, and `laser_core.PropertySet`. ```python import numpy as np import matplotlib.pyplot as plt from laser_core import LaserFrame from laser_core import PropertySet class SIRModel: def __init__(self, params): # Model Parameters self.params = params # Initialize the population LaserFrame self.population = LaserFrame(capacity=params.population_size,initial_count=params.population_size) # Add disease state property (0 = Susceptible, 1 = Infected, 2 = Recovered) self.population.add_scalar_property("disease_state", dtype=np.int32, default=0) # Add a recovery timer property (for intrahost progression, optional for timing) self.population.add_scalar_property("recovery_timer", dtype=np.int32, default=0) # Results tracking self.results = LaserFrame( capacity = 1 ) # number of nodes self.results.add_vector_property( "S", length=params["timesteps"], dtype=np.float32 ) self.results.add_vector_property( "I", length=params["timesteps"], dtype=np.float32 ) self.results.add_vector_property( "R", length=params["timesteps"], dtype=np.float32 ) # Components self.components = [] def add_component(self, component): self.components.append(component) def track_results(self, tick): susceptible = (self.population.disease_state == 0).sum() infected = (self.population.disease_state == 1).sum() recovered = (self.population.disease_state == 2).sum() total = self.population.count self.results.S[tick] = susceptible / total self.results.I[tick] = infected / total self.results.R[tick] = recovered / total def run(self): for tick in range(self.params.timesteps): for component in self.components: component.step() self.track_results(tick) def plot_results(self): plt.figure(figsize=(10, 6)) plt.plot(self.results.S, label="Susceptible (S)", color="blue") plt.plot(self.results.I, label="Infected (I)", color="red") plt.plot(self.results.R, label="Recovered (R)", color="green") plt.title("SIR Model Dynamics with LASER Components") plt.xlabel("Time (Timesteps)") plt.ylabel("Fraction of Population") plt.legend() plt.grid() plt.show() plt.savefig("gpt_sir.png") ``` -------------------------------- ### Simulation Parameter Setup Source: https://github.com/institutefordiseasemodeling/laser/blob/main/docs/vdexample.rst Defines simulation parameters using the PropertySet class. This includes setting the initial population size, crude birth rate (CBR), and the total number of simulation timesteps. ```python params = PropertySet({ "population_size": 100_000, "cbr": 15, # Crude Birth Rate: 15 per 1000 per year "timesteps": 365*10 # Run for 10 years }) ``` -------------------------------- ### Templates Example (C++) Source: https://github.com/institutefordiseasemodeling/laser/wiki/Consciously-Choosing-OOP‐ness Shows C++ templates, allowing a 'Box' class to work with any data type 'T'. It includes methods to set and get the content of the box, providing compile-time type safety. ```C++ template class Box { private: T content; public: void setContent(T content) { this->content = content; } T getContent() { return content; } }; ``` -------------------------------- ### Initialize KaplanMeierEstimator and Predict Death for Newborns Source: https://github.com/institutefordiseasemodeling/laser/blob/main/examples/kmestimator.ipynb Demonstrates initializing the KaplanMeierEstimator with cumulative death data and predicting the age at death for a population of newborns. It uses numpy for data handling and the laser_core library. ```Python import numpy as np from laser_core.demographics import KaplanMeierEstimator cumulative = np.array([ 8131, 9396, 10562, 11636, 12620, 13506, 14287, 14958, 15523, 15997, # year 0.. 9 16400, 16756, 17083, 17401, 17725, 18067, 18437, 18837, 19268, 19726, # year 10..19 20207, 20705, 21215, 21732, 22256, 22785, 23319, 23860, 24407, 24961, # year 20..29 25522, 26091, 26668, 27252, 27845, 28446, 29059, 29684, 30324, 30979, # year 30..39 31649, 32334, 33031, 33737, 34452, 35176, 35913, 36666, 37442, 38247, # year 40..49 39085, 39959, 40869, 41815, 42795, 43811, 44866, 45966, 47118, 48330, # year 50..59 49608, 50958, 52380, 53876, 55442, 57080, 58790, 60574, 62435, 64372, # year 60..69 66380, 68451, 70569, 72719, 74880, 77039, 79179, 81288, 83353, 85355, # year 70..79 87274, 89085, 90766, 92299, 93672, 94884, 95936, 96837, 97594, 98216, # year 80..89 98713, 99097, 99383, 99590, 99735, 99833, 99897, 99939, 99965, 99980, # year 90..99 100000, # year 100+ ], dtype=np.int32) estimator = KaplanMeierEstimator(cumulative) nagents = 100_000 dobs = np.zeros(nagents) # dates of birth, newborns = 0 dods = estimator.predict_age_at_death(dobs, max_year=100) # dates of death in days ``` ```matplotlib import matplotlib.pyplot as plt histogram = np.zeros(cumulative.shape[0]+1, np.int32) yods = dods // 365 # years of death for i in range(cumulative.shape[0]+1): histogram[i] = np.sum(yods == i) fig, ax1 = plt.subplots() color = "tab:orange" ax1.set_xlabel("Age") ax1.set_ylabel("Cumulative Sampled Deaths", color=color) ax1.plot(np.cumsum(histogram), color=color, marker="x") ax1.tick_params(axis="y", labelcolor=color) ax2 = ax1.twinx() color = "tab:green" ax2.set_ylabel("Input Cumulative Deaths", color=color) ax2.plot(cumulative, color=color) ax2.tick_params(axis="y", labelcolor=color) fig.tight_layout() plt.show() ``` -------------------------------- ### PyCUDA SEIR Model Implementation Source: https://github.com/institutefordiseasemodeling/laser/wiki/Status An experimental implementation of the SEIR model using PyCUDA. The project notes indicate that PyCUDA setup and implementation can be technically demanding, potentially setting a high bar for user customization. Numba's CUDA support is considered a potential alternative for targeted performance improvements. ```Python # Conceptual PyCUDA kernel structure # from pycuda import driver, gpuarray, compiler # kernel_code = """ # __global__ void seir_kernel(float *S, float *I, float *R, float beta, float gamma, int N) { # int idx = threadIdx.x + blockIdx.x * blockDim.x; # if (idx < N) { # // SEIR model update logic for agent at index idx # // ... calculations ... # } # } # """ # Note: Actual PyCUDA usage involves compiling the kernel, # allocating memory on the GPU, transferring data, launching the kernel, # and retrieving results. This is a complex process. ```