# ShellFlow ShellFlow is a minimal shell script orchestrator for mixed local and remote execution. It allows DevOps engineers to write a single shell script with comment markers (`# @LOCAL` and `# @REMOTE `) that define execution boundaries, and ShellFlow runs each block in order while resolving remote targets from SSH configuration. The tool provides fail-fast execution, shared prelude support, and automatic context passing between blocks via the `SHELLFLOW_LAST_OUTPUT` environment variable. The orchestrator is designed to be simple and direct, with a single-file Python implementation that reuses existing `~/.ssh/config` for host definitions. Each block runs in a fresh shell environment, with shell options from the prelude copied into every block. ShellFlow validates remote targets against SSH config before execution, ensuring unknown hosts fail early with clear error messages rather than spawning SSH processes that will fail. ## CLI Commands ### Run a Script Execute a shellflow script with local and remote blocks sequentially, stopping on first failure. ```bash # Basic execution shellflow run playbooks/deploy.sh # With verbose output showing colored progress shellflow run playbooks/deploy.sh --verbose shellflow run playbooks/deploy.sh -v # Using a custom SSH config file shellflow run playbooks/deploy.sh --ssh-config ~/.ssh/config.work # Check version shellflow --version ``` ### Run with UV (Development) Execute shellflow without global installation using UV package manager. ```bash # Run directly with uv uv run shellflow run playbooks/hello.sh # Run with verbose mode uv run shellflow run playbooks/hello.sh -v ``` ## Script Format ### Local Execution Block Mark a block to run on the local machine using the `# @LOCAL` comment marker. ```bash #!/bin/bash set -euo pipefail # @LOCAL echo "Building application locally" tar -czf app.tar.gz ./dist ls -la app.tar.gz ``` ### Remote Execution Block Mark a block to run on a remote host using `# @REMOTE ` where the host must match an entry in SSH config. ```bash #!/bin/bash set -euo pipefail # @REMOTE production-server echo "Deploying to $(hostname)" tar -xzf /tmp/app.tar.gz -C /var/www/html/ systemctl restart nginx ``` ### Mixed Local and Remote Script Combine local and remote blocks in a single script file for complete deployment workflows. ```bash #!/bin/bash set -euo pipefail # @LOCAL echo "Starting deployment..." tar -czf app.tar.gz ./dist # @REMOTE webserver echo "Deploying to $(hostname)" uname -a # @LOCAL echo "Previous block output: $SHELLFLOW_LAST_OUTPUT" echo "Deployment complete!" ``` ### Shared Prelude Lines before the first marker are treated as a shared prelude and prepended to every executable block. ```bash #!/bin/bash set -euo pipefail # Shared prelude above - applied to all blocks # @LOCAL echo "Prelude options (set -euo) are active here" # @REMOTE production echo "Prelude is also active on remote host" ``` ## SSH Configuration ### Configure SSH Hosts ShellFlow resolves remote targets from `~/.ssh/config`. The host alias in `@REMOTE` must match a `Host` entry. ```sshconfig # ~/.ssh/config Host production-server HostName 192.168.1.100 User deploy Port 22 IdentityFile ~/.ssh/id_ed25519 Host staging HostName 10.0.0.50 User admin Port 2222 IdentityFile ~/.ssh/staging_key ``` ```bash # Script using configured hosts # @REMOTE production-server hostname uptime # @REMOTE staging echo "Deploying to staging" ``` ### Custom SSH Config Path Override the default SSH config path using command line or environment variable. ```bash # Via command line shellflow run deploy.sh --ssh-config ./custom_ssh_config # Via environment variable export SHELLFLOW_SSH_CONFIG=./custom_ssh_config shellflow run deploy.sh ``` ## Python API ### parse_script Function Parse a shell script string into a list of executable Block objects. ```python from shellflow import parse_script, Block script_content = """#!/bin/bash set -euo pipefail # @LOCAL echo "Building locally" make build # @REMOTE server1 echo "Deploying to server" ./deploy.sh """ blocks = parse_script(script_content) # Returns list of Block objects for block in blocks: print(f"Target: {block.target}") print(f"Is Local: {block.is_local}") print(f"Is Remote: {block.is_remote}") print(f"Host: {block.host}") print(f"Commands: {block.commands}") print("---") # Output: # Target: LOCAL # Is Local: True # Is Remote: False # Host: None # Commands: ['#!/bin/bash', 'set -euo pipefail', 'echo "Building locally"', 'make build'] # --- # Target: REMOTE:server1 # Is Local: False # Is Remote: True # Host: server1 # Commands: ['#!/bin/bash', 'set -euo pipefail', 'echo "Deploying to server"', './deploy.sh'] ``` ### run_script Function Execute a list of blocks sequentially with fail-fast behavior. ```python from shellflow import parse_script, run_script, RunResult script = """# @LOCAL echo "Step 1: Preparing" mkdir -p /tmp/deploy # @LOCAL echo "Step 2: Building" echo "build-artifact-123" # @LOCAL echo "Step 3: Previous output was: $SHELLFLOW_LAST_OUTPUT" """ blocks = parse_script(script) result: RunResult = run_script(blocks, verbose=True) print(f"Success: {result.success}") print(f"Blocks executed: {result.blocks_executed}") print(f"Error message: {result.error_message}") # Access individual block results for i, block_result in enumerate(result.block_results): print(f"Block {i+1}: exit_code={block_result.exit_code}, output={block_result.output[:50]}") ``` ### execute_local Function Execute a single local block with custom execution context. ```python from shellflow import Block, ExecutionContext, execute_local, ExecutionResult # Create a block with commands block = Block( target="LOCAL", commands=[ 'echo "Current user: $USER"', 'echo "Custom var: $DEPLOY_ENV"', 'pwd' ] ) # Create execution context with custom environment variables context = ExecutionContext( env={"DEPLOY_ENV": "production", "VERSION": "1.2.3"}, last_output="previous-block-output" ) result: ExecutionResult = execute_local(block, context) print(f"Success: {result.success}") print(f"Exit code: {result.exit_code}") print(f"Output: {result.output}") print(f"Error: {result.error_message}") ``` ### execute_remote Function Execute a block on a remote host via SSH. ```python from shellflow import Block, ExecutionContext, SSHConfig, execute_remote # Create remote block block = Block( target="REMOTE:webserver", commands=[ 'hostname', 'uptime', 'df -h /' ] ) # SSH configuration (typically read from ~/.ssh/config) ssh_config = SSHConfig( host="webserver", hostname="192.168.1.100", user="deploy", port=22, identity_file="~/.ssh/deploy_key" ) context = ExecutionContext() result = execute_remote(block, context, ssh_config) print(f"Success: {result.success}") print(f"Output: {result.output}") ``` ### read_ssh_config Function Read SSH configuration for a host from `~/.ssh/config`. ```python from shellflow import read_ssh_config, SSHConfig # Look up host configuration config = read_ssh_config("production-server") if config: print(f"Host: {config.host}") print(f"Hostname: {config.hostname}") print(f"User: {config.user}") print(f"Port: {config.port}") print(f"Identity File: {config.identity_file}") else: print("Host not found in SSH config") ``` ### Block Data Class Represents a block of commands to execute with target information. ```python from shellflow import Block # Local block local_block = Block( target="LOCAL", commands=['echo "hello"', 'echo "world"'] ) print(f"Is local: {local_block.is_local}") # True print(f"Is remote: {local_block.is_remote}") # False print(f"Host: {local_block.host}") # None # Remote block remote_block = Block( target="REMOTE:server1", commands=['hostname', 'uptime'] ) print(f"Is local: {remote_block.is_local}") # False print(f"Is remote: {remote_block.is_remote}") # True print(f"Host: {remote_block.host}") # "server1" ``` ### ExecutionContext Data Class Context passed between block executions with environment variables and last output. ```python from shellflow import ExecutionContext # Create context with custom variables context = ExecutionContext( env={"APP_ENV": "production", "DEBUG": "false"}, last_output="output-from-previous-block", success=True ) # Convert to shell environment (includes system env + custom vars) shell_env = context.to_shell_env() print(f"SHELLFLOW_LAST_OUTPUT: {shell_env['SHELLFLOW_LAST_OUTPUT']}") print(f"APP_ENV: {shell_env['APP_ENV']}") ``` ### Exception Handling Handle ShellFlow-specific exceptions for parsing and execution errors. ```python from shellflow import parse_script, run_script, ParseError, ExecutionError, ShellflowError try: # This will raise ParseError - @REMOTE without host script = """# @REMOTE hostname """ blocks = parse_script(script) except ParseError as e: print(f"Parse error: {e}") try: # Execute blocks that may fail script = """# @LOCAL exit 1 """ blocks = parse_script(script) result = run_script(blocks) if not result.success: print(f"Execution failed: {result.error_message}") except ShellflowError as e: print(f"ShellFlow error: {e}") ``` ## Development Commands ### Testing and Validation Run the test suite, linting, and type checking. ```bash # Using Justfile commands just test # Run pytest unit tests just bdd # Run behave BDD tests just test-all # Run format, lint, test, and bdd just lint # Run ruff linter just format # Format code with ruff just typecheck # Type check with ty # Direct commands uv run pytest -q uv run behave features uv run ruff check . uv run ty check src tests ``` ### Building and Publishing Build and publish the package to PyPI. ```bash # Build the package just build uv build # Publish to PyPI just publish uv publish # GitHub Actions release (via tag push) git tag v0.1.0 git push origin v0.1.0 ``` ## Summary ShellFlow is ideal for DevOps automation scenarios where shell scripts need to execute across multiple environments in a defined sequence. Common use cases include deployment pipelines that build locally and deploy remotely, configuration management across server fleets, and any workflow requiring coordinated local and remote shell execution. The fail-fast behavior ensures that errors are caught early, preventing partial deployments or inconsistent states across environments. Integration with existing infrastructure is straightforward since ShellFlow reuses standard SSH configuration files and requires no additional host definitions. The shared prelude feature allows common shell options (like `set -euo pipefail`) to be defined once and applied to all blocks. Context passing via `SHELLFLOW_LAST_OUTPUT` enables blocks to communicate results, making it possible to chain operations where later blocks depend on earlier output. The Python API provides programmatic access for embedding ShellFlow in larger automation systems or building custom orchestration tools.