airflow tasks run <dag_id> <task_id> <execution_date_or_run_id>
Run a single task instance
Arguments
Name | Description |
---|
dag_id | The id of the dag |
task_id | The id of the task |
execution_date_or_run_id | The execution_date of the DAG or run_id of the DAGRun |
Options
Name | Description |
---|
-h, --help | Show this help message and exit |
--cfg-path <cfg_path> | Path to config file to use instead of airflow.cfg |
-f, --force | Ignore previous task instance state, rerun regardless if task already succeeded/failed |
-A, --ignore-all-dependencies | Ignores all non-critical dependencies, including ignore_ti_state and ignore_task_deps |
-i, --ignore-dependencies | Ignore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies |
-I, --ignore-depends-on-past | Ignore depends_on_past dependencies (but respect upstream dependencies) |
-N, --interactive | Do not capture standard output and error streams (useful for interactive debugging) |
-j, --job-id <job_id> | |
-l, --local | Run the task using the LocalExecutor |
--map-index <map_index> | Mapped task index |
-m, --mark-success | Mark jobs as succeeded without running them |
--no-shut-down-logging | |
-p, --pickle <pickle> | Serialized pickle object of the entire dag (used internally) |
--pool <pool> | Resource pool to use |
-r, --raw | |
--ship-dag | Pickles (serializes) the DAG and ships it to the worker |
-S, --subdir <subdir> | File location or directory from which to look for the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg' |