| Attribute |
Value |
| dag_id |
crawlers_hourly |
| duration |
None |
| end_date |
None |
| execution_date |
2025-09-02T12:35:10.606343+00:00 |
| executor_config |
{} |
| generate_command |
<function TaskInstance.generate_command at 0x7f0481183b70> |
| hostname |
|
| is_premature |
False |
| job_id |
None |
| key |
('crawlers_hourly', 'Wait', <Pendulum [2025-09-02T12:35:10.606343+00:00]>, 1) |
| log |
<Logger airflow.task (INFO)> |
| log_filepath |
/usr/local/airflow/logs/crawlers_hourly/Wait/2025-09-02T12:35:10.606343+00:00.log |
| log_url |
http://localhost:8080/admin/airflow/log?dag_id=crawlers_hourly&task_id=Wait&execution_date=2025-09-02T12%3A35%3A10.606343%2B00%3A00 |
| logger |
<Logger airflow.task (INFO)> |
| mark_success_url |
http://localhost:8080/admin/airflow/success?task_id=Wait&dag_id=crawlers_hourly&execution_date=2025-09-02T12%3A35%3A10.606343%2B00%3A00&upstream=false&downstream=false |
| max_tries |
2 |
| metadata |
MetaData(bind=None) |
| next_try_number |
1 |
| operator |
None |
| pid |
None |
| pool |
general |
| previous_ti |
<TaskInstance: crawlers_hourly.Wait 2025-08-30 21:59:10.115258+00:00 [None]> |
| priority_weight |
2 |
| queue |
default |
| queued_dttm |
None |
| raw |
False |
| run_as_user |
None |
| start_date |
None |
| state |
None |
| task |
<Task(DummyOperator): Wait> |
| task_id |
Wait |
| test_mode |
False |
| try_number |
1 |
| unixname |
airflow |
| Attribute |
Value |
| adhoc |
False |
| dag |
<DAG: crawlers_hourly> |
| dag_id |
crawlers_hourly |
| depends_on_past |
True |
| deps |
{<TIDep(Previous Dagrun State)>, <TIDep(Trigger Rule)>, <TIDep(Not In Retry Period)>} |
| downstream_list |
[<Task(DummyOperator): End>] |
| downstream_task_ids |
{'End'} |
| email |
['airflow@airflow.com'] |
| email_on_failure |
False |
| email_on_retry |
False |
| end_date |
None |
| execution_timeout |
None |
| executor_config |
{} |
| inlets |
[] |
| lineage_data |
None |
| log |
<Logger airflow.task.operators (INFO)> |
| logger |
<Logger airflow.task.operators (INFO)> |
| max_retry_delay |
None |
| on_failure_callback |
None |
| on_retry_callback |
None |
| on_success_callback |
None |
| outlets |
[] |
| owner |
airflow |
| params |
{} |
| pool |
general |
| priority_weight |
1 |
| priority_weight_total |
2 |
| queue |
default |
| resources |
{'cpus': {'_name': 'CPU', '_units_str': 'core(s)', '_qty': 1}, 'ram': {'_name': 'RAM', '_units_str': 'MB', '_qty': 512}, 'disk': {'_name': 'Disk', '_units_str': 'MB', '_qty': 512}, 'gpus': {'_name': 'GPU', '_units_str': 'gpu(s)', '_qty': 0}} |
| retries |
2 |
| retry_delay |
0:30:00 |
| retry_exponential_backoff |
False |
| run_as_user |
None |
| schedule_interval |
0 * * * * |
| sla |
None |
| start_date |
2020-12-17T00:00:00+00:00 |
| task_concurrency |
None |
| task_id |
Wait |
| task_type |
DummyOperator |
| template_ext |
[] |
| template_fields |
() |
| trigger_rule |
all_success |
| ui_color |
#e8f7e4 |
| ui_fgcolor |
#000 |
| upstream_list |
[<Task(DockerOperator): bcjobs-crawl>, <Task(DockerOperator): linkup-crawl>, <Task(DockerOperator): linkedin-crawl>, <Task(DockerOperator): simplyhired-crawl>] |
| upstream_task_ids |
{'bcjobs-crawl', 'linkup-crawl', 'linkedin-crawl', 'simplyhired-crawl'} |
| wait_for_downstream |
False |
| weight_rule |
downstream |