status
stringclasses
1 value
repo_name
stringclasses
13 values
repo_url
stringclasses
13 values
issue_id
int64
1
104k
updated_files
stringlengths
10
1.76k
title
stringlengths
4
369
body
stringlengths
0
254k
issue_url
stringlengths
38
55
pull_url
stringlengths
38
53
before_fix_sha
stringlengths
40
40
after_fix_sha
stringlengths
40
40
report_datetime
unknown
language
stringclasses
5 values
commit_datetime
unknown
closed
apache/airflow
https://github.com/apache/airflow
13,918
["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "kubernetes_tests/test_kubernetes_pod_operator.py", "kubernetes_tests/test_kubernetes_pod_operator_backcompat.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"]
KubernetesPodOperator with pod_template_file = No Metadata & Wrong Pod Name
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** 1.15.15 **What happened**: If you use the **KubernetesPodOperator** with **LocalExecutor** and you use a **pod_template_file**, the pod created doesn't have metadata like : - dag_id - task_id - ... I want to have a ``privileged_escalation=True`` pod, launched by a KubernetesPodOperator but without the KubernetesExecutor. Is it possible ? **What you expected to happen**: Have the pod launched with privileged escalation & metadata & correct pod-name override. **How to reproduce it**: * have a pod template file : **privileged_runner.yaml** : ```yaml apiVersion: v1 kind: Pod metadata: name: privileged-pod spec: containers: - name: base securityContext: allowPrivilegeEscalation: true privileged: true ``` * have a DAG file with KubernetesOperator in it : **my-dag.py** : ```python ##=========================================================================================## ## CONFIGURATION from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import KubernetesPodOperator from airflow.operators.dummy_operator import DummyOperator from airflow.kubernetes.secret import Secret from kubernetes.client import models as k8s from airflow.models import Variable from datetime import datetime, timedelta from airflow import DAG env = Variable.get("process_env") namespace = Variable.get("namespace") default_args = { 'owner': 'airflow', 'depends_on_past': False, 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5) } ##==============================## ## Définition du DAG dag = DAG( 'transfert-files-to-nexus', start_date=datetime.utcnow(), schedule_interval="0 2 * * *", default_args=default_args, max_active_runs=1 ) ##=========================================================================================## ## Définition des tâches start = DummyOperator(task_id='start', dag=dag) end = DummyOperator(task_id='end', dag=dag) transfertfile = KubernetesPodOperator(namespace=namespace, task_id="transfertfile", name="transfertfile", image="registrygitlab.fr/docker-images/python-runner:1.8.22", image_pull_secrets="registrygitlab-curie", pod_template_file="/opt/bitnami/airflow/dags/git-airflow-dags/privileged_runner.yaml", is_delete_operator_pod=False, get_logs=True, dag=dag) ## Enchainement des tâches start >> transfertfile >> end ``` **Anything else we need to know**: I know that we have to use the ``KubernetesExecutor`` in order to have the **metadata**, but even if you use the ``KubernetesExecutor``, the fact that you have to use the **pod_template_file** for the ``KubernetesPodOperator`` makes no change, because in either ``LocalExecutor`` / ``KubernetesExecutor``you will endup with no pod name override correct & metadata.
https://github.com/apache/airflow/issues/13918
https://github.com/apache/airflow/pull/15492
def1e7c5841d89a60f8972a84b83fe362a6a878d
be421a6b07c2ae9167150b77dc1185a94812b358
"2021-01-26T20:27:09Z"
python
"2021-04-23T22:54:43Z"
closed
apache/airflow
https://github.com/apache/airflow
13,905
["setup.py"]
DockerOperator fails to pull an image
**Apache Airflow version**: 2.0 **Environment**: - **OS** (from /etc/os-release): Debian GNU/Linux 10 (buster) - **Kernel** (`uname -a`): Linux 37365fa0b59b 5.4.0-47-generic #51-Ubuntu SMP Fri Sep 4 19:50:52 UTC 2020 x86_64 GNU/Linux - **Others**: running inside a docker container, forked puckel/docker-airflow **What happened**: `DockerOperator` does not attempt to pull an image unless force_pull is set to True, instead displaying a misleading 404 error. **What you expected to happen**: `DockerOperator` should attempt to pull an image when it is not present locally. **How to reproduce it**: Make sure you don't have an image tagged `debian:buster-slim` present locally. ``` DockerOperator( task_id=f'try_to_pull_debian', image='debian:buster-slim', command=f'''echo hello''', force_pull=False ) ``` prints: `{taskinstance.py:1396} ERROR - 404 Client Error: Not Found ("No such image: ubuntu:latest")` This, on the other hand: ``` DockerOperator( task_id=f'try_to_pull_debian', image='debian:buster-slim', command=f'''echo hello''', force_pull=True ) ``` pulls the image and prints `{docker.py:263} INFO - hello` **Anything else we need to know**: I overrode `DockerOperator` to track down what I was doing wrong and found the following: When trying to run an image that's not present locally, `self.cli.images(name=self.image)` in the line: https://github.com/apache/airflow/blob/8723b1feb82339d7a4ba5b40a6c4d4bbb995a4f9/airflow/providers/docker/operators/docker.py#L286 returns a non-empty array even when the image has been deleted from the local machine. In fact, `self.cli.images` appears to return non-empty arrays even when supplied with nonsense image names. <details><summary>force_pull_false.log</summary> [2021-01-27 06:15:28,987] {__init__.py:124} DEBUG - Preparing lineage inlets and outlets [2021-01-27 06:15:28,987] {__init__.py:168} DEBUG - inlets: [], outlets: [] [2021-01-27 06:15:28,987] {config.py:21} DEBUG - Trying paths: ['/usr/local/airflow/.docker/config.json', '/usr/local/airflow/.dockercfg'] [2021-01-27 06:15:28,987] {config.py:25} DEBUG - Found file at path: /usr/local/airflow/.docker/config.json [2021-01-27 06:15:28,987] {auth.py:182} DEBUG - Found 'auths' section [2021-01-27 06:15:28,988] {auth.py:142} DEBUG - Found entry (registry='https://index.docker.io/v1/', username='xxxxxxx') [2021-01-27 06:15:29,015] {connectionpool.py:433} DEBUG - http://localhost:None "GET /version HTTP/1.1" 200 851 [2021-01-27 06:15:29,060] {connectionpool.py:433} DEBUG - http://localhost:None "GET /v1.41/images/json?filter=debian%3Abuster-slim&only_ids=0&all=0 HTTP/1.1" 200 None [2021-01-27 06:15:29,060] {docker.py:224} INFO - Starting docker container from image debian:buster-slim [2021-01-27 06:15:29,063] {connectionpool.py:433} DEBUG - http://localhost:None "POST /v1.41/containers/create HTTP/1.1" 404 48 [2021-01-27 06:15:29,063] {taskinstance.py:1396} ERROR - 404 Client Error: Not Found ("No such image: debian:buster-slim") Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 261, in _raise_for_status response.raise_for_status() File "/usr/local/lib/python3.8/site-packages/requests/models.py", line 941, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http+docker://localhost/v1.41/containers/create During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1086, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1260, in _prepare_and_execute_task_with_callbacks result = self._execute_task(context, task_copy) File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1300, in _execute_task result = task_copy.execute(context=context) File "/usr/local/lib/python3.8/site-packages/airflow/providers/docker/operators/docker.py", line 305, in execute return self._run_image() File "/usr/local/lib/python3.8/site-packages/airflow/providers/docker/operators/docker.py", line 231, in _run_image self.container = self.cli.create_container( File "/usr/local/lib/python3.8/site-packages/docker/api/container.py", line 427, in create_container return self.create_container_from_config(config, name) File "/usr/local/lib/python3.8/site-packages/docker/api/container.py", line 438, in create_container_from_config return self._result(res, True) File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 267, in _result self._raise_for_status(response) File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 263, in _raise_for_status raise create_api_error_from_http_exception(e) File "/usr/local/lib/python3.8/site-packages/docker/errors.py", line 31, in create_api_error_from_http_exception raise cls(e, response=response, explanation=explanation) docker.errors.ImageNotFound: 404 Client Error: Not Found ("No such image: debian:buster-slim") </details> <details><summary>force_pull_true.log</summary> [2021-01-27 06:17:01,811] {__init__.py:124} DEBUG - Preparing lineage inlets and outlets [2021-01-27 06:17:01,811] {__init__.py:168} DEBUG - inlets: [], outlets: [] [2021-01-27 06:17:01,811] {config.py:21} DEBUG - Trying paths: ['/usr/local/airflow/.docker/config.json', '/usr/local/airflow/.dockercfg'] [2021-01-27 06:17:01,811] {config.py:25} DEBUG - Found file at path: /usr/local/airflow/.docker/config.json [2021-01-27 06:17:01,811] {auth.py:182} DEBUG - Found 'auths' section [2021-01-27 06:17:01,812] {auth.py:142} DEBUG - Found entry (registry='https://index.docker.io/v1/', username='xxxxxxxxx') [2021-01-27 06:17:01,825] {connectionpool.py:433} DEBUG - http://localhost:None "GET /version HTTP/1.1" 200 851 [2021-01-27 06:17:01,826] {docker.py:287} INFO - Pulling docker image debian:buster-slim [2021-01-27 06:17:01,826] {auth.py:41} DEBUG - Looking for auth config [2021-01-27 06:17:01,826] {auth.py:242} DEBUG - Looking for auth entry for 'docker.io' [2021-01-27 06:17:01,826] {auth.py:250} DEBUG - Found 'https://index.docker.io/v1/' [2021-01-27 06:17:01,826] {auth.py:54} DEBUG - Found auth config [2021-01-27 06:17:04,399] {connectionpool.py:433} DEBUG - http://localhost:None "POST /v1.41/images/create?tag=buster-slim&fromImage=debian HTTP/1.1" 200 None [2021-01-27 06:17:04,400] {docker.py:301} INFO - buster-slim: Pulling from library/debian [2021-01-27 06:17:04,982] {docker.py:301} INFO - a076a628af6f: Pulling fs layer [2021-01-27 06:17:05,884] {docker.py:301} INFO - a076a628af6f: Downloading [2021-01-27 06:17:11,429] {docker.py:301} INFO - a076a628af6f: Verifying Checksum [2021-01-27 06:17:11,429] {docker.py:301} INFO - a076a628af6f: Download complete [2021-01-27 06:17:11,480] {docker.py:301} INFO - a076a628af6f: Extracting </details>
https://github.com/apache/airflow/issues/13905
https://github.com/apache/airflow/pull/15731
7933aaf07f5672503cfd83361b00fda9d4c281a3
41930fdebfaa7ed2c53e7861c77a83312ca9bdc4
"2021-01-26T05:49:03Z"
python
"2021-05-09T21:05:49Z"
closed
apache/airflow
https://github.com/apache/airflow
13,891
["airflow/api_connexion/endpoints/dag_run_endpoint.py", "airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", "airflow/www/templates/airflow/dags.html", "airflow/www/views.py", "docs/apache-airflow/security/access-control.rst", "tests/api_connexion/endpoints/test_dag_run_endpoint.py", "tests/www/test_views.py"]
RBAC Granular DAG Permissions don't work as intended
Previous versions (before 2.0) allowed for granular can_edit DAG permissions so that different user groups can trigger different DAGs and access control is more specific. Since 2.0 it seems that this does not work as expected. How to reproduce: Create a copy of the VIEWER role, try adding it can dag edit on a specific DAG. **Expected Result:** user can trigger said DAG. **Actual Result:** user access is denied. It seems to be a new parameter was added: **can create on DAG runs** and without it the user cannot run DAGs, however, with it, the user can run all DAGs without limitations and I believe this is an unintended use.
https://github.com/apache/airflow/issues/13891
https://github.com/apache/airflow/pull/13922
568327f01a39d6f181dda62ef6a143f5096e6b97
629abfdbab23da24ca45996aaaa6e3aa094dd0de
"2021-01-25T13:55:12Z"
python
"2021-02-03T03:16:18Z"
closed
apache/airflow
https://github.com/apache/airflow
13,877
["airflow/migrations/versions/cf5dc11e79ad_drop_user_and_chart.py"]
Upgrading 1.10 sqlite database in 2.0 fails
While it is not an important case it might be annoying to users that if they used airflow 1.10 with sqlite, the migration to 2.0 will fail on dropping constraints in `known_event` table. It would be great to provide some more useful message then asking the user to remove the sqlite database. ``` [2021-01-24 08:38:42,015] {db.py:678} INFO - Creating tables INFO [alembic.runtime.migration] Context impl SQLiteImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. INFO [alembic.runtime.migration] Running upgrade 03afc6b6f902 -> cf5dc11e79ad, drop_user_and_chart Traceback (most recent call last): File "/Users/vijayantsoni/.virtualenvs/airflow/bin/airflow", line 11, in <module> sys.exit(main()) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main args.func(args) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command return func(*args, **kwargs) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb db.initdb() File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/utils/db.py", line 549, in initdb upgradedb() File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/utils/db.py", line 688, in upgradedb command.upgrade(config, 'heads') File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/command.py", line 294, in upgrade script.run_env() File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/script/base.py", line 481, in run_env util.load_python_file(self.dir, "env.py") File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/util/pyfiles.py", line 97, in load_python_file module = load_module_py(module_id, path) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/util/compat.py", line 182, in load_module_py spec.loader.exec_module(module) File "<frozen importlib._bootstrap_external>", line 783, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/migrations/env.py", line 108, in <module> run_migrations_online() File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online context.run_migrations() File "<string>", line 8, in run_migrations File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/runtime/environment.py", line 813, in run_migrations self.get_context().run_migrations(**kw) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/runtime/migration.py", line 560, in run_migrations step.migration_fn(**kw) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/airflow/migrations/versions/cf5dc11e79ad_drop_user_and_chart.py", line 49, in upgrade op.drop_constraint('known_event_user_id_fkey', 'known_event') File "<string>", line 8, in drop_constraint File "<string>", line 3, in drop_constraint File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/operations/ops.py", line 148, in drop_constraint return operations.invoke(op) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/operations/base.py", line 354, in invoke return fn(self, operation) File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/operations/toimpl.py", line 160, in drop_constraint operations.impl.drop_constraint( File "/Users/vijayantsoni/.virtualenvs/airflow/lib/python3.8/site-packages/alembic/ddl/sqlite.py", line 52, in drop_constraint raise NotImplementedError( NotImplementedError: No support for ALTER of constraints in SQLite dialectPlease refer to the batch mode feature which allows for SQLite migrations using a copy-and-move strategy. ```
https://github.com/apache/airflow/issues/13877
https://github.com/apache/airflow/pull/13921
df11a1d7dcc4e454b99a71805c133c3d15c197dc
7f45e62fdf1dd5df50f315a4ab605b619d4b848c
"2021-01-24T17:30:47Z"
python
"2021-01-29T19:37:10Z"
closed
apache/airflow
https://github.com/apache/airflow
13,843
["airflow/api_connexion/endpoints/log_endpoint.py", "tests/api_connexion/endpoints/test_log_endpoint.py"]
Task not found exception in get logs api
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)**: NA **Environment**: Docker - **OS**: CentOS Linux 7 (Core) - **Python version**: 3.6.8 **What happened**: Every time I call get_log api (https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/get_log) to get logs for a specific task instance that is not in the dag now, I get the TaskNotFound exception. ```Traceback (most recent call last): File "/usr/local/lib64/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/usr/local/lib64/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/usr/local/lib64/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/usr/local/lib64/python3.6/site-packages/flask/_compat.py", line 39, in reraise raise value File "/usr/local/lib64/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/usr/local/lib64/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/usr/local/lib/python3.6/site-packages/connexion/decorators/decorator.py", line 48, in wrapper response = function(request) File "/usr/local/lib/python3.6/site-packages/connexion/decorators/uri_parsing.py", line 144, in wrapper response = function(request) File "/usr/local/lib/python3.6/site-packages/connexion/decorators/validation.py", line 384, in wrapper return function(request) File "/usr/local/lib/python3.6/site-packages/connexion/decorators/response.py", line 103, in wrapper response = function(request) File "/usr/local/lib/python3.6/site-packages/connexion/decorators/parameter.py", line 121, in wrapper return function(**kwargs) File "/usr/local/lib/python3.6/site-packages/airflow/api_connexion/security.py", line 47, in decorated return func(*args, **kwargs) File "/usr/local/lib/python3.6/site-packages/airflow/utils/session.py", line 65, in wrapper return func(*args, session=session, **kwargs) File "/usr/local/lib/python3.6/site-packages/airflow/api_connexion/endpoints/log_endpoint.py", line 74, in get_log ti.task = dag.get_task(ti.task_id) File "/usr/local/lib/python3.6/site-packages/airflow/models/dag.py", line 1527, in get_task raise TaskNotFound(f"Task {task_id} not found") airflow.exceptions.TaskNotFound: Task 0-1769e47c-5933-42f9-ac59-b59c7de13382 not found ``` **What you expected to happen**: Even if the task is not in the dag now I expect to get its log in a past run. **How to reproduce it**: Create a dag with a few tasks and run it. Then remove a task from the dag and try to get the log of the removed task in the past run using the api. **Anything else we need to know**: The problem is that in https://github.com/apache/airflow/blob/master/airflow/api_connexion/endpoints/log_endpoint.py at line 73 there is a call to get the task from current dag without catching the TaskNotFound exception.
https://github.com/apache/airflow/issues/13843
https://github.com/apache/airflow/pull/13872
f473ca7130f844bc59477674e641b42b80698bb7
dfbccd3b1f62738e0d5be15a9d9485976b4d8756
"2021-01-22T16:47:51Z"
python
"2021-01-24T13:49:27Z"
closed
apache/airflow
https://github.com/apache/airflow
13,805
["airflow/cli/commands/task_command.py"]
Could not get scheduler_job_id
**Apache Airflow version:** 2.0.0 **Kubernetes version (if you are using kubernetes) (use kubectl version):** 1.18.3 **Environment:** Cloud provider or hardware configuration: AWS **What happened:** When trying to run a DAG, it gets scheduled, but task is never run. When attempting to run task manually, it shows an error: ``` Something bad has happened. Please consider letting us know by creating a bug report using GitHub. Python version: 3.8.7 Airflow version: 2.0.0 Node: airflow-web-ffdd89d6-h98vj ------------------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/usr/local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise raise value File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/usr/local/lib/python3.8/site-packages/airflow/www/auth.py", line 34, in decorated return func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/airflow/www/decorators.py", line 60, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/airflow/www/views.py", line 1366, in run executor.start() File "/usr/local/lib/python3.8/site-packages/airflow/executors/kubernetes_executor.py", line 493, in start raise AirflowException("Could not get scheduler_job_id") airflow.exceptions.AirflowException: Could not get scheduler_job_id ``` **What you expected to happen:** The task to be run successfully without **How to reproduce it:** Haven't pinpointed what causes the issue, besides an attempted upgrade from Airflow 1.10.14 to Airflow 2.0.0 **Anything else we need to know:** This error is encountered in an upgrade of Airflow from 1.10.14 to Airflow 2.0.0 EDIT: Formatted to fit the issue template
https://github.com/apache/airflow/issues/13805
https://github.com/apache/airflow/pull/16108
436e0d096700c344e7099693d9bf58e12658f9ed
cdc9f1a33854254607fa81265a323cf1eed6d6bb
"2021-01-21T10:09:05Z"
python
"2021-05-27T12:50:03Z"
closed
apache/airflow
https://github.com/apache/airflow
13,799
["airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py", "airflow/models/taskinstance.py"]
Scheduler crashes when unpausing some dags with: TypeError: '>' not supported between instances of 'NoneType' and 'int'
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.15 **Environment**: - **Cloud provider or hardware configuration**: GKE - **OS** (e.g. from /etc/os-release): Ubuntu 18.04 **What happened**: I just migrated from 1.10.14 to 2.0.0. When I turn on some random dags, the scheduler crashes with the following error: ```python Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute self._run_scheduler_loop() File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1377, in _run_scheduler_loop num_queued_tis = self._do_scheduling(session) File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1533, in _do_scheduling num_queued_tis = self._critical_section_execute_task_instances(session=session) File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1132, in _critical_section_execute_task_instances queued_tis = self._executable_task_instances_to_queued(max_tis, session=session) File "/usr/local/lib/python3.6/dist-packages/airflow/utils/session.py", line 62, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/airflow/jobs/scheduler_job.py", line 1034, in _executable_task_instances_to_queued if task_instance.pool_slots > open_slots: TypeError: '>' not supported between instances of 'NoneType' and 'int' ``` **What you expected to happen**: I expected those dags would have their tasks scheduled without problems. **How to reproduce it**: Can't reproduce it yet. Still trying to figure out if this happens only with specific dags or not. **Anything else we need to know**: I couldn't find in which context `task_instance.pool_slots` could be None
https://github.com/apache/airflow/issues/13799
https://github.com/apache/airflow/pull/14406
c069e64920da780237a1e1bdd155319b007a2587
f763b7c3aa9cdac82b5d77e21e1840fbe931257a
"2021-01-20T22:08:00Z"
python
"2021-02-25T02:56:40Z"
closed
apache/airflow
https://github.com/apache/airflow
13,797
["airflow/sentry.py", "airflow/utils/session.py", "tests/utils/test_session.py"]
Sentry celery dag task run error
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: - **Cloud provider or hardware configuration**: AWS - **OS** (e.g. from /etc/os-release): Centos7 - **Kernel** (e.g. `uname -a`): Linux 3.10.0-693.5.2.el7.x86_64 - **Install tools**: celery==4.4.0, sentry-sdk==0.19.5 - **Others**: python 3.6.8 **What happened**: We see this in the sentry error logs randomly for all dag tasks: `TypeError in airflow.executors.celery_executor.execute_command` ``` TypeError: _run_mini_scheduler_on_child_tasks() got multiple values for argument 'session' File "airflow/sentry.py", line 159, in wrapper return func(task_instance, *args, session=session, **kwargs) ``` **What you expected to happen**: No error in sentry. **How to reproduce it**: Schedule or manually run a dag task such as PythonOperator. The error msg will appear when airflow runs dag task. The error will not appear in the airflow web server logs but only on Sentry server. **Anything else we need to know**: N/A
https://github.com/apache/airflow/issues/13797
https://github.com/apache/airflow/pull/13929
24aa3bf02a2f987a68d1ff5579cbb34e945fa92c
0e8698d3edb3712eba0514a39d1d30fbfeeaec09
"2021-01-20T19:39:49Z"
python
"2021-03-19T21:40:22Z"
closed
apache/airflow
https://github.com/apache/airflow
13,774
["airflow/providers/amazon/aws/operators/s3_copy_object.py"]
add acl_policy to S3CopyObjectOperator
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** <!-- A short description of your feature --> **Use case / motivation** <!-- What do you want to happen? Rather than telling us how you might implement this solution, try to take a step back and describe what you are trying to achieve. --> **Are you willing to submit a PR?** <!--- We accept contributions! --> **Related Issues** <!-- Is there currently another issue associated with this? -->
https://github.com/apache/airflow/issues/13774
https://github.com/apache/airflow/pull/13773
9923d606d2887c52390a30639fc1ee0d4000149c
29730d720066a4c16d524e905de8cdf07e8cd129
"2021-01-19T21:53:18Z"
python
"2021-01-20T15:16:25Z"
closed
apache/airflow
https://github.com/apache/airflow
13,761
["airflow/example_dags/tutorial.py", "airflow/models/baseoperator.py", "airflow/serialization/schema.json", "airflow/www/utils.py", "airflow/www/views.py", "docs/apache-airflow/concepts.rst", "tests/serialization/test_dag_serialization.py", "tests/www/test_utils.py"]
Markdown from doc_md is not being rendered in ui
**Apache Airflow version**: 1.10.14 **Environment**: - **Cloud provider or hardware configuration**: docker - **OS** (e.g. from /etc/os-release): apache/airflow:1.10.14-python3.8 - **Kernel** (e.g. `uname -a`): Linux host 5.4.0-62-generic #70-Ubuntu SMP Tue Jan 12 12:45:47 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: Docker version 19.03.8, build afacb8b7f0 - **Others**: **What happened**: I created a DAG and set the doc_md property on the object but it isn't being rendered in the UI. **What you expected to happen**: I expected the markdown to be rendered in the UI **How to reproduce it**: Created a new container using the `airflow:1.10.14`, I have tried the following images with the same results. - airflow:1.10.14:image-python3.8 - airflow:1.10.14:image-python3.7 - airflow:1.10.12:image-python3.7 - airflow:1.10.12:image-python3.7 ``` dag_docs = """ ## Pipeline #### Purpose This is a pipeline """ dag = DAG( 'etl-get_from_api', default_args=default_args, description='A simple dag', schedule_interval=timedelta(days=1), ) dag.doc_md = dag_docs ``` ![image](https://user-images.githubusercontent.com/29732449/105004686-6b77d680-5a88-11eb-9e34-c8dd38b3fd10.png) ![image](https://user-images.githubusercontent.com/29732449/105004748-7af71f80-5a88-11eb-811c-11bc6a351c71.png) I have also tried with using a doc-string to populate the doc_md as well as adding some text within the constructor. ``` dag = DAG( 'etl-get_from_api', default_args=default_args, description='A simple dag', schedule_interval=timedelta(days=1), doc_md = "some text" ) ``` All of the different permutations I've tried seem to have the same result. The only thing I can change is the description, that appears to show up correctly. **Anything else we need to know**: I have tried multiple browsers (Firefox and Chrome) and I have also done an inspect on from both the graph view and the tree view from within the dag but I can't find any of the text within the page at all.
https://github.com/apache/airflow/issues/13761
https://github.com/apache/airflow/pull/15191
7c17bf0d1e828b454a6b2c7245ded275b313c792
e86f5ca8fa5ff22c1e1f48addc012919034c672f
"2021-01-19T08:10:12Z"
python
"2021-04-05T02:46:41Z"
closed
apache/airflow
https://github.com/apache/airflow
13,755
["airflow/config_templates/airflow_local_settings.py", "airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/providers/elasticsearch/log/es_task_handler.py", "tests/providers/elasticsearch/log/test_es_task_handler.py"]
Elasticsearch log retrieval fails when "host" field is not a string
**Apache Airflow version**: 2.0.0 **Kubernetes version:** 1.17.16 **OS** (e.g. from /etc/os-release): Ubuntu 18.4 **What happened**: Webserver gets exception when reading logs from Elasticsearch when "host" field in the log is not a string. Recent Filebeat template mapping creates host as an object with "host.name", "host.os" etc. ``` [2021-01-18 23:53:27,923] {app.py:1891} ERROR - Exception on /get_logs_with_metadata [GET] Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app response = self.full_dispatch_request() File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request rv = self.handle_user_exception(e) File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1820, in handle_user_exception reraise(exc_type, exc_value, tb) File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise raise value File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1949, in full_dispatch_request rv = self.dispatch_request() File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1935, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/usr/local/lib/python3.7/site-packages/airflow/www/auth.py", line 34, in decorated return func(*args, **kwargs) File "/usr/local/lib/python3.7/site-packages/airflow/www/decorators.py", line 60, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 65, in wrapper return func(*args, session=session, **kwargs) File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1054, in get_logs_with_metadata logs, metadata = task_log_reader.read_log_chunks(ti, try_number, metadata) File "/usr/local/lib/python3.7/site-packages/airflow/utils/log/log_reader.py", line 58, in read_log_chunks logs, metadatas = self.log_handler.read(ti, try_number, metadata=metadata) File "/usr/local/lib/python3.7/site-packages/airflow/utils/log/file_task_handler.py", line 217, in read log, metadata = self._read(task_instance, try_number_element, metadata) File "/usr/local/lib/python3.7/site-packages/airflow/providers/elasticsearch/log/es_task_handler.py", line 161, in _read logs_by_host = self._group_logs_by_host(logs) File "/usr/local/lib/python3.7/site-packages/airflow/providers/elasticsearch/log/es_task_handler.py", line 130, in _group_logs_by_host grouped_logs[key].append(log) TypeError: unhashable type: 'AttrDict' ``` **What you expected to happen**: Airflow Webserver successfully pulls the logs, replacing host value with default if needed. <!-- What do you think went wrong? --> The issue comes from this line. When "host" is a dictionary, it tries to insert it as a key to the `grouped_logs` dictionary, which throws `unhashable type: 'AttrDict'`. ``` def _group_logs_by_host(logs): grouped_logs = defaultdict(list) for log in logs: key = getattr(log, 'host', 'default_host') grouped_logs[key].append(log) # ---> fails when key is a dict ``` **How to reproduce it**: I don't know how to concisely write this and make it easy to read at the same time. 1- Configure Airflow to read logs from Elasticsearch ``` [elasticsearch] host = http://localhost:9200 write_stdout = True json_format = True ``` 2 - Load index template where host is an object [May need to add other fields to this template as well]. Filebeat adds this by default (and many more fields). ``` PUT _template/filebeat-airflow { "order": 1, "index_patterns": [ "filebeat-airflow-*" ], "mappings": { "doc": { "properties": { "host": { "properties": { "name": { "type": "keyword", "ignore_above": 1024 }, "id": { "type": "keyword", "ignore_above": 1024 }, "architecture": { "type": "keyword", "ignore_above": 1024 }, "ip": { "type": "ip" }, "mac": { "type": "keyword", "ignore_above": 1024 } } } } } } } ``` 3 - Post sample log and fill in `log_id` field for a valid dag run. ``` curl -X POST -H 'Content-Type: application/json' -i 'http://localhost:9200/filebeat-airflow/_doc' --data '{"message": "test log message", "log_id": "<fill-in-with-valid-example>", "offset": "1"}' ``` 4 - Go to WebUI and try to view logs for dag_run. **Workaround:** Remove host field completely with filebeat. **Solution:** Do a type check if the extracted `host` field is a string, if not use the default value. **Solution2:** Make host field name configurable so that we can set it to be `host.name` instead of hardcoded `'host'`. If I have time I will submit the fix. I never submitted a commit before so I don't know how long it will take me to prepare a proper commit for this.
https://github.com/apache/airflow/issues/13755
https://github.com/apache/airflow/pull/14625
86b9d3b1e8b2513aa3f614b9a8eba679cdfd25e0
5cd0bf733b839951c075c54e808a595ac923c4e8
"2021-01-19T04:08:57Z"
python
"2021-06-11T18:32:42Z"
closed
apache/airflow
https://github.com/apache/airflow
13,750
["airflow/sensors/sql.py", "tests/sensors/test_sql_sensor.py"]
Support Standard SQL in BigQuery Sensor
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** A sql sensor which uses Standard SQL due to default one uses legacy sql **Use case / motivation** Currently (correct me if I am wrong!), the sql sensor only supports legacy sql. If I want to poke a BQ table, I do not think I can do that using standard sql right now. **Are you willing to submit a PR?** If community approves of this idea, sure!
https://github.com/apache/airflow/issues/13750
https://github.com/apache/airflow/pull/18431
83b51e53062dc596a630edd4bd01407a556f1aa6
314a4fe0050783ebb43b300c4c950667d1ddaa89
"2021-01-18T19:35:41Z"
python
"2021-11-26T15:04:23Z"
closed
apache/airflow
https://github.com/apache/airflow
13,746
["CONTRIBUTING.rst"]
Broken link on CONTRIBUTING.rst
Version Airflow 2.0, or most current version In CONTRIBUTING.rst under the section "How to rebase a PR", [link to the docs section](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#id14) for reference, the link to Resolve conflicts link has an erroneous period at the end of the URL: [current incorrect link](https://www.jetbrains.com/help/idea/resolving-conflicts.html.) The link should be https://www.jetbrains.com/help/idea/resolving-conflicts.html - without the period. The link works without the period. Steps to reproduce: Click on the Resolve conflicts link on the page CONTRIBUTING.rst in the documentation. I would like to submit a PR to fix this, if someone would like to assist me in the review 😄
https://github.com/apache/airflow/issues/13746
https://github.com/apache/airflow/pull/13748
85a3ce1a47e0b84bac518e87481e92d266edea31
b103a1dd0e22b67dcc8cb2a28a5afcdfb7554412
"2021-01-18T16:35:26Z"
python
"2021-01-18T18:29:25Z"
closed
apache/airflow
https://github.com/apache/airflow
13,744
["airflow/api_connexion/endpoints/connection_endpoint.py", "tests/api_connexion/endpoints/test_connection_endpoint.py"]
REST API Connection Endpoint doesn't return the extra field in response
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: Apache Airflow: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): Distributor ID: Ubuntu Description: Ubuntu 18.04.5 LTS Release: 18.04 Codename: bionic - **Kernel** (e.g. `uname -a`): Linux Personal 5.4.0-62-generic #70~18.04.1-Ubuntu SMP Tue Jan 12 17:18:00 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: - **Others**: **What happened**: <!-- (please include exact error messages if you can) --> REST API doesn't return the **extra** field of the connection in the response. **What you expected to happen**: <!-- What do you think went wrong? --> It should return all the fields as shown in the documentation. ![Screenshot from 2021-01-18 20-10-09](https://user-images.githubusercontent.com/15157792/104928902-38224280-59c9-11eb-814a-3f359c0796f2.png) **How to reproduce it**: <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md style of ![alt text](http://url/to/img.png) To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> Create one connection with id **leads_ec2** and define values as shown in the screenshot. ![Screenshot from 2021-01-18 19-49-48](https://user-images.githubusercontent.com/15157792/104927763-d44b4a00-59c7-11eb-9139-da83d7098b3c.png) Now call the below API endpoint to get the connection details. And as shown in the screenshot it doesn't include the extra field in the response. **API Endpoint** : `http://localhost:8000/api/v1/connections/leads_ec2` ![Screenshot from 2021-01-18 19-50-07](https://user-images.githubusercontent.com/15157792/104928126-491e8400-59c8-11eb-80d5-84b52e812d8e.png) **How often does this problem occur? Once? Every time etc?**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> --> Same for other connection_id. It doesn't return the extra field in the response.
https://github.com/apache/airflow/issues/13744
https://github.com/apache/airflow/pull/13885
31b956c6c22476d109c45c99d8a325c5c1e0fd45
adf7755eaa67bd924f6a4da0498bce804da1dd4b
"2021-01-18T14:42:08Z"
python
"2021-01-25T09:52:16Z"
closed
apache/airflow
https://github.com/apache/airflow
13,741
["airflow/stats.py", "tests/core/test_stats.py"]
Airflow 2.0 does not send metrics to statsD when Scheduler is run with Daemon mode
**Apache Airflow version**: 2.0.0 **Environment**: - **OS** (e.g. from /etc/os-release): Ubuntu 20.04 LTS - **Python version**: 3.8 - **Kernel** (e.g. `uname -a`): x86_64 x86_64 x86_64 GNU/Linux 5.4.0-58-generic #64-Ubuntu - **Install tools**: pip **What happened**: Airflow 2.0 does not send metrics to statsD. I configure Airflow with official documentation (https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/metrics.html) and by this article https://dstan.medium.com/run-airflow-statsd-grafana-locally-16b372c86524 (but I used port 8125). I turned on statsD: ```ini statsd_on = True statsd_host = localhost statsd_port = 8125 statsd_prefix = airflow ``` But I do not see airflow metrics at http://localhost:9102/metrics (statsD metrics endpoint). --- P.S. I noticed this error just using Airflow 2.0. In version 1.10.13 everything is ok in the same environment. Thank you for advance.
https://github.com/apache/airflow/issues/13741
https://github.com/apache/airflow/pull/14454
cfa1071eaf0672dbf2b2825c3fd6affaca68bdee
0aa597e2ffd71d3587f629c0a1cb3d904e07b6e6
"2021-01-18T12:26:52Z"
python
"2021-02-26T14:45:56Z"
closed
apache/airflow
https://github.com/apache/airflow
13,713
["airflow/www/static/css/main.css"]
Airflow web server UI bouncing horizontally at some viewport widths
**Apache Airflow version**: 2.0.0 **Environment**: Ubuntu 20.04 LTS, Python 3.8.6 via pyenv - **OS** (e.g. from /etc/os-release): 20.04.1 LTS (Focal Fossa) - **Kernel** (e.g. `uname -a`): Linux DESKTOP-QBFDUA0 4.19.104-microsoft-standard #1 SMP Wed Feb 19 06:37:35 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: Following steps in https://airflow.apache.org/docs/apache-airflow/stable/start.html **What happened**: I followed the quickstart here (https://airflow.apache.org/docs/apache-airflow/stable/start.html) to start Airflow on my machine. Then, I followed the tutorial here (https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html) to create my own DAG after disabling the example DAGs via the config file. The bouncing problem I'm reporting I actually noticed as soon as I launched Airflow. I'm just explaining what steps I took to get to what you see in the GIF below. When I opened the Airflow UI in my browser, it appeared to "bounce" left and right. This happened on multiple pages. It seemed to happen only at certain widths bigger than the mobile width. At a large width, it didn't happen. I captured a GIF to try to demonstrate it: ![airflow_bouncing_problem](https://user-images.githubusercontent.com/7719209/104796253-a45f3500-5782-11eb-8b8a-ffc5b24c90cc.gif) I didn't see any JS errors in the console in dev tools as this was happening. **What you expected to happen**: A bounce-free **Airflow experience**™️ **What do you think went wrong?**: CSS? I'm not qualified for this magical front end stuff tbh. **How to reproduce it**: Run the steps I described above on Ubuntu 20.04 LTS or a similar Linux operating system, using Python 3. **Anything else we need to know**: n/a **How often does this problem occur? Once? Every time etc?** Every time I launch the Airflow web server and scheduler and load it at `localhost:8080`.
https://github.com/apache/airflow/issues/13713
https://github.com/apache/airflow/pull/13857
b9eb51a0fb32cd660a5459d73d7323865b34dd99
f72be51aeca5edb5696a9feb2acb4ff8f6bcc658
"2021-01-16T03:43:42Z"
python
"2021-01-25T22:03:26Z"
closed
apache/airflow
https://github.com/apache/airflow
13,704
["airflow/operators/branch.py", "tests/operators/test_branch_operator.py"]
BaseBranchOperator should push to xcom by default
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Not relevant **Environment**: Not relevant **What happened**: BranchPythonOperator performs xcom push by default since this is the behavior of PythonOperator. However BaseBranchOperator doesn't do xcom push. Note: It's impossible to push to xcom manually because the BaseBranchOperator has no return in it's execute method. So even when using `do_xcom_push=True` it won't help https://github.com/apache/airflow/blob/master/airflow/operators/branch.py#L52 **What you expected to happen**: BaseBranchOperator to do xcom push of the branch it choose to follow as the default or at least to support the parameter of `do_xcom_push=True`
https://github.com/apache/airflow/issues/13704
https://github.com/apache/airflow/pull/13763
3fd5ef355556cf0ad7896bb570bbe4b2eabbf46e
3e257950990a6edd817c372036352f96d4f8a76b
"2021-01-15T17:51:00Z"
python
"2021-01-21T01:16:32Z"
closed
apache/airflow
https://github.com/apache/airflow
13,700
["airflow/models/dag.py", "tests/models/test_dag.py"]
Partial subset DAGs do not update task group's used_group_ids
**Apache Airflow version**: 2.0.0 **Environment**: - **Cloud provider or hardware configuration**: Docker container - **OS** (e.g. from /etc/os-release): Debian Stretch **What happened**: When working on some custom DAG override logic, I noticed that invoking `DAG.partial_subset` does not properly update the corresponding `_task_group.used_group_ids` on the returned subset DAG, such that adding back a task which was excluded during the `partial_subset` operation fails. **What you expected to happen**: Tasks that had already been added to the original DAG can be added again to the subset DAG returned by `DAG.partial_subset` **How to reproduce it**: Create any DAG with a single task called, e.g. `my-task`, then invoke `dag.partial_subset(['not-my-task'], False, False)` Note that the returned subset DAG's `_task_group.used_group_ids` still contains `my-task` even though it was not included in the subset DAG itself **Anything else we need to know**: I was able to work around this by adding logic to update the new partial subset DAG's `_task_group.used_group_ids` manually, but this should really be done as part of the `DAG.partial_subset` logic
https://github.com/apache/airflow/issues/13700
https://github.com/apache/airflow/pull/15308
42a1ca8aab905a0eb1ffb3da30cef9c76830abff
1e425fe6459a39d93a9ada64278c35f7cf0eab06
"2021-01-15T14:47:54Z"
python
"2021-04-20T18:08:52Z"
closed
apache/airflow
https://github.com/apache/airflow
13,697
["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"]
Email config section is incorrect
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): n/a **Environment**: This pertains to the docs - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: I see [here](https://airflow.apache.org/docs/apache-airflow/stable/howto/email-config.html#email-configuration) it says to set `subject_template` and `html_content_template` under the email header, but in the [configuration references](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#email) it doesn't show those two fields. Have they been removed for some reason?
https://github.com/apache/airflow/issues/13697
https://github.com/apache/airflow/pull/13709
74b2cd7364df192a8b53d4734e33b07e69864acc
1ab19b40fdea3d6399fcab4cd8855813e0d232cf
"2021-01-15T14:02:02Z"
python
"2021-01-16T01:11:35Z"
closed
apache/airflow
https://github.com/apache/airflow
13,685
["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"]
scheduler dies with "sqlalchemy.exc.IntegrityError: (MySQLdb._exceptions.IntegrityError) (1062, "Duplicate entry 'huge_demo13499411352-2021-01-15 01:04:00.000000' for key 'dag_run.dag_id'")"
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: tencent cloud - **OS** (e.g. from /etc/os-release): centos7 - **Kernel** (e.g. `uname -a`): 3.10 - **Install tools**: - **Others**: Server version: 8.0.22 MySQL Community Server - GPL **What happened**: Scheduler died when I try to modify a dag's schedule_interval from "None" to "* */1 * * *"(I edited the dag file in the dag folder and saved it). A few minutes later I tried to start the scheduler again and it began to run. And the logs are as follows: ``` {2021-01-15 09:06:22,636} {{scheduler_job.py:1293}} ERROR - Exception when executing SchedulerJob._run_scheduler_loop Traceback (most recent call last): File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context self.dialect.do_execute( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 609, in do_execute cursor.execute(statement, parameters) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 209, in execute res = self._query(query) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 315, in _query db.query(q) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/connections.py", line 239, in query _mysql.connection.query(self, query) MySQLdb._exceptions.IntegrityError: (1062, "Duplicate entry 'huge_demo13499411352-2021-01-15 01:04:00.000000' for key 'dag_run.dag_id'") The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute self._run_scheduler_loop() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1377, in _run_scheduler_loop num_queued_tis = self._do_scheduling(session) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1474, in _do_scheduling self._create_dag_runs(query.all(), session) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1561, in _create_dag_runs dag.create_dagrun( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/utils/session.py", line 62, in wrapper return func(*args, **kwargs) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/models/dag.py", line 1807, in create_dagrun session.flush() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2540, in flush self._flush(objects) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2682, in _flush transaction.rollback(_capture_exception=True) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 68, in __exit__ compat.raise_( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_ raise exception File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/session.py", line 2642, in _flush flush_context.execute() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute rec.execute(self) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py", line 586, in execute persistence.save_obj( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 239, in save_obj _emit_insert_statements( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 1135, in _emit_insert_statements result = cached_connections[connection].execute( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute return meth(self, multiparams, params) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection return connection._execute_clauseelement(self, multiparams, params) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement ret = self._execute_context( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context self._handle_dbapi_exception( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception util.raise_( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_ raise exception File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context self.dialect.do_execute( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 609, in do_execute cursor.execute(statement, parameters) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 209, in execute res = self._query(query) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 315, in _query db.query(q) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/connections.py", line 239, in query _mysql.connection.query(self, query) sqlalchemy.exc.IntegrityError: (MySQLdb._exceptions.IntegrityError) (1062, "Duplicate entry 'huge_demo13499411352-2021-01-15 01:04:00.000000' for key 'dag_run.dag_id'") [SQL: INSERT INTO dag_run (dag_id, execution_date, start_date, end_date, state, run_id, creating_job_id, external_trigger, run_type, conf, last_scheduling_decision, dag_hash) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)] [parameters: ('huge_demo13499411352', datetime.datetime(2021, 1, 15, 1, 4), datetime.datetime(2021, 1, 15, 1, 6, 22, 629433), None, 'running', 'scheduled__2021-01-15T01:04:00+00:00', 71466, 0, <DagRunType.SCHEDULED: 'scheduled'>, b'\x80\x05}\x94.', None, '60078c379cdeecb9bc8844eed5aa9745')] (Background on this error at: http://sqlalche.me/e/13/gkpj) {2021-01-15 09:06:23,648} {{process_utils.py:95}} INFO - Sending Signals.SIGTERM to GPID 66351 {2021-01-15 09:06:23,781} {{process_utils.py:61}} INFO - Process psutil.Process(pid=66351, status='terminated') (66351) terminated with exit code 0 {2021-01-15 09:06:23,781} {{scheduler_job.py:1296}} INFO - Exited execute loop ``` **What you expected to happen**: Schdeduler should not die. **How to reproduce it**: I don't know how to reproduce it **Anything else we need to know**: No
https://github.com/apache/airflow/issues/13685
https://github.com/apache/airflow/pull/13920
05fbeb16bc40cd3a710804408d3ae84156b5aae6
594069ee061e9839b2b12aa43aa3a23e05beed86
"2021-01-15T01:20:15Z"
python
"2021-02-01T16:06:31Z"
closed
apache/airflow
https://github.com/apache/airflow
13,680
["chart/files/pod-template-file.kubernetes-helm-yaml"]
"dag_id could not be found" when running airflow on KubernetesExecutor
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): v1.19.4 **What happened**: I get this error when try to execute tasks using kubernetes ``` [2021-01-14 19:39:17,628] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/dags/repo/bash.py Traceback (most recent call last): File "/home/airflow/.local/bin/airflow", line 8, in <module> sys.exit(main()) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__main__.py", line 40, in main args.func(args) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/cli_parser.py", line 48, in command return func(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 89, in wrapper return f(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py", line 216, in task_run dag = get_dag(args.subdir, args.dag_id) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 189, in get_dag 'parse.'.format(dag_id) airflow.exceptions.AirflowException: dag_id could not be found: bash. Either the dag did not exist or it failed to parse. ``` **What you expected to happen**: get executed and terminate **How to reproduce it**: deploy airflow helm chart using this values.yaml: ``` # Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. --- # Default values for airflow. # This is a YAML-formatted file. # Declare variables to be passed into your templates. # User and group of airflow user uid: 50000 gid: 50000 # Airflow home directory # Used for mount paths airflowHome: "/opt/airflow" # Default airflow repository -- overrides all the specific images below defaultAirflowRepository: apache/airflow # Default airflow tag to deploy defaultAirflowTag: 2.0.0 # Select certain nodes for airflow pods. nodeSelector: { } affinity: { } tolerations: [ ] # Add common labels to all objects and pods defined in this chart. labels: { } # Ingress configuration ingress: # Enable ingress resource enabled: false # Configs for the Ingress of the web Service web: # Annotations for the web Ingress annotations: { } # The path for the web Ingress path: "" # The hostname for the web Ingress host: "" # configs for web Ingress TLS tls: # Enable TLS termination for the web Ingress enabled: false # the name of a pre-created Secret containing a TLS private key and certificate secretName: "" # HTTP paths to add to the web Ingress before the default path precedingPaths: [ ] # Http paths to add to the web Ingress after the default path succeedingPaths: [ ] # Configs for the Ingress of the flower Service flower: # Annotations for the flower Ingress annotations: { } # The path for the flower Ingress path: "" # The hostname for the flower Ingress host: "" # configs for web Ingress TLS tls: # Enable TLS termination for the flower Ingress enabled: false # the name of a pre-created Secret containing a TLS private key and certificate secretName: "" # HTTP paths to add to the flower Ingress before the default path precedingPaths: [ ] # Http paths to add to the flower Ingress after the default path succeedingPaths: [ ] # Network policy configuration networkPolicies: # Enabled network policies enabled: false # Extra annotations to apply to all # Airflow pods airflowPodAnnotations: { } # Enable RBAC (default on most clusters these days) rbacEnabled: true # Airflow executor # Options: SequentialExecutor, LocalExecutor, CeleryExecutor, KubernetesExecutor executor: "KubernetesExecutor" # If this is true and using LocalExecutor/SequentialExecutor/KubernetesExecutor, the scheduler's # service account will have access to communicate with the api-server and launch pods. # If this is true and using the CeleryExecutor, the workers will be able to launch pods. allowPodLaunching: true # Images images: airflow: repository: ~ tag: ~ pullPolicy: IfNotPresent pod_template: repository: ~ tag: ~ pullPolicy: IfNotPresent flower: repository: ~ tag: ~ pullPolicy: IfNotPresent statsd: repository: apache/airflow tag: airflow-statsd-exporter-2020.09.05-v0.17.0 pullPolicy: IfNotPresent redis: repository: redis tag: 6-buster pullPolicy: IfNotPresent pgbouncer: repository: apache/airflow tag: airflow-pgbouncer-2020.09.05-1.14.0 pullPolicy: IfNotPresent pgbouncerExporter: repository: apache/airflow tag: airflow-pgbouncer-exporter-2020.09.25-0.5.0 pullPolicy: IfNotPresent gitSync: repository: k8s.gcr.io/git-sync tag: v3.1.6 pullPolicy: IfNotPresent # Environment variables for all airflow containers env: - name: "AIRFLOW__KUBERNETES__GIT_SYNC_RUN_AS_USER" value: "65533" # Secrets for all airflow containers secret: [ ] # - envName: "" # secretName: "" # secretKey: "" # Extra secrets that will be managed by the chart # (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values). # The format is "key/value" where # * key (can be templated) is the the name the secret that will be created # * value: an object with the standard 'data' or 'stringData' key (or both). # The value associated with those keys must be a string (can be templated) extraSecrets: { } # eg: # extraSecrets: # {{ .Release.Name }}-airflow-connections: # data: | # AIRFLOW_CONN_GCP: 'base64_encoded_gcp_conn_string' # AIRFLOW_CONN_AWS: 'base64_encoded_aws_conn_string' # stringData: | # AIRFLOW_CONN_OTHER: 'other_conn' # {{ .Release.Name }}-other-secret-name-suffix: | # data: | # ... # Extra ConfigMaps that will be managed by the chart # (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values). # The format is "key/value" where # * key (can be templated) is the the name the configmap that will be created # * value: an object with the standard 'data' key. # The value associated with this keys must be a string (can be templated) extraConfigMaps: { } # eg: # extraConfigMaps: # {{ .Release.Name }}-airflow-variables: # data: | # AIRFLOW_VAR_HELLO_MESSAGE: "Hi!" # AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}" # Extra env 'items' that will be added to the definition of airflow containers # a string is expected (can be templated). extraEnv: ~ # eg: # extraEnv: | # - name: PLATFORM # value: FR # Extra envFrom 'items' that will be added to the definition of airflow containers # A string is expected (can be templated). extraEnvFrom: ~ # eg: # extraEnvFrom: | # - secretRef: # name: '{{ .Release.Name }}-airflow-connections' # - configMapRef: # name: '{{ .Release.Name }}-airflow-variables' # Airflow database config data: # If secret names are provided, use those secrets metadataSecretName: ~ resultBackendSecretName: ~ # Otherwise pass connection values in metadataConnection: user: postgres pass: postgres host: ~ port: 5432 db: postgres sslmode: disable resultBackendConnection: user: postgres pass: postgres host: ~ port: 5432 db: postgres sslmode: disable # Fernet key settings fernetKey: ~ fernetKeySecretName: ~ # In order to use kerberos you need to create secret containing the keytab file # The secret name should follow naming convention of the application where resources are # name {{ .Release-name }}-<POSTFIX>. In case of the keytab file, the postfix is "kerberos-keytab" # So if your release is named "my-release" the name of the secret should be "my-release-kerberos-keytab" # # The Keytab content should be available in the "kerberos.keytab" key of the secret. # # apiVersion: v1 # kind: Secret # data: # kerberos.keytab: <base64_encoded keytab file content> # type: Opaque # # # If you have such keytab file you can do it with similar # # kubectl create secret generic {{ .Release.name }}-kerberos-keytab --from-file=kerberos.keytab # kerberos: enabled: false ccacheMountPath: '/var/kerberos-ccache' ccacheFileName: 'cache' configPath: '/etc/krb5.conf' keytabPath: '/etc/airflow.keytab' principal: 'airflow@FOO.COM' reinitFrequency: 3600 config: | # This is an example config showing how you can use templating and how "example" config # might look like. It works with the test kerberos server that we are using during integration # testing at Apache Airflow (see `scripts/ci/docker-compose/integration-kerberos.yml` but in # order to make it production-ready you must replace it with your own configuration that # Matches your kerberos deployment. Administrators of your Kerberos instance should # provide the right configuration. [logging] default = "FILE:{{ template "airflow_logs_no_quote" . }}/kerberos_libs.log" kdc = "FILE:{{ template "airflow_logs_no_quote" . }}/kerberos_kdc.log" admin_server = "FILE:{{ template "airflow_logs_no_quote" . }}/kadmind.log" [libdefaults] default_realm = FOO.COM ticket_lifetime = 10h renew_lifetime = 7d forwardable = true [realms] FOO.COM = { kdc = kdc-server.foo.com admin_server = admin_server.foo.com } # Airflow Worker Config workers: # Number of airflow celery workers in StatefulSet replicas: 1 # Allow KEDA autoscaling. # Persistence.enabled must be set to false to use KEDA. keda: enabled: false namespaceLabels: { } # How often KEDA polls the airflow DB to report new scale requests to the HPA pollingInterval: 5 # How many seconds KEDA will wait before scaling to zero. # Note that HPA has a separate cooldown period for scale-downs cooldownPeriod: 30 # Maximum number of workers created by keda maxReplicaCount: 10 persistence: # Enable persistent volumes enabled: true # Volume size for worker StatefulSet size: 100Gi # If using a custom storageClass, pass name ref to all statefulSets here storageClassName: # Execute init container to chown log directory. # This is currently only needed in KinD, due to usage # of local-path provisioner. fixPermissions: false kerberosSidecar: # Enable kerberos sidecar enabled: false resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi # Grace period for tasks to finish after SIGTERM is sent from kubernetes terminationGracePeriodSeconds: 600 # This setting tells kubernetes that its ok to evict # when it wants to scale a node down. safeToEvict: true # Annotations to add to worker kubernetes service account. serviceAccountAnnotations: { } # Mount additional volumes into worker. extraVolumes: [ ] extraVolumeMounts: [ ] # Airflow scheduler settings scheduler: # Airflow 2.0 allows users to run multiple schedulers, # However this feature is only recommended for MySQL 8+ and Postgres replicas: 1 # Scheduler pod disruption budget podDisruptionBudget: enabled: false # PDB configuration config: maxUnavailable: 1 resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi # This setting can overwrite # podMutation settings. airflowLocalSettings: ~ # This setting tells kubernetes that its ok to evict # when it wants to scale a node down. safeToEvict: true # Annotations to add to scheduler kubernetes service account. serviceAccountAnnotations: { } # Mount additional volumes into scheduler. extraVolumes: [ ] extraVolumeMounts: [ ] # Airflow webserver settings webserver: allowPodLogReading: true livenessProbe: initialDelaySeconds: 15 timeoutSeconds: 30 failureThreshold: 20 periodSeconds: 5 readinessProbe: initialDelaySeconds: 15 timeoutSeconds: 30 failureThreshold: 20 periodSeconds: 5 # Number of webservers replicas: 1 # Additional network policies as needed extraNetworkPolicies: [ ] resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi # Create initial user. defaultUser: enabled: true role: Admin username: admin email: admin@example.com firstName: admin lastName: user password: admin # Mount additional volumes into webserver. extraVolumes: [ ] # - name: airflow-ui # emptyDir: { } extraVolumeMounts: [ ] # - name: airflow-ui # mountPath: /opt/airflow # This will be mounted into the Airflow Webserver as a custom # webserver_config.py. You can bake a webserver_config.py in to your image # instead webserverConfig: ~ # webserverConfig: | # from airflow import configuration as conf # # The SQLAlchemy connection string. # SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN') # # Flask-WTF flag for CSRF # CSRF_ENABLED = True service: type: NodePort ## service annotations annotations: { } # Annotations to add to webserver kubernetes service account. serviceAccountAnnotations: { } # Flower settings flower: # Additional network policies as needed extraNetworkPolicies: [ ] resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi # A secret containing the connection secretName: ~ # Else, if username and password are set, create secret from username and password username: ~ password: ~ service: type: ClusterIP # Statsd settings statsd: enabled: true # Additional network policies as needed extraNetworkPolicies: [ ] resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi service: extraAnnotations: { } # Pgbouncer settings pgbouncer: # Enable pgbouncer enabled: false # Additional network policies as needed extraNetworkPolicies: [ ] # Pool sizes metadataPoolSize: 10 resultBackendPoolSize: 5 # Maximum clients that can connect to pgbouncer (higher = more file descriptors) maxClientConn: 100 # Pgbouner pod disruption budget podDisruptionBudget: enabled: false # PDB configuration config: maxUnavailable: 1 # Limit the resources to pgbouncerExported. # When you specify the resource request the scheduler uses this information to decide which node to place # the Pod on. When you specify a resource limit for a Container, the kubelet enforces those limits so # that the running container is not allowed to use more of that resource than the limit you set. # See: https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/ # Example: # # resource: # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi resources: { } service: extraAnnotations: { } # https://www.pgbouncer.org/config.html verbose: 0 logDisconnections: 0 logConnections: 0 sslmode: "prefer" ciphers: "normal" ssl: ca: ~ cert: ~ key: ~ redis: terminationGracePeriodSeconds: 600 persistence: # Enable persistent volumes enabled: true # Volume size for worker StatefulSet size: 1Gi # If using a custom storageClass, pass name ref to all statefulSets here storageClassName: resources: { } # limits: # cpu: 100m # memory: 128Mi # requests: # cpu: 100m # memory: 128Mi # If set use as redis secret passwordSecretName: ~ brokerURLSecretName: ~ # Else, if password is set, create secret with it, # else generate a new one on install password: ~ # This setting tells kubernetes that its ok to evict # when it wants to scale a node down. safeToEvict: true # Auth secret for a private registry # This is used if pulling airflow images from a private registry registry: secretName: ~ # Example: # connection: # user: ~ # pass: ~ # host: ~ # email: ~ connection: { } # Elasticsearch logging configuration elasticsearch: # Enable elasticsearch task logging enabled: true # A secret containing the connection # secretName: ~ # Or an object representing the connection # Example: connection: # user: # pass: host: elasticsearch-master-headless.elk.svc.cluster.local port: 9200 # connection: {} # All ports used by chart ports: flowerUI: 5555 airflowUI: 8080 workerLogs: 8793 redisDB: 6379 statsdIngest: 9125 statsdScrape: 9102 pgbouncer: 6543 pgbouncerScrape: 9127 # Define any ResourceQuotas for namespace quotas: { } # Define default/max/min values for pods and containers in namespace limits: [ ] # This runs as a CronJob to cleanup old pods. cleanup: enabled: false # Run every 15 minutes schedule: "*/15 * * * *" # Configuration for postgresql subchart # Not recommended for production postgresql: enabled: true postgresqlPassword: postgres postgresqlUsername: postgres # Config settings to go into the mounted airflow.cfg # # Please note that these values are passed through the `tpl` function, so are # all subject to being rendered as go templates. If you need to include a # literal `{{` in a value, it must be expressed like this: # # a: '{{ "{{ not a template }}" }}' # # yamllint disable rule:line-length config: core: dags_folder: '{{ include "airflow_dags" . }}' load_examples: 'False' executor: '{{ .Values.executor }}' # For Airflow 1.10, backward compatibility colored_console_log: 'True' remote_logging: '{{- ternary "True" "False" .Values.elasticsearch.enabled }}' # Authentication backend used for the experimental API api: auth_backend: airflow.api.auth.backend.deny_all logging: remote_logging: '{{- ternary "True" "False" .Values.elasticsearch.enabled }}' colored_console_log: 'True' logging_level: INFO metrics: statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}' statsd_port: 9125 statsd_prefix: airflow statsd_host: '{{ printf "%s-statsd" .Release.Name }}' webserver: enable_proxy_fix: 'True' expose_config: 'True' rbac: 'True' celery: default_queue: celery scheduler: scheduler_heartbeat_sec: 5 # For Airflow 1.10, backward compatibility statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}' statsd_port: 9125 statsd_prefix: airflow statsd_host: '{{ printf "%s-statsd" .Release.Name }}' # Restart Scheduler every 41460 seconds (11 hours 31 minutes) # The odd time is chosen so it is not always restarting on the same "hour" boundary run_duration: 41460 elasticsearch: json_format: 'True' log_id_template: "{dag_id}_{task_id}_{execution_date}_{try_number}" elasticsearch_configs: max_retries: 3 timeout: 30 retry_timeout: 'True' kerberos: keytab: '{{ .Values.kerberos.keytabPath }}' reinit_frequency: '{{ .Values.kerberos.reinitFrequency }}' principal: '{{ .Values.kerberos.principal }}' ccache: '{{ .Values.kerberos.ccacheMountPath }}/{{ .Values.kerberos.ccacheFileName }}' kubernetes: namespace: '{{ .Release.Namespace }}' airflow_configmap: '{{ include "airflow_config" . }}' airflow_local_settings_configmap: '{{ include "airflow_config" . }}' pod_template_file: '{{ include "airflow_pod_template_file" . }}/pod_template_file.yaml' worker_container_repository: '{{ .Values.images.airflow.repository | default .Values.defaultAirflowRepository }}' worker_container_tag: '{{ .Values.images.airflow.tag | default .Values.defaultAirflowTag }}' delete_worker_pods: 'False' multi_namespace_mode: '{{ if .Values.multiNamespaceMode }}True{{ else }}False{{ end }}' # yamllint enable rule:line-length multiNamespaceMode: false podTemplate: # Git sync dags: persistence: # Enable persistent volume for storing dags enabled: false # Volume size for dags size: 1Gi # If using a custom storageClass, pass name here storageClassName: gp2 # access mode of the persistent volume accessMode: ReadWriteMany ## the name of an existing PVC to use existingClaim: "airflow-dags" gitSync: enabled: true repo: git@github.com:Tikna-inc/airflow.git branch: main rev: HEAD root: "/git" dest: "repo" depth: 1 maxFailures: 0 subPath: "" sshKeySecret: airflow-ssh-secret wait: 60 containerName: git-sync uid: 65533 ``` **and this is the dag with its tasks** ``` from datetime import timedelta import requests from airflow import DAG from airflow.operators.bash_operator import BashOperator from airflow.utils.dates import days_ago logging.getLogger().setLevel(level=logging.INFO) default_args = { 'owner': 'airflow', 'depends_on_past': False, 'email': ['airflow@example.com'], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), } def get_active_customers(): requests.get("localhost:8080") dag = DAG( 'bash', default_args=default_args, description='A simple test DAG', schedule_interval='*/2 * * * *', start_date=days_ago(1), tags=['Test'], is_paused_upon_creation=False, catchup=False ) t1 = BashOperator( task_id='print_date', bash_command='mkdir ./itsMe', dag=dag ) t1 ``` This is airflow.cfg file ```cfg [api] auth_backend = airflow.api.auth.backend.deny_all [celery] default_queue = celery [core] colored_console_log = True dags_folder = /opt/airflow/dags/repo/ executor = KubernetesExecutor load_examples = False remote_logging = False [elasticsearch] json_format = True log_id_template = {dag_id}_{task_id}_{execution_date}_{try_number} [elasticsearch_configs] max_retries = 3 retry_timeout = True timeout = 30 [kerberos] ccache = /var/kerberos-ccache/cache keytab = /etc/airflow.keytab principal = airflow@FOO.COM reinit_frequency = 3600 [kubernetes] airflow_configmap = airflow-airflow-config airflow_local_settings_configmap = airflow-airflow-config dags_in_image = False delete_worker_pods = False multi_namespace_mode = False namespace = airflow pod_template_file = /opt/airflow/pod_templates/pod_template_file.yaml worker_container_repository = apache/airflow worker_container_tag = 2.0.0 [logging] colored_console_log = True logging_level = INFO remote_logging = False [metrics] statsd_host = airflow-statsd statsd_on = True statsd_port = 9125 statsd_prefix = airflow [scheduler] run_duration = 41460 scheduler_heartbeat_sec = 5 statsd_host = airflow-statsd statsd_on = True statsd_port = 9125 statsd_prefix = airflow [webserver] enable_proxy_fix = True expose_config = True ``` This is the pod yaml file for the new tasks ``` apiVersion: v1 kind: Pod metadata: annotations: dag_id: bash2 execution_date: "2021-01-14T20:16:00+00:00" kubernetes.io/psp: eks.privileged task_id: create_dir try_number: "2" labels: airflow-worker: "38" airflow_version: 2.0.0 dag_id: bash2 execution_date: 2021-01-14T20_16_00_plus_00_00 kubernetes_executor: "True" task_id: create_dir try_number: "2" name: sss3 namespace: airflow spec: containers: - args: - airflow - tasks - run - bash2 - create_dir - "2021-01-14T20:16:00+00:00" - --local - --pool - default_pool - --subdir - /opt/airflow/dags/repo/bash.py env: - name: AIRFLOW__CORE__EXECUTOR value: LocalExecutor - name: AIRFLOW__CORE__FERNET_KEY valueFrom: secretKeyRef: key: fernet-key name: airflow-fernet-key - name: AIRFLOW__CORE__SQL_ALCHEMY_CONN valueFrom: secretKeyRef: key: connection name: airflow-airflow-metadata - name: AIRFLOW_CONN_AIRFLOW_DB valueFrom: secretKeyRef: key: connection name: airflow-airflow-metadata - name: AIRFLOW_IS_K8S_EXECUTOR_POD value: "True" image: apache/airflow:2.0.0 imagePullPolicy: IfNotPresent name: base resources: { } terminationMessagePath: /dev/termination-log terminationMessagePolicy: File volumeMounts: - mountPath: /opt/airflow/logs name: airflow-logs - mountPath: /opt/airflow/airflow.cfg name: config readOnly: true subPath: airflow.cfg - mountPath: /etc/git-secret/ssh name: git-sync-ssh-key subPath: ssh - mountPath: /opt/airflow/dags name: dags readOnly: true - mountPath: /var/run/secrets/kubernetes.io/serviceaccount name: airflow-worker-token-7sdtr readOnly: true dnsPolicy: ClusterFirst enableServiceLinks: true initContainers: - env: - name: GIT_SSH_KEY_FILE value: /etc/git-secret/ssh - name: GIT_SYNC_SSH value: "true" - name: GIT_KNOWN_HOSTS value: "false" - name: GIT_SYNC_REV value: HEAD - name: GIT_SYNC_BRANCH value: main - name: GIT_SYNC_REPO value: git@github.com:Tikna-inc/airflow.git - name: GIT_SYNC_DEPTH value: "1" - name: GIT_SYNC_ROOT value: /git - name: GIT_SYNC_DEST value: repo - name: GIT_SYNC_ADD_USER value: "true" - name: GIT_SYNC_WAIT value: "60" - name: GIT_SYNC_MAX_SYNC_FAILURES value: "0" - name: GIT_SYNC_ONE_TIME value: "true" image: k8s.gcr.io/git-sync:v3.1.6 imagePullPolicy: IfNotPresent name: git-sync resources: { } securityContext: runAsUser: 65533 terminationMessagePath: /dev/termination-log terminationMessagePolicy: File volumeMounts: - mountPath: /git name: dags - mountPath: /etc/git-secret/ssh name: git-sync-ssh-key readOnly: true subPath: gitSshKey - mountPath: /var/run/secrets/kubernetes.io/serviceaccount name: airflow-worker-token-7sdtr readOnly: true nodeName: ip-172-31-41-37.eu-south-1.compute.internal priority: 0 restartPolicy: Never schedulerName: default-scheduler securityContext: runAsUser: 50000 serviceAccount: airflow-worker serviceAccountName: airflow-worker terminationGracePeriodSeconds: 30 tolerations: - effect: NoExecute key: node.kubernetes.io/not-ready operator: Exists tolerationSeconds: 300 - effect: NoExecute key: node.kubernetes.io/unreachable operator: Exists tolerationSeconds: 300 volumes: - emptyDir: { } name: dags - name: git-sync-ssh-key secret: defaultMode: 288 secretName: airflow-ssh-secret - emptyDir: { } name: airflow-logs - configMap: defaultMode: 420 name: airflow-airflow-config name: config - name: airflow-worker-token-7sdtr secret: defaultMode: 420 secretName: airflow-worker-token-7sdtr ``` **-----------------------Important----------------------------** **Debugging** for debugging purpose I have changed the pod args rather than running the task, I ran it with ``` spec: containers: - args: - airflow - webserver ``` and tried to look for the Dags , and found None. It seems like gitSync is not working with the pods triggered by kubernetesExecutor. Any help please ???
https://github.com/apache/airflow/issues/13680
https://github.com/apache/airflow/pull/13826
3909232fafd09ac72b49010ecdfd6ea48f06d5cf
5f74219e6d400c4eae9134f6015c72430d6d549f
"2021-01-14T19:47:20Z"
python
"2021-02-04T19:01:46Z"
closed
apache/airflow
https://github.com/apache/airflow
13,679
["airflow/utils/db.py"]
SQL Syntax errors on startup
**Apache Airflow version**: 2.0.0 **What happened**: While investigating issues relating to task getting stuck, I saw this sql error in postgres logs. I am not entirely sure of what it impacts but I thought of letting you know. ``` ERROR: column "connection.password" must appear in the GROUP BY clause or be used in an aggregate function at character 8 STATEMENT: SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.description AS connection_description, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted, count(connection.conn_id) AS count_1 FROM connection GROUP BY connection.conn_id HAVING count(connection.conn_id) > 1 ERROR: current transaction is aborted, commands ignored until end of transaction block STATEMENT: SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.description AS connection_description, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted FROM connection WHERE connection.conn_type IS NULL ``` **How to reproduce it**: 1. Run `docker-compose run initdb` 2. Run `docker-compose run upgradedb` <details> <summary> Here's my docker-compose </summary> ``` version: "3.2" networks: airflow: services: postgres: container_name: af_postgres image: postgres:9.6 environment: - POSTGRES_USER=airflow - POSTGRES_DB=airflow - POSTGRES_PASSWORD=airflow volumes: - ./postgresql/data:/var/lib/postgresql/data command: > postgres -c listen_addresses=* -c logging_collector=on -c log_destination=stderr networks: - airflow initdb: container_name: af_initdb image: docker.io/apache/airflow:2.0.0-python3.7 environment: - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow depends_on: - postgres entrypoint: /bin/bash command: -c "airflow db init" networks: - airflow upgradedb: container_name: af_upgradedb image: docker.io/apache/airflow:2.0.0-python3.7 environment: - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow depends_on: - postgres entrypoint: /bin/bash command: -c "airflow db upgrade" networks: - airflow ``` </details> **Anything else we need to know**: Upon looking the code, I believe having `Connection.conn_id` [here](https://github.com/apache/airflow/blob/ab5f770bfcd8c690cbe4d0825896325aca0beeca/airflow/utils/db.py#L613) will resolve the sql syntax error.
https://github.com/apache/airflow/issues/13679
https://github.com/apache/airflow/pull/13783
1602ec97c8d5bc7a7a8b42e850ac6c7a7030e47d
b4c8a0406e88f330b38e8571b5b3ea399ff6fe7d
"2021-01-14T18:15:42Z"
python
"2021-01-20T07:23:28Z"
closed
apache/airflow
https://github.com/apache/airflow
13,677
["chart/templates/scheduler/scheduler-deployment.yaml"]
Airflow Scheduler Liveliness Probe does not support running multiple instances
## Topic Airflow with Kubernetes Executor ## Version Airflow 2.0.0 ## Description Hi, This code https://github.com/apache/airflow/blob/1d2977f6a4c67fa6174c79dcdc4e9ee3ce06f1b1/chart/templates/scheduler/scheduler-deployment.yaml#L138 causes scheduler pods to randomly restart due to liveliness probe hitting random hostname, if more than one scheduler replica is running. ### Solution Suggesting this change: ``` livenessProbe: exec: command: - python - '-Wignore' - '-c' - > import sys import os from airflow.jobs.scheduler_job import SchedulerJob from airflow.utils.session import provide_session from airflow.utils.state import State from airflow.utils.net import get_hostname @provide_session def all_running_jobs(session=None): return session.query(SchedulerJob).filter(SchedulerJob.state == State.RUNNING).all() os.environ['AIRFLOW__CORE__LOGGING_LEVEL'] = 'ERROR' os.environ['AIRFLOW__LOGGING__LOGGING_LEVEL'] = 'ERROR' all_active_schedulers = all_running_jobs() current_scheduler = get_hostname() for _job in all_active_schedulers: if _job.hostname == current_scheduler and _job.is_alive(): sys.exit(0) sys.exit(1) ```
https://github.com/apache/airflow/issues/13677
https://github.com/apache/airflow/pull/13705
808092928a66908f36aec585b881c5390d365130
2abfe1e1364a98e923a0967e4a989ccabf8bde54
"2021-01-14T17:23:03Z"
python
"2021-01-15T23:52:52Z"
closed
apache/airflow
https://github.com/apache/airflow
13,676
["airflow/api_connexion/endpoints/xcom_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_xcom_endpoint.py"]
API Endpoints - /xcomEntries/{xcom_key} - doesn't return value
**Apache Airflow version**: 2.0.0 **What happened**: Using endpoint `/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` I got Response Body but without `value` entry. Like: ``` { "dag_id": "string", "execution_date": "string", "key": "string", "task_id": "string", "timestamp": "string" } ``` Instead of: ``` { "dag_id": "string", "execution_date": "string", "key": "string", "task_id": "string", "timestamp": "string", "value": "string" } ``` The exact value by defined `key` exists.
https://github.com/apache/airflow/issues/13676
https://github.com/apache/airflow/pull/13684
2fef2ab1bf0f8c727a503940c9c65fd5be208386
dc80fa4cbc070fc6e84fcc95799d185badebaa71
"2021-01-14T15:57:46Z"
python
"2021-01-15T10:18:44Z"
closed
apache/airflow
https://github.com/apache/airflow
13,668
["airflow/jobs/scheduler_job.py", "airflow/models/dag.py", "airflow/models/dagrun.py", "airflow/models/pool.py", "airflow/models/taskinstance.py", "airflow/utils/sqlalchemy.py", "tests/utils/test_sqlalchemy.py"]
scheduler dies with "MySQLdb._exceptions.OperationalError: (1213, 'Deadlock found when trying to get lock; try restarting transaction')"
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: tencent cloud - **OS** (e.g. from /etc/os-release): centos7 - **Kernel** (e.g. `uname -a`): 3.10 - **Install tools**: - **Others**: Server version: 8.0.22 MySQL Community Server - GPL **What happened**: Scheduler dies when I try to restart it. And the logs are as follows: ``` {2021-01-14 13:29:05,424} {{scheduler_job.py:1293}} ERROR - Exception when executing SchedulerJob._run_scheduler_loop Traceback (most recent call last): File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context self.dialect.do_execute( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 609, in do_execute cursor.execute(statement, parameters) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 209, in execute res = self._query(query) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 315, in _query db.query(q) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/connections.py", line 239, in query _mysql.connection.query(self, query) MySQLdb._exceptions.OperationalError: (1213, 'Deadlock found when trying to get lock; try restarting transaction') The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute self._run_scheduler_loop() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1349, in _run_scheduler_loop self.adopt_or_reset_orphaned_tasks() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/utils/session.py", line 65, in wrapper return func(*args, session=session, **kwargs) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1758, in adopt_or_reset_orphaned_tasks session.query(SchedulerJob) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 4063, in update update_op.exec_() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 1697, in exec_ self._do_exec() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 1895, in _do_exec self._execute_stmt(update_stmt) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py", line 1702, in _execute_stmt self.result = self.query._execute_crud(stmt, self.mapper) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/orm/query.py", line 3568, in _execute_crud return conn.execute(stmt, self._params) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1011, in execute return meth(self, multiparams, params) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection return connection._execute_clauseelement(self, multiparams, params) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement ret = self._execute_context( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context self._handle_dbapi_exception( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception util.raise_( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 182, in raise_ raise exception File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context self.dialect.do_execute( File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 609, in do_execute cursor.execute(statement, parameters) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 209, in execute res = self._query(query) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/cursors.py", line 315, in _query db.query(q) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/MySQLdb/connections.py", line 239, in query _mysql.connection.query(self, query) sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) (1213, 'Deadlock found when trying to get lock; try restarting transaction') [SQL: UPDATE job SET state=%s WHERE job.state = %s AND job.latest_heartbeat < %s] [parameters: ('failed', 'running', datetime.datetime(2021, 1, 14, 5, 28, 35, 157941))] (Background on this error at: http://sqlalche.me/e/13/e3q8) {2021-01-14 13:29:06,435} {{process_utils.py:95}} INFO - Sending Signals.SIGTERM to GPID 6293 {2021-01-14 13:29:06,677} {{process_utils.py:61}} INFO - Process psutil.Process(pid=6318, status='terminated') (6318) terminated with exit code None {2021-01-14 13:29:06,767} {{process_utils.py:201}} INFO - Waiting up to 5 seconds for processes to exit... {2021-01-14 13:29:06,850} {{process_utils.py:61}} INFO - Process psutil.Process(pid=6320, status='terminated') (6320) terminated with exit code None {2021-01-14 13:29:06,850} {{process_utils.py:61}} INFO - Process psutil.Process(pid=6319, status='terminated') (6319) terminated with exit code None {2021-01-14 13:29:06,858} {{process_utils.py:61}} INFO - Process psutil.Process(pid=6321, status='terminated') (6321) terminated with exit code None {2021-01-14 13:29:06,864} {{process_utils.py:201}} INFO - Waiting up to 5 seconds for processes to exit... {2021-01-14 13:29:06,876} {{process_utils.py:61}} INFO - Process psutil.Process(pid=6293, status='terminated') (6293) terminated with exit code 0 {2021-01-14 13:29:06,876} {{scheduler_job.py:1296}} INFO - Exited execute loop ``` **What you expected to happen**: Schdeduler should not die. **How to reproduce it**: I don't know how to reproduce it **Anything else we need to know**: I just upgrade airflow from 1.10.14. Now I try to fix it temporarily by catching the exception in scheduler_job.py ```python for dag_run in dag_runs: try: self._schedule_dag_run(dag_run, active_runs_by_dag_id.get(dag_run.dag_id, set()), session) except Exception as e: self.log.exception(e) ```
https://github.com/apache/airflow/issues/13668
https://github.com/apache/airflow/pull/14031
019389d034700c53d218135ab01128ff8b325b1c
568327f01a39d6f181dda62ef6a143f5096e6b97
"2021-01-14T07:05:53Z"
python
"2021-02-03T02:55:27Z"
closed
apache/airflow
https://github.com/apache/airflow
13,667
["airflow/models/dagbag.py"]
scheduler dies with "TypeError: '>' not supported between instances of 'NoneType' and 'datetime.datetime'"
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: tencent cloud - **OS** (e.g. from /etc/os-release): centos7 - **Kernel** (e.g. `uname -a`): 3.10 - **Install tools**: - **Others**: Server version: 8.0.22 MySQL Community Server - GPL **What happened**: Scheduler dies when I try to restart it. And the logs are as follows: ``` 2021-01-14 14:07:44,429} {{scheduler_job.py:1754}} INFO - Resetting orphaned tasks for active dag runs {2021-01-14 14:08:14,470} {{scheduler_job.py:1754}} INFO - Resetting orphaned tasks for active dag runs {2021-01-14 14:08:16,968} {{scheduler_job.py:1293}} ERROR - Exception when executing SchedulerJob._run_scheduler_loop Traceback (most recent call last): File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute self._run_scheduler_loop() File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1377, in _run_scheduler_loop num_queued_tis = self._do_scheduling(session) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1516, in _do_scheduling self._schedule_dag_run(dag_run, active_runs_by_dag_id.get(dag_run.dag_id, set()), session) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1629, in _schedule_dag_run dag = dag_run.dag = self.dagbag.get_dag(dag_run.dag_id, session=session) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/utils/session.py", line 62, in wrapper return func(*args, **kwargs) File "/home/app/.pyenv/versions/3.8.1/envs/airflow-py381/lib/python3.8/site-packages/airflow/models/dagbag.py", line 187, in get_dag if sd_last_updated_datetime > self.dags_last_fetched[dag_id]: TypeError: '>' not supported between instances of 'NoneType' and 'datetime.datetime' {2021-01-14 14:08:17,975} {{process_utils.py:95}} INFO - Sending Signals.SIGTERM to GPID 53178 {2021-01-14 14:08:18,212} {{process_utils.py:61}} INFO - Process psutil.Process(pid=58676, status='terminated') (58676) terminated with exit code None {2021-01-14 14:08:18,295} {{process_utils.py:201}} INFO - Waiting up to 5 seconds for processes to exit... {2021-01-14 14:08:18,345} {{process_utils.py:61}} INFO - Process psutil.Process(pid=53178, status='terminated') (53178) terminated with exit code 0 {2021-01-14 14:08:18,345} {{process_utils.py:61}} INFO - Process psutil.Process(pid=58677, status='terminated') (58677) terminated with exit code None {2021-01-14 14:08:18,346} {{process_utils.py:61}} INFO - Process psutil.Process(pid=58678, status='terminated') (58678) terminated with exit code None {2021-01-14 14:08:18,346} {{process_utils.py:61}} INFO - Process psutil.Process(pid=58708, status='terminated') (58708) terminated with exit code None {2021-01-14 14:08:18,346} {{scheduler_job.py:1296}} INFO - Exited execute loop ``` **What you expected to happen**: Schdeduler should not die. **How to reproduce it**: I don't know how to reproduce it **Anything else we need to know**: I just upgrade airflow from 1.10.14. Now I try to fix it temporarily by catching the exception in scheduler_job.py ```python for dag_run in dag_runs: try: self._schedule_dag_run(dag_run, active_runs_by_dag_id.get(dag_run.dag_id, set()), session) except Exception as e: self.log.exception(e) ```
https://github.com/apache/airflow/issues/13667
https://github.com/apache/airflow/pull/13899
ffb472cf9e630bd70f51b74b0d0ea4ab98635572
8958d125cd4ac9e58d706d75be3eb88d591199cd
"2021-01-14T06:51:40Z"
python
"2021-01-26T13:32:33Z"
closed
apache/airflow
https://github.com/apache/airflow
13,659
["docs/apache-airflow/howto/define_extra_link.rst"]
Operator Extra Links not showing up on UI
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.18 **Environment**: - **Cloud provider or hardware configuration**: AWS - **OS** (e.g. from /etc/os-release): Linux - **Kernel** (e.g. `uname -a`): Linux - **Install tools**: - **Others**: **What happened**: Followed the Example Here: https://airflow.apache.org/docs/apache-airflow/stable/howto/define_extra_link.html#define-an-operator-extra-link, and was expecting Link to show up on UI but it does not :( ![image](https://user-images.githubusercontent.com/23406205/104509302-52f65080-559e-11eb-9459-2815f8bb5573.png) ``` class GoogleLink(BaseOperatorLink): name = "Google" def get_link(self, operator, dttm): return "https://www.google.com" class MyFirstOperator(BaseOperator): operator_extra_links = ( GoogleLink(), ) @apply_defaults def __init__(self, **kwargs): super().__init__(**kwargs) def execute(self, context): self.log.info("Hello World!") print(self.extra_links) ``` <!-- (please include exact error messages if you can) --> **What you expected to happen**: I expected a Button Link to show up in Task Instance Model <!-- What do you think went wrong? --> **How to reproduce it**: Follow Example here on Airflow 2.0 https://airflow.apache.org/docs/apache-airflow/stable/howto/define_extra_link.html#define-an-operator-extra-link <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md style of ![alt text](http://url/to/img.png) To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> -->
https://github.com/apache/airflow/issues/13659
https://github.com/apache/airflow/pull/13683
3558538883612a10e9ea3521bf864515b6e560c5
3d21082adc3bde63a15dad4db85b448ff695cfc6
"2021-01-13T20:55:43Z"
python
"2021-01-15T12:21:53Z"
closed
apache/airflow
https://github.com/apache/airflow
13,656
["airflow/www/static/js/connection_form.js"]
Password is unintendedly changed when editing a connection
**Apache Airflow version**: 2.0.0 **What happened**: When editing a connection - without changing the password - and saving the edited connection, a wrong password is saved. **What you expected to happen**: If I do not change the password in the UI, I expect that the password is not changed. **How to reproduce it**: - Create a new connection and save it (screenshots 1 + 2) - Edit the connection without editing the password, and save it again (screenshots 3 + 4) If you _do_ edit the password, the (new or old) password is saved correctly. *Screenshot 1* ![image](https://user-images.githubusercontent.com/46958547/104480468-e5a9e600-55c4-11eb-8810-b3f033a750a4.png) *Screenshot 2* ![image](https://user-images.githubusercontent.com/46958547/104480612-0e31e000-55c5-11eb-8aa7-6b9ecbb9a858.png) *Screenshot 3* ![image](https://user-images.githubusercontent.com/46958547/104480726-29045480-55c5-11eb-8c93-43c05ba859fe.png) *Screenshot 4* ![image](https://user-images.githubusercontent.com/46958547/104480770-39b4ca80-55c5-11eb-90d7-59be971ac53d.png) (I blurred out the full string in the unlikely case that the full string might contain information on my fernet key or something)
https://github.com/apache/airflow/issues/13656
https://github.com/apache/airflow/pull/15073
1627323a197bba2c4fbd71816a9a6bd3f78c1657
b4374d33b0e5d62c3510f1f5ac4a48e7f48cb203
"2021-01-13T16:34:22Z"
python
"2021-03-29T19:12:15Z"
closed
apache/airflow
https://github.com/apache/airflow
13,653
["airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/dag_schema.py", "tests/api_connexion/endpoints/test_dag_endpoint.py", "tests/api_connexion/schemas/test_dag_schema.py"]
API Endpoint for Airflow V1 - DAGs details
**Description** We need to have the endpoint in Airflow V1 to retrieve details of existing DAG, e.g. `GET /dags/{dag_id}/details ` **Use case / motivation** We want to be able to retrieve/discover the parameters that a DAG accepts. We can see that you pass parameters when you execute a dag via the conf object. We can also see that you explicitly declare parameters that a DAG accepts via the params argument when creating the DAG. However, we can't see anywhere via either the REST API or CLI that allows you to retrieve this information from a DAG (note that we are not saying a DAG run). It doesn't even look like version 2 API supports this although the OpenAPI spec mentions a dags/{dag_id}/details endpoint but this is not documented. We found the related GitHub issue for this new endpoint and it is done but looks like documentation isn't yet updated. Please can you: 1. Provide the response for the v2 details endpoint 2. Advise when v2 documentation will be updated with the details endpoint. 3. Advise if there is a workaround for us doing this on v1.1 **Related Issues** #8138
https://github.com/apache/airflow/issues/13653
https://github.com/apache/airflow/pull/13790
2c6c7fdb2308de98e142618836bdf414df9768c8
10b8ecc86f24739a38e56347dcc8dc60e3e43975
"2021-01-13T14:21:27Z"
python
"2021-01-21T15:42:19Z"
closed
apache/airflow
https://github.com/apache/airflow
13,638
["airflow/utils/log/file_task_handler.py", "tests/utils/test_log_handlers.py"]
Stable API task logs
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA **Environment**: - **Cloud provider or hardware configuration**: PC (docker-compose) - **OS** (e.g. from /etc/os-release): Linux mint 20 (for PC), Debian Buster in container - **Kernel** (e.g. `uname -a`): Linux 607a1bfeebd2 5.4.0-60-generic #67-Ubuntu SMP Tue Jan 5 18:31:36 UTC 2021 x86_64 GNU/Linux - **Install tools**: Poetry (so pipy) - **Others**: Using python 3.8.6, with Celery Executor, one worker Task did run properly **What happened**: I tried to get the logs of a task instance using the stable Rest API through the Swagger UI included in Airflow, and it crashed (got a stack trace) I got 500 error ``` engine-webserver_1 | 2021-01-12T16:45:18.465370280Z [2021-01-12 16:45:18,464] {app.py:1891} ERROR - Exception on /api/v1/dags/insert/dagRuns/manual__2021-01-12T15:05:59.560500+00:00/taskInstances/insert-db/logs/0 [GET] engine-webserver_1 | 2021-01-12T16:45:18.465391147Z Traceback (most recent call last): engine-webserver_1 | 2021-01-12T16:45:18.465394643Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app engine-webserver_1 | 2021-01-12T16:45:18.465397709Z response = self.full_dispatch_request() engine-webserver_1 | 2021-01-12T16:45:18.465400161Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request engine-webserver_1 | 2021-01-12T16:45:18.465402912Z rv = self.handle_user_exception(e) engine-webserver_1 | 2021-01-12T16:45:18.465405405Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception engine-webserver_1 | 2021-01-12T16:45:18.465407715Z reraise(exc_type, exc_value, tb) engine-webserver_1 | 2021-01-12T16:45:18.465409739Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise engine-webserver_1 | 2021-01-12T16:45:18.465412258Z raise value engine-webserver_1 | 2021-01-12T16:45:18.465414560Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request engine-webserver_1 | 2021-01-12T16:45:18.465425555Z rv = self.dispatch_request() engine-webserver_1 | 2021-01-12T16:45:18.465427999Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request engine-webserver_1 | 2021-01-12T16:45:18.465429697Z return self.view_functions[rule.endpoint](**req.view_args) engine-webserver_1 | 2021-01-12T16:45:18.465431146Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/decorator.py", line 48, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465433001Z response = function(request) engine-webserver_1 | 2021-01-12T16:45:18.465434308Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/uri_parsing.py", line 144, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465435841Z response = function(request) engine-webserver_1 | 2021-01-12T16:45:18.465437122Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/validation.py", line 384, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465438620Z return function(request) engine-webserver_1 | 2021-01-12T16:45:18.465440074Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/response.py", line 103, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465441667Z response = function(request) engine-webserver_1 | 2021-01-12T16:45:18.465443086Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/parameter.py", line 121, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465445345Z return function(**kwargs) engine-webserver_1 | 2021-01-12T16:45:18.465446713Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/api_connexion/security.py", line 47, in decorated engine-webserver_1 | 2021-01-12T16:45:18.465448202Z return func(*args, **kwargs) engine-webserver_1 | 2021-01-12T16:45:18.465449538Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/utils/session.py", line 65, in wrapper engine-webserver_1 | 2021-01-12T16:45:18.465451032Z return func(*args, session=session, **kwargs) engine-webserver_1 | 2021-01-12T16:45:18.465452504Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/api_connexion/endpoints/log_endpoint.py", line 81, in get_log engine-webserver_1 | 2021-01-12T16:45:18.465454135Z logs, metadata = task_log_reader.read_log_chunks(ti, task_try_number, metadata) engine-webserver_1 | 2021-01-12T16:45:18.465455658Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/utils/log/log_reader.py", line 58, in read_log_chunks engine-webserver_1 | 2021-01-12T16:45:18.465457226Z logs, metadatas = self.log_handler.read(ti, try_number, metadata=metadata) engine-webserver_1 | 2021-01-12T16:45:18.465458632Z ValueError: not enough values to unpack (expected 2, got 1) ``` <!-- (please include exact error messages if you can) --> **What you expected to happen**: I expected to get the logs of my task <!-- What do you think went wrong? --> **How to reproduce it**: I think it's everytime (at least on my side) <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md style of ![alt text](http://url/to/img.png) To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: Other stable API call, such as getting list of dags runs, task instance, etc worked well. Logs is appearing well if I go to <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: --> EDIT : Ok, I'm stupid, I put 0 as try number, instead of 1... So not a big bug, though I think 0 as try number should be a 400 status response, not 500 crash. Should I keep it open ?
https://github.com/apache/airflow/issues/13638
https://github.com/apache/airflow/pull/14001
32d2c25e2dd1fd069f51bdfdd79595f12047a867
2366f861ee97f50e2cff83d557a1ae97030febf9
"2021-01-12T17:10:25Z"
python
"2021-02-01T13:33:30Z"
closed
apache/airflow
https://github.com/apache/airflow
13,637
["UPDATING.md", "airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"]
Scheduler takes 100% of CPU without task execution
Hi, running airflow 2.0.0 with python 3.6.9 the scheduler is consuming much CPU time without execution any task: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 15758 oli 20 0 42252 3660 3124 R 100.0 0.0 0:00.06 top 16764 oli 20 0 590272 90648 15468 R 200.0 0.3 0:00.59 airflow schedul 16769 oli 20 0 588808 77236 13900 R 200.0 0.3 0:00.55 airflow schedul 1 root 20 0 1088 548 516 S 0.0 0.0 0:13.28 init 10 root 20 0 900 80 16 S 0.0 0.0 0:00.00 init
https://github.com/apache/airflow/issues/13637
https://github.com/apache/airflow/pull/13664
9536ad906f1591a5a0f82f69ba3bd214c4516c5b
e4b8ee63b04a25feb21a5766b1cc997aca9951a9
"2021-01-12T14:16:04Z"
python
"2021-01-14T13:08:12Z"
closed
apache/airflow
https://github.com/apache/airflow
13,634
["airflow/providers/segment/provider.yaml"]
Docs: Segment `external-doc-url` links to Dingtalk API
On file `master:airflow/providers/segment/provider.yaml` ``` integrations: - integration-name: Segment external-doc-url: https://oapi.dingtalk.com tags: [service] ``` That is the API for Dingtalk, which is an unrelated Alibaba owned service. The docs for Twilio Segment can be found at [(https://segment.com/docs/)](url) I am not sure if this issue is the result of an issue somewhere else, but I identified this will adding integration logos. Not sure any of this is relevant because but it appears I must add this information **Apache Airflow version**: 2.0.0 **Environment**: GNU/Linux - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): Ubuntu - **Kernel** (e.g. `uname -a`): 20.4 - **Install tools**: PIP **What happened**: `external-doc-url` is mapped to dingtalk API **What you expected to happen**: `external-doc-url` to be mapped to Segment docs **How to reproduce it**: Observe code at `master:airflow/providers/segment/provider.yaml` ``` integrations: - integration-name: Segment external-doc-url: https://oapi.dingtalk.com tags: [service] ``` New to Apache Airflow, but please bear with me 😄
https://github.com/apache/airflow/issues/13634
https://github.com/apache/airflow/pull/13645
189af54043a6aa6e7557bda6cf7cfca229d0efd2
548d082008c0c83f44020937f6ff19ca006b96cc
"2021-01-12T13:06:38Z"
python
"2021-01-13T12:07:17Z"
closed
apache/airflow
https://github.com/apache/airflow
13,629
["airflow/providers/apache/hive/hooks/hive.py"]
HiveCliHook kill method error
https://github.com/apache/airflow/blob/6c458f29c0eeadb1282e524e76fdd379d6436824/airflow/providers/apache/hive/hooks/hive.py#L464 It should be: ``` if hasattr(self, 'sub_process') ```
https://github.com/apache/airflow/issues/13629
https://github.com/apache/airflow/pull/14542
45a0ac2e01c174754f4e6612c8e4d3125061d096
d9e4454c66051a9e8bb5b2f3814d46f29332b89d
"2021-01-12T07:06:21Z"
python
"2021-03-01T13:59:12Z"
closed
apache/airflow
https://github.com/apache/airflow
13,624
["airflow/www/templates/airflow/dags.html"]
Misleading dag pause info tooltip
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: N/A **What happened**: The UI tooltip is misleading and confuses the user. Tooltip says " use this toggle to pause the dag" which implies that if the toggle is set to **ON** the flow is paused, but in fact it's the reverse of that. Either the logic should be reversed so that if the toggle is on, the DAG is paused, or the wording should be changed to explicitly state the actual functionality of the "on state" of the toggle. something like "When this toggle is ON, the DAG will be executed at scheduled times, turn this toggle off to pause executions of this dag ". **What you expected to happen**: UI tooltip should be honest and clear about its function. **How to reproduce it**: open DAGs window of the airflow webserver in a supported browser, hold mouse over the (i) on the second cell from left on the top row. <img width="534" alt="Screen Shot 2021-01-11 at 12 27 18 PM" src="https://user-images.githubusercontent.com/14813957/104258476-7bfad200-5434-11eb-8152-443f05071e4b.png">
https://github.com/apache/airflow/issues/13624
https://github.com/apache/airflow/pull/13642
3d538636984302013969aa82a04d458d24866403
c4112e2e9deaa2e30e6fd05d43221023d0d7d40b
"2021-01-12T01:46:46Z"
python
"2021-01-12T19:14:31Z"
closed
apache/airflow
https://github.com/apache/airflow
13,602
["airflow/www/utils.py", "tests/www/test_utils.py"]
WebUI returns an error when logs that do not use a DAG list `None` as the DAG ID
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: - **Cloud provider or hardware configuration**: docker-compose - **OS** (e.g. from /etc/os-release): Docker `apache/airflow` `sha256:b4f957bef5a54ca0d781ae1431d8485f125f0b5d18f3bc7e0416c46e617db265` - **Kernel** (e.g. `uname -a`): Linux c697ae3a0397 5.4.0-58-generic #64~18.04.1-Ubuntu SMP Wed Dec 9 17:11:11 UTC 2020 x86_64 GNU/Linux - **Install tools**: docker - **Others**: **What happened**: When an event that does not include a DAG is logged in the UI, this event lists the DAG ID as "None". This "None" is treated as an actual DAG ID with a link, which throws an error if clicked. ``` Something bad has happened. Please consider letting us know by creating a bug report using GitHub. Python version: 3.6.12 Airflow version: 2.0.0 Node: 9097c882a712 ------------------------------------------------------------------------------- Traceback (most recent call last): File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/home/airflow/.local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise raise value File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/auth.py", line 34, in decorated return func(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/decorators.py", line 97, in view_func return f(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/decorators.py", line 60, in wrapper return f(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/session.py", line 65, in wrapper return func(*args, session=session, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/views.py", line 2028, in graph dag = current_app.dag_bag.get_dag(dag_id) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/session.py", line 65, in wrapper return func(*args, session=session, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dagbag.py", line 171, in get_dag self._add_dag_from_db(dag_id=dag_id, session=session) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dagbag.py", line 227, in _add_dag_from_db raise SerializedDagNotFound(f"DAG '{dag_id}' not found in serialized_dag table") airflow.exceptions.SerializedDagNotFound: DAG 'None' not found in serialized_dag table ``` <!-- (please include exact error messages if you can) --> **What you expected to happen**: I expected `None` to not be a link or have it link to some sort of error page. Instead it throws an error. **How to reproduce it**: Run a CLI command such as `airflow dags list`, then go to `/log/list/` in the web UI, and click on the `None` *Dag Id* for the logged event for the command. ![image](https://user-images.githubusercontent.com/2805291/104140901-d7b85300-5381-11eb-87e3-25a22dc842ec.png) **Anything else we need to know**: This problem appears to occur every time.
https://github.com/apache/airflow/issues/13602
https://github.com/apache/airflow/pull/13619
eb40eea81be95ecd0e71807145797b6d82375885
8ecdef3e50d3b83901d70a13794ae6afabc4964e
"2021-01-11T01:26:42Z"
python
"2021-01-12T10:16:01Z"
closed
apache/airflow
https://github.com/apache/airflow
13,597
["airflow/www/static/js/connection_form.js", "airflow/www/views.py"]
Extra field widgets of custom connections do not properly save data
**Apache Airflow version**: 2.0.0 **Environment**: Docker image `apache/airflow:2.0.0-python3.8` on Win10 with WSL **What happened**: I built a custom provider with a number of custom connections. This works: - The connections are properly registered - The UI does not show hidden fields as per `get_ui_field_behaviour` - The UI correctly relabels fields as per `get_ui_field_behaviour` - The UI correctly shows added widgets as per `get_connection_form_widgets` (well, mostly) What does not work: - The UI does not save values entered for additional widgets I used the [JDBC example](https://github.com/apache/airflow/blob/master/airflow/providers/jdbc/hooks/jdbc.py) to string myself along by copying it and pasting it as a hook into my custom provider package. (I did not install the JDBC provider package, unless it is installed in the image I use - but if I don't add it in my own provider package, I don't have the connection type in the UI, so I assume it is not). Curiously, The JDBC hook works just fine. I then created the following file: ```Python """ You find two child classes of DbApiHook in here. One is the exact copy of the JDBC provider hook, minus some irrelevant logic (I only care about the UI stuff here). The other is the exact same thing, except I added an "x" behind every occurance of "jdbc" in strings and names. """ from typing import Any, Dict, Optional from airflow.hooks.dbapi import DbApiHook class JdbcXHook(DbApiHook): """ Copy of JdbcHook below. Added an "x" at various places, including the class name. """ conn_name_attr = 'jdbcx_conn_id' # added x default_conn_name = 'jdbcx_default' # added x conn_type = 'jdbcx' # added x hook_name = 'JDBCx Connection' # added x supports_autocommit = True @staticmethod def get_connection_form_widgets() -> Dict[str, Any]: """Returns connection widgets to add to connection form""" from flask_appbuilder.fieldwidgets import BS3TextFieldWidget from flask_babel import lazy_gettext from wtforms import StringField # added an x in the keys return { "extra__jdbcx__drv_path": StringField(lazy_gettext('Driver Path'), widget=BS3TextFieldWidget()), "extra__jdbcx__drv_clsname": StringField( lazy_gettext('Driver Class'), widget=BS3TextFieldWidget() ), } @staticmethod def get_ui_field_behaviour() -> Dict: """Returns custom field behaviour""" return { "hidden_fields": ['port', 'schema', 'extra'], "relabeling": {'host': 'Connection URL'}, } class JdbcHook(DbApiHook): """ General hook for jdbc db access. JDBC URL, username and password will be taken from the predefined connection. Note that the whole JDBC URL must be specified in the "host" field in the DB. Raises an airflow error if the given connection id doesn't exist. """ conn_name_attr = 'jdbc_conn_id' default_conn_name = 'jdbc_default' conn_type = 'jdbc' hook_name = 'JDBC Connection plain' supports_autocommit = True @staticmethod def get_connection_form_widgets() -> Dict[str, Any]: """Returns connection widgets to add to connection form""" from flask_appbuilder.fieldwidgets import BS3TextFieldWidget from flask_babel import lazy_gettext from wtforms import StringField return { "extra__jdbc__drv_path": StringField(lazy_gettext('Driver Path'), widget=BS3TextFieldWidget()), "extra__jdbc__drv_clsname": StringField( lazy_gettext('Driver Class'), widget=BS3TextFieldWidget() ), } @staticmethod def get_ui_field_behaviour() -> Dict: """Returns custom field behaviour""" return { "hidden_fields": ['port', 'schema', 'extra'], "relabeling": {'host': 'Connection URL'}, } ``` **What you expected to happen**: After doing the above, I expected - Seeing both in the add connection UI - Being able to use both the same way **What actually happenes**: - I _do_ see both in the UI (Screenshot 1) - For some reason, the "normal" hook has BOTH extra fields - not just his own two? (Screenshot 2) - If I add the connection as in Screenshot 2, they are saved in the four fields (his own two + the two for the "x" hook) properly as shown in Screenshot 3 - If I seek to edit the connection again, they are also they - all four fields - with the correct values in the UI - If I add the connection for the "x" type as in Screenshot 4, it ostensibly saves it - with two fields as defined in the code - You can see in screenshot 5, that the extra is saved as an empty string?! - When trying to edit the connection in the UI, you also see that there is no data saved for two extra widgets?! - I added a few more screenshots of airflow providers CLI command results (note that the package `ewah` has a number of other custom hooks, and the issue above occurs for *all* of them) *Screenshot 1:* ![image](https://user-images.githubusercontent.com/46958547/104121824-9acc6c00-5341-11eb-821c-4bff40a0e7c7.png) *Screenshot 2:* ![image](https://user-images.githubusercontent.com/46958547/104121854-c94a4700-5341-11eb-8d3c-80b6380730d9.png) *Screenshot 3:* ![image](https://user-images.githubusercontent.com/46958547/104121912-247c3980-5342-11eb-8030-11c7348309f3.png) *Screenshot 4:* ![image](https://user-images.githubusercontent.com/46958547/104121944-5e4d4000-5342-11eb-83b7-870711ccd367.png) *Screenshot 5:* ![image](https://user-images.githubusercontent.com/46958547/104121971-82a91c80-5342-11eb-83b8-fee9386c0c4f.png) *Screenshot 6 - airflow providers behaviours:* ![image](https://user-images.githubusercontent.com/46958547/104122073-1c70c980-5343-11eb-88f6-6130e5de9e92.png) *Screenshot 7 - airflow providers get:* ![image](https://user-images.githubusercontent.com/46958547/104122092-41fdd300-5343-11eb-9bda-f6849812ba56.png) (Note: This error occurs with pre-installed providers as well) *Screenshot 8 - airflow providers hooks:* ![image](https://user-images.githubusercontent.com/46958547/104122109-65288280-5343-11eb-8322-dda73fef6649.png) *Screenshot 9 - aorflow providers list:* ![image](https://user-images.githubusercontent.com/46958547/104122191-c94b4680-5343-11eb-80cf-7f510d4b6e9a.png) *Screenshot 10 - airflow providers widgets:* ![image](https://user-images.githubusercontent.com/46958547/104122142-930dc700-5343-11eb-96be-dec43d87a59d.png) **How to reproduce it**: - create a custom provider package - add the code snippet pasted above somewhere - add the two classes to the `hook-class-names` list in the provider info - install the provider package - do what I described above
https://github.com/apache/airflow/issues/13597
https://github.com/apache/airflow/pull/13640
34eb203c5177bc9be91a9387d6a037f6fec9dba1
b007fc33d481f0f1341d1e1e4cba719a5fe6580d
"2021-01-10T12:00:44Z"
python
"2021-01-12T23:32:49Z"
closed
apache/airflow
https://github.com/apache/airflow
13,559
["airflow/models/taskinstance.py"]
Nested templated variables do not always render
**Apache Airflow version**: 1.10.14 and 1.10.8. **Environment**: Python 3.6 and Airflow 1.10.14 on sqllite, **What happened**: Nested jinja templates do not consistently render when running tasks. TI run rendering behavior also differs from airflow UI and airflow render cli. **What you expected to happen**: Airflow should render nested jinja templates consistently and completely across each interface. Coming from airflow 1.8.2, this used to be the case. <!-- What do you think went wrong? --> This regression may have been introduced in 1.10.6 with a refactor of BaseOperator templating functionality. https://github.com/apache/airflow/pull/5461 Whether or not a nested layer renders seems to differ based on which arg is being templated in an operator and perhaps order. Furthermore, it seems like the render cli and airflow ui each apply TI.render_templates() a second time, creating inconsistency in what nested templates get rendered. There may be bug in the way BaseOperator.render_template() observes/caches templated fields **How to reproduce it**: From the most basic airflow setup nested_template_bug.py ``` from datetime import datetime from airflow import DAG from airflow.operators.python_operator import PythonOperator with DAG("nested_template_bug", start_date=datetime(2021, 1, 1)) as dag: arg0 = 'level_0_{{task.task_id}}_{{ds}}' kwarg1 = 'level_1_{{task.op_args[0]}}' def print_fields(arg0, kwarg1): print(f'level 0 arg0: {arg0}') print(f'level 1 kwarg1: {kwarg1}') nested_render = PythonOperator( task_id='nested_render', python_callable=print_fields, op_args=[arg0, ], op_kwargs={ 'kwarg1': kwarg1, }, ) ``` ``` > airflow test c level 0 arg0: level_0_nested_render_2021-01-01 level 1 kwarg1: level_1_level_0_{{task.task_id}}_{{ds}} > airflow render nested_template_bug nested_render 2021-01-01 # ---------------------------------------------------------- # property: op_args # ---------------------------------------------------------- ['level_0_nested_render_2021-01-01'] # ---------------------------------------------------------- # property: op_kwargs # ---------------------------------------------------------- {'kwarg1': 'level_1_level_0_nested_render_2021-01-01'} ```
https://github.com/apache/airflow/issues/13559
https://github.com/apache/airflow/pull/18516
b0a29776b32cbee657c9a6369d15278a999e927f
1ac63cd5e2533ce1df1ec1170418a09170998699
"2021-01-08T04:06:45Z"
python
"2021-09-28T15:30:58Z"
closed
apache/airflow
https://github.com/apache/airflow
13,535
["airflow/providers/docker/CHANGELOG.rst", "airflow/providers/docker/operators/docker.py", "tests/providers/docker/operators/test_docker.py"]
DockerOperator / XCOM : `TypeError: Object of type bytes is not JSON serializable`
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA **Environment**: * local Ubuntu 18.04 LTS / * docker-compose version 1.25.3, build d4d1b42b * docker 20.10.1, build 831ebea **What happened**: when enabling xcom push for a docker operator the following error is thrown after the task finishes succesfully: `TypeError: Object of type bytes is not JSON serializable` **What you expected to happen**: * error is not thrown * if xcom_all is True: xcom contains all log lines * if xcom_all is False: xcom contains last log line **How to reproduce it**: see docker compose and readme here: https://github.com/AlessioM/airflow-xcom-issue
https://github.com/apache/airflow/issues/13535
https://github.com/apache/airflow/pull/13536
2de7793881da0968dd357a54e8b2a99017891915
cd3307ff2147b170dc3feb5999edf5c8eebed4ba
"2021-01-07T09:22:20Z"
python
"2021-07-26T17:55:07Z"
closed
apache/airflow
https://github.com/apache/airflow
13,532
["airflow/providers/docker/operators/docker.py", "tests/providers/docker/operators/test_docker.py"]
In DockerOperator the parameter auto_remove doesn't work in
When setting DockerOperator with auto_remove=True in airflow version 2.0.0 the container remain in the container list if it was finished with 'Exited (1)'
https://github.com/apache/airflow/issues/13532
https://github.com/apache/airflow/pull/13993
8eddc8b5019890a712810b8e5b1185997adb9bf4
ba54afe58b7cbd3711aca23252027fbd034cca41
"2021-01-07T07:48:37Z"
python
"2021-01-31T19:23:45Z"
closed
apache/airflow
https://github.com/apache/airflow
13,531
["airflow/api_connexion/endpoints/task_instance_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/task_instance_schema.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"]
Airflow v1 REST List task instances api can not get `no_status` task instance
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0 **Environment**: - **OS** ubuntu 18.04 - **Kernel** 5.4.0-47-generic **What happened**: When I use list task instances REST api, I can not get the instances that status is `no_status`. ``` ### Get Task Instances POST {{baseUrl}}/dags/~/dagRuns/~/taskInstances/list Authorization: Basic admin:xxx Content-Type: application/json { "dag_ids": ["stop_dags"] } or { "dag_ids": ["stop_dags"], "state": ["null"] } ``` **What you expected to happen**: include all state task instances when I don't have specific states. <!-- What do you think went wrong? --> **How to reproduce it**: use REST test tools like postman to visit the api. **Anything else we need to know**: I can not find the REST api that I get all the dag runs instances with specific state, maybe should extend the REST api. Thanks!
https://github.com/apache/airflow/issues/13531
https://github.com/apache/airflow/pull/19487
1e570229533c4bbf5d3c901d5db21261fa4b1137
f636060fd7b5eb8facd1acb10a731d4e03bc864a
"2021-01-07T07:19:52Z"
python
"2021-11-20T16:09:33Z"
closed
apache/airflow
https://github.com/apache/airflow
13,515
["airflow/providers/slack/ADDITIONAL_INFO.md", "airflow/providers/slack/BACKPORT_PROVIDER_README.md", "airflow/providers/slack/README.md", "airflow/providers/slack/hooks/slack.py", "docs/conf.py", "docs/spelling_wordlist.txt", "scripts/ci/images/ci_verify_prod_image.sh", "setup.py", "tests/providers/slack/hooks/test_slack.py"]
Update slackapiclient / slack_sdk to v3
Hello, Slack has released updates to its library and we can start using it for it. We especially like one change. > slack_sdk has no required dependencies. This means aiohttp is no longer automatically resolved. I've looked through the documentation and it doesn't look like a difficult task, but I think it's still worth testing. More info: https://slack.dev/python-slack-sdk/v3-migration/index.html#from-slackclient-2-x Best regards, Kamil Breguła
https://github.com/apache/airflow/issues/13515
https://github.com/apache/airflow/pull/13745
dbd026227949a74e5995c8aef3c35bd80fc36389
283945001363d8f492fbd25f2765d39fa06d757a
"2021-01-06T12:56:13Z"
python
"2021-01-25T21:13:48Z"
closed
apache/airflow
https://github.com/apache/airflow
13,504
["airflow/jobs/scheduler_job.py", "airflow/models/dagbag.py", "tests/jobs/test_scheduler_job.py"]
Scheduler is unable to find serialized DAG in the serialized_dag table
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Not relevant **Environment**: - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): CentOS Linux 7 (Core) - **Kernel** (e.g. `uname -a`): Linux us01odcres-jamuaar-0003 3.10.0-957.5.1.el7.x86_64 #1 SMP Fri Feb 1 14:54:57 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: PostgreSQL 12.2 - **Others**: **What happened**: I have 2 dag files say, dag1.py and dag2.py. dag1.py creates a static DAG i.e. once it's parsed it will create 1 specific DAG. dag2.py creates dynamic DAGs based on json files kept in an external location. The static DAG (generated from dag1.py) has a task in the later stage which generates json files and they get picked up by dag2.py which creates dynamic DAGs. The dynamic DAGs which get created are unpaused by default and get scheduled once. This whole process used to work fine with airflow 1.x where DAG serialization was not mandatory and was turned off by default. But with Airflow 2.0 I am getting the following exception occasionally when the dynamically generated DAGs try to get scheduled by the scheduler. ``` [2021-01-06 10:09:38,742] {scheduler_job.py:1293} ERROR - Exception when executing SchedulerJob._run_scheduler_loop Traceback (most recent call last): File "/global/packages/python/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1275, in _execute self._run_scheduler_loop() File "/global/packages/python/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1377, in _run_scheduler_loop num_queued_tis = self._do_scheduling(session) File "/global/packages/python/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1474, in _do_scheduling self._create_dag_runs(query.all(), session) File "/global/packages/python/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 1557, in _create_dag_runs dag = self.dagbag.get_dag(dag_model.dag_id, session=session) File "/global/packages/python/lib/python3.7/site-packages/airflow/utils/session.py", line 62, in wrapper return func(*args, **kwargs) File "/global/packages/python/lib/python3.7/site-packages/airflow/models/dagbag.py", line 171, in get_dag self._add_dag_from_db(dag_id=dag_id, session=session) File "/global/packages/python/lib/python3.7/site-packages/airflow/models/dagbag.py", line 227, in _add_dag_from_db raise SerializedDagNotFound(f"DAG '{dag_id}' not found in serialized_dag table") airflow.exceptions.SerializedDagNotFound: DAG 'dynamic_dag_1' not found in serialized_dag table ``` When I checked the serialized_dag table manually, I am able to see the DAG entry there. I found the last_updated column value to be **2021-01-06 10:09:38.757076+05:30** Whereas the exception got logged at **[2021-01-06 10:09:38,742]** which is little before the last_updated time. I think this means that the Scheduler tried to look for the DAG entry in the serialized_dag table before DagFileProcessor created the entry. Is this right or something else can be going on here? **What you expected to happen**: Scheduler should start looking for the DAG entry in the serialized_dag table only after DagFileProcessor has added it. Here it seems that DagFileProcessor added the DAG entry in the **dag** table, scheduler immediately fetched this dag_id from it and tried to find the same in **serialized_dag** table even before DagFileProcessor could add that. **How to reproduce it**: It occurs occasionally and there is no well defined way to reproduce it. **Anything else we need to know**:
https://github.com/apache/airflow/issues/13504
https://github.com/apache/airflow/pull/13893
283945001363d8f492fbd25f2765d39fa06d757a
b9eb51a0fb32cd660a5459d73d7323865b34dd99
"2021-01-06T07:57:27Z"
python
"2021-01-25T21:55:37Z"
closed
apache/airflow
https://github.com/apache/airflow
13,494
["airflow/providers/google/cloud/log/stackdriver_task_handler.py", "airflow/utils/log/log_reader.py", "tests/cli/commands/test_info_command.py", "tests/providers/google/cloud/log/test_stackdriver_task_handler.py", "tests/providers/google/cloud/log/test_stackdriver_task_handler_system.py"]
Unable to view StackDriver logs in Web UI
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.15-gke.4901 **Environment**: - **Cloud provider or hardware configuration**: GKE - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: Using the apache/airflow docker image - **Others**: Running 1 pod encapsulating 2 containers (1 x webserver and 1x scheduler) running in localexecutor mode **What happened**: I have remote logging configured for tasks to send the logs to StackDriver as per the below configuration. The logs get sent to Stackdriver okay and I can view them via the GCP console. However I cannot view them when browsing the UI. The UI shows a spinning wheel and I see requests in the network tab to `https://my_airflow_instance/get_logs_with_metadata?dag_id=XXX......` These requests take about 15 seconds to run before returning with HTTP 200 and something like this in the response body: ``` {"message":"","metadata":{"end_of_log":false,"next_page_token":"xxxxxxxxx"}} ``` So no actual log data **What you expected to happen**: I should see the logs in the Web UI **How to reproduce it**: Configure remote logging for StackDriver with the below config: ``` AIRFLOW__LOGGING__GOOGLE_KEY_PATH: "/var/run/secrets/airflow/secrets/google-cloud-platform/stackdriver/credentials.json" AIRFLOW__LOGGING__LOG_FORMAT: "[%(asctime)s] {{%(filename)s:%(lineno)d}} %(levelname)s - %(message)s" AIRFLOW__LOGGING__REMOTE_LOGGING: "True" AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: "stackdriver://airflow-tasks" ``` **Anything else we need to know**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> -->
https://github.com/apache/airflow/issues/13494
https://github.com/apache/airflow/pull/13784
d65376c377341fa9d6da263e145e06880d4620a8
833e3383230e1f6f73f8022ddf439d3d531eff01
"2021-01-05T17:47:14Z"
python
"2021-02-02T17:38:25Z"
closed
apache/airflow
https://github.com/apache/airflow
13,464
["airflow/models/dagrun.py"]
Scheduler fails if task is removed at runtime
**Apache Airflow version**: 2.0.0, LocalExecutor **Environment**: Docker on Win10 with WSL, official Python3.8 image **What happened**: When a DAG is running, and I delete task from the running DAG, the scheduler fails. When using Docker, upon automatic restart of the scheduler, the scheduler just fails again, perpetually. ![image](https://user-images.githubusercontent.com/46958547/103558919-a89f6e80-4eb5-11eb-8521-19b04ee3c690.png) Note: I don't _know_ if the task itself was running at the time, but I would guess it was. **What you expected to happen**: The scheduler should understand that the task is not part of the DAG anymore and not fail. **How to reproduce it**: - Create a DAG with multiple tasks - Let it run - While running, delete one of the tasks from the source code - See the scheduler break
https://github.com/apache/airflow/issues/13464
https://github.com/apache/airflow/pull/14057
d45739f7ce0de183329d67fff88a9da3943a9280
eb78a8b86c6e372bbf4bfacb7628b154c16aa16b
"2021-01-04T16:54:58Z"
python
"2021-02-04T10:08:17Z"
closed
apache/airflow
https://github.com/apache/airflow
13,451
["airflow/providers/http/sensors/http.py", "tests/providers/http/sensors/test_http.py"]
Modify HttpSensor to continue poking if the response is not 404
**Description** As documented in the [HttpSensor](https://airflow.apache.org/docs/apache-airflow-providers-http/stable/_modules/airflow/providers/http/sensors/http.html) if the response for the HTTP call is an error different from "404" the task will Fail. >HTTP Error codes other than 404 (like 403) or Connection Refused Error > would fail the sensor itself directly (no more poking). The code block that apply this behavior: ``` except AirflowException as exc: if str(exc).startswith("404"): return False raise exc ``` **Use case / motivation** I am working with an API that returns 500 for any error that happens internally (unauthorized, Not Acceptable, etc) and need the sensor be able to continue poking even the response is different from 404. Another case's an API that sometimes returns 429 and makes the task fail. (Could solve with a large interval) The first API has a bad design, but since we need to consume some services like this, I would like to have more flexibility when working with HttpSensor **What do you want to happen** When creating a HttpSensor task, I would like to be able to pass a list of status codes that will make the Sensor return "False" if the HTTP status code in the response match one code of the list to make the Sensor continue poking. If no status code is set, the default to return False and continue poking will be 404 like is now. **Are you willing to submit a PR?** Yep!
https://github.com/apache/airflow/issues/13451
https://github.com/apache/airflow/pull/13499
7a742cb03375a57291242131a27ffd4903bfdbd8
1602ec97c8d5bc7a7a8b42e850ac6c7a7030e47d
"2021-01-03T17:10:29Z"
python
"2021-01-20T00:02:08Z"
closed
apache/airflow
https://github.com/apache/airflow
13,434
["airflow/models/dag.py", "tests/jobs/test_scheduler_job.py"]
Airflow 2.0.0 manual run causes scheduled run to skip
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: - **Cloud provider or hardware configuration**: local/aws - **OS** (e.g. from /etc/os-release): Ubuntu 18.04.5 LTS - **Kernel** (e.g. `uname -a`): 5.4.0-1032-aws - **Install tools**: pip - **Others**: **What happened**: I did a fresh Airflow 2.0.0 install. With this version, when I manually trigger a DAG, Airflow skips the next scheduled run. <!-- (please include exact error messages if you can) --> **What you expected to happen**: Manual runs do not interfere with the scheduled runs prior to Airflow 2. <!-- What do you think went wrong? --> **How to reproduce it**: Create a simple hourly DAG. After enabling it and the initial run, run it manually. It shall skip the next hour. Below is an example, where the manual run with execution time of 08:17 causes the scheduled run with execution time of 08:00 to skip. ![image](https://user-images.githubusercontent.com/26160471/103457719-c7193480-4d12-11eb-82cb-42efaedc9ef4.png)
https://github.com/apache/airflow/issues/13434
https://github.com/apache/airflow/pull/13963
8e0db6eae371856597dce0ccf8a920b0107965cd
de277c69e7909cf0d563bbd542166397523ebbe0
"2021-01-02T12:59:07Z"
python
"2021-01-30T12:02:53Z"
closed
apache/airflow
https://github.com/apache/airflow
13,414
["airflow/operators/trigger_dagrun.py", "tests/operators/test_trigger_dagrun.py"]
DAG raises error when passing non serializable JSON object via trigger
When passing a non serializable JSON object in a trigger, I get the following error below. The logs become unavailable. my code: ```py task_trigger_ad_attribution = TriggerDagRunOperator( task_id='trigger_ad_attribution', trigger_dag_id=AD_ATTRIBUTION_DAG_ID, conf={"message": "Triggered from display trigger", 'trigger_info': {'dag_id':DAG_ID, 'now':datetime.datetime.now(), }, 'trigger_date' : '{{execution_date}}' }, ) ``` ``` Ooops! Something bad has happened. Please consider letting us know by creating a bug report using GitHub. Python version: 3.6.9 Airflow version: 2.0.0 Node: henry-Inspiron-5566 ------------------------------------------------------------------------------- Traceback (most recent call last): File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise raise value File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/auth.py", line 34, in decorated return func(*args, **kwargs) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/decorators.py", line 97, in view_func return f(*args, **kwargs) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/decorators.py", line 60, in wrapper return f(*args, **kwargs) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/views.py", line 1997, in tree data = htmlsafe_json_dumps(data, separators=(',', ':')) File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/jinja2/utils.py", line 614, in htmlsafe_json_dumps dumper(obj, **kwargs) File "/usr/lib/python3.6/json/__init__.py", line 238, in dumps **kw).encode(obj) File "/usr/lib/python3.6/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File "/usr/lib/python3.6/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) File "/usr/lib/python3.6/json/encoder.py", line 180, in default o.__class__.__name__) TypeError: Object of type 'datetime' is not JSON serializable ```
https://github.com/apache/airflow/issues/13414
https://github.com/apache/airflow/pull/13964
862443f6d3669411abfb83082c29c2fad7fcf12d
b4885b25871ae7ede2028f81b0d88def3e22f23a
"2020-12-31T20:51:45Z"
python
"2021-01-29T16:24:46Z"
closed
apache/airflow
https://github.com/apache/airflow
13,376
["airflow/cli/commands/sync_perm_command.py", "tests/cli/commands/test_sync_perm_command.py"]
airflow sync-perm command does not sync DAG level Access Control
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.0 **What happened**: Running sync_perm CLI command does not synchronize the permission granted through the DAG via access_control. This is because of dag serialization. When dag serialization is enabled, the dagbag will exhibit a lazy loading behaviour. **How to reproduce it**: 1. Add access_control to a DAG where the new role has permission to see the DAG. ``` access_control={ "test": {'can_dag_read'} }, ``` 4. Run `airflow sync-perm`. 5. Log in as the new user and you will still not see any DAG. 6. If you refresh the DAG, the new user will be able to DAG after they refresh their page **Expected behavior** When I run `airflow sync-perm`, I expect the role who has been granted read permission for the DAG to be able to see that DAG. This is also an issue with 1.10.x with DAG Serialization enabled, so would be good to backport it too.
https://github.com/apache/airflow/issues/13376
https://github.com/apache/airflow/pull/13377
d5cf993f81ea2c4b5abfcb75ef05a6f3783874f2
1b94346fbeca619f3084d05bdc5358836ed02318
"2020-12-29T23:33:44Z"
python
"2020-12-30T11:35:45Z"
closed
apache/airflow
https://github.com/apache/airflow
13,360
["airflow/providers/amazon/aws/transfers/mongo_to_s3.py", "tests/providers/amazon/aws/transfers/test_mongo_to_s3.py"]
Add 'mongo_collection' to template_fields in MongoToS3Operator
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** <!-- A short description of your feature --> Make `MongoToS3Operator` `mongo_collection` parameter templated. **Use case / motivation** <!-- What do you want to happen? Rather than telling us how you might implement this solution, try to take a step back and describe what you are trying to achieve. --> This would allow for passing a templated mongo collection from other tasks, such as a mongo collection used as data destination by using `S3Hook`. For instance, we could use templated mongo collection to write data for different dates in different collections by using: `mycollection.{{ ds_nodash }}`. **Are you willing to submit a PR?** <!--- We accept contributions! --> Yes. **Related Issues** <!-- Is there currently another issue associated with this? --> N/A
https://github.com/apache/airflow/issues/13360
https://github.com/apache/airflow/pull/13361
e43688358320a5f20776c0d346c310a568a55049
f7a1334abe4417409498daad52c97d3f0eb95137
"2020-12-29T11:42:55Z"
python
"2021-01-02T10:32:07Z"
closed
apache/airflow
https://github.com/apache/airflow
13,331
["airflow/cli/cli_parser.py", "airflow/cli/commands/scheduler_command.py", "chart/templates/scheduler/scheduler-deployment.yaml", "tests/cli/commands/test_scheduler_command.py"]
Helm Chart uses unsupported commands for Airflow 2.0 - serve_logs
Hello, Our Helm Chart uses the command that is deleted in Airflow 2.0. We should consider what we want to do with it - add this command again or delete the reference to this command in the Helm Chart. https://github.com/apache/airflow/blob/d41c6a46b176a80e1cdb0bcc592f5a8baec21c41/chart/templates/scheduler/scheduler-deployment.yaml#L177-L197 Related PR: https://github.com/apache/airflow/pull/6843 Best regards, Kamil Breguła CC: @dstandish
https://github.com/apache/airflow/issues/13331
https://github.com/apache/airflow/pull/15557
053d903816464f699876109b50390636bf617eff
414bb20fad6c6a50c5a209f6d81f5ca3d679b083
"2020-12-27T23:00:54Z"
python
"2021-04-29T15:06:06Z"
closed
apache/airflow
https://github.com/apache/airflow
13,325
["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"]
max_tis_per_query=0 leads to nothing being scheduled in 2.0.0
After upgrading to airflow 2.0.0 it seems as if the scheduler isn't working anymore. Tasks hang on scheduled state, but no tasks get executed. I've tested this with sequential and celery executor. When using the celery executor no messages seem to arrive in RabbiyMq This is on local docker. Everything was working fine before upgrading. There don't seem to be any error messages, so I'm not completely sure if this is a bug or a misconfiguration on my end. Using python:3.7-slim-stretch Docker image. Regular setup that we're using is CeleryExecutor. Mysql version is 5.7 Any help would be greatly appreciated. **Python packages** alembic==1.4.3 altair==4.1.0 amazon-kclpy==1.5.0 amqp==2.6.1 apache-airflow==2.0.0 apache-airflow-providers-amazon==1.0.0 apache-airflow-providers-celery==1.0.0 apache-airflow-providers-ftp==1.0.0 apache-airflow-providers-http==1.0.0 apache-airflow-providers-imap==1.0.0 apache-airflow-providers-jdbc==1.0.0 apache-airflow-providers-mysql==1.0.0 apache-airflow-providers-sqlite==1.0.0 apache-airflow-upgrade-check==1.1.0 apispec==3.3.2 appdirs==1.4.4 argcomplete==1.12.2 argon2-cffi==20.1.0 asn1crypto==1.4.0 async-generator==1.10 attrs==20.3.0 azure-common==1.1.26 azure-core==1.9.0 azure-storage-blob==12.6.0 Babel==2.9.0 backcall==0.2.0 bcrypt==3.2.0 billiard==3.6.3.0 black==20.8b1 bleach==3.2.1 boa-str==1.1.0 boto==2.49.0 boto3==1.7.3 botocore==1.10.84 cached-property==1.5.2 cattrs==1.1.2 cbsodata==1.3.3 celery==4.4.2 certifi==2020.12.5 cffi==1.14.4 chardet==3.0.4 click==7.1.2 clickclick==20.10.2 cmdstanpy==0.9.5 colorama==0.4.4 colorlog==4.0.2 commonmark==0.9.1 connexion==2.7.0 convertdate==2.3.0 coverage==4.2 croniter==0.3.36 cryptography==3.3.1 cycler==0.10.0 Cython==0.29.21 decorator==4.4.2 defusedxml==0.6.0 dill==0.3.3 dnspython==2.0.0 docutils==0.14 email-validator==1.1.2 entrypoints==0.3 ephem==3.7.7.1 et-xmlfile==1.0.1 fbprophet==0.7.1 fire==0.3.1 Flask==1.1.2 Flask-AppBuilder==3.1.1 Flask-Babel==1.0.0 Flask-Bcrypt==0.7.1 Flask-Caching==1.9.0 Flask-JWT-Extended==3.25.0 Flask-Login==0.4.1 Flask-OpenID==1.2.5 Flask-SQLAlchemy==2.4.4 flask-swagger==0.2.13 Flask-WTF==0.14.3 flatten-json==0.1.7 flower==0.9.5 funcsigs==1.0.2 future==0.18.2 graphviz==0.15 great-expectations==0.13.2 gunicorn==19.10.0 holidays==0.10.4 humanize==3.2.0 idna==2.10 importlib-metadata==1.7.0 importlib-resources==1.5.0 inflection==0.5.1 ipykernel==5.4.2 ipython==7.19.0 ipython-genutils==0.2.0 ipywidgets==7.5.1 iso8601==0.1.13 isodate==0.6.0 itsdangerous==1.1.0 JayDeBeApi==1.2.3 jdcal==1.4.1 jedi==0.17.2 jellyfish==0.8.2 Jinja2==2.11.2 jmespath==0.10.0 joblib==1.0.0 JPype1==1.2.0 json-merge-patch==0.2 jsonpatch==1.28 jsonpointer==2.0 jsonschema==3.2.0 jupyter-client==6.1.7 jupyter-core==4.7.0 jupyterlab-pygments==0.1.2 kinesis-events==0.1.0 kiwisolver==1.3.1 kombu==4.6.11 korean-lunar-calendar==0.2.1 lazy-object-proxy==1.4.3 lockfile==0.12.2 LunarCalendar==0.0.9 Mako==1.1.3 Markdown==3.3.3 MarkupSafe==1.1.1 marshmallow==3.10.0 marshmallow-enum==1.5.1 marshmallow-oneofschema==2.0.1 marshmallow-sqlalchemy==0.23.1 matplotlib==3.3.3 mistune==0.8.4 mock==1.0.1 mockito==1.2.2 msrest==0.6.19 mypy-extensions==0.4.3 mysql-connector-python==8.0.18 mysqlclient==2.0.2 natsort==7.1.0 nbclient==0.5.1 nbconvert==6.0.7 nbformat==5.0.8 nest-asyncio==1.4.3 nose==1.3.7 notebook==6.1.5 numpy==1.19.4 oauthlib==3.1.0 openapi-spec-validator==0.2.9 openpyxl==3.0.5 oscrypto==1.2.1 packaging==20.8 pandas==1.1.5 pandocfilters==1.4.3 parso==0.7.1 pathspec==0.8.1 pendulum==2.1.2 pexpect==4.8.0 phonenumbers==8.12.15 pickleshare==0.7.5 Pillow==8.0.1 prison==0.1.3 prometheus-client==0.8.0 prompt-toolkit==3.0.8 protobuf==3.14.0 psutil==5.8.0 ptyprocess==0.6.0 pyarrow==2.0.0 pycodestyle==2.6.0 pycparser==2.20 pycryptodomex==3.9.9 pydevd-pycharm==193.5233.109 Pygments==2.7.3 PyJWT==1.7.1 PyMeeus==0.3.7 pyodbc==4.0.30 pyOpenSSL==19.1.0 pyparsing==2.4.7 pyrsistent==0.17.3 pystan==2.19.1.1 python-crontab==2.5.1 python-daemon==2.2.4 python-dateutil==2.8.1 python-editor==1.0.4 python-nvd3==0.15.0 python-slugify==4.0.1 python3-openid==3.2.0 pytz==2019.3 pytzdata==2020.1 PyYAML==5.3.1 pyzmq==20.0.0 recordlinkage==0.14 regex==2020.11.13 requests==2.23.0 requests-oauthlib==1.3.0 rich==9.2.0 ruamel.yaml==0.16.12 ruamel.yaml.clib==0.2.2 s3transfer==0.1.13 scikit-learn==0.23.2 scipy==1.5.4 scriptinep3==0.3.1 Send2Trash==1.5.0 setproctitle==1.2.1 setuptools-git==1.2 shelljob==0.5.6 six==1.15.0 sklearn==0.0 snowflake-connector-python==2.3.7 snowflake-sqlalchemy==1.2.4 SQLAlchemy==1.3.22 SQLAlchemy-JSONField==1.0.0 SQLAlchemy-Utils==0.36.8 swagger-ui-bundle==0.0.8 tabulate==0.8.7 TagValidator==0.0.8 tenacity==6.2.0 termcolor==1.1.0 terminado==0.9.1 testpath==0.4.4 text-unidecode==1.3 threadpoolctl==2.1.0 thrift==0.13.0 toml==0.10.2 toolz==0.11.1 tornado==6.1 tqdm==4.54.1 traitlets==5.0.5 typed-ast==1.4.1 typing-extensions==3.7.4.3 tzlocal==1.5.1 unicodecsv==0.14.1 urllib3==1.24.2 validate-email==1.3 vine==1.3.0 watchtower==0.7.3 wcwidth==0.2.5 webencodings==0.5.1 Werkzeug==1.0.1 widgetsnbextension==3.5.1 wrapt==1.12.1 WTForms==2.3.1 xlrd==2.0.1 XlsxWriter==1.3.7 zipp==3.4.0 **Relevant config** ``` # The folder where your airflow pipelines live, most likely a # subfolder in a code repositories # This path must be absolute dags_folder = /usr/local/airflow/dags # The executor class that airflow should use. Choices include # SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor executor = CeleryExecutor # The SqlAlchemy connection string to the metadata database. # SqlAlchemy supports many different database engine, more information # their website sql_alchemy_conn = db+mysql://airflow:airflow@postgres/airflow # The SqlAlchemy pool size is the maximum number of database connections # in the pool. sql_alchemy_pool_size = 5 # The SqlAlchemy pool recycle is the number of seconds a connection # can be idle in the pool before it is invalidated. This config does # not apply to sqlite. sql_alchemy_pool_recycle = 3600 # The amount of parallelism as a setting to the executor. This defines # the max number of task instances that should run simultaneously # on this airflow installation parallelism = 32 # The number of task instances allowed to run concurrently by the scheduler dag_concurrency = 16 # Are DAGs paused by default at creation dags_are_paused_at_creation = True # When not using pools, tasks are run in the "default pool", # whose size is guided by this config element non_pooled_task_slot_count = 128 # The maximum number of active DAG runs per DAG max_active_runs_per_dag = 16 # How long before timing out a python file import while filling the DagBag dagbag_import_timeout = 60 # The class to use for running task instances in a subprocess task_runner = StandardTaskRunner # Whether to enable pickling for xcom (note that this is insecure and allows for # RCE exploits). This will be deprecated in Airflow 2.0 (be forced to False). enable_xcom_pickling = True # When a task is killed forcefully, this is the amount of time in seconds that # it has to cleanup after it is sent a SIGTERM, before it is SIGKILLED killed_task_cleanup_time = 60 # This flag decides whether to serialise DAGs and persist them in DB. If set to True, Webserver reads from DB instead of parsing DAG files store_dag_code = True # You can also update the following default configurations based on your needs min_serialized_dag_update_interval = 30 min_serialized_dag_fetch_interval = 10 [celery] # This section only applies if you are using the CeleryExecutor in # [core] section above # The app name that will be used by celery celery_app_name = airflow.executors.celery_executor # The concurrency that will be used when starting workers with the # "airflow worker" command. This defines the number of task instances that # a worker will take, so size up your workers based on the resources on # your worker box and the nature of your tasks worker_concurrency = 16 # When you start an airflow worker, airflow starts a tiny web server # subprocess to serve the workers local log files to the airflow main # web server, who then builds pages and sends them to users. This defines # the port on which the logs are served. It needs to be unused, and open # visible from the main web server to connect into the workers. worker_log_server_port = 8793 # The Celery broker URL. Celery supports RabbitMQ, Redis and experimentally # a sqlalchemy database. Refer to the Celery documentation for more # information. broker_url = amqp://amqp:5672/1 # Another key Celery setting result_backend = db+mysql://airflow:airflow@postgres/airflow # Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start # it `airflow flower`. This defines the IP that Celery Flower runs on flower_host = 0.0.0.0 # This defines the port that Celery Flower runs on flower_port = 5555 # Default queue that tasks get assigned to and that worker listen on. default_queue = airflow # Import path for celery configuration options celery_config_options = airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG # No SSL ssl_active = False [scheduler] # Task instances listen for external kill signal (when you clear tasks # from the CLI or the UI), this defines the frequency at which they should # listen (in seconds). job_heartbeat_sec = 5 # The scheduler constantly tries to trigger new tasks (look at the # scheduler section in the docs for more information). This defines # how often the scheduler should run (in seconds). scheduler_heartbeat_sec = 5 # after how much time should the scheduler terminate in seconds # -1 indicates to run continuously (see also num_runs) run_duration = -1 # after how much time a new DAGs should be picked up from the filesystem min_file_process_interval = 60 use_row_level_locking=False dag_dir_list_interval = 300 # How often should stats be printed to the logs print_stats_interval = 30 child_process_log_directory = /usr/local/airflow/logs/scheduler # Local task jobs periodically heartbeat to the DB. If the job has # not heartbeat in this many seconds, the scheduler will mark the # associated task instance as failed and will re-schedule the task. scheduler_zombie_task_threshold = 300 # Turn off scheduler catchup by setting this to False. # Default behavior is unchanged and # Command Line Backfills still work, but the scheduler # will not do scheduler catchup if this is False, # however it can be set on a per DAG basis in the # DAG definition (catchup) catchup_by_default = True # This changes the batch size of queries in the scheduling main loop. # This depends on query length limits and how long you are willing to hold locks. # 0 for no limit max_tis_per_query = 0 # The scheduler can run multiple threads in parallel to schedule dags. # This defines how many threads will run. parsing_processes = 4 authenticate = False ```
https://github.com/apache/airflow/issues/13325
https://github.com/apache/airflow/pull/13512
b103a1dd0e22b67dcc8cb2a28a5afcdfb7554412
31d31adb58750d473593a9b13c23afcc9a0adf97
"2020-12-27T10:25:52Z"
python
"2021-01-18T21:24:37Z"
closed
apache/airflow
https://github.com/apache/airflow
13,306
["BREEZE.rst", "Dockerfile", "Dockerfile.ci", "scripts/ci/images/ci_verify_prod_image.sh", "scripts/ci/libraries/_initialization.sh", "setup.py"]
The "ldap" extra misses libldap dependency
The 'ldap' provider misses 'ldap' extra dep (which adds ldap3 pip dependency).
https://github.com/apache/airflow/issues/13306
https://github.com/apache/airflow/pull/13308
13a9747bf1d92020caa5d4dc825e096ce583f2df
d23ac9b235c5b30a5d2d3a3a7edf60e0085d68de
"2020-12-24T18:21:48Z"
python
"2020-12-28T16:07:00Z"
closed
apache/airflow
https://github.com/apache/airflow
13,295
["airflow/models/dag.py", "tests/models/test_dag.py"]
In triggered SubDag (schedule_interval=None), when clearing a successful Subdag, child tasks aren't run
**Apache Airflow version**: Airflow 2.0 **Environment**: Ubuntu 20.04 (WSL on Windows 10) - **OS** (e.g. from /etc/os-release): VERSION="20.04.1 LTS (Focal Fossa)" - **Kernel** (e.g. `uname -a`): Linux XXX 4.19.128-microsoft-standard #1 SMP Tue Jun 23 12:58:10 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux **What happened**: After successfully running a SUBDAG, clearing it (including downstream+recursive) doesn't trigger the inner tasks. Instead, the subdag is marked successful and the inner tasks all stay cleared and aren't re-run. **What you expected to happen**: Expected Clear with DownStream + Recursive to re-run all subdag tasks. <!-- What do you think went wrong? --> **How to reproduce it**: 1. Using a slightly modified version of https://airflow.apache.org/docs/apache-airflow/stable/concepts.html#subdags: ```python from airflow import DAG from airflow.example_dags.subdags.subdag import subdag from airflow.operators.dummy import DummyOperator from airflow.operators.subdag import SubDagOperator from airflow.utils.dates import days_ago def subdag(parent_dag_name, child_dag_name, args): dag_subdag = DAG( dag_id=f'{parent_dag_name}.{child_dag_name}', default_args=args, start_date=days_ago(2), schedule_interval=None, ) for i in range(5): DummyOperator( task_id='{}-task-{}'.format(child_dag_name, i + 1), default_args=args, dag=dag_subdag, ) return dag_subdag DAG_NAME = 'example_subdag_operator' args = { 'owner': 'airflow', } dag = DAG( dag_id=DAG_NAME, default_args=args, start_date=days_ago(2), schedule_interval=None, tags=['example'] ) start = DummyOperator( task_id='start', dag=dag, ) section_1 = SubDagOperator( task_id='section-1', subdag=subdag(DAG_NAME, 'section-1', args), dag=dag, ) some_other_task = DummyOperator( task_id='some-other-task', dag=dag, ) section_2 = SubDagOperator( task_id='section-2', subdag=subdag(DAG_NAME, 'section-2', args), dag=dag, ) end = DummyOperator( task_id='end', dag=dag, ) start >> section_1 >> some_other_task >> section_2 >> end ``` 2. Run the subdag fully. 3. Clear (with recursive/downstream) any of the SubDags. 4. The Subdag will be marked successful, but if you zoom into the subdag, you'll see all the child tasks were not run.
https://github.com/apache/airflow/issues/13295
https://github.com/apache/airflow/pull/14776
0b50e3228519138c9826bc8e98f0ab5dc40a268d
052163516bf91ab7bb53f4ec3c7b5621df515820
"2020-12-24T01:51:24Z"
python
"2021-03-18T10:38:52Z"
closed
apache/airflow
https://github.com/apache/airflow
13,262
["airflow/providers/google/cloud/hooks/dataflow.py", "tests/providers/google/cloud/hooks/test_dataflow.py"]
Dataflow Flex Template Operator
**Apache Airflow version**: 1. 1.10.9 Composer Airflow Image **Environment**: - **Cloud provider or hardware configuration**: Cloud Composer **What happened**: Error logs indicate appears to not recognize the job as Batch. [2020-12-22 16:28:53,445] {taskinstance.py:1135} ERROR - 'type' Traceback (most recent call last) File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 972, in _run_raw_tas result = task_copy.execute(context=context File "/usr/local/lib/airflow/airflow/providers/google/cloud/operators/dataflow.py", line 647, in execut on_new_job_id_callback=set_current_job_id File "/usr/local/lib/airflow/airflow/providers/google/common/hooks/base_google.py", line 383, in inner_wrappe return func(self, *args, **kwargs File "/usr/local/lib/airflow/airflow/providers/google/cloud/hooks/dataflow.py", line 804, in start_flex_templat jobs_controller.wait_for_done( File "/usr/local/lib/airflow/airflow/providers/google/cloud/hooks/dataflow.py", line 348, in wait_for_don while self._jobs and not all(self._check_dataflow_job_state(job) for job in self._jobs) File "/usr/local/lib/airflow/airflow/providers/google/cloud/hooks/dataflow.py", line 348, in <genexpr while self._jobs and not all(self._check_dataflow_job_state(job) for job in self._jobs) File "/usr/local/lib/airflow/airflow/providers/google/cloud/hooks/dataflow.py", line 321, in _check_dataflow_job_stat wait_for_running = job['type'] == DataflowJobType.JOB_TYPE_STREAMIN KeyError: 'type I have specified: ``` with models.DAG( dag_id="pdc-test", start_date=days_ago(1), schedule_interval=None, ) as dag_flex_template: start_flex_template = DataflowStartFlexTemplateOperator( task_id="pdc-test", body={ "launchParameter": { "containerSpecGcsPath": GCS_FLEX_TEMPLATE_TEMPLATE_PATH, "jobName": DATAFLOW_FLEX_TEMPLATE_JOB_NAME, "parameters": { "stage": STAGE, "target": TARGET, "path": PATH, "filename": FILENAME, "column": "geometry" }, "environment": { "network": NETWORK, "subnetwork": SUBNETWORK, "machineType": "n1-standard-1", "numWorkers": "1", "maxWorkers": "1", "tempLocation": "gs://test-pipelines-work/batch", "workerZone": "northamerica-northeast1", "enableStreamingEngine": "false", "serviceAccountEmail": "<number>-compute@developer.gserviceaccount.com", "ipConfiguration": "WORKER_IP_PRIVATE" }, } }, location=LOCATION, project_id=GCP_PROJECT_ID )``` **What you expected to happen**: Expecting the dag to run. <!-- What do you think went wrong? --> Appears the Operator is not handling the input as a batch type Flex Template. DataflowJobType should be BATCH and not STREAMING. **How to reproduce it**: 1. Create a Batch Flex Template as of https://cloud.google.com/dataflow/docs/guides/templates/using-flex-templates 2. Point code above to your registered template and invoke.
https://github.com/apache/airflow/issues/13262
https://github.com/apache/airflow/pull/14914
7c2ed5394e12aa02ff280431b8d35af80d37b1f0
a7e144bec855f6ccf0fa5ae8447894195ffe170f
"2020-12-22T19:32:24Z"
python
"2021-03-23T18:48:42Z"
closed
apache/airflow
https://github.com/apache/airflow
13,254
["airflow/configuration.py", "tests/core/test_configuration.py"]
Import error when using custom backend and sql_alchemy_conn_secret
**Apache Airflow version**: 2.0.0 **Environment**: - **Cloud provider or hardware configuration**: N/A - **OS** (e.g. from /etc/os-release): custom Docker image (`FROM python:3.6`) and macOS Big Sur (11.0.1) - **Kernel** (e.g. `uname -a`): - `Linux xxx 4.14.174+ #1 SMP x86_64 GNU/Linux` - `Darwin xxx 20.1.0 Darwin Kernel Version 20.1.0 rRELEASE_X86_64 x86_64` - **Install tools**: - **Others**: **What happened**: I may have mixed 2 different issues here, but this is what happened to me. I'm trying to use Airflow with the `airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend` and a `sql_alchemy_conn_secret` too, however, I have a `NameError` exception when attempting to run either `airflow scheduler` or `airflow webserver`: ``` Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 34, in <module> from airflow import settings File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 35, in <module> from airflow.configuration import AIRFLOW_HOME, WEBSERVER_CONFIG, conf # NOQA F401 File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 786, in <module> conf.read(AIRFLOW_CONFIG) File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 447, in read self._validate() File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 196, in _validate self._validate_config_dependencies() File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 224, in _validate_config_dependencies is_sqlite = "sqlite" in self.get('core', 'sql_alchemy_conn') File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 324, in get option = self._get_option_from_secrets(deprecated_key, deprecated_section, key, section) File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 342, in _get_option_from_secrets option = self._get_secret_option(section, key) File "/usr/local/lib/python3.6/site-packages/airflow/configuration.py", line 303, in _get_secret_option return _get_config_value_from_secret_backend(secrets_path) NameError: name '_get_config_value_from_secret_backend' is not defined ``` **What you expected to happen**: A proper import and configuration creation. **How to reproduce it**: `airflow.cfg`: ```ini [core] # ... sql_alchemy_conn_secret = some-key # ... [secrets] backend = airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend backend_kwargs = { ... } # ... ``` **Anything else we need to know**: Here is the workaround I have for the moment, not sure it works all the way, and probably doesn't cover all edge cases, tho it kinda works for my setup: Move `get_custom_secret_backend` before (for me it's actually below `_get_config_value_from_secret_backend`): https://github.com/apache/airflow/blob/cc87caa0ce0b31aa29df7bbe90bdcc2426d80ff1/airflow/configuration.py#L794 Then comment: https://github.com/apache/airflow/blob/cc87caa0ce0b31aa29df7bbe90bdcc2426d80ff1/airflow/configuration.py#L232-L236
https://github.com/apache/airflow/issues/13254
https://github.com/apache/airflow/pull/13260
7a560ab6de7243e736b66599842b241ae60d1cda
69d6d0239f470ac75e23160bac63408350c1835a
"2020-12-22T14:08:30Z"
python
"2020-12-24T17:09:19Z"
closed
apache/airflow
https://github.com/apache/airflow
13,226
["UPDATING.md"]
Use of SQLAInterface in custom models in Plugins
We might need to add to Airflow 2.0 upgrade documentation the need to use `CustomSQLAInterface` instead of `SQLAInterface`. If you want to define your own appbuilder models you need to change the interface to a Custom one: Non-RBAC replace: ``` from flask_appbuilder.models.sqla.interface import SQLAInterface datamodel = SQLAInterface(your_data_model) ``` with RBAC (in 1.10): ``` from airflow.www_rbac.utils import CustomSQLAInterface datamodel = CustomSQLAInterface(your_data_model) ``` and in 2.0: ``` from airflow.www.utils import CustomSQLAInterface datamodel = CustomSQLAInterface(your_data_model) ```
https://github.com/apache/airflow/issues/13226
https://github.com/apache/airflow/pull/14478
0a969db2b025709505f8043721c83218a73bb84d
714a07542c2560b50d013d66f71ad9a209dd70b6
"2020-12-21T17:40:47Z"
python
"2021-03-03T00:29:54Z"
closed
apache/airflow
https://github.com/apache/airflow
13,225
["airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/task_instance_schema.py", "airflow/models/dag.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py", "tests/api_connexion/schemas/test_task_instance_schema.py"]
Clear Tasks via the stable REST API with task_id filter
**Description** I have noticed that the stable REST API doesn't have the ability to run a task (which is possible from the airflow web interface. I think it would be nice to have either: - Run task - Run all failing tasks (rerun from point of failure) this ability for integrations. **Use case / motivation** I would like the ability to identify the failing tasks on a specific DAG Run and rerun only them. I would like to do it remotely (non-interactive) using the REST API. I could write a script that run only the failing tasks, but I couldn't find a way to "Run" a task, when I have the task instance ID. **Are you willing to submit a PR?** Not at this stage **Related Issues**
https://github.com/apache/airflow/issues/13225
https://github.com/apache/airflow/pull/14500
a265fd54792bb7638188eaf4f6332ae95d24899e
e150bbfe0a7474308ba7df9c89e699b77c45bb5c
"2020-12-21T17:38:56Z"
python
"2021-04-07T06:54:34Z"
closed
apache/airflow
https://github.com/apache/airflow
13,214
["airflow/migrations/versions/2c6edca13270_resource_based_permissions.py"]
Make migration logging consistent
**Apache Airflow version**: 2.0.0.dev **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: When I run `airflow db reset -y` I got ``` INFO [alembic.runtime.migration] Running upgrade bef4f3d11e8b -> 98271e7606e2, Add scheduling_decision to DagRun and DAG INFO [alembic.runtime.migration] Running upgrade 98271e7606e2 -> 52d53670a240, fix_mssql_exec_date_rendered_task_instance_fields_for_MSSQL INFO [alembic.runtime.migration] Running upgrade 52d53670a240 -> 364159666cbd, Add creating_job_id to DagRun table INFO [alembic.runtime.migration] Running upgrade 364159666cbd -> 45ba3f1493b9, add-k8s-yaml-to-rendered-templates INFO [alembic.runtime.migration] Running upgrade 45ba3f1493b9 -> 849da589634d, Prefix DAG permissions. INFO [alembic.runtime.migration] Running upgrade 849da589634d -> 2c6edca13270, Resource based permissions. [2020-12-21 10:15:40,510] {manager.py:727} WARNING - No user yet created, use flask fab command to do it. [2020-12-21 10:15:41,964] {providers_manager.py:291} WARNING - Exception when importing 'airflow.providers.google.cloud.hooks.compute_ssh.ComputeEngineSSHHook' from 'apache-airflow-providers-google' package: No module named 'google.cloud.oslogin_v1' [2020-12-21 10:15:42,791] {providers_manager.py:291} WARNING - Exception when importing 'airflow.providers.google.cloud.hooks.compute_ssh.ComputeEngineSSHHook' from 'apache-airflow-providers-google' package: No module named 'google.cloud.oslogin_v1' [2020-12-21 10:15:47,157] {migration.py:515} INFO - Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection [2020-12-21 10:15:47,160] {migration.py:515} INFO - Running upgrade 61ec73d9401f -> 64a7d6477aae, fix description field in connection to be text [2020-12-21 10:15:47,164] {migration.py:515} INFO - Running upgrade 64a7d6477aae -> e959f08ac86c, Change field in DagCode to MEDIUMTEXT for MySql [2020-12-21 10:15:47,381] {dagbag.py:440} INFO - Filling up the DagBag from /root/airflow/dags [2020-12-21 10:15:47,857] {dag.py:1813} INFO - Sync 29 DAGs [2020-12-21 10:15:47,870] {dag.py:1832} INFO - Creating ORM DAG for example_bash_operator [2020-12-21 10:15:47,871] {dag.py:1832} INFO - Creating ORM DAG for example_kubernetes_executor [2020-12-21 10:15:47,872] {dag.py:1832} INFO - Creating ORM DAG for example_xcom_args [2020-12-21 10:15:47,873] {dag.py:1832} INFO - Creating ORM DAG for tutorial [2020-12-21 10:15:47,873] {dag.py:1832} INFO - Creating ORM DAG for example_python_operator [2020-12-21 10:15:47,874] {dag.py:1832} INFO - Creating ORM DAG for example_xcom ``` **What you expected to happen**: I expect to see all migration logging to be formatted in the same style. I would also love to see no unrelated logs - this will make `db reset` easier to digest. **How to reproduce it**: Run `airflow db reset -y` **Anything else we need to know**: N/A
https://github.com/apache/airflow/issues/13214
https://github.com/apache/airflow/pull/13458
feb84057d34b2f64e3b5dcbaae2d3b18f5f564e4
43b2d3392224d8e0d6fb8ce8cdc6b0f0b0cc727b
"2020-12-21T10:21:14Z"
python
"2021-01-04T17:25:02Z"
closed
apache/airflow
https://github.com/apache/airflow
13,200
["airflow/utils/cli.py", "tests/utils/test_cli_util.py"]
CLI `airflow scheduler -D --pid <PIDFile>` fails silently if PIDFile given is a relative path
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: 2.0.0 **Environment**: Linux & MacOS, venv - **OS** (e.g. from /etc/os-release): Ubuntu 18.04.3 LTS / MacOS 10.15.7 - **Kernel** (e.g. `uname -a`): - Linux *** 5.4.0-1029-aws #30~18.04.1-Ubuntu SMP Tue Oct 20 11:09:25 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux - Darwin *** 19.6.0 Darwin Kernel Version 19.6.0: Thu Oct 29 22:56:45 PDT 2020; root:xnu-6153.141.2.2~1/RELEASE_X86_64 x86_64 **What happened**: Say I'm in my home dir, running command `airflow scheduler -D --pid test.pid` (`test.pid` is a relative path) is supposed to start the scheduler in daemon mode, and the PID will be stored in the file `test.pid` (if it doesn't exist, it should be created). However, the scheduler is NOT started. This can be validated by running `ps aux | grep airflow | grep scheduler` (no process is shown). In the whole process, I don't see any error message. However, if I change the pid file path to an absolute path, i.e. `airflow scheduler -D --pid ${PWD}/test.pid`, it successfully start the scheduler in daemon mode (can be validated via the method above). **What you expected to happen**: Even if the PID file path provided is a relative path, the scheduler should be started properly as well. <!-- What do you think went wrong? --> **How to reproduce it**: Described above <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md style of ![alt text](http://url/to/img.png) To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> -->
https://github.com/apache/airflow/issues/13200
https://github.com/apache/airflow/pull/13232
aa00e9bcd4ec16f42338b30d29e87ccda8eecf82
93e4787b70a85cc5f13db5e55ef0c06629b45e6e
"2020-12-20T22:16:54Z"
python
"2020-12-22T22:18:38Z"
closed
apache/airflow
https://github.com/apache/airflow
13,192
["airflow/providers/google/cloud/operators/mlengine.py", "tests/providers/google/cloud/operators/test_mlengine.py"]
Generalize MLEngineStartTrainingJobOperator to custom images
**Description** The operator is arguably unnecessarily limited to AI Platform’s standard images. The only change that is required to lift this constraint is making `package_uris` and `training_python_module` optional with default values `[]` and `None`, respectively. Then, using `master_config`, one can supply `imageUri` and run any image of choice. **Use case / motivation** This will open up for running arbitrary images on AI Platform. **Are you willing to submit a PR?** If the above sounds reasonable, I can open pull requests.
https://github.com/apache/airflow/issues/13192
https://github.com/apache/airflow/pull/13318
6e1a6ff3c8a4f8f9bcf8b7601362359bfb2be6bf
f6518dd6a1217d906d863fe13dc37916efd78b3e
"2020-12-20T10:26:37Z"
python
"2021-01-02T10:34:04Z"
closed
apache/airflow
https://github.com/apache/airflow
13,181
["chart/templates/workers/worker-kedaautoscaler.yaml", "chart/tests/helm_template_generator.py", "chart/tests/test_keda.py"]
keda scaledobject not created even though keda enabled in helm config
In brand new cluster using k3d locally, I first installed keda: ```bash helm install keda \ --namespace keda kedacore/keda \ --version "v1.5.0" ``` Next, I installed airflow using this config: ```yaml executor: CeleryExecutor defaultAirflowTag: 2.0.0-python3.7 airflowVersion: 2.0.0 workers: keda: enabled: true persistence: enabled: false pgbouncer: enabled: true ``` I think this should create a scaled object `airflow-worker`. But it does not. @turbaszek and @dimberman you may have insight ...
https://github.com/apache/airflow/issues/13181
https://github.com/apache/airflow/pull/13183
4aba9c5a8b89d2827683fb4c84ac481c89ebc2b3
a9d562e1c3c16c98750c9e3be74347f882acb97a
"2020-12-19T08:30:36Z"
python
"2020-12-21T10:19:26Z"
closed
apache/airflow
https://github.com/apache/airflow
13,151
["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"]
Task Instances in the "removed" state prevent the scheduler from scheduling new tasks when max_active_runs is set
**Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **OS** (e.g. from /etc/os-release): Debian GNU/Linux 10 (buster) - **Kernel** (e.g. `uname -a`): Linux 6ae65b86e112 5.4.0-52-generic #57-Ubuntu SMP Thu Oct 15 10:57:00 UTC 2020 x86_64 GNU/Linux - **Others**: Python 3.8 **What happened**: After migrating one of our development Airflow instances from 1.10.14 to 2.0.0, the scheduler started to refuse to schedule tasks for a DAG that did not actually exceed its `max_active_runs`. When it did this the following error would be logged: ``` DAG <dag_name> already has 577 active runs, not queuing any tasks for run 2020-12-17 08:05:00+00:00 ``` A bit of digging revealed that this DAG had task instances associated with it that are in the `removed` state. As soon as I forced the task instances that are in the `removed` state into the `failed` state, the tasks would be scheduled. I believe the root cause of the issue is that [this filter](https://github.com/apache/airflow/blob/master/airflow/jobs/scheduler_job.py#L1506) does not filter out tasks that are in the `removed` state. **What you expected to happen**: I expected the task instances in the DAG to be scheduled, because the DAG did not actually exceed the number of `max_active_runs`. **How to reproduce it**: I think the best approach to reproduce it is as follows: 1. Create a DAG and set `max_active_runs` to 1. 2. Ensure the DAG has ran successfully a number of times, such that it has some history associated with it. 3. Set one historical task instance to the `removed` state (either by directly updating it in the DB, or deleting a task from a DAG before it has been able to execute). **Anything else we need to know**: The Airflow instance that I ran into this issue with contains about 3 years of task history, which means that we actually had quite a few task instances that are in the `removed` state, but there is no easy way to delete those from the Web UI. A work around is to set the tasks to `failed`, which will allow the scheduler to proceed.
https://github.com/apache/airflow/issues/13151
https://github.com/apache/airflow/pull/13165
5cf2fbf12462de0a684ec4f631783850f7449059
ef8f414c20b3cd64e2226ec5c022e799a6e0af86
"2020-12-18T13:14:51Z"
python
"2020-12-19T12:09:50Z"
closed
apache/airflow
https://github.com/apache/airflow
13,142
["airflow/www/security.py", "docs/apache-airflow/security/webserver.rst", "tests/www/test_security.py"]
Error while attempting to disable login (setting AUTH_ROLE_PUBLIC = 'Admin')
# Error while attempting to disable login **Apache Airflow version**: 2.0.0 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: mac-pro - **OS** (e.g. from /etc/os-release): osx - **Kernel** (e.g. `uname -a`): Darwin C02CW1JLMD6R 19.6.0 Darwin Kernel Version 19.6.0: Mon Aug 31 22:12:52 PDT 2020; root:xnu-6153.141.2~1/RELEASE_X86_64 x86_64 - **Install tools**: - **Others**: **What happened**: When setting in `webserver_config.py`, ```python AUTH_ROLE_PUBLIC = 'Admin' ``` Got error on webserver, when going to localhost:8080/home, ```log [2020-12-17 16:29:09,993] {app.py:1891} ERROR - Exception on /home [GET] Traceback (most recent call last): File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app response = self.full_dispatch_request() File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request rv = self.handle_user_exception(e) File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception reraise(exc_type, exc_value, tb) File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise raise value File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request rv = self.dispatch_request() File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/airflow/www/auth.py", line 34, in decorated return func(*args, **kwargs) File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/airflow/www/views.py", line 540, in index user_permissions = current_app.appbuilder.sm.get_all_permissions_views() File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/airflow/www/security.py", line 226, in get_all_permissions_views for role in self.get_user_roles(): File "/Users/Zshot0831/.local/share/virtualenvs/airflow_2-DimIlKMl/lib/python3.8/site-packages/airflow/www/security.py", line 219, in get_user_roles public_role = current_app.appbuilder.config.get('AUTH_ROLE_PUBLIC') AttributeError: 'AirflowAppBuilder' object has no attribute 'config' ``` **What you expected to happen**: Reached homepage without the need for authentication as admin. **How to reproduce it**: 1. Install airflow in a new environment (or to a new directory, set env AIRFLOW_HOME=[my new dir]) 2. Uncomment and change in webserver_config.py ```python AUTH_ROLE_PUBLIC = 'Admin' ``` 3. Start `airflow webserver` 4. Look in localhost:8080/home or localhost:8080 *webserver_config.py file** ```python # # Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, # software distributed under the License is distributed on an # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. """Default configuration for the Airflow webserver""" import os from flask_appbuilder.security.manager import AUTH_DB # from flask_appbuilder.security.manager import AUTH_LDAP # from flask_appbuilder.security.manager import AUTH_OAUTH # from flask_appbuilder.security.manager import AUTH_OID # from flask_appbuilder.security.manager import AUTH_REMOTE_USER basedir = os.path.abspath(os.path.dirname(__file__)) # Flask-WTF flag for CSRF WTF_CSRF_ENABLED = True # ---------------------------------------------------- # AUTHENTICATION CONFIG # ---------------------------------------------------- # For details on how to set up each of the following authentication, see # http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods # for details. # The authentication type # AUTH_OID : Is for OpenID # AUTH_DB : Is for database # AUTH_LDAP : Is for LDAP # AUTH_REMOTE_USER : Is for using REMOTE_USER from web server # AUTH_OAUTH : Is for OAuth AUTH_TYPE = AUTH_DB # Uncomment to setup Full admin role name # AUTH_ROLE_ADMIN = 'Admin' # Uncomment to setup Public role name, no authentication needed AUTH_ROLE_PUBLIC = 'Admin' # Will allow user self registration # AUTH_USER_REGISTRATION = True # The default user self registration role # AUTH_USER_REGISTRATION_ROLE = "Public" # When using OAuth Auth, uncomment to setup provider(s) info # Google OAuth example: # OAUTH_PROVIDERS = [{ # 'name':'google', # 'token_key':'access_token', # 'icon':'fa-google', # 'remote_app': { # 'api_base_url':'https://www.googleapis.com/oauth2/v2/', # 'client_kwargs':{ # 'scope': 'email profile' # }, # 'access_token_url':'https://accounts.google.com/o/oauth2/token', # 'authorize_url':'https://accounts.google.com/o/oauth2/auth', # 'request_token_url': None, # 'client_id': GOOGLE_KEY, # 'client_secret': GOOGLE_SECRET_KEY, # } # }] # When using LDAP Auth, setup the ldap server # AUTH_LDAP_SERVER = "ldap://ldapserver.new" # When using OpenID Auth, uncomment to setup OpenID providers. # example for OpenID authentication # OPENID_PROVIDERS = [ # { 'name': 'Yahoo', 'url': 'https://me.yahoo.com' }, # { 'name': 'AOL', 'url': 'http://openid.aol.com/<username>' }, # { 'name': 'Flickr', 'url': 'http://www.flickr.com/<username>' }, # { 'name': 'MyOpenID', 'url': 'https://www.myopenid.com' }] # ---------------------------------------------------- # Theme CONFIG # ---------------------------------------------------- # Flask App Builder comes up with a number of predefined themes # that you can use for Apache Airflow. # http://flask-appbuilder.readthedocs.io/en/latest/customizing.html#changing-themes # Please make sure to remove "navbar_color" configuration from airflow.cfg # in order to fully utilize the theme. (or use that property in conjunction with theme) # APP_THEME = "bootstrap-theme.css" # default bootstrap # APP_THEME = "amelia.css" # APP_THEME = "cerulean.css" # APP_THEME = "cosmo.css" # APP_THEME = "cyborg.css" # APP_THEME = "darkly.css" # APP_THEME = "flatly.css" # APP_THEME = "journal.css" # APP_THEME = "lumen.css" # APP_THEME = "paper.css" # APP_THEME = "readable.css" # APP_THEME = "sandstone.css" # APP_THEME = "simplex.css" # APP_THEME = "slate.css" # APP_THEME = "solar.css" # APP_THEME = "spacelab.css" # APP_THEME = "superhero.css" # APP_THEME = "united.css" # APP_THEME = "yeti.css" ``` ![Screen Shot 2020-12-17 at 4 37 06 PM](https://user-images.githubusercontent.com/3694482/102546943-64733800-4086-11eb-8dd7-6a36b2b8ee82.png)
https://github.com/apache/airflow/issues/13142
https://github.com/apache/airflow/pull/13191
641f63c2c4d38094cb85389fb50f25345d622e23
4be27af04df047a9d1b95fca09eb25e88385f0a8
"2020-12-17T21:42:09Z"
python
"2020-12-28T06:37:26Z"
closed
apache/airflow
https://github.com/apache/airflow
13,132
["airflow/providers/microsoft/winrm/operators/winrm.py", "docs/spelling_wordlist.txt"]
Let user specify the decode encoding of stdout/stderr of WinRMOperator
**Description** Let user specify the decode encoding used in WinRMOperator. **Use case / motivation** I'm trying to use winrm, but the task failed. After checked, I find https://github.com/apache/airflow/blob/master/airflow/providers/microsoft/winrm/operators/winrm.py#L117 ```python for line in stdout.decode('utf-8').splitlines(): self.log.info(line) for line in stderr.decode('utf-8').splitlines(): self.log.warning(line) ``` But my remote host powershell's default encoding is 'gb2312'. I try https://stackoverflow.com/questions/40098771/changing-powershells-default-output-encoding-to-utf-8 's solution, i.e., put `PSDefaultParameterValues['Out-File:Encoding'] = 'utf8'` in `$PROFILE`. But it doesn't work in the WinRMOperator case. The alternative way may be set the decode encoding in the operator to avoid error.
https://github.com/apache/airflow/issues/13132
https://github.com/apache/airflow/pull/13153
d9e4454c66051a9e8bb5b2f3814d46f29332b89d
a1d060c7f4e09c617f39e2b8df2a043bfeac9d82
"2020-12-17T11:24:41Z"
python
"2021-03-01T14:00:49Z"
closed
apache/airflow
https://github.com/apache/airflow
13,086
["airflow/models/baseoperator.py", "airflow/serialization/schema.json", "airflow/serialization/serialized_objects.py", "tests/serialization/test_dag_serialization.py"]
max_retry_delay should be a timedelta for type hinting
**Apache Airflow version**: master **What happened**: https://github.com/apache/airflow/blob/master/airflow/models/baseoperator.py#L356 --> should be timedelta not datetime
https://github.com/apache/airflow/issues/13086
https://github.com/apache/airflow/pull/14436
b16b9ee6894711a8af7143286189c4a3cc31d1c4
59c459fa2a6aafc133db4a89980fb3d3d0d25589
"2020-12-15T15:31:00Z"
python
"2021-02-26T11:42:00Z"
closed
apache/airflow
https://github.com/apache/airflow
13,081
["docs/apache-airflow/upgrading-to-2.rst"]
OAuth2 login process is not stateless
**Apache Airflow version**: 1.10.14 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.15-eks-ad4801", GitCommit:"ad4801fd44fe0f125c8d13f1b1d4827e8884476d", GitTreeState:"clean", BuildDate:"2020-10-20T23:27:12Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"} **Environment**: - **Cloud provider or hardware configuration**: AWS / EKS - **OS** (e.g. from /etc/os-release): N/A - **Kernel** (e.g. `uname -a`): N/A - **Install tools**: N/A - **Others**: N/A **What happened**: Cognito login does not work if second request is not handled by first pod receiving access_token headers. **What you expected to happen**: Logging in via Cognito OAuth2 mode / Code should work via any pod. **How to reproduce it**: Override `webserver_config.py` with the following code: ``` """Default configuration for the Airflow webserver""" import logging import os import json from airflow.configuration import conf from airflow.www_rbac.security import AirflowSecurityManager from flask_appbuilder.security.manager import AUTH_OAUTH log = logging.getLogger(__name__) basedir = os.path.abspath(os.path.dirname(__file__)) # The SQLAlchemy connection string. SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN') # Flask-WTF flag for CSRF WTF_CSRF_ENABLED = True CSRF_ENABLED = True # ---------------------------------------------------- # AUTHENTICATION CONFIG # ---------------------------------------------------- # For details on how to set up each of the following authentication, see # http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods # for details. # The authentication type AUTH_TYPE = AUTH_OAUTH SECRET_KEY = os.environ.get("FLASK_SECRET_KEY") OAUTH_PROVIDERS = [{ 'name': 'aws_cognito', 'whitelist': ['@ga.gov.au'], 'token_key': 'access_token', 'icon': 'fa-amazon', 'remote_app': { 'api_base_url': os.environ.get("OAUTH2_BASE_URL") + "/", 'client_kwargs': { 'scope': 'openid email aws.cognito.signin.user.admin' }, 'authorize_url': os.environ.get("OAUTH2_BASE_URL") + "/authorize", 'access_token_url': os.environ.get("OAUTH2_BASE_URL") + "/token", 'request_token_url': None, 'client_id': os.environ.get("COGNITO_CLIENT_ID"), 'client_secret': os.environ.get("COGNITO_CLIENT_SECRET"), } }] class CognitoAirflowSecurityManager(AirflowSecurityManager): def oauth_user_info(self, provider, resp): # log.info("Requesting user info from AWS Cognito: {0}".format(resp)) assert provider == "aws_cognito" # log.info("Requesting user info from AWS Cognito: {0}".format(resp)) me = self.appbuilder.sm.oauth_remotes[provider].get("userInfo") return { "username": me.json().get("username"), "email": me.json().get("email"), "first_name": me.json().get("given_name", ""), "last_name": me.json().get("family_name", ""), "id": me.json().get("sub", ""), } SECURITY_MANAGER_CLASS = CognitoAirflowSecurityManager ``` - Setup an airflow-app linked a to Cognito user pull and run multiple replicas of the airflow-web pod. - Login will start failing and work may be 1 in 9 attempts. **Anything else we need to know**: There are 3 possible work arounds using infrastructure changes instead of airflow-web code changes. - Use a single pod for airflow-web to avoid session issues - Make ALB sticky via ingress to give users the same pod consistently - Sharing the same secret key across all airflow-web pods using the environment
https://github.com/apache/airflow/issues/13081
https://github.com/apache/airflow/pull/13094
484f95f55cda4ca4fd3157135199623c9e37cc8a
872350bac5bebea09bd52d50734a3b7517af712c
"2020-12-15T06:41:18Z"
python
"2020-12-21T23:26:06Z"
closed
apache/airflow
https://github.com/apache/airflow
13,053
["airflow/utils/dot_renderer.py", "tests/utils/test_dot_renderer.py"]
CLI does not display TaskGroups
Hello, Airflow ability to [display DAG in CLI](http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/usage-cli.html#display-dags-structure) with command `airflow dags show`, but unfortunately this command does not display Task Groups. It would be great if the Task Groups were correctly marked in the diagrams. <img width="1268" alt="Screenshot 2020-12-14 at 02 28 58" src="https://user-images.githubusercontent.com/12058428/102030893-9f3e4d00-3db4-11eb-8c2d-f33e38d01997.png"> <img width="681" alt="Screenshot 2020-12-14 at 02 29 16" src="https://user-images.githubusercontent.com/12058428/102030898-a2d1d400-3db4-11eb-9b31-0cde70fea675.png"> Best regards, Kamil Breguła
https://github.com/apache/airflow/issues/13053
https://github.com/apache/airflow/pull/14269
21f297425ae85ce89e21477d55b51d5560f47bf8
c71f707d24a9196d33b91a7a2a9e3384698e5193
"2020-12-14T01:34:50Z"
python
"2021-02-25T15:23:15Z"
closed
apache/airflow
https://github.com/apache/airflow
13,047
["airflow/utils/dag_processing.py", "tests/utils/test_dag_processing.py"]
Occasional "KeyError" in dag_processing
**Apache Airflow version**: 2.0.0rc2 **Environment**: Breeze with example dags, Python 3.8 postgres. - **OS** (e.g. from /etc/os-release): Linux - **Kernel** (e.g. `uname -a`): Breeze CI image - **Install tools**: Breeze: - **Executor**: LocalExecutor ``` ./breeze start-airflow --backend postgres --load-example-dags --load-default-connections --install-airflow-version 2.0.0rc2 --skip-mounting-local-sources --python 3.8 ``` **What happened**: When testing airflow logging I occasionally stumble upon "KeyError' from `dag_procesing.py`. I am not sure exactly when it happens. It's not always reproducible but it looks like it is when I restart scheduler and trigger 'example_bash_operator.py" it happens rather randomly (1/10 times more or less). It does not happen always when I triggere task manually. DAG gets correctly executed after triggering, but the log is there and warniing printed in the logs right after the DAG finishes execution. The error I see in scheduler's logs: ``` [2020-12-13 19:35:33,752] {dagbag.py:440} INFO - Filling up the DagBag from /usr/local/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py Running <TaskInstance: example_bash_operator.run_after_loop 2020-12-13T19:35:30.648020+00:00 [queued]> on host 6611da4b1a27 [2020-12-13 19:35:34,517] {dagrun.py:444} INFO - Marking run <DagRun example_bash_operator @ 2020-12-13 19:35:30.648020+00:00: manual__2020-12-13T19:35:30.648020+00:00, externally triggered: True> successful [2020-12-13 19:35:34,523] {scheduler_job.py:1193} INFO - Executor reports execution of example_bash_operator.run_after_loop execution_date=2020-12-13 19:35:30.648020+00:00 exited with status success for try_number 1 Process ForkProcess-34: Traceback (most recent call last): File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap self.run() File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 365, in _run_processor_manager processor_manager.start() File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 596, in start return self._run_parsing_loop() File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 659, in _run_parsing_loop self._processors.pop(processor.file_path) KeyError: '/usr/local/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py' [2020-12-13 19:35:35,589] {dag_processing.py:396} WARNING - DagFileProcessorManager (PID=1029759) exited with exit code 1 - re-launching ``` **What you expected to happen**: No error in logs. **How to reproduce it**: ``` ./breeze start-airflow --backend postgres --load-example-dags --load-default-connections --install-airflow-version 2.0.0rc2 --skip-mounting-local-sources --python 3.8 ``` Login to the webserver, enable 'example_bash_operator", wait for it to execute. Trigger the example DAG several times (always wait for the end of execution. It happens randomly (for me around 1/10 tasks) **Anything else we need to know**:
https://github.com/apache/airflow/issues/13047
https://github.com/apache/airflow/pull/13662
614b70805ade1946bb543b6815e304af1342ae06
32f59534cbdb8188e4c8f49d7dfbb4b915eaeb4d
"2020-12-13T19:53:22Z"
python
"2021-01-15T16:40:20Z"
closed
apache/airflow
https://github.com/apache/airflow
13,046
["airflow/utils/json.py"]
installation of simplejson breaks airlfow webserver 2.0.0rc2
Version 2.0.0rc2 To reproduce: 1. `pip install apache-airflow==2.0.0rc2` 2. `pip install simplejson` 3. run webserver 4. open in browser and observe following error Error: ``` [2020-12-13 11:37:28 -0800] [85061] [INFO] Starting gunicorn 19.10.0 [2020-12-13 11:37:28 -0800] [85061] [INFO] Listening at: http://0.0.0.0:8080 (85061) [2020-12-13 11:37:28 -0800] [85061] [INFO] Using worker: sync [2020-12-13 11:37:28 -0800] [85064] [INFO] Booting worker with pid: 85064 [2020-12-13 11:37:36,444] {app.py:1892} ERROR - Exception on /home [GET] Traceback (most recent call last): File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app ctx.push() File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/ctx.py", line 390, in push self.session = session_interface.open_session(self.app, self.request) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 340, in open_session s = self.get_signing_serializer(app) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 336, in get_signing_serializer signer_kwargs=signer_kwargs, File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 95, in __init__ self.is_text_serializer = is_text_serializer(serializer) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 13, in is_text_serializer return isinstance(serializer.dumps({}), text_type) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/tag.py", line 305, in dumps return dumps(self.tag(value), separators=(",", ":")) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/__init__.py", line 211, in dumps rv = _json.dumps(obj, **kwargs) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/simplejson/__init__.py", line 412, in dumps **kw).encode(obj) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/utils/json.py", line 36, in __init__ super().__init__(*args, **kwargs) TypeError: __init__() got an unexpected keyword argument 'encoding' [2020-12-13 11:37:36 -0800] [85064] [ERROR] Error handling request /home Traceback (most recent call last): File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app ctx.push() File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/ctx.py", line 390, in push self.session = session_interface.open_session(self.app, self.request) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 340, in open_session s = self.get_signing_serializer(app) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 336, in get_signing_serializer signer_kwargs=signer_kwargs, File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 95, in __init__ self.is_text_serializer = is_text_serializer(serializer) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 13, in is_text_serializer return isinstance(serializer.dumps({}), text_type) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/tag.py", line 305, in dumps return dumps(self.tag(value), separators=(",", ":")) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/__init__.py", line 211, in dumps rv = _json.dumps(obj, **kwargs) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/simplejson/__init__.py", line 412, in dumps **kw).encode(obj) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/utils/json.py", line 36, in __init__ super().__init__(*args, **kwargs) TypeError: __init__() got an unexpected keyword argument 'encoding' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 135, in handle self.handle_request(listener, req, client, addr) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 176, in handle_request respiter = self.wsgi(environ, resp.start_response) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2464, in __call__ return self.wsgi_app(environ, start_response) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2450, in wsgi_app response = self.handle_exception(e) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 1879, in handle_exception server_error = handler(server_error) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/www/views.py", line 372, in show_traceback else 'Error! Please contact server admin.', File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/templating.py", line 136, in render_template ctx.app.update_template_context(context) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 838, in update_template_context context.update(func()) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/utils.py", line 368, in _user_context_processor return dict(current_user=_get_user()) File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/utils.py", line 335, in _get_user current_app.login_manager._load_user() File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/login_manager.py", line 346, in _load_user is_missing_user_id = 'user_id' not in session File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/werkzeug/local.py", line 379, in <lambda> __contains__ = lambda x, i: i in x._get_current_object() TypeError: argument of type 'NoneType' is not iterable 127.0.0.1 - - [13/Dec/2020:11:37:36 -0800] "GET /home HTTP/1.1" 500 0 "-" "-" ^C[2020-12-13 11:37:38,288] {webserver_command.py:430} INFO - Received signal: 2. Closing gunicorn. [2020-12-13 11:37:38 -0800] [85061] [INFO] Handling signal: int [2020-12-13 11:37:38 -0800] [85064] [INFO] Worker exiting (pid: 85064) [2020-12-13 11:37:38 -0800] [85061] [INFO] Shutting down: Master ```
https://github.com/apache/airflow/issues/13046
https://github.com/apache/airflow/pull/13050
1c1ef7ee693fead93e269dfd9774a72b6eed2e85
ea3d42a3b68f926ff5022e2786bd6c57e3308cd2
"2020-12-13T19:41:10Z"
python
"2020-12-14T12:41:12Z"
closed
apache/airflow
https://github.com/apache/airflow
13,027
["MANIFEST.in"]
No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json'
v2.0.0rc1 ``` airflow db init DB: sqlite:////Users/red/airflow/airflow.db [2020-12-12 00:33:02,036] {db.py:678} INFO - Creating tables INFO [alembic.runtime.migration] Context impl SQLiteImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. Traceback (most recent call last): File "/usr/local/bin/airflow", line 8, in <module> sys.exit(main()) File "/usr/local/lib/python3.9/site-packages/airflow/__main__.py", line 40, in main args.func(args) File "/usr/local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command return func(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb db.initdb() File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 549, in initdb upgradedb() File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 688, in upgradedb command.upgrade(config, 'heads') File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 298, in upgrade script.run_env() File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 489, in run_env util.load_python_file(self.dir, "env.py") File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file module = load_module_py(module_id, path) File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py spec.loader.exec_module(module) File "<frozen importlib._bootstrap_external>", line 790, in exec_module File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 108, in <module> run_migrations_online() File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online context.run_migrations() File "<string>", line 8, in run_migrations File "/usr/local/lib/python3.9/site-packages/alembic/runtime/environment.py", line 846, in run_migrations self.get_context().run_migrations(**kw) File "/usr/local/lib/python3.9/site-packages/alembic/runtime/migration.py", line 511, in run_migrations for step in self._migrations_fn(heads, self): File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 287, in upgrade return script._upgrade_revs(revision, rev) File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 364, in _upgrade_revs revs = list(revs) File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 777, in _iterate_revisions uppers = util.dedupe_tuple(self.get_revisions(upper)) File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 321, in get_revisions resolved_id, branch_label = self._resolve_revision_number(id_) File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 501, in _resolve_revision_number self._revision_map File "/usr/local/lib/python3.9/site-packages/alembic/util/langhelpers.py", line 230, in __get__ obj.__dict__[self.__name__] = result = self.fget(obj) File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 123, in _revision_map for revision in self._generator(): File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 112, in _load_revisions script = Script._from_filename(self, vers, file_) File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 906, in _from_filename module = util.load_python_file(dir_, filename) File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file module = load_module_py(module_id, path) File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py spec.loader.exec_module(module) File "<frozen importlib._bootstrap_external>", line 790, in exec_module File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed File "/usr/local/lib/python3.9/site-packages/airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", line 27, in <module> from airflow.www.app import create_app File "/usr/local/lib/python3.9/site-packages/airflow/www/app.py", line 38, in <module> from airflow.www.extensions.init_views import ( File "/usr/local/lib/python3.9/site-packages/airflow/www/extensions/init_views.py", line 29, in <module> from airflow.www.views import lazy_add_provider_discovered_options_to_connection_form File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2836, in <module> class ConnectionFormWidget(FormWidget): File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2839, in ConnectionFormWidget field_behaviours = json.dumps(ProvidersManager().field_behaviours) File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 111, in __init__ _create_customized_form_field_behaviours_schema_validator() File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 53, in _create_customized_form_field_behaviours_schema_validator importlib_resources.read_text('airflow', 'customized_form_field_behaviours.schema.json') File "/usr/local/Cellar/python@3.9/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 139, in read_text with open_text(package, resource, encoding, errors) as fp: File "/usr/local/Cellar/python@3.9/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 121, in open_text open_binary(package, resource), encoding=encoding, errors=errors) File "/usr/local/Cellar/python@3.9/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 91, in open_binary return reader.open_resource(resource) File "<frozen importlib._bootstrap_external>", line 995, in open_resource FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json' ```
https://github.com/apache/airflow/issues/13027
https://github.com/apache/airflow/pull/13031
15fd1bc890aa1630ef16e7981408f8f994d30d97
baa68ca51f93b3cea18efc24a7540a0ddf89c03d
"2020-12-12T00:42:57Z"
python
"2020-12-12T09:21:43Z"
closed
apache/airflow
https://github.com/apache/airflow
12,969
["airflow/cli/cli_parser.py", "airflow/cli/commands/task_command.py", "airflow/executors/celery_executor.py", "airflow/executors/local_executor.py", "airflow/task/task_runner/standard_task_runner.py"]
S3 Remote Logging not working
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. These questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: v2.0.0b3 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.15 **Environment**: - **Cloud provider or hardware configuration**: AWS - **OS** (e.g. from /etc/os-release): - **Kernel** (e.g. `uname -a`): - **Install tools**: Custom Helm Chart - **Others**: **What happened**: S3 Remote Logging not working. Below is the stacktrace: ``` Running <TaskInstance: canary_dag.print_date 2020-12-09T19:46:17.200838+00:00 [queued]> on host canarydagprintdate-9fafada4409d4eafb5e6e9c7187810ae │ │ [2020-12-09 19:54:09,825] {s3_task_handler.py:183} ERROR - Could not verify previous log to append: 'NoneType' object is not callable │ │ Traceback (most recent call last): │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 179, in s3_write │ │ if append and self.s3_log_exists(remote_log_location): │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 141, in s3_log_exists │ │ return self.hook.check_for_key(remote_log_location) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │ │ connection = self.get_connection(self.aws_conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │ │ conn = Connection.get_connection_from_secrets(conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │ │ conn = secrets_backend.get_connection(conn_id=conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │ │ with create_session() as session: │ │ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │ │ return next(self.gen) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │ │ session = settings.Session() │ │ TypeError: 'NoneType' object is not callable │ │ [2020-12-09 19:54:09,826] {s3_task_handler.py:193} ERROR - Could not write logs to s3://my-favorite-airflow-logs/canary_dag/print_date/2020-12-09T19:46:17.200838+00:00/2.log │ │ Traceback (most recent call last): │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 190, in s3_write │ │ encrypt=conf.getboolean('logging', 'ENCRYPT_S3_LOGS'), │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │ │ connection = self.get_connection(self.aws_conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │ │ conn = Connection.get_connection_from_secrets(conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │ │ conn = secrets_backend.get_connection(conn_id=conn_id) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │ │ with create_session() as session: │ │ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │ │ return next(self.gen) │ │ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │ │ session = settings.Session() │ │ TypeError: 'NoneType' object is not callable stream closed ``` **What you expected to happen** Able to see the task instance logs in the airflow UI being read from S3 remote location. **How to reproduce it**: Pulled the latest master and created an airflow image from the dockerfile mentioned in the repo.
https://github.com/apache/airflow/issues/12969
https://github.com/apache/airflow/pull/13057
6bf9acb90fcb510223cadc1f41431ea5f57f0ca1
ab5f770bfcd8c690cbe4d0825896325aca0beeca
"2020-12-09T20:21:42Z"
python
"2020-12-14T16:28:01Z"
closed
apache/airflow
https://github.com/apache/airflow
12,912
["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"]
dagrun_timeout doesn't kill task instances on timeout
**Apache Airflow version**: 1.10.12 **What happened**: I created dag with dagrun_timeout=2 minutes. After 2 minutes dagrun is marked as failed and the next one is started, but task keeps going. **What you expected to happen**: Task is killed with dag run as it is done when you mark dagrun failed manually. **How to reproduce it**: ``` dag = DAG(dag_id='platform.airflow-test', description='', schedule_interval="0 0 * * *", start_date=datetime(2020, 7, 1), max_active_runs=1, catchup=True, dagrun_timeout=timedelta(minutes=2)) run_this = BashOperator( task_id='run_after_loop', bash_command=' for((i=1;i<=600;i+=1)); do echo "Welcome $i times"; sleep 1; done', dag=dag, ) ```
https://github.com/apache/airflow/issues/12912
https://github.com/apache/airflow/pull/14321
9f37af25ae7eb85fa8dbb70b7dbb23bbd5505323
09327ba6b371aa68cf681747c73a7a0f4968c173
"2020-12-08T09:36:09Z"
python
"2021-03-05T00:45:06Z"
closed
apache/airflow
https://github.com/apache/airflow
12,909
[".github/workflows/scheduled_quarantined.yml"]
Quarantined Build is broken
Seems like the script `./scripts/ci/tools/ci_check_if_tests_should_be_run.sh` has been removed from code between release 1.10.12 & 1.10.13, and since then the Quarantined Build is broken https://github.com/apache/airflow/actions/runs/405827008 cc - @potiuk
https://github.com/apache/airflow/issues/12909
https://github.com/apache/airflow/pull/13288
c2bedd580c3dd0e971ac394be25e331ba9c1c932
c4809885ecd7ec1a92a1d8d0264234d86479bf24
"2020-12-08T05:29:46Z"
python
"2020-12-23T17:52:30Z"
closed
apache/airflow
https://github.com/apache/airflow
12,881
["Dockerfile", "Dockerfile.ci", "IMAGES.rst", "scripts/in_container/_in_container_utils.sh", "scripts/in_container/run_ci_tests.sh", "scripts/in_container/run_install_and_test_provider_packages.sh", "scripts/in_container/run_prepare_provider_readme.sh", "setup.py", "tests/providers/presto/hooks/test_presto.py"]
Snowflake python connector monkeypatches urllib and makes many services unusable.
Curreently wnen you run snowflke provider, it monkeypatches urlllb in a way that is not compatible with other libraries (for example presto SSL with kerberos, google, amazon, qubole and many others). This is not critical (as in 2.0 we have provider separation and snowflake code will not even be there until you choose [snowflake] extra or install provider manually, For now we decided to release but immediately yank the snowflake provider! Additional links: * Issue: https://github.com/snowflakedb/snowflake-connector-python/issues/324 Offending code: * https://github.com/snowflakedb/snowflake-connector-python/blob/133d6215f7920d304c5f2d466bae38127c1b836d/src/snowflake/connector/network.py#L89-L92
https://github.com/apache/airflow/issues/12881
https://github.com/apache/airflow/pull/13654
821194beead51868ce360dfc096dbab91760cc37
6e90dfc38b1bf222f47acc2beb1a6c7ceccdc8dc
"2020-12-07T12:47:04Z"
python
"2021-01-16T11:52:56Z"
closed
apache/airflow
https://github.com/apache/airflow
12,877
["setup.cfg"]
ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py)
**Apache Airflow version**: 1.10.3 **Environment**: - **OS** (e.g. from /etc/os-release): Arch Linux - **Kernel** (e.g. `uname -a`): Linux 5.9.11-arch2-1 #1 SMP PREEMPT Sat, 28 Nov 2020 02:07:22 +0000 x86_64 GNU/Linux - **Install tools**: pip 2.3.1 (with _--use-deprecated legacy-resolver_) - **Others**: python 3.9 **What happened**: ``` (env) ➜ project-airflow git:(feature-implementation) ✗ ./env/bin/airflow webserver Traceback (most recent call last): File "/home/user/dev/project-airflow/./env/bin/airflow", line 26, in <module> from airflow.bin.cli import CLIFactory File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/bin/cli.py", line 95, in <module> api_module = import_module(conf.get('cli', 'api_client')) # type: Any File "/usr/lib/python3.9/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/client/local_client.py", line 24, in <module> from airflow.api.common.experimental import delete_dag File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/common/experimental/delete_dag.py", line 26, in <module> from airflow.models.serialized_dag import SerializedDagModel File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/models/serialized_dag.py", line 35, in <module> from airflow.serialization.serialized_objects import SerializedDAG File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 28, in <module> import cattr File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/__init__.py", line 2, in <module> from .converters import Converter, UnstructureStrategy File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/converters.py", line 15, in <module> from ._compat import ( File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/_compat.py", line 87, in <module> from typing import _Union ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py) ``` **How to reproduce it**: Try launch airflow webserver with python **3.9** **Anything else we need to know**: --
https://github.com/apache/airflow/issues/12877
https://github.com/apache/airflow/pull/13223
f95b1c9c95c059e85ad5676daaa191929785fee2
9c0a5df22230105eb3a571c040daaba3f9cadf37
"2020-12-07T10:19:45Z"
python
"2020-12-21T20:36:54Z"
closed
apache/airflow
https://github.com/apache/airflow
12,876
["airflow/models/baseoperator.py", "tests/core/test_core.py", "tests/models/test_baseoperator.py"]
Improve error message when template fields contain invalid entries
** Description ** When a new operator is defined there are some issues with `template_fields` that produces somewhat misleading error messages * `template_fields = ('myfield')`, which of course is incorrect because that is not a tuple, you need to use `('myfield',)` or `['myfield']` * `template_field = ['field_with_a_typo']` As a I said both gives errors (at different stages) but the errors are a bit cryptic For example: ``` [2020-12-07 09:50:45,088] {taskinstance.py:1150} ERROR - 'SFTPToS3Operator2' object has no attribute 's3_key' Traceback (most recent call last): File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 965, in _run_raw_task self.render_templates(context=context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1424, in render_templates self.task.render_template_fields(context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 719, in render_template_fields self._do_render_template_fields(self, self.template_fields, context, jinja_env, set()) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 724, in _do_render_template_fields content = getattr(parent, attr_name) AttributeError: 'SFTPToS3Operator2' object has no attribute 's3_key' ``` where there is no mention that this is related to `templated_fields` **Use case / motivation** In order to have a better experience developing plugins I would like * A warning / error if a str is used for `template_fields = 'myfield'`. It's very unlikely that anyone want to use `myfield` as sequence `['m','y', 'f','i','e','l','d']`. * A more specific error message in `_run_raw_task` if `template_fields` contains attributes not present. <!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** <!-- A short description of your feature --> **Use case / motivation** <!-- What do you want to happen? Rather than telling us how you might implement this solution, try to take a step back and describe what you are trying to achieve. --> **Related Issues** <!-- Is there currently another issue associated with this? -->
https://github.com/apache/airflow/issues/12876
https://github.com/apache/airflow/pull/21054
e97b72994f18e40e302ba8a14dbe73d34846a557
beb2a2081a800650bc9a5e602b7216166582f67f
"2020-12-07T10:06:14Z"
python
"2022-01-26T16:44:57Z"
closed
apache/airflow
https://github.com/apache/airflow
12,861
[".github/workflows/build-images-workflow-run.yml", ".github/workflows/ci.yml", "scripts/ci/tools/ci_free_space_on_ci.sh", "tests/cli/commands/test_jobs_command.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_taskinstance.py", "tests/test_utils/asserts.py"]
[QUARANTINE] TestSchedulerJob.test_scheduler_task_start_date
The test fails quite regularly - usually in one of many jobs: Example https://github.com/apache/airflow/pull/12850/checks?check_run_id=1506517544 ``` ______________ TestSchedulerJob.test_scheduler_task_start_date ________________ self = <tests.jobs.test_scheduler_job.TestSchedulerJob testMethod=test_scheduler_task_start_date> def test_scheduler_task_start_date(self): """ Test that the scheduler respects task start dates that are different from DAG start dates """ dagbag = DagBag(dag_folder=os.path.join(settings.DAGS_FOLDER, "no_dags.py"), include_examples=False) dag_id = 'test_task_start_date_scheduling' dag = self.dagbag.get_dag(dag_id) dag.is_paused_upon_creation = False dagbag.bag_dag(dag=dag, root_dag=dag) # Deactivate other dags in this file so the scheduler doesn't waste time processing them other_dag = self.dagbag.get_dag('test_start_date_scheduling') other_dag.is_paused_upon_creation = True dagbag.bag_dag(dag=other_dag, root_dag=other_dag) dagbag.sync_to_db() scheduler = SchedulerJob(executor=self.null_exec, subdir=dag.fileloc, num_runs=2) scheduler.run() session = settings.Session() tiq = session.query(TaskInstance).filter(TaskInstance.dag_id == dag_id) ti1s = tiq.filter(TaskInstance.task_id == 'dummy1').all() ti2s = tiq.filter(TaskInstance.task_id == 'dummy2').all() self.assertEqual(len(ti1s), 0) > self.assertEqual(len(ti2s), 2) E AssertionError: 1 != 2 tests/jobs/test_scheduler_job.py:2415: AssertionError ```
https://github.com/apache/airflow/issues/12861
https://github.com/apache/airflow/pull/14792
3f61df11e7e81abc0ac4495325ccb55cc1c88af4
45cf89ce51b203bdf4a2545c67449b67ac5e94f1
"2020-12-06T19:43:03Z"
python
"2021-03-18T13:01:10Z"
closed
apache/airflow
https://github.com/apache/airflow
12,852
["IMAGES.rst", "README.md"]
The README file in this repo has a bad link - [404:NotFound] "production-deployment.html"
**Apache Airflow version**: N/A **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: N/A - **Cloud provider or hardware configuration**: N/A - **OS** (e.g. from /etc/os-release): N/A - **Kernel** (e.g. `uname -a`): N/A - **Install tools**: N/A - **Others**: N/A **What happened**: The link under “Latest docs” gives Status code [404:NotFound] - Link: https://github.com/apache/airflow/blob/master/docs/production-deployment.html **What you expected to happen**: The link should point to an actual file. The closest name file I could find is “https://github.com/apache/airflow/blob/master/docs/apache-airflow/production-deployment.rst” But I was not sure if this is what the link should be pointing to or not(??) **How to reproduce it**: Click the link in the main page for this repo ## Install minikube/kind N/A **Anything else we need to know**: This bad link was found by a tool I recently created as part of an new experimental hobby project: https://github.com/MrCull/GitHub-Repo-ReadMe-Dead-Link-Finder Re-check this Repo via: http://githubreadmechecker.com/Home/Search?SingleRepoUri=https%3a%2f%2fgithub.com%2fapache%2fairflow Check all Repos for this GitHub account: http://githubreadmechecker.com/Home/Search?User=apache -- I (a human) verified that this link is broken and have manually logged this Issue (i.e. this Issue has not been created by a bot). If this has been in any way helpful then please consider giving the above Repo a Star. If you have any feedback on the information provided here, or on the tool itself, then please feel free to share your thoughts and pass on the feedback, or long an “Issue”.
https://github.com/apache/airflow/issues/12852
https://github.com/apache/airflow/pull/12854
a00f25011fc6c859b27b6c78b9201880cf6323ce
3663d1519eb867b6bb152b27b93033666993511a
"2020-12-06T13:53:21Z"
python
"2020-12-07T00:05:21Z"
closed
apache/airflow
https://github.com/apache/airflow
12,832
["dev/README_RELEASE_AIRFLOW.md", "dev/sign.sh"]
Source hash apache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid
**Description** The sha256sum file for apache-airflow releases is in an unexpected format for python-based checksum modules. **Current file format:** apache-airflow-1.10.13rc1-bin.tar.gz: 36D641C0 F2AAEC4E BCE91BD2 66CE2BC6 AA2D995C 08C9B62A 0EA1CBEC 027E657B 8AF4B54E 6C3AD117 9634198D F6EA53F8 163711BA 95586B5B 7BCF7F4B 098A19E2 **Wanted formats** xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx amd64\apache-airflow-1.10.13-bin.tar.gz **Or** `xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx` **Use case / motivation** Ansible and salt python libraries to consume checksums do not understand the format... ``` ID: airflow-archive-install Function: archive.extracted Name: /opt/apache-airflow-1.10.13/bin/ Result: False Comment: Attempt 1: Returned a result of "False", with the following comment: "Source hash https://github.com/apache/airflow/releases/download/1.10.13/ap ache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid. The supported formats are: 1) a hash, 2) an expression in the format <hash_type>=<hash>, or 3) eithe r a path to a local file containing hashes, or a URI of a remote hash file. Supported protocols for remote hash files are: salt, file, http, https, ftp, swift , s3. The hash may also not be of a valid length, the following are supported hash types and lengths: md5 (32), sha1 (40), sha224 (56), sha256 (64), sha384 (9 6), sha512 (128)." ......etc Started: 11:39:44.082079 Duration: 123506.098 ms ``` **Related Issues** No
https://github.com/apache/airflow/issues/12832
https://github.com/apache/airflow/pull/12867
298c88a434325dd6df8f374057709022e0b0811f
a00f25011fc6c859b27b6c78b9201880cf6323ce
"2020-12-05T12:01:35Z"
python
"2020-12-06T23:46:06Z"
closed
apache/airflow
https://github.com/apache/airflow
12,827
["airflow/config_templates/default_webserver_config.py", "docs/apache-airflow/security/webserver.rst"]
Missing docs about webserver_config.py
Hello, We are missing documentation on the `webserver_config.py` file. I think it is worth answering the following questions in this guide: * What is this file? * What is this file for? * When and how should you edit this file? Best regards, Kamil Breguła
https://github.com/apache/airflow/issues/12827
https://github.com/apache/airflow/pull/13155
23a47879ababe76f6cf9034a2bae055b2a91bf1f
81fed8072d1462ab43818bb7757ade4b67982976
"2020-12-05T05:45:30Z"
python
"2020-12-20T01:21:49Z"
closed
apache/airflow
https://github.com/apache/airflow
12,807
["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/configuration.py", "airflow/models/baseoperator.py", "tests/core/test_configuration.py", "tests/models/test_baseoperator.py"]
add default weight_rule to airflow.cfg
**Description** It would be nice if the weight_rule default value could be managed by a global config suggested config: ``` # Weighting method used for the effective total priority weight of the task. # Options are: { downstream | upstream | absolute } default is default_weight_rule = downstream ``` **Use case / motivation** In some pipeline, you really need to have absolute weight, and then you have to add a line in each task definition which is annoying
https://github.com/apache/airflow/issues/12807
https://github.com/apache/airflow/pull/18627
d0ffd31ba3a4e8cd27fb7305cc19c33cf637509f
d79f506213297dc0dc034d6df3226361b6f95d7a
"2020-12-04T10:02:56Z"
python
"2021-09-30T14:53:55Z"
closed
apache/airflow
https://github.com/apache/airflow
12,806
["airflow/cli/commands/db_command.py"]
'NoneType' object has no attribute 'wait' with airflow db shell on SQLite
**Apache Airflow version**: 2.0.0b3 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: - **OS** (e.g. from /etc/os-release): Ubuntu 20.04.1 LTS - **Kernel** (e.g. `uname -a`): Linux airflowvm 5.4.0-56-generic #62-Ubuntu SMP Mon Nov 23 19:20:19 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux - **Install tools**: - **Others**: **What happened**: After connecting to SQLite with `airflow db shell` And exiting the shell with`.quit` I got the following error: ``` [2020-12-04 07:31:28,506] {process_utils.py:149} INFO - Executing cmd: sqlite3 /home/airflow/airflow/airflow.db SQLite version 3.31.1 2020-01-27 19:55:54 Enter ".help" for usage hints. sqlite> ; sqlite> .quit Traceback (most recent call last): File "/home/airflow/sandbox/bin/airflow", line 8, in <module> sys.exit(main()) File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main args.func(args) File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 50, in command return func(*args, **kwargs) File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/utils/cli.py", line 86, in wrapper return f(*args, **kwargs) File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 78, in shell execute_interactive(["sqlite3", url.database]).wait() AttributeError: 'NoneType' object has no attribute 'wait' ``` **What you expected to happen**: No error when exiting the session. Looks like this `execute_interactive(["sqlite3", url.database])` returns `None` **How to reproduce it**: ``` airflow db shell sqlite> .quit ``` **Anything else we need to know**: I love this new command :)
https://github.com/apache/airflow/issues/12806
https://github.com/apache/airflow/pull/13907
c2266aac489b126638b3403b7a1ff0d2a9368056
0d1c39ad2d1e8344af413041b3bb6834d1b56778
"2020-12-04T07:40:22Z"
python
"2021-01-26T12:23:38Z"
closed
apache/airflow
https://github.com/apache/airflow
12,796
["airflow/providers/http/sensors/http.py"]
Make headers templated in HttpSensor
**Description** Make HttpSensor `headers` parameter templated. **Use case / motivation** This would allow for passing data from other tasks, such as an API token, in the headers. **Related Issues** N/A
https://github.com/apache/airflow/issues/12796
https://github.com/apache/airflow/pull/12809
37afe55775676e2cb4cf6ed0cfc6c892855d6805
c1cd50465c5473bc817fded5eeb4c425a0529ae5
"2020-12-03T20:57:43Z"
python
"2020-12-05T00:59:52Z"
closed
apache/airflow
https://github.com/apache/airflow
12,785
["airflow/operators/python.py", "airflow/plugins_manager.py", "airflow/utils/python_virtualenv_script.jinja2", "tests/plugins/test_plugins_manager.py"]
Macros added through plugins can not be used within Jinja templates in Airflow 2.0
**Apache Airflow version**: 2.0.0b3 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A **Environment**: - **OS** (e.g. from /etc/os-release): Debian GNU/Linux 10 (buster) - **Kernel** (e.g. `uname -a`): Linux 6ae65b86e112 5.4.0-52-generic #57-Ubuntu SMP Thu Oct 15 10:57:00 UTC 2020 x86_64 GNU/Linux - **Others**: Python 3.8 **What happened**: At JW Player we add additional macros to Airflow through a plugin. The definition of this plugin looks like the following (simplified): ``` from airflow.plugins_manager import AirflowPlugin from utils_plugin.macros.convert_image_tag import convert_image_tag class JwUtilsPlugin(AirflowPlugin): name = 'jw_utils' macros = [convert_image_tag] ``` `convert_image_tag` is a function that takes a string (a docker tag) as argument and resolves it to a SHA-256 hash that uniquely identifies an image by querying the docker registry. I.e. it is a function that takes a string as argument and returns a string. In Airflow 1.10.x we can successfully use this macro in our DAGs to resolve image tags to SHA-256 hashes, e.g. the following DAG will run an Alpine Image using a DockerOperator: ```python from datetime import datetime, timedelta from airflow import DAG try: from airflow.providers.docker.operators.docker import DockerOperator except ModuleNotFoundError: from airflow.operators.docker_operator import DockerOperator now = datetime.now() with DAG('test_dag', schedule_interval='*/15 * * * *', default_args={ 'owner': 'airflow', 'start_date': datetime.utcnow() - timedelta(hours=1), 'task_concurrency': 1, 'execution_timeout': timedelta(minutes=5) }, max_active_runs=1) as dag: task_sleep = DockerOperator( task_id='task_sleep', image=f"{{ macros.jw_utils.convert_image_tag('alpine') }}", command=['sleep', '10'] ) ``` This is in contrast to Airflow 2.0, if we attempt to use our custom macro here, then when Airflow attempts to render the task template it will error out with the following error: ``` [2020-12-03 12:54:43,666] {{taskinstance.py:1402}} ERROR - 'module object' has no attribute 'jw_utils' Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1087, in _run_raw_task self._prepare_and_execute_task_with_callbacks(context, task) File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1224, in _prepare_and_execute_task_with_callbacks self.render_templates(context=context) File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1690, in render_templates self.task.render_template_fields(context) File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 857, in render_template_fields self._do_render_template_fields(self, self.template_fields, context, jinja_env, set()) File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 870, in _do_render_template_fields rendered_content = self.render_template(content, context, jinja_env, seen_oids) File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 907, in render_template return jinja_env.from_string(content).render(**context) File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render self.environment.handle_exception() File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception reraise(*rewrite_traceback_stack(source=source)) File "/usr/local/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise raise value.with_traceback(tb) File "<template>", line 1, in top-level template code File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 471, in getattr return getattr(obj, attribute) jinja2.exceptions.UndefinedError: 'module object' has no attribute 'jw_utils' ``` **What you expected to happen**: I would have expected that the DAG definition from above would have worked in Airflow 2.0, like it would have functioned in Airflow 1.10.x. **How to reproduce it**: This bug can be reproduced by creating a plugin that adds a macro, and then attempting to use that macro in a DAG. **Anything else we need to know**: In order to better understand the issue, I did a bit of digging. The plugin that we extend Airflow's functionality with has its own suite of pytest testcases. Since we are in the process of preparing for a transition to Airflow 2.0 we are now running the unit tests for this plugin against both Airflow 1.10.x and Airflow 2.0.0b3. After reviewing how plugins were being loaded in Airflow, I've added the following testcase to mimic how plugins were being loaded and how [`get_template_context()`](https://github.com/apache/airflow/blob/2.0.0b3/airflow/models/taskinstance.py#L1481) in Airflow 2.0 ensures that plugins have been imported: ```python def test_macro_namespacing(is_airflow_1): """ Tests whether macros can be loaded from Airflow's namespace after loading plugins. """ from airflow import macros if not is_airflow_1: # In Airflow 2.x, we need to make sure we invoke integrate_macros_plugins(), otherwise # the namespace will not be created properly. from airflow.plugins_manager import integrate_macros_plugins integrate_macros_plugins() from utils_plugin.plugin import JwUtilsPlugin # After Airflow has loaded the plugins, the macros should be available as airflow.macros.jw_utils. macros_module = import_module(f"airflow.macros.{JwUtilsPlugin.name}") for macro in JwUtilsPlugin.macros: # Verify that macros have been registered correctly. assert hasattr(macros_module, macro.__name__) # However, in order for the module to actually be allowed to be used in templates, it must also exist on # airflow.macros. assert hasattr(macros, 'jw_utils') ``` This test case passes when being ran on Airflow 1.10, but surprisngly enough it fails on Airflow 2.x. Specifically it fails on the `assert hasattr(macros, 'jw_utils')` statement in Airflow 2.0. This statement tests whether the macros that we create through the `JwUtilsPlugin` have been properly added to `airflow.macros`. I thought it was strange for the test-case to fail on this module, given that the `import_module()` statement succeeded in Airflow 2.0. After this observation I started comparing the logic for registering macros in Airflow 1.10.x to the Airflow 2.0.0 implementation. While doing this I observed that the plugin loading mechanism in Airflow 1.10.x works because Airflow [automatically discovers](https://github.com/apache/airflow/blob/1.10.13/airflow/__init__.py#L104) all plugins through the `plugins_manager` module. When this happens it automatically [initializes plugin-macro modules](https://github.com/apache/airflow/blob/1.10.13/airflow/plugins_manager.py#L306) in the `airflow.macros` namespace. Notably, after the plugin's module has been initialized it will also automatically be registered on the `airflow.macros` module [by updating the dictionary](https://github.com/apache/airflow/blob/1.10.13/airflow/macros/__init__.py#L93) returned by `globals()`. This is in contrast to Airflow 2.0, where plugins are no longer loaded automatically. Instead they are being loaded lazily, i.e. they will be loaded on-demand whenever a function needs them. In order to load macros (or ensure that macros have been loaded), modules need to import the [`integrate_macros_plugins`](https://github.com/apache/airflow/blob/2.0.0b3/airflow/plugins_manager.py#L395) function from `airflow.plugins_manager`. When Airflow attempts to prepare a template context, prior to running a task, it properly imports this function and invokes it in [taskinstance.py](https://github.com/apache/airflow/blob/2.0.0b3/airflow/models/taskinstance.py#L1483). However, in contrast to the old 1.10.x implementation, this function does not update the symbol table of `airflow.macros`. The result of this is that the macros from the plugin _will in fact_ be imported, but because `airflow.macros` symbol table itself is not being updated, the macros that are being added by the plugins can not be used in the template rendering context. I believe this issue could be solved by ensuring that `integrate_macros_plugins` sets a reference to the `airflow.macros.jw_utils` as `jw_utils` on the `airflow.macros` module. Once that has been done I believe macros provided through plugins are functional again.
https://github.com/apache/airflow/issues/12785
https://github.com/apache/airflow/pull/12788
f66a46db88da86b4a11c5ee142c09a5001c32c41
29d78489e76c292c2ca74cab02141c2bcff2aabc
"2020-12-03T14:52:01Z"
python
"2020-12-07T22:34:14Z"
closed
apache/airflow
https://github.com/apache/airflow
12,783
["airflow/models/baseoperator.py", "airflow/sensors/base_sensor_operator.py", "airflow/serialization/schema.json", "airflow/serialization/serialized_objects.py", "tests/serialization/test_dag_serialization.py"]
Sensors in reschedule mode are not rescheduled
**Apache Airflow version**: 2.0.0dev **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: ``` ./breeze --python=3.8 --backend=postgres --db-reset restart ``` **What happened**: Sensors in reschedule mode are not rescheduled by scheduler. **What you expected to happen**: Sensors in both poke and reschedule mode should work. **How to reproduce it**: ``` from airflow import DAG from airflow.sensors.base_sensor_operator import BaseSensorOperator from airflow.utils.dates import days_ago class DummySensor(BaseSensorOperator): def poke(self, context): return False with DAG( "other_dag", start_date=days_ago(1), schedule_interval="*/5 * * * *", catchup=False ) as dag3: DummySensor( task_id='wait-task', poke_interval=60 * 5, mode='reschedule' ) ``` Then: ``` root@053f6ca34e24: /opt/airflow# airflow dags unpause other_dag Dag: other_dag, paused: False root@053f6ca34e24: /opt/airflow# airflow scheduler ____________ _____________ ____ |__( )_________ __/__ /________ __ ____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / / ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ / _/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/ [2020-12-03 14:18:58,404] {scheduler_job.py:1247} INFO - Starting the scheduler [2020-12-03 14:18:58,404] {scheduler_job.py:1252} INFO - Processing each file at most -1 times [2020-12-03 14:18:58,571] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 63835 [2020-12-03 14:18:58,576] {scheduler_job.py:1757} INFO - Resetting orphaned tasks for active dag runs [2020-12-03 14:18:58,660] {settings.py:52} INFO - Configured default timezone Timezone('UTC') [2020-12-03 14:18:58,916] {scheduler_job.py:944} INFO - 1 tasks up for execution: <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]> [2020-12-03 14:18:58,920] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued [2020-12-03 14:18:58,921] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks [2020-12-03 14:18:58,921] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state: <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]> [2020-12-03 14:18:58,925] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default [2020-12-03 14:18:58,926] {base_executor.py:79} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py'] [2020-12-03 14:18:58,935] {local_executor.py:80} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py'] [2020-12-03 14:18:59,063] {dagbag.py:440} INFO - Filling up the DagBag from /files/dags/the_old_issue.py Running <TaskInstance: other_dag.wait-task 2020-12-03T14:10:00+00:00 [queued]> on host 053f6ca34e24 [2020-12-03 14:19:00,022] {scheduler_job.py:944} INFO - 1 tasks up for execution: <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]> [2020-12-03 14:19:00,029] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued [2020-12-03 14:19:00,029] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks [2020-12-03 14:19:00,029] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state: <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]> [2020-12-03 14:19:00,033] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default [2020-12-03 14:19:00,033] {base_executor.py:82} ERROR - could not queue task TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) [2020-12-03 14:19:00,038] {scheduler_job.py:1199} INFO - Executor reports execution of other_dag.wait-task execution_date=2020-12-03 14:10:00+00:00 exited with status success for try_number 1 [2020-12-03 14:19:00,045] {scheduler_job.py:1235} ERROR - Executor reports task instance <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [queued]> finished (success) although the task says its queued. (Info: None) Was the task killed externally? [2020-12-03 14:19:01,173] {dagrun.py:429} ERROR - Marking run <DagRun other_dag @ 2020-12-03 14:10:00+00:00: scheduled__2020-12-03T14:10:00+00:00, externally triggered: False> failed ``` **Anything else we need to know**: Discovered when working on #10790 Thank @nathadfield for helping discover this issue!
https://github.com/apache/airflow/issues/12783
https://github.com/apache/airflow/pull/12858
75d8ff96b4e7736b177c3bb8e949653d6a501736
c045ff335eecb5c72aeab9e7f01973c18f678ff7
"2020-12-03T13:52:28Z"
python
"2020-12-06T21:55:53Z"
closed
apache/airflow
https://github.com/apache/airflow
12,780
["PULL_REQUEST_WORKFLOW.rst", "scripts/ci/selective_ci_checks.sh"]
K8S were not run on cli change
In https://github.com/apache/airflow/pull/12725 selective checks did not run K8S tests.
https://github.com/apache/airflow/issues/12780
https://github.com/apache/airflow/pull/13305
e9d65bd4582b083914f2fc1213bea44cf41d1a08
e2bfac9fc874a6dd1eb52a067313f43ec94307e3
"2020-12-03T09:36:39Z"
python
"2020-12-24T14:48:57Z"
closed
apache/airflow
https://github.com/apache/airflow
12,776
["airflow/migrations/versions/4addfa1236f1_add_fractional_seconds_to_mysql_tables.py", "airflow/migrations/versions/d2ae31099d61_increase_text_size_for_mysql.py", "airflow/migrations/versions/e959f08ac86c_change_field_in_dagcode_to_mediumtext_.py", "airflow/models/dagcode.py"]
Update source_code field of dag_code table to MEDIUMTEXT
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** Update source_code field of dag_code table to MEDIUMTEXT <!-- A short description of your feature --> **Use case / motivation** Lot of dags exceed the limit of 65K characters limit giving error `"Data too long for column 'source_code' at row 1"` when enabling webserver to fetch dag_code from db. <!-- What do you want to happen? Rather than telling us how you might implement this solution, try to take a step back and describe what you are trying to achieve. --> **Related Issues** <!-- Is there currently another issue associated with this? -->
https://github.com/apache/airflow/issues/12776
https://github.com/apache/airflow/pull/12890
b11551278a703e2e742969ac554908f16f235809
f66a46db88da86b4a11c5ee142c09a5001c32c41
"2020-12-03T08:42:01Z"
python
"2020-12-07T22:17:22Z"
closed
apache/airflow
https://github.com/apache/airflow
12,774
["CONTRIBUTING.rst", "docs/apache-airflow/cli-and-env-variables-ref.rst"]
Missing AIRFLOW__{SECTION}__{OPTION}__SECRET in environment variable reference
Hello, One env variiable - `AIRFLOW__{SECTION}__{OPTION}__SECRET` has not been added to our [environment variables reference](https://github.com/apache/airflow/blob/master/docs/apache-airflow/cli-and-env-variables-ref.rst). For moe info, see: See: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/howto/set-config.html Related: https://github.com/apache/airflow/issues/12773 https://github.com/apache/airflow/issues/12772
https://github.com/apache/airflow/issues/12774
https://github.com/apache/airflow/pull/12797
4da94b5a19eb547e86cebf074078ba6f03a51db1
292118e33971dfd68cb32a404a85c0d46d225b40
"2020-12-03T08:21:50Z"
python
"2020-12-04T00:58:17Z"
closed
apache/airflow
https://github.com/apache/airflow
12,773
["airflow/config_templates/config.yml", "docs/apache-airflow/configurations-ref.rst"]
Incomplete list of environment variables that override configuration
Hello, In our configuration reference docs, we provide information about the environment variables that affect the options. <img width="430" alt="Screenshot 2020-12-03 at 09 14 33" src="https://user-images.githubusercontent.com/12058428/100982181-07389c00-3548-11eb-9089-fe00c4b9367f.png"> Unfortunately, this list is not complete. Some configuration options can also be set using `AIRFLOW__{SECTION}__{OPTION}__SECRET` or `AIRFLOW__{SECTION}__{OPTION}__CMD` env variable. See: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/howto/set-config.html
https://github.com/apache/airflow/issues/12773
https://github.com/apache/airflow/pull/12820
e82cf0d01d6c1e1ec65d8e1b70d65158947fccd2
c85f49454de63f5857bf477a240229a71f0e78ff
"2020-12-03T08:17:58Z"
python
"2020-12-05T06:00:18Z"
closed
apache/airflow
https://github.com/apache/airflow
12,772
["airflow/configuration.py", "docs/apache-airflow/configurations-ref.rst", "docs/conf.py", "tests/core/test_configuration.py"]
Missing docs for deprecated configuration options
Hello, In Airflow 2, we've moved some configuration options to the new section. We also changed the names of some of the configuration options. This is confusing for users who are familiar with the old option and section names. It would be great if we could add information to the documentation that points to the new name of the options. https://github.com/apache/airflow/blob/8f48f12128e0d985c6de2603902524859fecbca8/airflow/configuration.py#L139-L169 > 'The {old} option in [{section}] has been renamed to {new} > 'The {old_key} option in [{old_section}] has been moved to the {new_key} option in ' > '[{new_section}] Best regards, Kamil Breguła
https://github.com/apache/airflow/issues/12772
https://github.com/apache/airflow/pull/13883
810c15ed85d7bcde8d5b8bc44e1cbd4859e29d2e
65e49fc56f32b3e815fdf4a17be6b4e1c1e43c11
"2020-12-03T08:11:14Z"
python
"2021-01-27T12:06:35Z"
closed
apache/airflow
https://github.com/apache/airflow
12,769
["docs/apache-airflow/upgrading-to-2.rst"]
Documentation needed for DB upgrade as part of 2.0
Following up on the dev call on 30th of November, there was a clear desire expressed for documentation around the database upgrade process from Airflow 1.10.14 (or equivalent) to Airflow 2.0. Though the upgrade process is conceptually no different from a regular 1.10.x to a 1.10.x+1 release, the fact that there are significant known database changes may raise concerns in the minds of Airflow users as part of the upgrade. To ease their concerns, the following questions should ideally be answered as part of the documentation specifically either as part of the "Upgrading to 2.0 document" or linked from there. Q 1. Is there anything "special" which I need to be done to upgrade from 1.10.x to 2.0 with respect to the database? Ans. I don't believe so, other than the normal upgrade checks. Q 2. How long should I expect this database upgrade expected to take? Ans. I am not quite sure how to answer this since it depends on the data. We can possibly share sample times based on tested data sets. Q 3. Can I do something to reduce the database upgrade time? Ans. A couple of options here. One possibility is to recommend the maintenance DAGs to be run to archive / delete older task history, xcom data, and equivalent. Another possibility is to provide a script for them to run as part of the Airflow project distribution, possibly part of upgrade check scripts.
https://github.com/apache/airflow/issues/12769
https://github.com/apache/airflow/pull/13005
3fbc8e650dcd398bc2844b7b3d92748423c7611a
0ffd5fa3d87e78126807e6cdb4b1b29154047153
"2020-12-03T01:57:37Z"
python
"2020-12-11T16:20:35Z"
closed
apache/airflow
https://github.com/apache/airflow
12,757
["airflow/models/baseoperator.py", "tests/utils/test_task_group.py"]
Graph View is empty when Operator has multiline string in args (v2.0)
Airflow v2.0b3 Kubernetes v1.19.3 Discovered issue while testing KubernetesPodOperator (haven't tested with other operator). If I create a multiline string using """ """", add some variables inside (Jinja templating), then use this string as an argument to KubernetesPodOperator: - In Graph View DAG is not visible (just gray area where it should be a digraph); - in browser's web console i see the following error: `Uncaught TypeError: node is undefined preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103 preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103 fn http://localhost:8080/static/dist/dagre-d3.min.js:103 call http://localhost:8080/static/dist/d3.min.js:3 draw http://localhost:8080/graph?dag_id=mydag&execution_date=mydate expand_group http://localhost:8080/graph?dag_id=mydag&execution_date=mydate <anonymous> http://localhost:8080/graph?dag_id=mydag&execution_date=mydate` Tree view works without issues in this case. The DAG succeeds.
https://github.com/apache/airflow/issues/12757
https://github.com/apache/airflow/pull/12829
cd66450b4ee2a219ddc847970255e420ed679700
12ce5be77f64c335dce12c3586d2dc7b63491d34
"2020-12-02T14:59:06Z"
python
"2020-12-05T11:52:55Z"
closed
apache/airflow
https://github.com/apache/airflow
12,751
["chart/templates/flower/flower-service.yaml", "chart/templates/webserver/webserver-service.yaml", "chart/tests/test_flower.py", "chart/tests/test_webserver.py", "chart/values.schema.json", "chart/values.yaml"]
Helm Chart: Provide option to specify loadBalancerIP in webserver service
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> **Description** The current service type for `webserver` is defaulted at `ClusterIP`. I am able to change it to `LoadBalancer` type, but the I was not able to specify the static IP. So every time we reinstall the chart, it will change the assigned IP of the loadbalancer being provisioned to us. <!-- A short description of your feature --> **Use case / motivation** <!-- What do you want to happen? Rather than telling us how you might implement this solution, try to take a step back and describe what you are trying to achieve. --> **Related Issues** <!-- Is there currently another issue associated with this? -->
https://github.com/apache/airflow/issues/12751
https://github.com/apache/airflow/pull/15972
bb43e06c75dd6cafc094813347f7a7b13cb9374e
9875f640ca19dabd846c17f4278ccc90e189ae8d
"2020-12-02T04:19:48Z"
python
"2021-05-21T23:06:09Z"
closed
apache/airflow
https://github.com/apache/airflow
12,748
["codecov.yml"]
Code Coverage is Broken
https://codecov.io/github/apache/airflow?branch=master CodeCov code-coverage is broken on Master. It wasn't great but still useful to check which sections needed lacks tests. cc @potiuk
https://github.com/apache/airflow/issues/12748
https://github.com/apache/airflow/pull/13092
0eb210df3e10b478a567291355bc269150c93ae5
ae98c074032861b07d6945a8f6f493b319dcc374
"2020-12-01T23:46:36Z"
python
"2020-12-15T21:33:39Z"
closed
apache/airflow
https://github.com/apache/airflow
12,744
["setup.cfg", "setup.py"]
Difference of extras Airflow 2.0 vs. Airflow 1.10
**Description** When airflow 2.0 is installed from PyPI, providers are not installed by default. In order to install them, you should add an appropriate extra. While this behavior is identical in Airflow 1.10 for those "providers" that required additional packages, there were a few "providers" that did not require any extras to function (example http, ftp) - we have "http", "ftp" extras for them now, but maybe some of those are popular enough to be included by default?. We have to make a decision now: - [x] should all of them (or some of them) be included by default when you install Airflow? - [x] if we decide to exclude only some (or none), we should add them in UPGRADING_to_2_0 and in UPDATING documentation. **Use case / motivation** We want people to get a familiar experience when installing airflow. Why we provide familiar mechanism (with extras) and people will expect a slightly different configurations, installation and we can describe the differences, maybe some of those providers are so popular that we should include them by default? **Related Issues** #12685 - where we discuss which of the extras should be included in the Production Image of 2.0. **Additional info** Here is the list of all "providers" that were present in 1.10 and had no additional dependencies - so basically they woudl work out-fhe-box in 1.10, but they need appropriate "extra" in 2.0. * "apache.pig": [], * "apache.sqoop": [], * "dingding": [], * "discord": [], * "ftp": [], * "http": [], * "imap": [], * "openfaas": [], * "opsgenie": [], * "sqlite": [], Also here I appeal to the wisdom of crowd: @ashb, @dimberman @kaxil, @turbaszek, @mik-laj. @XD-DENG, @feluelle, @eladkal, @ryw, @vikramkoka, @KevinYang21 - let me know WDYT before I bring it to devlist?
https://github.com/apache/airflow/issues/12744
https://github.com/apache/airflow/pull/12916
9b39f24780e85f859236672e9060b2fbeee81b36
e7c1771cba16e3f554d6de5f77c97e49b16f7bed
"2020-12-01T18:44:37Z"
python
"2020-12-08T15:22:47Z"
closed
apache/airflow
https://github.com/apache/airflow
12,726
["docs/apache-airflow/tutorial_taskflow_api.rst"]
Add classic operator in TaskFlow API tutorial
**Description** TaskFlow API tutorial should add an example to use a classic operator (example: EmailOperator) so that users know that it can be leveraged. Alternatively, it should add references to how to add dependencies (implicit or explicit) to classic operators. **Use case / motivation** It's not super clear how can TaskFlow API be used with existing operators (aka PostgresOperator, EmailOperator...). Adding an example, will facilitate users to get a picture of what can be done with this.
https://github.com/apache/airflow/issues/12726
https://github.com/apache/airflow/pull/19214
2fdcb8a89cd1aaf1a90657385a257e58926c21a9
2dfe85dcb4923f1c4cce8b1570561f11cf07c186
"2020-12-01T00:27:48Z"
python
"2021-10-29T16:44:50Z"
closed
apache/airflow
https://github.com/apache/airflow
12,722
["airflow/hooks/dbapi_hook.py", "tests/hooks/test_dbapi_hook.py"]
No Support for special Characters in Passwords for get_uri() method in DbApiHook
Hello, I have recently noticed that a lot of connections with certain special characters don't work with SqlAlchemy as it requires the passwords to be [urlencoded when they contain special characters](https://stackoverflow.com/questions/1423804/writing-a-connection-string-when-password-contains-special-characters). Would there be any impact to changing the code at line 81 to urlencode the password like `urllib.parse.quote_plus(conn.password)` to prevent login failures for special characters? https://github.com/apache/airflow/blob/dee304b222d355b03794aa063f39e3ee13997730/airflow/hooks/dbapi_hook.py#L72-L88 I initially caught this issue while using the OdbcHook method found here. https://github.com/apache/airflow/blob/dee304b222d355b03794aa063f39e3ee13997730/airflow/providers/odbc/hooks/odbc.py#L198-L204 Happy to create a feature request, I just want to confirm that this issue makes sense.
https://github.com/apache/airflow/issues/12722
https://github.com/apache/airflow/pull/12775
4fb312140fc15b46fa96e98ec0e3939d81109eb6
01707d71d9d184d4c5b9602c93c2e46c9010d711
"2020-11-30T20:23:33Z"
python
"2020-12-07T16:51:09Z"
closed
apache/airflow
https://github.com/apache/airflow
12,716
["chart/templates/dags-persistent-volume-claim.yaml", "chart/templates/scheduler/scheduler-deployment.yaml", "chart/values.yaml"]
Scheduler in helm chart does not work with persistent + gitsync
**Apache Airflow version**: 2.0.0dev Due to this: https://github.com/apache/airflow/blob/5e13c372860a28256bf6e572bf7349f3dd6b8b0c/chart/templates/scheduler/scheduler-deployment.yaml#L156-L164 Doing this: https://github.com/apache/airflow/tree/master/chart#mounting-dags-using-git-sync-side-car-with-persistence-enabled ``` helm upgrade airflow . \ --set dags.persistence.enabled=false \ --set dags.gitSync.enabled=true ``` will fail with: ``` Error: Deployment.apps "airflow4-scheduler" is invalid: spec.template.spec.containers[0].volumeMounts[4].mountPath: Invalid value: "/opt/airflow/dags": must be unique ``` The reason is that if both ``` --set dags.persistence.enabled=false --set dags.gitSync.enabled=true ``` are specified then the volume mount is specified two times in scheduler definition.
https://github.com/apache/airflow/issues/12716
https://github.com/apache/airflow/pull/12717
8c5594b02ffbfc631ebc2366dbde6d8c4e56d550
e7a2e3544f216c6fba8ea4b344ecb6c89158c032
"2020-11-30T11:21:00Z"
python
"2021-02-08T16:39:56Z"
closed
apache/airflow
https://github.com/apache/airflow
12,692
["airflow/plugins_manager.py", "airflow/providers_manager.py", "setup.cfg", "tests/plugins/test_plugins_manager.py", "tests/www/test_views.py"]
Provider discovery based on entry_points is rather brittle
The tests we run in CI had shown that provider discovery based on entry_points is rather brittle. Example here: https://github.com/apache/airflow/pull/12466/checks?check_run_id=1467792592#step:9:4452 This is not a problem with Airflow, but wth PIP which might silently upgrade some packages and cause "version conflict" totally independently from Airflow configuration and totally out-of-our-control. Simple installing a whl package on top of the existing airflow installation (as it happened in the case above) might cause inconsistent requirements (in the case above installing .whl packages with all providers on top of existing Airflow installation caused the requests package to be upgraded to 2.25.0, even if airflow has the right requirements set. In this case it was (correct and it is from the "install_requires" section of airflow's setup.cfg): ``` Requirement.parse('requests<2.24.0,>=2.20.0'), {'apache-airflow'} ``` In case you have a version conflict in your env, running entry_point.load() from a package that has this version conflicts results with `pkg_resources.VersionConflict` error or `pkg_resources.ContextualVersionConflict) rather than returning the entry_point. Or at least that's what I observed so far. It's rather easy to reproduce. Simply install requests > 2.24.0 in the current airflow and see what happens. So far I could not find a way to mitigate this problem, but @ashb - since you have more experience with it, maybe you can find a workaround for this? I think we have a few options: 1) We fail 'airflow' hard if there is any Version Conflict. We have a way now after I've implemented ##10854 (and after @ephraimbuddy finishes the #12188 ) - we have a good, maintainable list of non-conflicting dependencies for Airflow and it's providers and we can keep that in the future thanks to pip-check. But I am afraid that will give a hard time to people who would like to install airflow with some custom dependencies (Tensorflow for example, depending on versions is notoriously difficult to sync with Airflow when it comes to dependencies). However, this is the most "Proper" (TM) solution. 2) We find a workaround for the entry_point.load() VersionConflict exception. However, I think that might not be possible or easy looking for example at this SO thread: https://stackoverflow.com/questions/52982603/python-entry-point-fails-for-dependency-conflict . The most upvoted (=1) answer there starts with "Welcome to the world of dependencies hell! I know no clean way to solve this" - which is not very encouraging. I tried also to find it out from docs and code of the entry_point.load() but to no avail. @ashb - maybe you can help here. 3) We go back to the original implementation of mine where I read provider info from provider.yaml embedded into the package. This has disadvantage of being non-standard, but it works independently of version conflicts. WDYT?
https://github.com/apache/airflow/issues/12692
https://github.com/apache/airflow/pull/12694
850b74befe5e1827c84d02dd2c7c5e6aded3f841
7ef9aa7d545f11442b6ebb86590cd8ce5f98430b
"2020-11-28T18:11:18Z"
python
"2020-11-29T06:19:47Z"
closed
apache/airflow
https://github.com/apache/airflow
12,691
["airflow/www/templates/airflow/dag_details.html"]
add dagrun_timeout to the DAG Details screen in the UI
In the Details page the is no indication of the DAG `dagrun_timeout`
https://github.com/apache/airflow/issues/12691
https://github.com/apache/airflow/pull/14165
92f81da91cc337e18e5aa77d445d0a8ab7d32600
61b613359e2394869070b3ad94f64dfda3efac74
"2020-11-28T18:02:57Z"
python
"2021-02-10T20:25:04Z"