status
stringclasses 1
value | repo_name
stringclasses 13
values | repo_url
stringclasses 13
values | issue_id
int64 1
104k
| updated_files
stringlengths 10
1.76k
| title
stringlengths 4
369
| body
stringlengths 0
254k
⌀ | issue_url
stringlengths 38
55
| pull_url
stringlengths 38
53
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
unknown | language
stringclasses 5
values | commit_datetime
unknown |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 21,380 | ["airflow/providers/databricks/hooks/databricks.py", "airflow/providers/databricks/operators/databricks.py", "tests/providers/databricks/hooks/test_databricks.py", "tests/providers/databricks/operators/test_databricks.py"] | Databricks: support for triggering jobs by name | ### Description
The DatabricksRunNowOperator supports triggering job runs by job ID. We would like to extend the operator to also support triggering jobs by name. This will likely require first making an API call to list jobs in order to find the appropriate job id.
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21380 | https://github.com/apache/airflow/pull/21663 | 537c24433014d3d991713202df9c907e0f114d5d | a1845c68f9a04e61dd99ccc0a23d17a277babf57 | "2022-02-07T10:23:18Z" | python | "2022-02-26T21:55:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,348 | ["airflow/providers/amazon/aws/operators/glue.py"] | Status of testing Providers that were prepared on February 05, 2022 | ### Body
I have a kind request for all the contributors to the latest provider packages release.
Could you please help us to test the RC versions of the providers?
Let us know in the comment, whether the issue is addressed.
Those are providers that require testing as there were some substantial changes introduced:
## Provider [amazon: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-amazon/3.0.0rc1)
- [ ] [Rename params to cloudformation_parameter in CloudFormation operators. (#20989)](https://github.com/apache/airflow/pull/20989): @potiuk
- [ ] [[SQSSensor] Add opt-in to disable auto-delete messages (#21159)](https://github.com/apache/airflow/pull/21159): @LaPetiteSouris
- [x] [Create a generic operator SqlToS3Operator and deprecate the MySqlToS3Operator. (#20807)](https://github.com/apache/airflow/pull/20807): @mariotaddeucci
- [ ] [Move some base_aws logging from info to debug level (#20858)](https://github.com/apache/airflow/pull/20858): @o-nikolas
- [ ] [Adds support for optional kwargs in the EKS Operators (#20819)](https://github.com/apache/airflow/pull/20819): @ferruzzi
- [ ] [AwsAthenaOperator: do not generate client_request_token if not provided (#20854)](https://github.com/apache/airflow/pull/20854): @XD-DENG
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [fix: cloudwatch logs fetch logic (#20814)](https://github.com/apache/airflow/pull/20814): @ayushchauhan0811
- [ ] [Alleviate import warning for `EmrClusterLink` in deprecated AWS module (#21195)](https://github.com/apache/airflow/pull/21195): @josh-fell
- [ ] [Rename amazon EMR hook name (#20767)](https://github.com/apache/airflow/pull/20767): @vinitpayal
- [ ] [Standardize AWS SQS classes names (#20732)](https://github.com/apache/airflow/pull/20732): @eladkal
- [ ] [Standardize AWS Batch naming (#20369)](https://github.com/apache/airflow/pull/20369): @ferruzzi
- [ ] [Standardize AWS Redshift naming (#20374)](https://github.com/apache/airflow/pull/20374): @ferruzzi
- [ ] [Standardize DynamoDB naming (#20360)](https://github.com/apache/airflow/pull/20360): @ferruzzi
- [ ] [Standardize AWS ECS naming (#20332)](https://github.com/apache/airflow/pull/20332): @ferruzzi
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [apache.druid: 2.3.0rc1](https://pypi.org/project/apache-airflow-providers-apache-druid/2.3.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.hive: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-apache-hive/2.2.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.spark: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-apache-spark/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.sqoop: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-apache-sqoop/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [cncf.kubernetes: 3.0.2rc1](https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/3.0.2rc1)
- [ ] [Add missed deprecations for cncf (#20031)](https://github.com/apache/airflow/pull/20031): @dimon222
## Provider [docker: 2.4.1rc1](https://pypi.org/project/apache-airflow-providers-docker/2.4.1rc1)
- [ ] [Fixes Docker xcom functionality (#21175)](https://github.com/apache/airflow/pull/21175): @ferruzzi
## Provider [exasol: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-exasol/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [google: 6.4.0rc1](https://pypi.org/project/apache-airflow-providers-google/6.4.0rc1)
- [ ] [[Part 1]: Add hook for integrating with Google Calendar (#20542)](https://github.com/apache/airflow/pull/20542): @rsg17
- [ ] [Add encoding parameter to `GCSToLocalFilesystemOperator` to fix #20901 (#20919)](https://github.com/apache/airflow/pull/20919): @danneaves-ee
- [ ] [batch as templated field in DataprocCreateBatchOperator (#20905)](https://github.com/apache/airflow/pull/20905): @wsmolak
- [ ] [Make timeout Optional for wait_for_operation (#20981)](https://github.com/apache/airflow/pull/20981): @MaksYermak
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Cloudsql import links fix. (#21199)](https://github.com/apache/airflow/pull/21199): @subkanthi
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [http: 2.0.3rc1](https://pypi.org/project/apache-airflow-providers-http/2.0.3rc1)
- [ ] [Split out confusing path combination logic to separate method (#21247)](https://github.com/apache/airflow/pull/21247): @malthe
## Provider [imap: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-imap/2.2.0rc1)
- [ ] [Add "use_ssl" option to IMAP connection (#20441)](https://github.com/apache/airflow/pull/20441): @feluelle
## Provider [jdbc: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-jdbc/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [microsoft.azure: 3.6.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-azure/3.6.0rc1)
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [microsoft.mssql: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-mssql/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [microsoft.psrp: 1.1.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-psrp/1.1.0rc1)
- [x] [PSRP improvements (#19806)](https://github.com/apache/airflow/pull/19806): @malthe
## Provider [mysql: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-mysql/2.2.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
(https://github.com/apache/airflow/pull/20618): @potiuk
## Provider [oracle: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-oracle/2.2.0rc1)
- [x] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [x] [Fix handling of Oracle bindvars in stored procedure call when parameters are not provided (#20720)](https://github.com/apache/airflow/pull/20720): @malthe
## Provider [postgres: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-postgres/3.0.0rc1)
- [ ] [Replaces the usage of postgres:// with postgresql:// (#21205)](https://github.com/apache/airflow/pull/21205): @potiuk
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Remove `:type` lines now sphinx-autoapi supports typehints (#20951)](https://github.com/apache/airflow/pull/20951): @ashb
- [ ] [19489 - Pass client_encoding for postgres connections (#19827)](https://github.com/apache/airflow/pull/19827): @subkanthi
- [ ] [Amazon provider remove deprecation, second try (#19815)](https://github.com/apache/airflow/pull/19815): @uranusjr
## Provider [qubole: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-qubole/2.1.0rc1)
- [x] [Add Qubole how to documentation (#20058)](https://github.com/apache/airflow/pull/20058): @kazanzhy
## Provider [slack: 4.2.0rc1](https://pypi.org/project/apache-airflow-providers-slack/4.2.0rc1)
- [ ] [Return slack api call response in slack_hook (#21107)](https://github.com/apache/airflow/pull/21107): @pingzh
(https://github.com/apache/airflow/pull/20571): @potiuk
## Provider [snowflake: 2.5.0rc1](https://pypi.org/project/apache-airflow-providers-snowflake/2.5.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Fix #21096: Support boolean in extra__snowflake__insecure_mode (#21155)](https://github.com/apache/airflow/pull/21155): @mik-laj
## Provider [sqlite: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-sqlite/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [ssh: 2.4.0rc1](https://pypi.org/project/apache-airflow-providers-ssh/2.4.0rc1)
- [ ] [Add a retry with wait interval for SSH operator (#14489)](https://github.com/apache/airflow/issues/14489): @Gaurang033
- [ ] [Add banner_timeout feature to SSH Hook/Operator (#21262)](https://github.com/apache/airflow/pull/21262): @potiuk
## Provider [tableau: 2.1.4rc1](https://pypi.org/project/apache-airflow-providers-tableau/2.1.4rc1)
- [ ] [Squelch more deprecation warnings (#21003)](https://github.com/apache/airflow/pull/21003): @uranusjr
## Provider [vertica: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-vertica/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/21348 | https://github.com/apache/airflow/pull/21353 | 8da7af2bc0f27e6d926071439900ddb27f3ae6c1 | d1150182cb1f699e9877fc543322f3160ca80780 | "2022-02-05T20:59:27Z" | python | "2022-02-06T21:25:29Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,336 | ["airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "tests/www/views/test_views_trigger_dag.py"] | Override the dag run_id from within the ui | ### Description
It would be great to have the ability to override the generated run_ids like `scheduled__2022-01-27T14:00:00+00:00` so that it is easier to find specific dag runs in the ui. I know the rest api allows you to specify a run id, but it would be great if ui users could also specify a run_id using for example dag_run conf.
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21336 | https://github.com/apache/airflow/pull/21851 | 340180423a687d8171413c0c305f2060f9722177 | 14a2d9d0078569988671116473b43f86aba1161b | "2022-02-04T21:10:04Z" | python | "2022-03-16T08:12:59Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,325 | ["airflow/providers/cncf/kubernetes/hooks/kubernetes.py", "airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py", "tests/providers/apache/flink/operators/test_flink_kubernetes.py", "tests/providers/cncf/kubernetes/hooks/test_kubernetes_pod.py", "tests/providers/cncf/kubernetes/operators/test_spark_kubernetes.py"] | on_kill method for SparkKubernetesOperator | ### Description
In some cases the Airflow sends `SIGTERM` to the task, here to the `SparkKubernetesOperator`, it needs to send `SIGTERM` also to the corresponding pods/jobs
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21325 | https://github.com/apache/airflow/pull/29977 | feab21362e2fee309990a89aea39031d94c5f5bd | 9a4f6748521c9c3b66d96598036be08fd94ccf89 | "2022-02-04T13:31:39Z" | python | "2023-03-14T22:31:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,321 | ["airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py", "airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py", "airflow/providers/amazon/aws/operators/ecs.py", "docs/apache-airflow-providers-amazon/operators/ecs.rst", "tests/providers/amazon/aws/operators/test_ecs.py"] | ECS Operator does not support launch type "EXTERNAL" | ### Description
You can run ECS tasks either on EC2 instances or via AWS Fargate, and these will run in AWS. With ECS Anywhere, you are now able to run the same ECS tasks on any host that has the ECS agent - on prem, in another cloud provider, etc. The control plane resides in ECS, but the execution of the task is managed by the ECS agent.
To launch tasks on hosts that are managed by ECS Anywhere, you need to specify a launch type of EXTERNAL. This is currently not supported by the ECS Operator. When you attempt do this, you get an error of unsupported launch type.
The current work around is to use boto3 and create a task and then run it using the correct parameters.
### Use case/motivation
The ability to run your task code to support hybrid and multi-cloud orchestration scenarios.
### Related issues
_No response_
### Are you willing to submit a PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21321 | https://github.com/apache/airflow/pull/22093 | 33ecca1b9ab99d9d15006df77757825c81c24f84 | e63f6e36d14a8cd2462e80f26fb4809ab8698380 | "2022-02-04T11:23:52Z" | python | "2022-03-11T07:25:11Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,302 | ["airflow/www/package.json", "airflow/www/static/js/graph.js", "airflow/www/static/js/tree/Tree.jsx", "airflow/www/static/js/tree/dagRuns/index.test.jsx", "airflow/www/static/js/tree/index.jsx", "airflow/www/static/js/tree/renderTaskRows.jsx", "airflow/www/static/js/tree/renderTaskRows.test.jsx", "airflow/www/static/js/tree/useTreeData.js", "airflow/www/static/js/tree/useTreeData.test.jsx", "airflow/www/yarn.lock"] | Pause auto-refresh when the document becomes hidden | ### Description
When running Airflow it can be common to leave some tabs of Airflow open but not active. I believe (but not 100% sure, if I am wrong I can close this issue) Airflow's auto-refresh keeps refreshing when the document becomes hidden (for example, you switched to another browser tab).
This is not desirable in the cases when you are running the Airflow services on your same machine and you have a long-running DAG (taking hours to run). This could cause your CPU utilization to ramp up in this scenario (which can be quite common for users, myself included):
1. You are running the Airflow services on your same machine
2. Your machine is not that powerful
3. You have a long-running DAG (taking hours to run)
4. You leave a auto-refreshing page(s) of that DAG open for a long time (such as tree or graph) in hidden (or non-focused) tabs of your browser
- What can make this even worse is if you have multiple tabs like this open, you are multiplying the extra processing power to refresh the page at a short interval
5. You have not increased the default `auto_refresh_interval` of 3
### Use case/motivation
I am proposing the following improvements to the auto-refresh method to improve this situation:
1. When you change tabs in your browser, there is a feature of Javascript in modern browsers called "Page Visibility API". It allows for the use of listeners on a `visibilitychange` event to know when a document becomes visible or hidden. This can be used to pause auto-refresh when the document becomes hidden.
- Discussion on Stack Overflow: https://stackoverflow.com/questions/1060008/is-there-a-way-to-detect-if-a-browser-window-is-not-currently-active
- MDN: https://developer.mozilla.org/en-US/docs/Web/API/Page_Visibility_API
- W3C: https://www.w3.org/TR/page-visibility/
2. We should provide a message in the UI to alert the user that the auto-refreshing is paused until the page regains focus.
3. Lastely, the option to only auto-refresh if the document is visible should be a configurable setting.
Additionally, the older `onblur` and `onfocus` listeners on the entire document could be used too. That way if a user switches to a different window while the page is still visible, the auto-refresh can pause (although this might not be desirable if you want to have Airflow open side-by-side with something else, so maybe this will be overboard)
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21302 | https://github.com/apache/airflow/pull/21904 | dfd9805a23b2d366f5c332f4cb4131462c5ba82e | 635fe533700f284da9aa04a38a5dae9ad6485454 | "2022-02-03T19:57:20Z" | python | "2022-03-08T18:31:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,201 | ["airflow/www/static/js/gantt.js", "airflow/www/static/js/graph.js", "airflow/www/static/js/task_instances.js", "airflow/www/views.py"] | Add Trigger Rule Display to Graph View | ### Description
This feature would introduce some visual addition(s) (e.g. tooltip) to the Graph View to display the trigger rule between tasks.
### Use case/motivation
This would add more detail to Graph View, providing more information visually about the relationships between upstream and downstream tasks.
### Related issues
https://github.com/apache/airflow/issues/19939
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21201 | https://github.com/apache/airflow/pull/26043 | bdc3d4da3e0fb11661cede149f2768acb2080d25 | f94176bc7b28b496c34974b6e2a21781a9afa221 | "2022-01-28T23:17:52Z" | python | "2022-08-31T19:51:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,188 | ["airflow/www/static/js/connection_form.js"] | "Test Connection" functionality in the UI doesn't consider custom form fields | ### Apache Airflow version
main (development)
### What happened
When creating a connection to Snowflake in the Airflow UI, I was hoping to test the connection prior to saving the connection. Unfortunately when I click on the Test button I receive the following error despite all fields in the connection form (especially Account) being provided.
![image](https://user-images.githubusercontent.com/48934154/151571943-3a6a9c51-5517-429e-9834-6015810d2e2e.png)
However, when I specify the connection parameters directly in the Extra field, the connection test is successful.
![image](https://user-images.githubusercontent.com/48934154/151571830-78a71d45-362a-4879-bdea-6f28545a795b.png)
### What you expected to happen
I would have expected that I could use the custom fields in the connection form to test the connection. While using the Extra field is a workaround for the Snowflake connection type, not all custom connection forms expose the Extra field (e.g. Azure Data Factory, Salesforce, etc.) which makes this workaround impossible when testing in the UI.
### How to reproduce
- Attempt to create a Snowflake connection (or other connection types that have the `test_connection()` method implemented which use custom fields for authentication).
- Click the Test button in the UI.
### Operating System
Debian GNU/Linux 10 (buster
### Versions of Apache Airflow Providers
N/A - using `main` branch.
### Deployment
Other
### Deployment details
Testing with `main` using Breeze.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21188 | https://github.com/apache/airflow/pull/21330 | fc44836504129664edb81c510e6deb41a7e1126d | a9b8ac5e0dde1f1793687a035245fde73bd146d4 | "2022-01-28T15:19:40Z" | python | "2022-02-15T19:00:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,183 | ["airflow/sensors/external_task.py", "tests/sensors/test_external_task_sensor.py"] | Webserver "External DAG" button on ExternalTaskSensor not working when dag_id is templated | ### Apache Airflow version
2.0.2
### What happened
When an ExternalTaskSensor receives a templated dag_id , the web interface's "External DAG" button does not resolve the template, so the destination URL does not work (although the sensor correctly waits for the dag_id that the template refers to)
![image](https://user-images.githubusercontent.com/28935464/151545548-cf60193d-7803-414a-aab6-aecd2cf411b2.png)
![image](https://user-images.githubusercontent.com/28935464/151544881-9c989b68-975a-480c-b473-808da0a892be.png)
![image](https://user-images.githubusercontent.com/28935464/151545663-6982156b-47ef-4b42-a777-02a09811c382.png)
### What you expected to happen
The button's destination URL should point to the templated dag_id.
### How to reproduce
1. Create a DAG with an ExternalTaskSensor whose dag_id is templated
```python
@provide_session
def get_last_run_id(execution_date: datetime, session: Any, **context: Any) -> datetime:
dag_id = context['task_instance'].xcom_pull('get_idemp', key='id_emp')
while (last_run := get_last_dagrun(dag_id, session, include_externally_triggered=True)) is None:
continue
return last_run.execution_date
sensor = ExternalTaskSensor(
task_id="wait_for_dag",
external_dag_id="{{ ti.xcom_pull('get_idemp', key='id_emp') }}",
external_task_id="update_load_registry", <---------- Last DAG operator
execution_date_fn=get_last_run_id
)
```
2. Trigger the created DAG
3. Click on the ExternalTaskSensor operator
4. Click on the "External DAG" button
### Operating System
Debian GNU/Linux 10 (buster) (on KubernetesExecutor)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21183 | https://github.com/apache/airflow/pull/21192 | 6b88d432d959df73433528fe3d62194239f13edd | 8da7af2bc0f27e6d926071439900ddb27f3ae6c1 | "2022-01-28T12:18:30Z" | python | "2022-02-06T21:14:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,169 | ["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "airflow/providers/cncf/kubernetes/utils/pod_manager.py", "kubernetes_tests/test_kubernetes_pod_operator.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"] | `is_delete_operator_pod=True` and `random_name_suffix=False` can cause KubernetesPodOperator to delete the wrong pod | ### Apache Airflow version
2.2.2
### What happened
When running multiple KubernetesPodOperators with `random_name_suffix=False` and `is_delete_pod_operator=True` the following will happen:
1) The first task will create the Pod `my-pod`
2) The second task will attempt to create the pod, but fail with a 409 response from the API server (this is expected)
3) The second task will delete `my-pod`, because it has `is_delete_pod_operator=True` and the Pod name is consistent between the two tasks. This is unexpected and will cause the first task to fail as well.
I understand that this is a rare circumstance, but I think its still worth fixing as anyone using `random_name_suffix=False` in an otherwise default KubernetesPodOperator may result in other pods being killed.
As a possible fix, we could [`find_pod`](https://github.com/apache/airflow/blob/684fe46158aa3d6cb2de245d29e20e487d8f2158/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py#L322) before deleting, to ensure that the pod being deleted has the appropriate `execution_date` label:
https://github.com/apache/airflow/blob/ad07923606262ef8a650dcead38183da6bbb5d7b/airflow/providers/cncf/kubernetes/utils/pod_launcher.py#L103-L112
Let me know if you have any other suggestions for how this could be fixed, or if this should just be considered expected behaviour when using fixed Kubernetes Pod IDs.
### What you expected to happen
The second task should be able to fail without deleting the pod from the first task.
### How to reproduce
Create a DAG with a single KubernetesPodOperator with `random_name_suffix=False` and `is_delete_pod_operator=True` and run it twice in parallel.
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
apache-airflow-providers-cncf-kubernetes=2.2.0
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21169 | https://github.com/apache/airflow/pull/22092 | 864cbc9cd843db94ca8bab2187b50de714fdb070 | 78ac48872bd02d1c08c6e55525f0bb4d6e983d32 | "2022-01-27T19:55:56Z" | python | "2022-06-21T15:51:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,164 | [".pre-commit-config.yaml", ".rat-excludes", "dev/breeze/autocomplete/Breeze2-complete-bash.sh", "dev/breeze/autocomplete/Breeze2-complete-fish.sh", "dev/breeze/autocomplete/Breeze2-complete-zsh.sh", "dev/breeze/setup.cfg", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/utils/run_utils.py"] | Breeze2 autocomplete requires `click-complete` to be installed | ### Apache Airflow version
main (development)
### What happened
When I setup autocomplete for Breeze2 on a "bare" system when I have no packages installed, It fails autocomplete with this error:
```
ModuleNotFoundError: No module named 'click_completion'
Traceback (most recent call last):
File "/home/jarek/.pyenv/versions/3.7.9/bin/Breeze2", line 33, in <module>
sys.exit(load_entry_point('apache-airflow-breeze', 'console_scripts', 'Breeze2')())
File "/home/jarek/.pyenv/versions/3.7.9/bin/Breeze2", line 25, in importlib_load_entry_point
return next(matches).load()
File "/home/jarek/.local/lib/python3.7/site-packages/importlib_metadata/__init__.py", line 194, in load
module = import_module(match.group('module'))
File "/home/jarek/.pyenv/versions/3.7.9/lib/python3.7/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/jarek/code/airflow/dev/breeze/src/airflow_breeze/breeze.py", line 24, in <module>
import click_completion
ModuleNotFoundError: No module named 'click_completion'
```
It seems that "autocomplete" feature of Breeze2 requires `click-completion` to be installed first. This is a small issue an small prerequisite but I think it is not handled by the currrent `setup-autocomplete`
The same happens if you install Breeze2 with `pipx`.
### What you expected to happen
I expect that `click-completion` package is automatically installled in when `./Breeze2 setup-autocomplete` is executed.
Also it should be described in the documentation as prerequisite
### How to reproduce
* make sure you have no packages installed in your Python "system environment" (for example: `pip list | xargs pip uninstall -y` )
* type ./Breeze2<TAB>
### Operating System
Linux Mint 20.1.3
### Versions of Apache Airflow Providers
Not relevant
### Deployment
Other
### Deployment details
No airflow deployment
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21164 | https://github.com/apache/airflow/pull/22695 | c758c76ac336c054fd17d4b878378aa893b7a979 | 4fb929a60966e9cecbbb435efd375adcc4fff9d7 | "2022-01-27T17:47:05Z" | python | "2022-04-03T18:52:48Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,163 | ["dev/breeze/src/airflow_breeze/breeze.py"] | `setup-autocomplete` in Breeze2 fails with "Permission denied" | ### Apache Airflow version
main (development)
### What happened
When I run "setup-autocomplete" in the new Breeze with zsh it fails with "permission denied" as it tries to access `/.zshrc`:
```
[jarek:~/code/airflow] static-check-breeze2+ ± ./Breeze2 setup-autocomplete
Installing zsh completion for local user
Activation command scripts are created in this autocompletion path: /home/jarek/code/airflow/.build/autocomplete/Breeze2-complete.zsh
Do you want to add the above autocompletion scripts to your zsh profile? [y/N]: y
This will modify the /.zshrc file
Traceback (most recent call last):
File "/home/jarek/code/airflow/.build/breeze2/venv/bin/Breeze2", line 33, in <module>
sys.exit(load_entry_point('apache-airflow-breeze', 'console_scripts', 'Breeze2')())
File "/home/jarek/code/airflow/.build/breeze2/venv/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
return self.main(*args, **kwargs)
File "/home/jarek/code/airflow/.build/breeze2/venv/lib/python3.8/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/home/jarek/code/airflow/.build/breeze2/venv/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/jarek/code/airflow/.build/breeze2/venv/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/jarek/code/airflow/.build/breeze2/venv/lib/python3.8/site-packages/click/core.py", line 754, in invoke
return __callback(*args, **kwargs)
File "/home/jarek/code/airflow/dev/breeze/src/airflow_breeze/breeze.py", line 248, in setup_autocomplete
write_to_shell(command_to_execute, script_path, breeze_comment)
File "/home/jarek/code/airflow/dev/breeze/src/airflow_breeze/breeze.py", line 215, in write_to_shell
with open(script_path, 'a') as script_file:
PermissionError: [Errno 13] Permission denied: '/.zshrc'
[jarek:~/code/airflow] static-check-breeze2+ 6s 1 ±
```
### What you expected to happen
I expected the scripts in my `${HOME}` directory to be updated with auto-complete but apparently it tries to update a file in `root' folder.
### How to reproduce
* Have zsh as your shell
* Run `./Breeze2 setup-autocomplete`
### Operating System
Linux Mint 20.1.3
### Versions of Apache Airflow Providers
- Not relevant -
### Deployment
Other
### Deployment details
No airflow - this is just a development environment.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21163 | https://github.com/apache/airflow/pull/21636 | ea21cb3320afcb8150f463ebc8d01798a5d34166 | 42469f0434f1377c9f9dfd78cbf033ef21e2e505 | "2022-01-27T17:23:13Z" | python | "2022-02-17T13:54:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,125 | ["airflow/providers/trino/hooks/trino.py", "tests/providers/trino/hooks/test_trino.py"] | apache-airflow-providers-trino issue when I use "run" method, query is not performed | ### Apache Airflow version
2.2.3 (latest released)
### What happened
When I use Trino provider I use "run" method in TrinoHook, because I need run "create table" query.
But these queries are not performed on Trino server.
### What you expected to happen
I expected, that queries that were started from "run" method should be performed)
### How to reproduce
Create DAG file with Trino provider with this code and some CREATE Table hql request.
```
...
th = TrinoHook()
@task(task_id="create_table")
def create_table(ds=None, **kwargs):
th.run("CREATE TABLE IF NOT EXISTS ...", handler=next_page)
cpt = create_table()
cpt
```
This code does not perform operation, but right POST request is sent to trino server.
### Operating System
Kubuntu, 20.04.3 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-trino=2.0.1
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
I think it is needed to add support of handler for run method in provider hook in file airflow/providers/trino/hooks/trino.py
In this case we get possibility to write handler, that can perform fetchAll() on returned Cursor object, for example
Something like this:
```
def run(
self,
hql,
autocommit: bool = False,
parameters: Optional[dict] = None,
handler=None
) -> None:
"""Execute the statement against Trino. Can be used to create views."""
return super().run(sql=self._strip_sql(hql), parameters=parameters, handler=handler)
```
Because now code does not send handler to dbapi parent method
` return super().run(sql=self._strip_sql(hql), parameters=parameters)`
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21125 | https://github.com/apache/airflow/pull/21479 | dc03000de80e672de661c84f5fbb916413211550 | 1884f2227d1e41d7bb37246ece4da5d871036c1f | "2022-01-26T12:43:19Z" | python | "2022-02-15T20:42:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,105 | ["BREEZE.rst", "CONTRIBUTING.rst", "CONTRIBUTORS_QUICK_START.rst", "dev/provider_packages/README.md", "docs/apache-airflow/installation/installing-from-pypi.rst", "scripts/tools/initialize_virtualenv.py"] | Breeze: Setting up local virtualenv | There shoudl be a comand that allows to set-up local virtualenv easly.
This involves:
* checking is airlfow is installed in "${HOME}/airflow" and warning and suggesting to move elsewhere if so (this is very bad because ${HOME}/airflow is by default where airflow stores all files (logs/config etc).
* cleaning the "${HOME}/airflow" and regenerating all necessary folders and files
* checking if vitualenv is activated - if not writing helpful message
* checking if additional dependencies are installed, and based on the OS, suggest what shoudl be installed (note we do not have Windows support here).
* installing aiflow with the right extra ([devel-all] I think )
* initializing the sqlite databases of airflow - both "normal" and "unit test" database | https://github.com/apache/airflow/issues/21105 | https://github.com/apache/airflow/pull/22971 | 03f7d857e940b9c719975e72ded4a89f183b0100 | 03bef084b3f1611e1becdd6ad0ff4c0d2dd909ac | "2022-01-25T16:14:33Z" | python | "2022-04-21T13:59:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,104 | ["BREEZE.rst", "dev/breeze/src/airflow_breeze/commands/developer_commands.py", "dev/breeze/src/airflow_breeze/shell/enter_shell.py", "images/breeze/output-commands.svg", "images/breeze/output-exec.svg"] | Breeze: Exec'ing into running Breeze | `./Breeze2 exec` should exec into the currently running Breeze (or fail with helpful message if Breeze is not running). | https://github.com/apache/airflow/issues/21104 | https://github.com/apache/airflow/pull/23052 | b6db0e90aeb30133086716a433cab9dca7408a54 | 94c3203e86252ed120d624a70aed571b57083ea4 | "2022-01-25T16:09:29Z" | python | "2022-04-28T19:31:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,102 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/ci/build_image.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/prod/prod_params.py"] | Breeze: Add 'prepare-image-cache' in Breeze | We have a separate command that prepares image caches. It is very similar to "building image" but:
* it should prepare both prod and CI images
* the `docker build` command should be slighly modified (--cache-to and some other commands)
* validation on whether the `buildx plugin` needs to be performed (and command should fail if not with helpful message)
Those "differences" between standard build-image can be found with `PREPARE_BUILDX_CACHE` variable == "true" usage in old breeze | https://github.com/apache/airflow/issues/21102 | https://github.com/apache/airflow/pull/22344 | 4e4c0574cdd3689d22e2e7d03521cb82179e0909 | dc75f5d8768c8a42df29c86beb519de282539e1f | "2022-01-25T16:03:43Z" | python | "2022-04-01T08:47:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,100 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/prod/__init__.py", "dev/breeze/src/airflow_breeze/prod/build_prod_image.py", "dev/breeze/src/airflow_breeze/prod/prod_params.py", "dev/breeze/src/airflow_breeze/utils/docker_command_utils.py", "dev/breeze/src/airflow_breeze/utils/path_utils.py", "dev/breeze/src/airflow_breeze/utils/run_utils.py", "dev/breeze/tests/test_prod_image.py"] | Breeze: Build PROD image with Breeze | Similarly to building CI image, we should build PROD image | https://github.com/apache/airflow/issues/21100 | https://github.com/apache/airflow/pull/21956 | 7418720ce173ca5d0c5f5197c168e43258af8cc3 | 4eebabb76d1d50936c4b63669a93358f4d100ce3 | "2022-01-25T15:53:50Z" | python | "2022-03-29T15:28:42Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,096 | ["airflow/providers/snowflake/hooks/snowflake.py", "tests/providers/snowflake/hooks/test_snowflake.py"] | Snowflake connection, boolean value of extra__snowflake__insecure_mode interpreted as string | ### Apache Airflow version
2.2.3 (latest released)
### What happened
Error thrown when using SnowflakeOperator with a Snowflake Connection.
After creating a Snowflake Connection, the "Extra" field was automatically filled with a dictionary containing the values entered in the other input fields. Note: the value for the key "extra__snowflake__insecure_mode" is a boolean.
![image](https://user-images.githubusercontent.com/82085639/150988536-002c78d7-b405-49c8-af10-ddf3d050ce09.png)
A task using SnowflakeOperator fails, throwing following error:
```
[2022-01-25, 14:39:54 UTC] {taskinstance.py:1700} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1511, in _execute_task
result = execute_callable(context=context)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/operators/snowflake.py", line 129, in execute
execution_info = hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 293, in run
with closing(self.get_conn()) as conn:
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 236, in get_conn
conn_config = self._get_conn_params()
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 170, in _get_conn_params
insecure_mode = to_boolean(
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/utils/strings.py", line 30, in to_boolean
return False if astring is None else astring.lower() in ['true', 't', 'y', 'yes', '1']
AttributeError: 'bool' object has no attribute 'lower'
[2022-01-25, 14:39:54 UTC] {taskinstance.py:1267} INFO - Marking task as FAILED. dag_id=test, task_id=snowflake_task, execution_date=20220123T000000, start_date=20220125T133954, end_date=20220125T133954
[2022-01-25, 14:39:55 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 4 for task snowflake_task
Traceback (most recent call last):
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 298, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 180, in _run_raw_task
ti._run_raw_task(
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1511, in _execute_task
result = execute_callable(context=context)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/operators/snowflake.py", line 129, in execute
execution_info = hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 293, in run
with closing(self.get_conn()) as conn:
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 236, in get_conn
conn_config = self._get_conn_params()
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 170, in _get_conn_params
insecure_mode = to_boolean(
File "/home/test_airflow2/.local/lib/python3.8/site-packages/airflow/utils/strings.py", line 30, in to_boolean
return False if astring is None else astring.lower() in ['true', 't', 'y', 'yes', '1']
AttributeError: 'bool' object has no attribute 'lower'
[2022-01-25, 14:39:55 UTC] {local_task_job.py:154} INFO - Task exited with return code 1
[2022-01-25, 14:39:55 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
```
This error seems to arise because the boolean value of extra__snowflake__insecure_mode is interpreted as a string.
Manually modifying the boolean extra__snowflake__insecure_mode value to be string in the "Extra" dictionary solves this problem:
**false->"false"**
![image](https://user-images.githubusercontent.com/82085639/150990538-b58a641e-cc16-4c49-88a6-1d32c4a3fc79.png)
### What you expected to happen
Be able to create a usable Snowflake Connection, by filling in fields other than "Extra". The "Extra" field should be automatically filled with a correct/usable connection dictionary.
I can then use this Snowflake Connection for SnowflakeOperators in DAGs.
### How to reproduce
Create a new Connection of type **Snowflake**, set an arbitrary Connection ID. The rest of the fields can be left empty (doesn't affect error).
![image](https://user-images.githubusercontent.com/82085639/150990006-87c9e388-872b-4323-b34b-3ea816f024bb.png)
Create a DAG with a SnowflakeOperator task, which uses the created Snowflake Connection:
```
from airflow import DAG
from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
from airflow.utils.dates import days_ago
with DAG('test', start_date=days_ago(2)) as dag:
snowflake_task = SnowflakeOperator(task_id='snowflake_task',
sql='select 1;',
snowflake_conn_id='snowflake_conn')
```
Execute the DAG, the task will fail and throw the above error.
### Operating System
Ubuntu 20.04.2 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-snowflake==2.4.0
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21096 | https://github.com/apache/airflow/pull/21155 | 14cfefa44c0fd2bcbd0290fc5b4f7b6e3d0cf2d9 | 534e9ae117641b4147542f2deec2a077f0a42e2f | "2022-01-25T14:01:42Z" | python | "2022-01-28T22:12:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,087 | ["airflow/executors/kubernetes_executor.py", "tests/executors/test_kubernetes_executor.py"] | KubernetesJobWatcher failing on HTTP 410 errors, jobs stuck in scheduled state | ### Apache Airflow version
2.2.3 (latest released)
### What happened
After upgrading Airflow to 2.2.3 (from 2.2.2) and cncf.kubernetes provider to 3.0.1 (from 2.0.3) we started to see these errors in the logs:
```
{"asctime": "2022-01-25 08:19:39", "levelname": "ERROR", "process": 565811, "name": "airflow.executors.kubernetes_executor.KubernetesJobWatcher", "funcName": "run", "lineno": 111, "message": "Unknown error in KubernetesJobWatcher. Failing", "exc_info": "Traceback (most recent call last):\n File \"/usr/local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py\", line 102, in run\n self.resource_version = self._run(\n File \"/usr/local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py\", line 145, in _run\n for event in list_worker_pods():\n File \"/usr/local/lib/python3.9/site-packages/kubernetes/watch/watch.py\", line 182, in stream\n raise client.rest.ApiException(\nkubernetes.client.exceptions.ApiException: (410)\nReason: Expired: too old resource version: 655595751 (655818065)\n"}
Process KubernetesJobWatcher-6571:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 102, in run
self.resource_version = self._run(
File "/usr/local/lib/python3.9/site-packages/airflow/executors/kubernetes_executor.py", line 145, in _run
for event in list_worker_pods():
File "/usr/local/lib/python3.9/site-packages/kubernetes/watch/watch.py", line 182, in stream
raise client.rest.ApiException(
kubernetes.client.exceptions.ApiException: (410)
Reason: Expired: too old resource version: 655595751 (655818065)
```
Pods are created and run to completion, but it seems the KubernetesJobWatcher is incapable of seeing that they completed. From there Airflow goes to a complete halt.
### What you expected to happen
No errors in the logs and the job watcher does it's job of collecting completed jobs.
### How to reproduce
I wish I knew. Trying to downgrade the cncf.kubernetes provider to previous versions to see if it helps.
### Operating System
k8s (Airflow images are Debian based)
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon 2.6.0
apache-airflow-providers-cncf-kubernetes 3.0.1
apache-airflow-providers-ftp 2.0.1
apache-airflow-providers-http 2.0.2
apache-airflow-providers-imap 2.1.0
apache-airflow-providers-postgres 2.4.0
apache-airflow-providers-sqlite 2.0.1
### Deployment
Other
### Deployment details
The deployment is on k8s v1.19.16, made with helm3.
### Anything else
This, in the symptoms, look a lot like #17629 but happens in a different place.
Redeploying as suggested in that issues seemed to help, but most jobs that were supposed to run last night got stuck again. All jobs use the same pod template, without any customization.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21087 | https://github.com/apache/airflow/pull/23521 | cfa95af7e83b067787d8d6596caa3bc97f4b25bd | dee05b2ebca6ab66f1b447837e11fe204f98b2df | "2022-01-25T08:50:17Z" | python | "2022-05-11T06:25:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,083 | ["airflow/models/dagrun.py", "tests/jobs/test_scheduler_job.py"] | A high value of min_file_process_interval & max_active_runs=1 causes stuck dags | ### Apache Airflow version
2.2.2
### What happened
When the value of `AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL` is set to 86400, all dags whose `MAX_ACTIVE_RUNS` is set to 1 stop executing & remains stuck forever. If the `MAX_ACTIVE_RUNS` is set to 2 or above, or `AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL` is set to a lower value (bw 30-300), dags work just fine.
### What you expected to happen
These 2 settings should be exclusive to each other and there should be no direct impact of one another.
### How to reproduce
set AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL to 86400
set MAX_ACTIVE_RUNS to 1 on any dag & observe its execution dates.
### Operating System
Debian GNU/Linux 11 (bullseye)
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==1!2.3.0
apache-airflow-providers-cncf-kubernetes==1!2.1.0
apache-airflow-providers-elasticsearch==1!2.1.0
apache-airflow-providers-ftp==1!2.0.1
apache-airflow-providers-google==1!6.1.0
apache-airflow-providers-http==1!2.0.1
apache-airflow-providers-imap==1!2.0.1
apache-airflow-providers-microsoft-azure==1!3.3.0
apache-airflow-providers-mysql==1!2.1.1
apache-airflow-providers-postgres==1!2.3.0
apache-airflow-providers-redis==1!2.0.1
apache-airflow-providers-slack==1!4.1.0
apache-airflow-providers-sqlite==1!2.0.1
apache-airflow-providers-ssh==1!2.3.0
### Deployment
Astronomer
### Deployment details
Deployed on AKS via Astronomer Helm chart.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21083 | https://github.com/apache/airflow/pull/21413 | 5fbf2471ab4746f5bc691ff47a7895698440d448 | feea143af9b1db3b1f8cd8d29677f0b2b2ab757a | "2022-01-25T05:49:42Z" | python | "2022-02-24T07:12:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,058 | ["airflow/models/taskinstance.py", "airflow/utils/state.py"] | Running airflow dags backfill --reset-dagruns <dag_id> -s <execution_start_dt> -e <execution_end_dt> results in error when run twice. | ### Apache Airflow version
2.2.3 (latest released)
### What happened
It's the same situation as https://github.com/apache/airflow/issues/21023.
Only change to `airflow dags backfill` from `airflow dags test`.
``` bash
(airflow) [www@np-data-eng-airflow-sync001-lde-jp2v-prod ~]$ airflow dags backfill tutorial --reset-dagruns -s 2022-01-20 -e 2022-01-23
You are about to delete these 9 tasks:
<TaskInstance: tutorial.print_date scheduled__2022-01-20T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.print_date scheduled__2022-01-21T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.print_date scheduled__2022-01-22T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.sleep scheduled__2022-01-20T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.sleep scheduled__2022-01-21T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.sleep scheduled__2022-01-22T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.templated scheduled__2022-01-20T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.templated scheduled__2022-01-21T07:48:54.720148+00:00 [success]>
<TaskInstance: tutorial.templated scheduled__2022-01-22T07:48:54.720148+00:00 [success]>
Are you sure? (yes/no):
y
Traceback (most recent call last):
File "/home1/www/venv3/airflow/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/cli/commands/dag_command.py", line 108, in dag_backfill
dag_run_state=State.NONE,
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/models/dag.py", line 1948, in clear_dags
dry_run=False,
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/models/dag.py", line 1887, in clear
dag_run_state=dag_run_state,
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 270, in clear_task_instances
dr.state = dag_run_state
File "<string>", line 1, in __set__
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/models/dagrun.py", line 194, in set_state
raise ValueError(f"invalid DagRun state: {state}")
ValueError: invalid DagRun state: None
```
### What you expected to happen
_No response_
### How to reproduce
1. Setup Airflow 2.2.3
2. Run any dag (in my case, using [tutorial dag file)](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html).
3. Run again same dag by `airflow dags backfill` command.
### Operating System
CentOS Linux 7.9
### Versions of Apache Airflow Providers
Providers info
apache-airflow-providers-celery | 2.1.0
apache-airflow-providers-cncf-kubernetes | 3.0.1
apache-airflow-providers-ftp | 2.0.1
apache-airflow-providers-http | 2.0.2
apache-airflow-providers-imap | 2.1.0
apache-airflow-providers-mysql | 2.1.1
apache-airflow-providers-redis | 2.0.1
apache-airflow-providers-slack | 4.1.0
apache-airflow-providers-sqlite | 2.0.1
apache-airflow-providers-ssh | 2.3.0
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21058 | https://github.com/apache/airflow/pull/21116 | d97e2bac854f9891eb47f0c06c261e89723038ca | d17db3ce8ee8a8a724ced9502c73bc308740a358 | "2022-01-24T09:21:04Z" | python | "2022-01-27T05:36:58Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,057 | ["airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py"] | Templated fields for DynamoDBToS3Operator similar to MySQLToS3Operator | ### Description
I am using Airflow to move data periodically to our datalake and noticed that the MySQLToS3Operator has tempalted fields and the DynamoDBToS3Operator doesn't. I found a semi awkward workaround but thought templated fields would be nice.
I supposed an implementation could be as simple as adding
template_fields = (
's3_bucket',
's3_key',
)
to the Operator similar to the MySQLToS3Operator.
Potentially one could also add a log call in execute()
as well as using with in the NamedTemporaryFile
### Use case/motivation
At the moment MySQLToS3Operator and DynamoDBToS3Operator behave differently in terms of templating and some of the code in MySQLtoS3Operator seems more refined.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21057 | https://github.com/apache/airflow/pull/22080 | c8d49f63ca60fa0fb447768546c2503b746a66dd | a150ee0bc124f21b99fa94adbb16e6ccfe654ae4 | "2022-01-24T08:36:19Z" | python | "2022-03-08T13:09:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,046 | ["Dockerfile", "dev/breeze/src/airflow_breeze/build_image/prod/build_prod_params.py"] | More ArtifactsHub specific labels in docker image | ### Description
We added two labels that allow us to publish docker images in [ArtifactHub](https://artifacthub.io/). We should consider adding other labels as well, if it helps to exposing our image on ArtifactHub. https://artifacthub.io/docs/topics/repositories/#container-images-repositories
Probably, we should consider adding these labels with `docker build --label` i.e. as a docker command argument instead of Dockerfile instruction. For now, we should hold off on this change, as the image building process is undergoing a major overhaul - rewriting from Bash to Python.
### Use case/motivation
Better exposition of our images on ArtifactHub
### Related issues
https://github.com/apache/airflow/pull/21040#issuecomment-1019423007
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21046 | https://github.com/apache/airflow/pull/23379 | 451c7cbc42a83a180c4362693508ed33dd1d1dab | 5b1ab96865a6e8f18784f88f88f8b17981450cc6 | "2022-01-23T17:22:29Z" | python | "2022-05-03T22:15:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,036 | ["airflow/www/views.py", "tests/www/views/test_views.py"] | Recent Tasks mixes status for current and old dag runs | ### Apache Airflow version
2.1.4
### What happened
Recent Tasks column in /home dashboard is showing status of the task for different dag runs.
See images attached:
![image](https://user-images.githubusercontent.com/4999277/150655955-a43d2f0c-0648-4bf2-863d-5acc45c2ae8b.png)
![image](https://user-images.githubusercontent.com/4999277/150655960-7273f34e-7a94-4c05-b3d4-3973a868dfc0.png)
### What you expected to happen
As stated in the tooltip, I expect the Recent Tasks column to show the status of tasks for the last run if the DAG isn't currently running, or for the current DAG run (only) if there's one in progress.
### How to reproduce
- trigger a manual ran in a dag, make any task fail and get some tasks marked as `failed` and `upstream_failed`.
- then trigger the dag again, and see how the Recent Tasks column in /home dashboard is showing tasks failed and upstream_failed from previous run along with running/completed from the current dag run in progress.
### Operating System
using apache/airflow:2.1.4-python3.8 image
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21036 | https://github.com/apache/airflow/pull/21352 | 28378d867afaac497529bd2e1d2c878edf66f460 | 28d7bde2750c38300e5cf70ba32be153b1a11f2c | "2022-01-22T21:34:43Z" | python | "2022-02-14T15:55:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,023 | ["airflow/cli/commands/dag_command.py", "tests/cli/commands/test_dag_command.py"] | Running airflow dags test <dag_id> <execution_dt> results in error when run twice. | ### Apache Airflow version
main (development)
### What happened
# Product and Version
Airflow Version: v2.3.0.dev0 (Git Version: .release:2.3.0.dev0+7a9ab1d7170567b1d53938b2f7345dae2026c6ea) to test and learn its functionalities. I am currently installed this using git clone and building the airflow on my MacOS environment, using python3.9.
# Problem Statement
When I was doing a test on my DAG, I wanted run
`airflow dags test <dag_id> <execution_dt>` so that I don't have to use UI to trigger dag runs each time. Running and looking at the result of the dags test proved to be more productive when doing some rapid tests on your DAG.
The test runs perfectly for the first time it runs, but when I try to re-run the test again the following error message is observed:
```
[2022-01-21 10:30:33,530] {migration.py:201} INFO - Context impl SQLiteImpl.
[2022-01-21 10:30:33,530] {migration.py:204} INFO - Will assume non-transactional DDL.
[2022-01-21 10:30:33,568] {dagbag.py:498} INFO - Filling up the DagBag from /Users/howardyoo/airflow/dags
[2022-01-21 10:30:33,588] {example_python_operator.py:67} WARNING - The virtalenv_python example task requires virtualenv, please install it.
[2022-01-21 10:30:33,594] {tutorial_taskflow_api_etl_virtualenv.py:29} WARNING - The tutorial_taskflow_api_etl_virtualenv example DAG requires virtualenv, please install it.
Traceback (most recent call last):
File "/Users/howardyoo/python3/bin/airflow", line 33, in <module>
sys.exit(load_entry_point('apache-airflow==2.3.0.dev0', 'console_scripts', 'airflow')())
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 50, in command
return func(*args, **kwargs)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/utils/session.py", line 71, in wrapper
return func(*args, session=session, **kwargs)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/utils/cli.py", line 98, in wrapper
return f(*args, **kwargs)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/cli/commands/dag_command.py", line 429, in dag_test
dag.clear(start_date=args.execution_date, end_date=args.execution_date, dag_run_state=State.NONE)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/utils/session.py", line 71, in wrapper
return func(*args, session=session, **kwargs)
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/models/dag.py", line 1906, in clear
clear_task_instances(
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 286, in clear_task_instances
dr.state = dag_run_state
File "<string>", line 1, in __set__
File "/Users/howardyoo/python3/lib/python3.9/site-packages/airflow/models/dagrun.py", line 207, in set_state
raise ValueError(f"invalid DagRun state: {state}")
ValueError: invalid DagRun state: None
```
When going through the DAG runs in my UI, I noticed the following entry on my dag test run.
![Screen Shot 2022-01-21 at 10 31 52 AM](https://user-images.githubusercontent.com/32691630/150564356-f8b95b11-794a-451e-b5ad-ab9b59f3b52b.png)
Looks like when you run the dag with `test` mode, it submits the dag run as `backfill` type. I am not completely sure why the `airflow dags test` would only succeed once, but looks like there might have been some process that may be missing to clear out the test (just my theory).
# Workaround
A viable workaround to stop it from failing is to find and `deleting` the dag run instance. Once the above dag run entry is deleted, I could successfully run my `airflow dags test` command again.
### What you expected to happen
According to the documentation (https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html#id2), it is stated that:
> The same applies to airflow dags test [dag_id] [logical_date], but on a DAG level. It performs a single DAG run of the given DAG id. While it does take task dependencies into account, no state is registered in the database. It is convenient for locally testing a full run of your DAG, given that e.g. if one of your tasks expects data at some location, it is available.
It does not mention about whether you have to delete the dag run instance to re-run the test, so I would expect that `airflow dags test` command will run successfully, and also successfully on any consecutive runs without any errors.
### How to reproduce
- Get the reported version of airflow and install it to run.
- Run airflow standalone using `airflow standalone` command. It should start up the basic webserver, scheduler, triggerer to start testing it.
- Get any dags that exist in the DAGs. run `airflow dags test <dag_id> <start_dt>` to initiate DAGs test.
- Once the test is finished, re-run the command and observe the error.
- Go to the DAG runs, delete the dag run that the first run produced, and run the test again - the test should run successfully.
### Operating System
MacOS Monterey (Version 12.1)
### Versions of Apache Airflow Providers
No providers were used
### Deployment
Other
### Deployment details
This airflow is running as a `standalone` on my local MacOS environment. I have setup a dev env, by cloning from the github and built the airflow to run locally. It is using sqlite as its backend database, and sequentialExecutor to execute tasks sequentially.
### Anything else
Nothing much. I would like this issue to be resolved so that I could run my DAG tests easily without 'actually' running it or relying on the UI. Also, there seems to be little information on what this `test` means and what it is different from the normal runs, so improving documentation to clarify it would be nice.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21023 | https://github.com/apache/airflow/pull/21031 | 795a6bb066c4f680dcc324758366d59c58eec572 | 515ea84335fc440fe022db2a0e3b158e0d7702da | "2022-01-21T16:47:21Z" | python | "2022-01-23T23:59:10Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,017 | ["airflow/models/taskinstance.py", "tests/models/test_taskinstance.py"] | Opening "Task Instance Details" page in web UI unintentionally resets TaskInstance.max_tries | ### Apache Airflow version
2.2.3 (latest released)
### What happened
`TaskInstance.max_tries` is a cumulative value that's stored in the database. Whenever a failed `TaskInstance` is cleared, its `max_tries` is incremented by `task.retries - 1`. However, the increased `max_tries` value of a `TaskInstance` is reset to the original value (`task.retries`) whenever a user opens the `Task Instance Details` page in the web UI. This is because `www/views.py` queries `TaskInstance` and then calls `refresh_from_task()` on it in a few places, which changes the attributes of the `TaskInstance`. The `session` used in `www/views.py` is then auto committed and the updated values such as `max_tries` are saved to the database.
This looks like a bug. So far the only effect I noticed is `max_tries` gets reset causing sensors in "reschedule" mode to timeout prematurely. It happens because of the following lines in `sensors/base.py`. When `max_tries` is reset, this code always gets the `first_try_number` 1 which may have happened a long time ago and thinks the sensor times out.
```
if self.reschedule:
# If reschedule, use the start date of the first try (first try can be either the very
# first execution of the task, or the first execution after the task was cleared.)
first_try_number = context['ti'].max_tries - self.retries + 1
task_reschedules = TaskReschedule.find_for_task_instance(
context['ti'], try_number=first_try_number
)
```
### What you expected to happen
Opening "Task Instance Details" page should not make any changes to the state of the TaskInstance in the database.
### How to reproduce
Create a DAG like this:
```
from datetime import datetime
from airflow import DAG
from airflow.sensors.python import PythonSensor
with DAG(
dag_id="reschedule_clear",
start_date=datetime(2021, 1, 1),
catchup=False,
schedule_interval="@daily",
) as dag:
task_1 = PythonSensor(
task_id="task_1",
python_callable=lambda: False,
mode="reschedule",
timeout=90,
retries=5,
)
```
```
State of the TaskInstance after it failed for the first time:
airflow=# select max_tries, try_number from task_instance ;
max_tries | try_number
-----------+------------
5 | 1
(1 row)
After the failed TaskInstance was cleared:
airflow=# select max_tries, try_number from task_instance ;
max_tries | try_number
-----------+------------
6 | 1
(1 row)
After the TaskInstance failed for a second time:
airflow=# select max_tries, try_number from task_instance ;
max_tries | try_number
-----------+------------
6 | 2
(1 row)
After the TaskInstance is cleared a second time:
airflow=# select max_tries, try_number from task_instance ;
max_tries | try_number
-----------+------------
7 | 2
(1 row)
After user opens the "Task Instance Details" page in Web UI:
airflow=# select max_tries, try_number from task_instance ;
max_tries | try_number
-----------+------------
5 | 2
(1 row)
```
This is the exception raised when the `max_tries` was accidentally changed from 7 to 5. Note these two lines when the total attempt changed from 8 to 6 after the user clicked on "Task Instance Details":
```
[2022-01-21, 13:57:23 UTC] {taskinstance.py:1274} INFO - Starting attempt 3 of 8
...
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1274} INFO - Starting attempt 3 of 6
```
```
--------------------------------------------------------------------------------
[2022-01-21, 13:57:23 UTC] {taskinstance.py:1274} INFO - Starting attempt 3 of 8
[2022-01-21, 13:57:23 UTC] {taskinstance.py:1275} INFO -
...
[2022-01-21, 13:57:23 UTC] {taskinstance.py:1724} INFO - Rescheduling task, marking task as UP_FOR_RESCHEDULE
[2022-01-21, 13:57:23 UTC] {local_task_job.py:156} INFO - Task exited with return code 0
...
--------------------------------------------------------------------------------
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1274} INFO - Starting attempt 3 of 6
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1275} INFO -
--------------------------------------------------------------------------------
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1294} INFO - Executing <Task(PythonSensor): task_1> on 2022-01-20 00:00:00+00:00
[2022-01-21, 13:58:25 UTC] {standard_task_runner.py:52} INFO - Started process 17578 to run task
[2022-01-21, 13:58:25 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'reschedule_clear', 'task_1', 'scheduled__2022-01-20T00:00:00+00:00', '--job-id', '9', '--raw', '--subdir', '/opt/***/***/example_dags/reschedule_clear.py', '--cfg-path', '/tmp/tmpmkyuigjb', '--error-file', '/tmp/tmpvvtn1w1z']
[2022-01-21, 13:58:25 UTC] {standard_task_runner.py:77} INFO - Job 9: Subtask task_1
[2022-01-21, 13:58:25 UTC] {logging_mixin.py:115} INFO - Running <TaskInstance: reschedule_clear.task_1 scheduled__2022-01-20T00:00:00+00:00 [running]> on host f9e8f282cff4
[2022-01-21, 13:58:25 UTC] {logging_mixin.py:115} WARNING - /opt/***/***/models/taskinstance.py:839 DeprecationWarning: Passing 'execution_date' to 'XCom.clear()' is deprecated. Use 'run_id' instead.
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1459} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=reschedule_clear
AIRFLOW_CTX_TASK_ID=task_1
AIRFLOW_CTX_EXECUTION_DATE=2022-01-20T00:00:00+00:00
AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-01-20T00:00:00+00:00
[2022-01-21, 13:58:25 UTC] {python.py:70} INFO - Poking callable: <function <lambda> at 0x7fc2eb01b3a0>
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1741} ERROR - Task failed with exception
Traceback (most recent call last):
File "/opt/airflow/airflow/models/taskinstance.py", line 1364, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/opt/airflow/airflow/models/taskinstance.py", line 1490, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/opt/airflow/airflow/models/taskinstance.py", line 1546, in _execute_task
result = execute_callable(context=context)
File "/opt/airflow/airflow/sensors/base.py", line 265, in execute
raise AirflowSensorTimeout(f"Snap. Time is OUT. DAG id: {log_dag_id}")
airflow.exceptions.AirflowSensorTimeout: Snap. Time is OUT. DAG id: reschedule_clear
[2022-01-21, 13:58:25 UTC] {taskinstance.py:1302} INFO - Immediate failure requested. Marking task as FAILED. dag_id=reschedule_clear, task_id=task_1, execution_date=20220120T000000, start_date=20220121T135825, end_date=20220121T135825
[2022-01-21, 13:58:25 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 9 for task task_1
Traceback (most recent call last):
File "/opt/airflow/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/opt/airflow/airflow/cli/cli_parser.py", line 49, in command
return func(*args, **kwargs)
File "/opt/airflow/airflow/utils/cli.py", line 98, in wrapper
return f(*args, **kwargs)
File "/opt/airflow/airflow/cli/commands/task_command.py", line 339, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/opt/airflow/airflow/cli/commands/task_command.py", line 147, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/opt/airflow/airflow/cli/commands/task_command.py", line 220, in _run_raw_task
ti._run_raw_task(
File "/opt/airflow/airflow/utils/session.py", line 71, in wrapper
return func(*args, session=session, **kwargs)
File "/opt/airflow/airflow/models/taskinstance.py", line 1364, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/opt/airflow/airflow/models/taskinstance.py", line 1490, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/opt/airflow/airflow/models/taskinstance.py", line 1546, in _execute_task
result = execute_callable(context=context)
File "/opt/airflow/airflow/sensors/base.py", line 265, in execute
raise AirflowSensorTimeout(f"Snap. Time is OUT. DAG id: {log_dag_id}")
airflow.exceptions.AirflowSensorTimeout: Snap. Time is OUT. DAG id: reschedule_clear
[2022-01-21, 13:58:25 UTC] {local_task_job.py:156} INFO - Task exited with return code 1
[2022-01-21, 13:58:25 UTC] {local_task_job.py:273} INFO - 0 downstream tasks scheduled from follow-on schedule check
```
### Operating System
Any
### Versions of Apache Airflow Providers
Any
### Deployment
Virtualenv installation
### Deployment details
Any
### Anything else
Every time a user opens "Task Instance Details" page.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21017 | https://github.com/apache/airflow/pull/21018 | 0d623296f1e5b6354e37dc0a45b2e4ed1a13901e | e3832a77a3e0d374dfdbe14f34a941d22c9c459d | "2022-01-21T14:03:58Z" | python | "2022-01-26T22:47:10Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,999 | ["docs/helm-chart/index.rst"] | ArgoCD deployment: build in redis is not restarted on password change | ### Official Helm Chart version
1.4.0 (latest released)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
1.21.x,1.18.x
### Helm Chart configuration
KubernetesCeleryExecutor used
### Docker Image customisations
_No response_
### What happened
On each argocd airflow app update ```{{ .Release.Name }}-redis-password``` and ```{{ .Release.Name }}-broker-url``` is regenerated(since argocd do not honor ```"helm.sh/hook": "pre-install"```). airflow pods restarted(as expected),get new redis connection and connection start failing with WRONG_PASSWORD, since redis is not restarted(old password used). Redis in chart 1.4 have no health checks.
### What you expected to happen
Generally, I have two options:
1. (Prefered) Add health check to Redis with login sequence. The password should be updated "on the fly"(read from the mounted secret and try to connect)
2. (Workeraund) I implemented a workaround with the parent chart. ```{{ .Release.Name }}-Redis-password``` and ```{{ .Release.Name }}-broker-url``` secrets generated from template where ```immutable: true``` added to the secret definition.
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20999 | https://github.com/apache/airflow/pull/29078 | 30ad26e705f50442f05dd579990372196323fc86 | 6c479437b1aedf74d029463bda56b42950278287 | "2022-01-20T20:57:48Z" | python | "2023-01-27T20:58:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,993 | ["airflow/models/dag.py", "tests/models/test_dag.py"] | DagFileProcessor 'NoneType' is not iterable | ### Apache Airflow version
2.2.2
### What happened
I'm seeing the same log repeating in the Scheduler.
I'm working in a restricted network so I cannot bring the entire log:
```
in bulk_write_to_db
if orm_tag.name not in set(dag.tags)
TypeError: 'NoneType' object is not iterable
```
I saw that a single DAG didn't have any labels and i tried to add a label but the log is still showing
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Debian 10 (Scheduler image)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
I'm deploying on OpenShift 4.8 using the official Helm Chart v1.3.0
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20993 | https://github.com/apache/airflow/pull/21757 | 768d851ca995bbe46cfdaeed7c46a51201b723e2 | d7265791187fb2117dfd090cdb7cce3f8c20866c | "2022-01-20T17:21:52Z" | python | "2022-02-28T00:35:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,991 | ["airflow/providers/elasticsearch/log/es_task_handler.py", "airflow/utils/log/file_task_handler.py"] | Elasticsearch remote log will not fetch task logs from manual dagruns before 2.2 upgrade | ### Apache Airflow Provider(s)
elasticsearch
### Versions of Apache Airflow Providers
```
apache-airflow-providers-amazon==1!2.5.0
apache-airflow-providers-cncf-kubernetes==1!2.1.0
apache-airflow-providers-datadog==1!2.0.1
apache-airflow-providers-elasticsearch==1!2.1.0
apache-airflow-providers-ftp==1!2.0.1
apache-airflow-providers-google==1!6.1.0
apache-airflow-providers-http==1!2.0.1
apache-airflow-providers-imap==1!2.0.1
apache-airflow-providers-microsoft-azure==1!3.3.0
apache-airflow-providers-mysql==1!2.1.1
apache-airflow-providers-postgres==1!2.3.0
apache-airflow-providers-redis==1!2.0.1
apache-airflow-providers-slack==1!4.1.0
apache-airflow-providers-sqlite==1!2.0.1
apache-airflow-providers-ssh==1!2.3.0
```
### Apache Airflow version
2.2.2
### Operating System
Debian Bullseye
### Deployment
Astronomer
### Deployment details
_No response_
### What happened
After upgrading to 2.2, task logs from manual dagruns performed before the upgrade could no longer be retrieved, even though they can still be seen in Kibana. Scheduled dagruns' tasks and tasks for dagruns begun after the upgrade are retrieved without issue.
The issue appears to be because these tasks with missing logs all belong to dagruns that do not have the attribute data_interval_start or data_interval_end set.
### What you expected to happen
Task logs continue to be fetched after upgrade.
### How to reproduce
Below is how I verified the log fetching process.
I ran the code snippet in a python interpreter in the scheduler to test log fetching.
```py
from airflow.models import TaskInstance, DagBag, DagRun
from airflow.settings import Session, DAGS_FOLDER
from airflow.configuration import conf
import logging
from dateutil import parser
logger = logging.getLogger('airflow.task')
task_log_reader = conf.get('logging', 'task_log_reader')
handler = next((handler for handler in logger.handlers if handler.name == task_log_reader), None)
dag_id = 'pipeline_nile_reconciliation'
task_id = 'nile_overcount_spend_resolution_task'
execution_date = parser.parse('2022-01-10T11:49:57.197933+00:00')
try_number=1
session = Session()
ti = session.query(TaskInstance).filter(
TaskInstance.dag_id == dag_id,
TaskInstance.task_id == task_id,
TaskInstance.execution_date == execution_date).first()
dagrun = session.query(DagRun).filter(
DagRun.dag_id == dag_id,
DagRun.execution_date == execution_date).first()
dagbag = DagBag(DAGS_FOLDER, read_dags_from_db=True)
dag = dagbag.get_dag(dag_id)
ti.task = dag.get_task(ti.task_id)
ti.dagrun = dagrun
handler.read(ti, try_number, {})
```
The following error log indicates errors in the log reading.
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/airflow/utils/log/file_task_handler.py", line 239, in read
log, metadata = self._read(task_instance, try_number_element, metadata)
File "/usr/local/lib/python3.9/site-packages/airflow/providers/elasticsearch/log/es_task_handler.py", line 168, in _read
log_id = self._render_log_id(ti, try_number)
File "/usr/local/lib/python3.9/site-packages/airflow/providers/elasticsearch/log/es_task_handler.py", line 107, in _render_log_id
data_interval_start = self._clean_date(dag_run.data_interval_start)
File "/usr/local/lib/python3.9/site-packages/airflow/providers/elasticsearch/log/es_task_handler.py", line 134, in _clean_date
return value.strftime("%Y_%m_%dT%H_%M_%S_%f")
AttributeError: 'NoneType' object has no attribute 'strftime'
```
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20991 | https://github.com/apache/airflow/pull/21289 | c9fdcdbc72543219314430e3de712d69033c57cf | 44bd211b19dcb75eeb53ced5bea2cf0c80654b1a | "2022-01-20T15:33:04Z" | python | "2022-02-12T03:40:29Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,966 | ["airflow/hooks/subprocess.py", "airflow/providers/cncf/kubernetes/utils/pod_manager.py", "tests/hooks/test_subprocess.py"] | Exception when parsing log | ### Apache Airflow version
2.1.4
### What happened
[2022-01-19 13:42:46,107] {taskinstance.py:1463} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1165, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1283, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1313, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 367, in execute
final_state, remote_pod, result = self.create_new_pod_for_operator(labels, launcher)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 521, in create_new_pod_for_operator
final_state, remote_pod, result = launcher.monitor_pod(pod=self.pod, get_logs=self.get_logs)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/utils/pod_launcher.py", line 148, in monitor_pod
timestamp, message = self.parse_log_line(line.decode('utf-8'))
UnicodeDecodeError: 'utf-8' codec can't decode bytes in position 16413-16414: invalid continuation byte
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.2.0
apache-airflow-providers-celery==2.0.0
apache-airflow-providers-cncf-kubernetes==2.0.2
apache-airflow-providers-docker==2.1.1
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==5.1.0
apache-airflow-providers-grpc==2.0.1
apache-airflow-providers-hashicorp==2.1.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.1.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.2.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sendgrid==2.0.1
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-slack==4.0.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.1.1
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
Always on a specific docker container.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20966 | https://github.com/apache/airflow/pull/23301 | c5b72bf30c8b80b6c022055834fc7272a1a44526 | 863b2576423e1a7933750b297a9b4518ae598db9 | "2022-01-19T21:10:09Z" | python | "2022-05-10T20:43:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,911 | ["scripts/docker/install_mysql.sh"] | Apt MySQL Client fails to install due to incorrect GPG Key | ### Apache Airflow version
2.0.2
### What happened
When attempting to rebuild an apache airflow image, we are getting failures during our builds when trying to run `apt-get update`.
The error we see:
```
#5 3.879 W: GPG error: http://repo.mysql.com/apt/debian buster InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 467B942D3A79BD29
#5 3.879 E: The repository 'http://repo.mysql.com/apt/debian buster InRelease' is not signed.
```
### What you expected to happen
This behaviour shouldn't occur but it looks like some changes were published to the `mysql` apt repository this morning which is when we started experiencing issues.
<img width="552" alt="Screen Shot 2022-01-17 at 5 17 49 PM" src="https://user-images.githubusercontent.com/40478775/149842772-78390787-5556-409e-9ab7-67f1260ec48a.png">
I think we just need to update the [airflow script](https://github.com/apache/airflow/blob/2.0.2/scripts/docker/install_mysql.sh ) which installs mysql and has a key already hardcoded [here](https://github.com/apache/airflow/blob/10023fdd65fa78033e7125d3d8103b63c127056e/scripts/docker/install_mysql.sh#L42).
### How to reproduce
Create a `Dockerfile`. Add the following lines to the Dockerfile:
```
FROM apache/airflow:2.0.2-python3.7 as airflow_main
USER root
RUN apt-get update
```
In a shell
run command -> `docker build .`
### Operating System
Debian 10
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
We were able to fix this by running the following command before the `apt-get update`
```
FROM apache/airflow:2.0.2-python3.7 as airflow_main
USER root
RUN sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 467B942D3A79BD29
RUN apt-get update
```
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20911 | https://github.com/apache/airflow/pull/20912 | ddedda420d451866cf177c1eb8efa8d8b9b9f78d | 7e29506037fa543f5d9b438320db064cdb820c7b | "2022-01-17T22:29:57Z" | python | "2022-01-18T03:35:42Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,901 | ["airflow/providers/google/cloud/transfers/gcs_to_local.py", "tests/providers/google/cloud/transfers/test_gcs_to_local.py"] | Bytes cast to String in `airflow.providers.google.cloud.transfers.gcs_to_local.GCSToLocalFilesystemOperator` ~142 | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
```
$ pip freeze | grep apache-airflow-providers
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==6.3.0
apache-airflow-providers-http==2.0.2
apache-airflow-providers-imap==2.1.0
apache-airflow-providers-pagerduty==2.1.0
apache-airflow-providers-sftp==2.4.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
```
### Apache Airflow version
2.2.3 (latest released)
### Operating System
Ubuntu 20.04.3 LTS
### Deployment
Composer
### Deployment details
_No response_
### What happened
Using `airflow.providers.google.cloud.transfers.gcs_to_local.GCSToLocalFilesystemOperator` to load the contents of a file into `xcom` unexpectedly casts the file bytes to string.
### What you expected to happen
`GCSToLocalFilesystemOperator` should not cast to string
### How to reproduce
Store a file on gcs;
```
Hello World!
```
Read file to xcom
```
my_task = GCSToLocalFilesystemOperator(
task_id='my_task',
bucket=bucket,
object_name=object_path,
store_to_xcom_key='my_xcom_key',
)
```
Access via jinja;
```
{{ ti.xcom_pull(task_ids="my_task", key="my_xcom_key") }}
```
XCom result is;
```
b'Hello World!'
```
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20901 | https://github.com/apache/airflow/pull/20919 | b171e03924fba92924162563f606d25f0d75351e | b8526abc2c220b1e07eed83694dfee972c2e2609 | "2022-01-17T10:04:57Z" | python | "2022-01-19T11:39:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,877 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/www/views.py", "docs/apache-airflow/howto/customize-ui.rst", "tests/www/views/test_views_base.py"] | Allow for Markup in UI page title | ### Description
A custom page title can be set on the UI with the `instance_name` variable in `airflow.cfg`. It would be nice to have the option to include Markup text in that variable for further customization of the title, similar to how dashboard alerts introduced in #18284 allow for `html=True`.
### Use case/motivation
It would be useful to be able to use formatting like underline, bold, color, etc, in the UI title. For example, color-coded environments to minimize chances of a developer triggering a DAG in the wrong environment:
![title](https://user-images.githubusercontent.com/50751110/149528236-403473ed-eba5-4102-a804-0fa78fca8856.JPG)
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20877 | https://github.com/apache/airflow/pull/20888 | 75755d7f65fb06c6e2e74f805b877774bfa7fcda | a65555e604481388da40cea561ca78f5cabb5f50 | "2022-01-14T14:10:45Z" | python | "2022-01-19T12:44:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,876 | ["airflow/migrations/versions/e655c0453f75_add_taskmap_and_map_id_on_taskinstance.py", "airflow/models/taskinstance.py", "airflow/models/taskreschedule.py"] | Airflow database upgrade fails with "psycopg2.errors.NotNullViolation: column "map_index" of relation "task_instance" contains null value"s | ### Apache Airflow version
main (development)
### What happened
I currently have Airflow 2.2.3 and due to this [issue](https://github.com/apache/airflow/issues/19699) I have tried to upgrade Airflow to this [commit](https://github.com/apache/airflow/commit/14ee831c7ad767e31a3aeccf3edbc519b3b8c923).
When I run `airflow db upgrade` I get the following error:
```
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.runtime.migration] Running upgrade 587bdf053233 -> e655c0453f75, Add TaskMap and map_index on TaskInstance.
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.NotNullViolation: column "map_index" of relation "task_instance" contains null values
```
The map_index column was introduced with this [PR](https://github.com/apache/airflow/pull/20286).
Could you please advise?
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Ubuntu 18.04.6 LTS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other
### Deployment details
Kubernetes
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20876 | https://github.com/apache/airflow/pull/20902 | 7e29506037fa543f5d9b438320db064cdb820c7b | 66276e68ba37abb2991cb0e03ca93c327fc63a09 | "2022-01-14T14:10:05Z" | python | "2022-01-18T04:55:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,873 | ["airflow/www/static/css/bootstrap-theme.css"] | Distinguish links in DAG documentation from code | ### Description
Currently code blocks (i.e using backticks in markdown) look the same as links when using `dag.doc_md`.
This makes it very difficult to distinguish what is clickable in links.
For example in the image below,
![image](https://user-images.githubusercontent.com/3199181/149515893-a84d0ef8-a829-4171-9da9-8a438ec6d810.png)
`document-chunker` is a code block, whereas `offline-processing-storage-chunking-manual-trigger-v5` is a link - but they appear identical.
If this was rendered on github it'd look like:
> The restored files must subsequently be restored using a manual trigger of `document-chunker`: [offline-processing-storage-chunking-manual-trigger-v5](https://invalid-link.com), set `s3_input_base_path` as copied and provide the same start and end dates.
notice that the link is clearly different to the code block.
We're using Airflow version 2.1.4.
### Use case/motivation
We're trying to link between manual steps which are possible with Airflow runs for recovery, reducing the need to write long pieces of documentation.
The links are currently difficult to distinguish which causes issues when following instructions.
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20873 | https://github.com/apache/airflow/pull/20938 | 892204105154fdc520758e66512fb6021d404e57 | cdb120de2403f5a21aa6d84d10f68c1b7f086aba | "2022-01-14T12:32:24Z" | python | "2022-01-19T16:18:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,839 | ["airflow/www/views.py", "tests/www/views/test_views_connection.py"] | Cannot edit custom fields on provider connections | ### Apache Airflow version
2.2.3 (latest released)
### What happened
Connections from providers are not saving edited values in any custom connection forms. You can work around the issue by changing connection type to something like HTTP, and modifying the extra field's JSON.
### What you expected to happen
_No response_
### How to reproduce
Using the official docker compose deployment, add a new connection of type 'Azure Data Explorer' and fill out one of the custom connection fields (e.g., "Tenant ID"). Save the record. Edit the record and enter a new value for the same field or any other field that is defined in the associated Hook's `get_connection_form_widgets` function. Save the record. Edit again. The changes were not saved.
### Operating System
Windows 10, using docker
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20839 | https://github.com/apache/airflow/pull/20883 | bad070f7f484a9b4065a0d86195a1e8002d9bfef | 44df1420582b358594c8d7344865811cff02956c | "2022-01-13T01:05:58Z" | python | "2022-01-24T00:33:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,832 | ["airflow/providers/amazon/aws/hooks/glue.py", "tests/providers/amazon/aws/hooks/test_glue.py"] | Unable to specify Python version for AwsGlueJobOperator | ### Apache Airflow Provider(s)
amazon
### Versions of Apache Airflow Providers
_No response_
### Apache Airflow version
2.0.2
### Operating System
Amazon Linux
### Deployment
MWAA
### Deployment details
_No response_
### What happened
When a new Glue job is created using the AwsGlueJobOperator, the job is defaulting to Python2. Setting the version in create_job_kwargs fails with key error.
### What you expected to happen
Expected the Glue job to be created with a Python3 runtime. create_job_kwargs are passed to the boto3 glue client create_job method which includes a "Command" parameter that is a dictionary containing the Python version.
### How to reproduce
Create a dag with an AwsGlueJobOperator and pass a "Command" parameter in the create_job_kwargs argument.
```
create_glue_job_args = {
"Command": {
"Name": "abalone-preprocess",
"ScriptLocation": f"s3://{output_bucket}/code/preprocess.py",
"PythonVersion": "3"
}
}
glue_etl = AwsGlueJobOperator(
task_id="glue_etl",
s3_bucket=output_bucket,
script_args={
'--S3_INPUT_BUCKET': data_bucket,
'--S3_INPUT_KEY_PREFIX': 'input/raw',
'--S3_UPLOADS_KEY_PREFIX': 'input/uploads',
'--S3_OUTPUT_BUCKET': output_bucket,
'--S3_OUTPUT_KEY_PREFIX': str(determine_dataset_id.output) +'/input/data'
},
iam_role_name="MLOps",
retry_limit=2,
concurrent_run_limit=3,
create_job_kwargs=create_glue_job_args,
dag=dag)
```
```
[2022-01-04 16:43:42,053] {{logging_mixin.py:104}} INFO - [2022-01-04 16:43:42,053] {{glue.py:190}} ERROR - Failed to create aws glue job, error: 'Command'
[2022-01-04 16:43:42,081] {{logging_mixin.py:104}} INFO - [2022-01-04 16:43:42,081] {{glue.py:112}} ERROR - Failed to run aws glue job, error: 'Command'
[2022-01-04 16:43:42,101] {{taskinstance.py:1482}} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/glue.py", line 166, in get_or_create_glue_job
get_job_response = glue_client.get_job(JobName=self.job_name)
File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 357, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 676, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.errorfactory.EntityNotFoundException: An error occurred (EntityNotFoundException) when calling the GetJob operation: Job with name: abalone-preprocess not found.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/operators/glue.py", line 121, in execute
glue_job_run = glue_job.initialize_job(self.script_args)
File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/glue.py", line 108, in initialize_job
job_name = self.get_or_create_glue_job()
File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/glue.py", line 186, in get_or_create_glue_job
**self.create_job_kwargs,
KeyError: 'Command'
```
### Anything else
When a new job is being created.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20832 | https://github.com/apache/airflow/pull/24215 | fd4e344880b505cfe53a97a6373d329515bbc7a3 | 41898d89220c8525d941367a875b4806e618b0d0 | "2022-01-12T17:18:16Z" | python | "2022-06-06T14:41:55Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,823 | ["STATIC_CODE_CHECKS.rst"] | Static check docs mistakes in example | ### Describe the issue with documentation
<img width="1048" alt="Screenshot 2022-01-12 at 4 22 44 PM" src="https://user-images.githubusercontent.com/10162465/149127114-5810f86e-83eb-40f6-b438-5b18b7026e86.png">
Run the flake8 check for the tests.core package with verbose output:
./breeze static-check mypy -- --files tests/hooks/test_druid_hook.py
The doc says flake8 check for tests.core package but it runs mypy check for files
### How to solve the problem
It can be solved by adding the right instruction.
./breeze static-check flake8 -- --files tests/core/* --verbose
Didn't check if the above command is working. But we have to use similar command like above.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20823 | https://github.com/apache/airflow/pull/20844 | 8dc68d47048d559cf4b76874d8d5e7a5af6359b6 | c49d6ec8b67e48d0c0fba1fe30c00f590f88ae65 | "2022-01-12T11:14:33Z" | python | "2022-01-13T07:30:46Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,804 | ["airflow/dag_processing/manager.py", "airflow/sensors/smart_sensor.py", "tests/dag_processing/test_manager.py", "tests/sensors/test_smart_sensor_operator.py"] | Some timing metrics are in seconds but reported as milliseconds | ### Apache Airflow version
2.2.2
### What happened
When Airflow reports timing stats it uses either a `timedelta` or a direct value. When using `timedelta` it is converted automatically to the correct units of measurement but when using a direct value it is accepted to already be in the correct units.
Unfortunately the Stats class, either being statsd.StatsClient or a stub, expects *milliseconds* while the Airflow code passes the value in *seconds*.
The result is two of the timing metrics are wrong by a magnitude of 1000.
This affects `dag_processing.last_duration.<dag_file>` and `smart_sensor_operator.loop_duration`.
The rest either pass `timedelta` or use a `Stats.timer` which calculates timing on its own and is not affected.
### What you expected to happen
All timing metrics to be in the correct unit of measurement.
### How to reproduce
Run a statsd-exporter and a prometheus to collect the metrics and compare to the logs.
For the dag processing metric, the scheduler logs the amounts and can be directly compared to the gathered metric.
### Operating System
Linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
Using these two with the configs below to process the metrics. The metrics can be viewed in the prometheus UI on `localhost:9090`.
```
prometheus:
image: prom/prometheus:v2.32.1
command:
- --config.file=/etc/prometheus/config.yml
- --web.console.libraries=/usr/share/prometheus/console_libraries
- --web.console.templates=/usr/share/prometheus/consoles
ports:
- 9090:9090
volumes:
- ./prometheus:/etc/prometheus
statsd-exporter:
image: prom/statsd-exporter:v0.22.4
command:
- --statsd.mapping-config=/tmp/statsd_mapping.yml
ports:
- 9102:9102
- 9125:9125
- 9125:9125/udp
volumes:
- ./prometheus/statsd_mapping.yml:/tmp/statsd_mapping.yml
```
The prometheus config is:
```
global:
scrape_interval: 15s
scrape_configs:
- job_name: airflow_statsd
scrape_interval: 1m
scrape_timeout: 30s
static_configs:
- targets:
- statsd-exporter:9102
```
The metrics mapping for statsd-exporter is:
```
mappings:
- match: "airflow.dag_processing.last_duration.*"
name: "airflow_dag_processing_last_duration"
labels:
dag_file: "$1"
- match: "airflow.collect_db_tags"
name: "airflow_collect_db_tags"
labels: {}
- match: "airflow.scheduler.critical_section_duration"
name: "airflow_scheduler_critical_section_duration"
labels: {}
- match: "airflow.dagrun.schedule_delay.*"
name: "airflow_dagrun_schedule_delay"
labels:
dag_id: "$1"
- match: "airflow.dag_processing.total_parse_time"
name: "airflow_dag_processing_total_parse_time"
labels: {}
- match: "airflow.dag_processing.last_run.seconds_ago.*"
name: "airflow_dag_processing_last_run_seconds_ago"
labels:
dag_file: "$1"
```
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20804 | https://github.com/apache/airflow/pull/21106 | 6e96f04eb515149f185448b8dfb84813c5879fc0 | 1507ca48d7c211799129ce7956c11f4c45fee5bc | "2022-01-11T09:44:40Z" | python | "2022-06-01T04:40:36Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,746 | ["airflow/providers/amazon/aws/hooks/emr.py"] | Airflow Elastic MapReduce connection | ### Description
Airflow has connections:
```
Amazon Web Services
Amazon Redshift
Elastic MapReduce
```
It wasn't easy to find the `Elastic MapReduce` because it wasn't listed as Amazon.
I think it would be better to rename it to `Amazon Elastic MapReduce` for easier find in the Connections droplist
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20746 | https://github.com/apache/airflow/pull/20767 | da9210e89c618611b1e450617277b738ce92ffd7 | 88e3f2ae5e5101928858099f9d4e7fb6542c4110 | "2022-01-07T12:37:23Z" | python | "2022-01-08T15:26:08Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,741 | [".github/workflows/build-images.yml", "breeze", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/utils/path_utils.py"] | CI: Reuse Breeze CI image building for CI | We already have a basic Build image function in `Breeze` - we should be able to use it to build images in CI.
There are actions that build images in `./build-imaages.yml` and they should simply (similarely to free-space) use Python commands develped in ./dev/breeze. | https://github.com/apache/airflow/issues/20741 | https://github.com/apache/airflow/pull/22008 | 60ec508945277215c557b7666cdb97872facdae2 | eddf4a30f81c40ada1d8805d689d82c3ac70e311 | "2022-01-07T08:57:07Z" | python | "2022-03-09T10:27:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,740 | [".pre-commit-config.yaml", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/pre_commit_ids.py", "dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2", "dev/breeze/src/airflow_breeze/utils/run_utils.py", "dev/breeze/tests/test_cache.py", "scripts/ci/pre_commit/pre_commit_check_pre_commit_hook_names.py"] | Breeze: Running static checks with Breeze | We should rewrite the action to run static checks with Breeze. Currently implemented by 'static-checks' opetoin.
The should baiscally run appropriate `pre-commit run` statement. The difference vs. running just pre-commit is that it should implement auto-complete of checks available (pre-commit does not have it) and make sure that CI image is built for the checks that require it. The list of available static checks should be retrieved by parsing the .pre-commit.yml file rather than (as it is currently done) maintaining the list in ./breeze-complete script
More info on static checks: https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst
| https://github.com/apache/airflow/issues/20740 | https://github.com/apache/airflow/pull/20848 | 82adce535eb0c427c230035d648bf3c829824b21 | 684fe46158aa3d6cb2de245d29e20e487d8f2158 | "2022-01-07T08:54:44Z" | python | "2022-01-27T17:34:44Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,739 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/docs_generator/__init__.py", "dev/breeze/src/airflow_breeze/docs_generator/build_documentation.py", "dev/breeze/src/airflow_breeze/docs_generator/doc_builder.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/utils/docker_command_utils.py", "dev/breeze/tests/test_commands.py", "dev/breeze/tests/test_global_constants.py"] | Breeze: Build documentation with Breeze | The "build-documentation" action should be rewritten in python in the new Breeze2 command.
It should allow for the same parameters that are already used by the https://github.com/apache/airflow/blob/main/docs/build_docs.py script: `--package-filter` for example.
It shoudl basically run the ./build_docs.py using CI image.
Also we should add `airflow-start-doc-server` as a separate entrypoint (similar to airflow-freespace). | https://github.com/apache/airflow/issues/20739 | https://github.com/apache/airflow/pull/20886 | 793684a88ce3815568c585c45eb85a74fa5b2d63 | 81c85a09d944021989ab530bc6caf1d9091a753c | "2022-01-07T08:49:46Z" | python | "2022-01-24T10:45:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,735 | ["airflow/providers/sftp/hooks/sftp.py", "tests/providers/sftp/hooks/test_sftp.py", "tests/providers/sftp/hooks/test_sftp_outdated.py"] | SFTPHook uses default connection 'sftp_default' even when ssh_conn_id is passed | ### Apache Airflow Provider(s)
sftp
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.6.0
apache-airflow-providers-apache-hdfs==2.2.0
apache-airflow-providers-apache-hive==2.1.0
apache-airflow-providers-apache-livy==2.1.0
apache-airflow-providers-apache-spark==2.0.3
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-elasticsearch==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==4.0.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-jira==2.0.1
apache-airflow-providers-postgres==2.4.0
apache-airflow-providers-redis==2.0.1
**apache-airflow-providers-sftp==2.4.0**
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
**apache-airflow-providers-ssh==2.3.0**
### Apache Airflow version
2.2.3 (latest released)
### Operating System
Ubuntu 20.04
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
I believe this was introduced in this commit https://github.com/apache/airflow/commit/f35ad27080a2e1f29efc20a9bd0613af0f6ff2ec
In File airflow/providers/sftp/hooks/sftp.py
In lines 74-79
```python
def __init__(
self,
ssh_conn_id: Optional[str] = 'sftp_default',
ftp_conn_id: Optional[str] = 'sftp_default',
*args,
**kwargs,
) -> None:
if ftp_conn_id:
warnings.warn(
'Parameter `ftp_conn_id` is deprecated.' 'Please use `ssh_conn_id` instead.',
DeprecationWarning,
stacklevel=2,
)
kwargs['ssh_conn_id'] = ftp_conn_id
self.ssh_conn_id = ssh_conn_id
super().__init__(*args, **kwargs)
```
Since `ftp_conn_id` has a default value of `sftp_default`, it will always override `ssh_conn_id` unless explicitly set to None during init.
### What you expected to happen
If you initialise the hook with `ssh_conn_id` parameter it should use that connection instead of ignoring the parameter and using the default value.
The deprecation message is also triggered despite only passing in `ssh_conn_id`
`<stdin>:1 DeprecationWarning: Parameter ftp_conn_id is deprecated.Please use ssh_conn_id instead.`
### How to reproduce
```python
from airflow.providers.sftp.hooks.sftp import SFTPHook
hook = SFTPHook(ssh_conn_id='some_custom_sftp_conn_id')
```
There will be a log message stating
`[2022-01-07 01:12:05,234] {base.py:70} INFO - Using connection to: id: sftp_default.`
### Anything else
This breaks the SFTPSensor and SFTPOperator
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20735 | https://github.com/apache/airflow/pull/20756 | 9ea459a6bd8073f16dc197b1147f220293557dc8 | c2fc760c9024ce9f0bec287679d9981f6ab1fd98 | "2022-01-07T01:30:46Z" | python | "2022-01-07T21:00:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,655 | ["airflow/www/views.py", "tests/www/views/test_views_tasks.py"] | Edit Task Instance Page does not save updates. | ### Apache Airflow version
main (development)
### What happened
On the Task Instances table, one can click on `Edit Record` and be directed to a `Edit Task Instance` page. There, if you try to change the state, it will simply not update nor are there any errors in the browser console.
<img width="608" alt="Screen Shot 2022-01-04 at 10 10 52 AM" src="https://user-images.githubusercontent.com/4600967/148101472-fb6076fc-3fc3-44c4-a022-6625d5112551.png">
### What you expected to happen
Changes to state should actually update the task instance.
### How to reproduce
Try to use the `Edit Task Instance` page
### Operating System
Mac OSx
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
This page is redundant, as a user can successfully edit task instance states from the table. Instead of wiring up the correct actions, I say we should just remove the `Edit Task Instance` page entirely.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20655 | https://github.com/apache/airflow/pull/30415 | 22bef613678e003dde9128ac05e6c45ce934a50c | b140c4473335e4e157ff2db85148dd120c0ed893 | "2022-01-04T17:37:21Z" | python | "2023-04-22T17:10:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,632 | ["chart/templates/flower/flower-deployment.yaml", "chart/templates/pgbouncer/pgbouncer-deployment.yaml", "chart/templates/scheduler/scheduler-deployment.yaml", "chart/templates/statsd/statsd-deployment.yaml", "chart/templates/triggerer/triggerer-deployment.yaml", "chart/templates/webserver/webserver-deployment.yaml", "chart/templates/workers/worker-deployment.yaml", "chart/tests/test_airflow_common.py", "chart/values.schema.json", "chart/values.yaml"] | Add priorityClassName support | ### Official Helm Chart version
1.2.0
### Apache Airflow version
2.1.4
### Kubernetes Version
1.21.2
### Helm Chart configuration
_No response_
### Docker Image customisations
_No response_
### What happened
_No response_
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
It's currently impossible to assign a `priorityClassName` to the Airflow containers.
Seems like a very useful feature to ensure that the Airflow infrastructure has higher priority than "regular" pods.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20632 | https://github.com/apache/airflow/pull/20794 | ab762a5a8ae147ae33500ee3c7e7a73d25d03ad7 | ec41fd51e07ca2b9a66e0b99b730300c80a6d059 | "2022-01-03T14:06:15Z" | python | "2022-02-04T19:51:36Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,627 | ["setup.cfg"] | Bump flask-appbuilder to >=3.3.4, <4.0.0 | ### Description
Reason: Improper Authentication in Flask-AppBuilder: https://github.com/advisories/GHSA-m3rf-7m4w-r66q
### Use case/motivation
_No response_
### Related issues
https://github.com/advisories/GHSA-m3rf-7m4w-r66q
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20627 | https://github.com/apache/airflow/pull/20628 | e1fbfc60d29af3fc0928b904ac80ca3b71a3a839 | 97261c642cbf07db91d252cf6b0b7ff184cd64c6 | "2022-01-03T11:50:47Z" | python | "2022-01-03T14:29:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,580 | ["airflow/api/common/experimental/mark_tasks.py"] | Triggers are not terminated when DAG is mark failed | ### Apache Airflow version
2.2.3 (latest released)
### What happened
Hello, I quite excited to new [Deferrable ("Async") Operators in AIP-40 ](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=177050929), and going to adapt this by develop new async version of ExternalTaskSensor. But while doing, I find out that Deferrable ("Async") Operators are not canceled when DAG is mark failded/success.
![Screenshot from 2021-12-30 14-07-44](https://user-images.githubusercontent.com/6848311/147730232-90a6a5eb-3b6d-4270-9090-57c65240fe19.png)
When I mark DAG as failed, `wait_task` is canceled (changed to `failed` state), but `wait_task_async` still in `deferred` state and triggerer is keep poking.
![Screenshot from 2021-12-30 14-08-06](https://user-images.githubusercontent.com/6848311/147730385-9c3e8c13-6b21-4bee-b85a-3d064ec1cde5.png)
![image](https://user-images.githubusercontent.com/6848311/147730395-124f8705-adf7-4f1c-832a-15fd3826446c.png)
### What you expected to happen
Deferrable ("Async") Operators should be canceled as sync version of operators
### How to reproduce
Testing DAG.
```python
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.bash import BashOperator
from airflow.sensors.external_task import ExternalTaskSensor, ExternalTaskSensorAsync
with DAG(
'tutorial_async_sensor',
default_args={
'depends_on_past': False,
'email': ['airflow@example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
},
description='A simple tutorial DAG using external sensor async',
schedule_interval=timedelta(days=1),
start_date=datetime(2021, 1, 1),
catchup=False,
tags=['example'],
) as dag:
t1 = ExternalTaskSensorAsync(
task_id='wait_task_async',
external_dag_id="tutorial",
external_task_id="sleep",
execution_delta=timedelta(hours=1),
poke_interval=5.0
)
t2 = ExternalTaskSensor(
task_id='wait_task',
external_dag_id="tutorial",
external_task_id="sleep",
execution_delta=timedelta(hours=1),
poke_interval=5.0
)
t3 = BashOperator(
task_id='echo',
depends_on_past=False,
bash_command='echo Hello world',
retries=3,
)
[t1, t2] >> t3
```
for `ExternalTaskSensorAsync`, see #20583
### Operating System
Ubuntu 20.04
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
run `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20580 | https://github.com/apache/airflow/pull/20649 | b83084b1b05415079972a76e3e535d40a1998de8 | 64c0bd50155dfdb84671ac35d645b812fafa78a1 | "2021-12-30T07:29:20Z" | python | "2022-01-05T07:42:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,579 | ["airflow/cli/commands/task_command.py", "tests/cli/commands/test_task_command.py"] | Airflow 2.2.3 : "airflow dags trigger" command gets "Calling `DAG.create_dagrun()` without an explicit data interval is deprecated" | ### Apache Airflow version
2.2.3 (latest released)
### What happened
When issuing the following command:
`airflow dags trigger 'VMWARE_BACKUP' --conf '{"VM_NAME":"psfiplb1"}'`
the system replays with:
```
/usr/local/lib/python3.8/dist-packages/airflow/api/common/experimental/trigger_dag.py:85 DeprecationWarning: Calling `DAG.create_dagrun()` without an explicit data interval is deprecated
Created <DagRun VMWARE_BACKUP @ 2021-12-31T00:00:00+00:00: manual__2021-12-31T00:00:00+00:00, externally triggered: True>
```
### What you expected to happen
I have no other switch parameter to choose for avoiding the deprecation warning.
My concern is about what will happen to the command when the deprecation will transforms itself in an error.
### How to reproduce
_No response_
### Operating System
Ubuntu 20.04.3 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-mssql==2.0.1
apache-airflow-providers-microsoft-winrm==2.0.1
apache-airflow-providers-openfaas==2.0.0
apache-airflow-providers-oracle==2.0.1
apache-airflow-providers-samba==3.0.1
apache-airflow-providers-sftp==2.3.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
### Deployment
Virtualenv installation
### Deployment details
Airflow inside an LXD continer
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20579 | https://github.com/apache/airflow/pull/27106 | c3095d77b81c1a3a6510246b1fea61e6423d518b | 70680ded7a4056882008b019f5d1a8f559a301cd | "2021-12-30T05:39:01Z" | python | "2023-03-16T19:08:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,560 | ["chart/templates/secrets/elasticsearch-secret.yaml", "chart/tests/test_elasticsearch_secret.py"] | Elasticsearch connection user and password should be optional | ### Official Helm Chart version
main (development)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
1.21.5
### Helm Chart configuration
```yaml
elasticsearch:
enabled: true
connection:
host: elasticsearch-master
port: 9200
```
### Docker Image customisations
_No response_
### What happened
Secret was created with this connection:
```
http://%3Cno+value%3E:%3Cno+value%3E@elasticsearch-master:9200
```
### What you expected to happen
Secret created with this connection:
```
http://elasticsearch-master:9200
```
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20560 | https://github.com/apache/airflow/pull/21222 | 5a6a2d604979cb70c5c9d3797738f0876dd38c3b | 5dc6338346deca8e5a9a47df2da19f38eeac0ce8 | "2021-12-29T21:57:46Z" | python | "2022-02-11T04:30:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,559 | ["airflow/exceptions.py", "airflow/models/dagbag.py", "airflow/models/param.py", "tests/dags/test_invalid_param.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_dag.py", "tests/models/test_dagbag.py", "tests/models/test_param.py"] | Missing required parameter shows a stacktrace instead of a validation error | ### Apache Airflow version
2.2.3 (latest released)
### What happened
I wanted to improve the [Params concepts doc](https://airflow.apache.org/docs/apache-airflow/stable/concepts/params.html) to show how one can reference the params from a task. While doing so, I tried to run that DAG by clicking "Trigger" (i.e. I didn't opt to modify the params first). Then I saw this:
```
Oops.
Something bad has happened.
...
Python version: 3.9.9
Airflow version: 2.2.3+astro.1
Node: df9204fff6e6
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/param.py", line 62, in __init__
jsonschema.validate(self.value, self.schema, format_checker=FormatChecker())
File "/usr/local/lib/python3.9/site-packages/jsonschema/validators.py", line 934, in validate
raise error
jsonschema.exceptions.ValidationError: 'NoValueSentinel' is too long
Failed validating 'maxLength' in schema:
{'maxLength': 4, 'minLength': 2, 'type': 'string'}
On instance:
'NoValueSentinel'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.9/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.9/site-packages/airflow/www/auth.py", line 51, in decorated
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/www/decorators.py", line 72, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 1650, in trigger
dag = current_app.dag_bag.get_dag(dag_id)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 186, in get_dag
self._add_dag_from_db(dag_id=dag_id, session=session)
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 261, in _add_dag_from_db
dag = row.dag
File "/usr/local/lib/python3.9/site-packages/airflow/models/serialized_dag.py", line 180, in dag
dag = SerializedDAG.from_dict(self.data) # type: Any
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 947, in from_dict
return cls.deserialize_dag(serialized_obj['dag'])
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 861, in deserialize_dag
v = {task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for task in v}
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 861, in <dictcomp>
v = {task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for task in v}
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 641, in deserialize_operator
v = cls._deserialize_params_dict(v)
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 459, in _deserialize_params_dict
op_params[k] = cls._deserialize_param(v)
File "/usr/local/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 439, in _deserialize_param
return class_(**kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/param.py", line 64, in __init__
raise ValueError(err)
ValueError: 'NoValueSentinel' is too long
Failed validating 'maxLength' in schema:
{'maxLength': 4, 'minLength': 2, 'type': 'string'}
On instance:
'NoValueSentinel'
```
I started with the dag in the docs, but ended up making some tweaks. Here's how it was when I saw the error.
```
from airflow import DAG
from airflow.models.param import Param
from airflow.operators.python import PythonOperator
from datetime import datetime
with DAG(
"my_dag",
start_date=datetime(1970, 1, 1),
schedule_interval=None,
params={
# a int param with default value
"int_param": Param(10, type="integer", minimum=0, maximum=20),
# a mandatory str param
"str_param": Param(type="string", minLength=2, maxLength=4),
# a param which can be None as well
"dummy": Param(type=["null", "number", "string"]),
# no data or type validations
"old": "old_way_of_passing",
# no data or type validations
"simple": Param("im_just_like_old_param"),
"email": Param(
default="example@example.com",
type="string",
format="idn-email",
minLength=5,
maxLength=255,
),
},
) as the_dag:
def print_these(*params):
for param in params:
print(param)
PythonOperator(
task_id="ref_params",
python_callable=print_these,
op_args=[
# you can modify them in jinja templates
"{{ params.int_param + 10 }}",
# or just leave them as-is
"{{ params.str_param }}",
"{{ params.dummy }}",
"{{ params.old }}",
"{{ params.simple }}",
"{{ params.email }}",
],
)
```
### What you expected to happen
If there's a required parameter, and I try to trigger a dagrun, I should get a friendly warning explaining what I've done wrong.
### How to reproduce
Run the dag above
### Operating System
docker / debian
### Versions of Apache Airflow Providers
n/a
### Deployment
Astronomer
### Deployment details
`astro dev start`
Dockerfile:
```
FROM quay.io/astronomer/ap-airflow:2.2.3-onbuild
```
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20559 | https://github.com/apache/airflow/pull/20802 | 717169987e76f332d7ce92bc361d2fff6966f6f0 | c59001d79facf7e472e0581ac8a538c25eebfda7 | "2021-12-29T20:52:07Z" | python | "2022-01-16T16:02:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,552 | ["airflow/providers/google/cloud/transfers/sftp_to_gcs.py", "tests/providers/google/cloud/transfers/test_sftp_to_gcs.py"] | gzip parameter of sftp_to_gcs operator is never passed to the GCS hook | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
List of versions is the one provided by Cloud Composer image `composer-1.17.7-airflow-2.1.4`.
(cf. https://cloud.google.com/composer/docs/concepts/versioning/composer-versions#images )
Relevant here is `apache-airflow-providers-google==5.1.0`
### Apache Airflow version
2.1.4
### Deployment
Composer
### Deployment details
_No response_
### What happened
Found on version 2.1.4 and reproduced locally on main branch (v2.2.3).
When using `SFTPToGCSOperator` with `gzip=True`, no compression is actually performed, the files are copied/moved as-is to GCS.
This happens because the `gzip` parameter isn't passed to the GCS Hook `upload()` call which then defaults to `False`.
### What you expected to happen
I expect the files to be compressed when `gzip=True`.
### How to reproduce
Create any `SFTPToGCSOperator` with `gzip=True` and upload a file.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20552 | https://github.com/apache/airflow/pull/20553 | c5c18c54fa83463bc953249dc28edcbf7179da17 | 3a480f5ff41c2da4ae4fd6b2289e064ee42048a5 | "2021-12-29T10:51:58Z" | python | "2021-12-29T12:20:18Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,545 | ["docs/apache-airflow/templates-ref.rst"] | built in macros (macros.random, macros.time) need documentation change | ### Apache Airflow version
2.2.3 (latest released)
### What happened
My gut says that the way forward is to change the macros object so that it only exposes modules:
- datetime
- time
- uuid
- random
... and then leave it to the user to decide which functions on those modules they want to call. I'm not confident enough to make that change. If instead we want to change the docs to match the actual functionality, I can submit a PR for that.
### What you expected to happen
When using either the bultin macros time or random they don't call datetime.time or random they instead call the builtin module time or for random returns a function instead of the module.
### How to reproduce
```python
import datetime as dt
import time
from uuid import uuid4
from textwrap import dedent
from airflow.models import DAG
from airflow.operators.python import PythonOperator
from dateutil.parser import parse as dateutil_parse
"""
According to the docs:
macros.datetime - datetime.datetime
macros.timedelta - datetime.timedelta
macros.datetutil - dateutil package
macros.time - datetime.time
macros.uuid - python standard lib uuid
macros.random - python standard lib random
According to the code:
macros.datetime - datetime.datetime
macros.timedelta - datetime.timedelta
macros.datetutil - dateutil package
macros.time - python standard lib time <--- differs
macros.uuid - python standard lib uuid
macros.random - random.random <--- differs
"""
def date_time(datetime_obj):
compare_obj = dt.datetime(2021, 12, 12, 8, 32, 23)
assert datetime_obj == compare_obj
def time_delta(timedelta_obj):
compare_obj = dt.timedelta(days=3, hours=4)
assert timedelta_obj == compare_obj
def date_util(dateutil_obj):
compare_obj = dateutil_parse("Thu Sep 26 10:36:28 2019")
assert dateutil_obj == compare_obj
def time_tester(time_obj):
# note that datetime.time.time() gives an AttributeError
# time.time() on the other hand, returns a float
# this works because macro.time isn't 'datetime.time', like the docs say
# it's just 'time'
compare_obj = time.time()
print(time_obj)
print(compare_obj)
# the macro might have captured a slightly differnt time than the task,
# but they're not going to be more than 10s apart
assert abs(time_obj - compare_obj) < 10
def uuid_tester(uuid_obj):
compare_obj = uuid4()
assert len(str(uuid_obj)) == len(str(compare_obj))
def random_tester(random_float):
# note that 'random.random' is a function that returns a float
# while 'random' is a module (and isn't callable)
# the macro was 'macros.random()' and here we have a float:
assert -0.1 < random_float < 100.1
# so the docs are wrong here too
# macros.random actually returns a function, not the random module
def show_docs(attr):
print(attr.__doc__)
with DAG(
dag_id="builtin_macros_with_airflow_specials",
schedule_interval=None,
start_date=dt.datetime(1970, 1, 1),
render_template_as_native_obj=True, # render templates using Jinja NativeEnvironment
tags=["core"],
) as dag:
test_functions = {
"datetime": (date_time, "{{ macros.datetime(2021, 12, 12, 8, 32, 23) }}"),
"timedelta": (time_delta, "{{ macros.timedelta(days=3, hours=4) }}"),
"dateutil": (
date_util,
"{{ macros.dateutil.parser.parse('Thu Sep 26 10:36:28 2019') }}",
),
"time": (time_tester, "{{ macros.time.time() }}"),
"uuid": (uuid_tester, "{{ macros.uuid.uuid4() }}"),
"random": (
random_tester,
"{{ 100 * macros.random() }}",
),
}
for name, (func, template) in test_functions.items():
(
PythonOperator(
task_id=f"showdoc_{name}",
python_callable=show_docs,
op_args=[f"{{{{ macros.{name} }}}}"],
)
>> PythonOperator(
task_id=f"test_{name}", python_callable=func, op_args=[template]
)
)
```
### Operating System
Docker (debian:buster)
### Versions of Apache Airflow Providers
2.2.2, and 2.2.3
### Deployment
Other
### Deployment details
Astro CLI with sequential executor
### Anything else
Rather than changing the docs to describe what the code actually does would it be better to make the code behave in a way that is more consistent (e.g. toplevel modules only)?
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20545 | https://github.com/apache/airflow/pull/20637 | b1b8f304586e8ad181861dfe8ac15297c78f917b | 8b2299b284ac15900f54bf8c84976cc01f4d597c | "2021-12-29T01:12:01Z" | python | "2022-01-04T03:39:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,539 | [".pre-commit-config.yaml", "BREEZE.rst", "STATIC_CODE_CHECKS.rst", "breeze-complete", "chart/values.schema.json", "scripts/ci/pre_commit/pre_commit_chart_schema.py", "scripts/ci/pre_commit/pre_commit_vendor_k8s_json_schema.py"] | Offline chart installs broken | ### Official Helm Chart version
main (development)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
1.21.5
### Helm Chart configuration
_No response_
### Docker Image customisations
_No response_
### What happened
If you don't have outbound routing to the internet, you cannot install/template/lint the helm chart, e.g:
```
$ helm upgrade --install oss ~/github/airflow/chart
Release "oss" does not exist. Installing it now.
Error: values don't meet the specifications of the schema(s) in the following chart(s):
airflow:
Get "https://raw.githubusercontent.com/yannh/kubernetes-json-schema/master/v1.22.0-standalone-strict/networkpolicypeer-networking-v1.json": dial tcp: lookup raw.githubusercontent.com: no such host
```
### What you expected to happen
The chart should function without requiring outbound internet access.
### How to reproduce
Disable networking and try installing the chart.
### Anything else
We use external schema references in our chart, e.g:
https://github.com/apache/airflow/blob/8acfe8d82197448d2117beab29e688a68cec156a/chart/values.schema.json#L1298
Unfortunately, naively using relative filesystem reference in `values.schema.json` doesn't work properly in helm (see helm/helm#10481).
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20539 | https://github.com/apache/airflow/pull/20544 | f200bb1977655455f8acb79c9bd265df36f8ffce | afd84f6f2dcb81cb0c61e8b0563ff40ee75a2a6d | "2021-12-28T18:05:52Z" | python | "2021-12-29T19:29:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,534 | ["airflow/www/views.py"] | is not bound to a Session; lazy load operation of attribute 'dag_model' | ### Apache Airflow version
2.2.3 (latest released)
### What happened
An error occurred while browsing the page (/rendered-k8s?dag_id=xxx&task_id=yyy&execution_date=zzz):
```
Parent instance <TaskInstance at 0x7fb0b01df940> is not bound to a Session; lazy load operation of attribute 'dag_model' cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3) Traceback (most recent call last):
args=self.command_as_list(),
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 527, in command_as_list
dag = self.dag_model
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/attributes.py", line 294, in __get__
return self.impl.get(instance_state(instance), dict_)
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/attributes.py", line 730, in get
value = self.callable_(state, passive)
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/strategies.py", line 717, in _load_for_state
raise orm_exc.DetachedInstanceError(
sqlalchemy.orm.exc.DetachedInstanceError: Parent instance <TaskInstance at 0x7fb0b01df940> is not bound to a Session; lazy load operation of attribute 'dag_model' cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3)
```
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
NAME="CentOS Stream" VERSION="8"
### Versions of Apache Airflow Providers
```
apache-airflow-providers-cncf-kubernetes==2.2.0
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.3.0
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.2.0
```
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20534 | https://github.com/apache/airflow/pull/21006 | 372849486cd455a4ff4821b01805a442f1a78417 | a665f48b606065977e0d3952bc74635ce11726d1 | "2021-12-28T09:11:55Z" | python | "2022-01-21T13:44:40Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,496 | ["airflow/cli/commands/standalone_command.py"] | The standalone model bug, about port 8080 | ### Apache Airflow version
2.2.3 (latest released)
### What happened
When I modify airflow.cfg about default webserver listening port to 8000, then start with "airflow standalone"
A error will be printed:
webserver | [2021-12-25 22:20:08 +0800] [16173] [ERROR] Connection in use: ('0.0.0.0', 8080)
webserver | [2021-12-25 22:20:08 +0800] [16173] [ERROR] Retrying in 1 second.
webserver | [2021-12-25 22:20:10 +0800] [16173] [ERROR] Connection in use: ('0.0.0.0', 8080)
webserver | [2021-12-25 22:20:10 +0800] [16173] [ERROR] Retrying in 1 second.
webserver | [2021-12-25 22:20:11 +0800] [16173] [ERROR] Connection in use: ('0.0.0.0', 8080)
webserver | [2021-12-25 22:20:11 +0800] [16173] [ERROR] Retrying in 1 second.
webserver | [2021-12-25 22:20:12 +0800] [16173] [ERROR] Connection in use: ('0.0.0.0', 8080)
webserver | [2021-12-25 22:20:12 +0800] [16173] [ERROR] Retrying in 1 second.
webserver | [2021-12-25 22:20:13 +0800] [16173] [ERROR] Connection in use: ('0.0.0.0', 8080)
webserver | [2021-12-25 22:20:13 +0800] [16173] [ERROR] Retrying in 1 second.
webserver | [2021-12-25 22:20:14 +0800] [16173] [ERROR] Can't connect to ('0.0.0.0', 8080)
Source code position,
https://github.com/apache/airflow/blob/main/airflow/cli/commands/standalone_command.py line 209, function is_ready
return self.port_open(**8080**) and self.job_running(SchedulerJob) and self.job_running(TriggererJob)
force the port to 8080, it's wrong.
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
All System
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20496 | https://github.com/apache/airflow/pull/20505 | 2976e6f829e727c01b9c2838e32d210d40e7a03c | f743e46c5a4fdd0b76fea2d07729b744644fc416 | "2021-12-25T15:18:45Z" | python | "2021-12-28T21:15:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,475 | ["airflow/models/taskinstance.py", "tests/conftest.py", "tests/models/test_taskinstance.py"] | Custom Timetable error with CeleryExecutor | ### Apache Airflow version
2.2.3 (latest released)
### What happened
i'm really new to airflow, and i have an error when using Custom Timetable w/ CeleryExecutor
### What you expected to happen
For the custom timetable to be implemented and used by DAG.
error log as follows:
```
[2021-12-23 05:44:30,843: WARNING/ForkPoolWorker-2] Running <TaskInstance: example_cron_trivial_dag.dummy scheduled__2021-12-23T05:44:00+00:00 [queued]> on host 310bbd362d25
[2021-12-23 05:44:30,897: ERROR/ForkPoolWorker-2] Failed to execute task Not a valid timetable: <cron_trivial_timetable.CronTrivialTimetable object at 0x7fc15a0a4100>.
Traceback (most recent call last):
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 121, in _execute_in_fork
args.func(args)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 298, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 105, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", line 163, in _run_task_by_local_task_job
run_job.run()
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/jobs/base_job.py", line 245, in run
self._execute()
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 78, in _execute
self.task_runner = get_task_runner(self)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/task/task_runner/__init__.py", line 63, in get_task_runner
task_runner = task_runner_class(local_task_job)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py", line 35, in __init__
super().__init__(local_task_job)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/task/task_runner/base_task_runner.py", line 48, in __init__
super().__init__(local_task_job.task_instance)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/logging_mixin.py", line 40, in __init__
self._set_context(context)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/logging_mixin.py", line 54, in _set_context
set_context(self.log, context)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/logging_mixin.py", line 178, in set_context
handler.set_context(value)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/file_task_handler.py", line 59, in set_context
local_loc = self._init_file(ti)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/file_task_handler.py", line 264, in _init_file
relative_path = self._render_filename(ti, ti.try_number)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/utils/log/file_task_handler.py", line 80, in _render_filename
context = ti.get_template_context()
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1912, in get_template_context
'next_ds': get_next_ds(),
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1868, in get_next_ds
execution_date = get_next_execution_date()
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1862, in get_next_execution_date
next_execution_date = dag.following_schedule(self.execution_date)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/models/dag.py", line 595, in following_schedule
data_interval = self.infer_automated_data_interval(timezone.coerce_datetime(dttm))
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/models/dag.py", line 678, in infer_automated_data_interval
raise ValueError(f"Not a valid timetable: {self.timetable!r}")
ValueError: Not a valid timetable: <cron_trivial_timetable.CronTrivialTimetable object at 0x7fc15a0a4100>
[2021-12-23 05:44:30,936: ERROR/ForkPoolWorker-2] Task airflow.executors.celery_executor.execute_command[7c76904d-1b61-4441-b5f0-96ef2ba7c3b7] raised unexpected: AirflowException('Celery command failed on host: 310bbd362d25')
Traceback (most recent call last):
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/celery/app/trace.py", line 451, in trace_task
R = retval = fun(*args, **kwargs)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/celery/app/trace.py", line 734, in __protected_call__
return self.run(*args, **kwargs)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 90, in execute_command
_execute_in_fork(command_to_exec, celery_task_id)
File "/opt/rh/rh-python38/root/usr/local/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 101, in _execute_in_fork
raise AirflowException('Celery command failed on host: ' + get_hostname())
airflow.exceptions.AirflowException: Celery command failed on host: 310bbd362d25
```
seems like the same problem mentioned here, but by different reasons that i really don't understand
https://github.com/apache/airflow/issues/19578#issuecomment-974654209
### How to reproduce
before 2.2.3 release, i'm having DAG import error w/ custom timetables, and since i only want Next Dag Run on UI be exact time, i will just change the code of CronDataIntervalTimetable by returning DataInterval.exact(end) in next_dagrun_info and infer_manual_data_interval.
here's the new timetable plugin file that i created (simply copied and tweaked from CronDataIntervalTimetable):
```
import datetime
from typing import Any, Dict, Optional, Union
from croniter import CroniterBadCronError, CroniterBadDateError, croniter
from dateutil.relativedelta import relativedelta
from pendulum import DateTime
from pendulum.tz.timezone import Timezone
from airflow.plugins_manager import AirflowPlugin
from airflow.compat.functools import cached_property
from airflow.exceptions import AirflowTimetableInvalid
from airflow.timetables.base import DagRunInfo, DataInterval, TimeRestriction, Timetable
from airflow.utils.dates import cron_presets
from airflow.utils.timezone import convert_to_utc, make_aware, make_naive
Delta = Union[datetime.timedelta, relativedelta]
class _TrivialTimetable(Timetable):
"""Basis for timetable implementations that schedule data intervals.
This kind of timetable classes create periodic data intervals (exact times) from an
underlying schedule representation (e.g. a cron expression, or a timedelta
instance), and schedule a DagRun at the end of each interval (exact time).
"""
def _skip_to_latest(self, earliest: Optional[DateTime]) -> DateTime:
"""Bound the earliest time a run can be scheduled.
This is called when ``catchup=False``.
"""
raise NotImplementedError()
def _align(self, current: DateTime) -> DateTime:
"""Align given time to the scheduled.
For fixed schedules (e.g. every midnight); this finds the next time that
aligns to the declared time, if the given time does not align. If the
schedule is not fixed (e.g. every hour), the given time is returned.
"""
raise NotImplementedError()
def _get_next(self, current: DateTime) -> DateTime:
"""Get the first schedule after the current time."""
raise NotImplementedError()
def _get_prev(self, current: DateTime) -> DateTime:
"""Get the last schedule before the current time."""
raise NotImplementedError()
def next_dagrun_info(
self,
*,
last_automated_data_interval: Optional[DataInterval],
restriction: TimeRestriction,
) -> Optional[DagRunInfo]:
earliest = restriction.earliest
if not restriction.catchup:
earliest = self._skip_to_latest(earliest)
if last_automated_data_interval is None:
# First run; schedule the run at the first available time matching
# the schedule, and retrospectively create a data interval for it.
if earliest is None:
return None
start = self._align(earliest)
else:
# There's a previous run. Create a data interval starting from when
# the end of the previous interval.
start = last_automated_data_interval.end
if restriction.latest is not None and start > restriction.latest:
return None
end = self._get_next(start)
return DagRunInfo.exact(end)
def _is_schedule_fixed(expression: str) -> bool:
"""Figures out if the schedule has a fixed time (e.g. 3 AM every day).
:return: True if the schedule has a fixed time, False if not.
Detection is done by "peeking" the next two cron trigger time; if the
two times have the same minute and hour value, the schedule is fixed,
and we *don't* need to perform the DST fix.
This assumes DST happens on whole minute changes (e.g. 12:59 -> 12:00).
"""
cron = croniter(expression)
next_a = cron.get_next(datetime.datetime)
next_b = cron.get_next(datetime.datetime)
return next_b.minute == next_a.minute and next_b.hour == next_a.hour
class CronTrivialTimetable(_TrivialTimetable):
"""Timetable that schedules data intervals with a cron expression.
This corresponds to ``schedule_interval=<cron>``, where ``<cron>`` is either
a five/six-segment representation, or one of ``cron_presets``.
The implementation extends on croniter to add timezone awareness. This is
because crontier works only with naive timestamps, and cannot consider DST
when determining the next/previous time.
Don't pass ``@once`` in here; use ``OnceTimetable`` instead.
"""
def __init__(self, cron: str, timezone: Timezone) -> None:
self._expression = cron_presets.get(cron, cron)
self._timezone = timezone
@classmethod
def deserialize(cls, data: Dict[str, Any]) -> "Timetable":
from airflow.serialization.serialized_objects import decode_timezone
return cls(data["expression"], decode_timezone(data["timezone"]))
def __eq__(self, other: Any) -> bool:
"""Both expression and timezone should match.
This is only for testing purposes and should not be relied on otherwise.
"""
if not isinstance(other, CronTrivialTimetable):
return NotImplemented
return self._expression == other._expression and self._timezone == other._timezone
@property
def summary(self) -> str:
return self._expression
def serialize(self) -> Dict[str, Any]:
from airflow.serialization.serialized_objects import encode_timezone
return {"expression": self._expression, "timezone": encode_timezone(self._timezone)}
def validate(self) -> None:
try:
croniter(self._expression)
except (CroniterBadCronError, CroniterBadDateError) as e:
raise AirflowTimetableInvalid(str(e))
@cached_property
def _should_fix_dst(self) -> bool:
# This is lazy so instantiating a schedule does not immediately raise
# an exception. Validity is checked with validate() during DAG-bagging.
return not _is_schedule_fixed(self._expression)
def _get_next(self, current: DateTime) -> DateTime:
"""Get the first schedule after specified time, with DST fixed."""
naive = make_naive(current, self._timezone)
cron = croniter(self._expression, start_time=naive)
scheduled = cron.get_next(datetime.datetime)
if not self._should_fix_dst:
return convert_to_utc(make_aware(scheduled, self._timezone))
delta = scheduled - naive
return convert_to_utc(current.in_timezone(self._timezone) + delta)
def _get_prev(self, current: DateTime) -> DateTime:
"""Get the first schedule before specified time, with DST fixed."""
naive = make_naive(current, self._timezone)
cron = croniter(self._expression, start_time=naive)
scheduled = cron.get_prev(datetime.datetime)
if not self._should_fix_dst:
return convert_to_utc(make_aware(scheduled, self._timezone))
delta = naive - scheduled
return convert_to_utc(current.in_timezone(self._timezone) - delta)
def _align(self, current: DateTime) -> DateTime:
"""Get the next scheduled time.
This is ``current + interval``, unless ``current`` is first interval,
then ``current`` is returned.
"""
next_time = self._get_next(current)
if self._get_prev(next_time) != current:
return next_time
return current
def _skip_to_latest(self, earliest: Optional[DateTime]) -> DateTime:
"""Bound the earliest time a run can be scheduled.
The logic is that we move start_date up until one period before, so the
current time is AFTER the period end, and the job can be created...
"""
current_time = DateTime.utcnow()
next_start = self._get_next(current_time)
last_start = self._get_prev(current_time)
if next_start == current_time:
new_start = last_start
elif next_start > current_time:
new_start = self._get_prev(last_start)
else:
raise AssertionError("next schedule shouldn't be earlier")
if earliest is None:
return new_start
return max(new_start, earliest)
def infer_manual_data_interval(self, *, run_after: DateTime) -> DataInterval:
# Get the last complete period before run_after, e.g. if a DAG run is
# scheduled at each midnight, the data interval of a manually triggered
# run at 1am 25th is between 0am 24th and 0am 25th.
end = self._get_prev(self._align(run_after))
return DataInterval.exact(end)
class CronTrivialTimetablePlugin(AirflowPlugin):
name = "cron_trivial_timetable_plugin"
timetables = [CronTrivialTimetable]
```
the dag file i'm using/testing:
```
import pendulum
from datetime import datetime, timedelta
from typing import Dict
from airflow.decorators import task
from airflow.models import DAG
from airflow.operators.bash import BashOperator
from airflow.operators.dummy import DummyOperator
from cron_trivial_timetable import CronTrivialTimetable
with DAG(
dag_id="example_cron_trivial_dag",
start_date=datetime(2021,11,14,12,0,tzinfo=pendulum.timezone('Asia/Tokyo')),
max_active_runs=1,
timetable=CronTrivialTimetable('*/2 * * * *', pendulum.timezone('Asia/Tokyo')),
default_args={
'owner': '********',
'depends_on_past': False,
'email': [**************],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(seconds=1),
'end_date': datetime(2101, 1, 1),
},
tags=['testing'],
catchup=False
) as dag:
dummy = BashOperator(task_id='dummy', queue='daybatch', bash_command="date")
```
### Operating System
CentOS-7
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
my scheduler & workers are in different dockers, hence i'm using CeleryExecutor
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20475 | https://github.com/apache/airflow/pull/20486 | 8acfe8d82197448d2117beab29e688a68cec156a | 9e315ff7caec7fd3d4c0dfe8b89ee2a1c7b5fe3a | "2021-12-23T06:24:23Z" | python | "2021-12-28T19:12:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,471 | ["airflow/models/dag.py", "airflow/models/dagrun.py", "tests/models/test_dag.py", "tests/models/test_dagrun.py", "tests/models/test_taskinstance.py"] | no logs returned when an end_date specified at the task level is after dag's end date | ### Apache Airflow version
2.2.2
### What happened
When you set an 'end_date' that is after the date for the dag parameter 'end_date' the task fails and no logs are given to the user.
### What you expected to happen
I expected the task to succeed with an 'end_date' of the date specified at the task level.
### How to reproduce
```
from airflow.models import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
def this_passes():
pass
with DAG(
dag_id="end_date",
schedule_interval=None,
start_date=datetime(2021, 1, 1),
end_date=datetime(2021, 1, 2),
tags=["dagparams"],
) as dag:
t1 = PythonOperator(
task_id="passer",
python_callable=this_passes,
end_date=datetime(2021, 2, 1),
)
```
### Operating System
Docker (debian:buster)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
Using the astro cli with docker image:
quay.io/astronomer/ap-airflow:2.2.2-buster-onbuild
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20471 | https://github.com/apache/airflow/pull/20920 | f612a2f56add4751e959625c49368d09a2a47d55 | 85871eba420f3324432f55f74fe57005ff47a21c | "2021-12-22T20:54:12Z" | python | "2022-03-27T19:11:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,457 | ["airflow/providers/amazon/aws/hooks/base_aws.py", "airflow/providers/amazon/aws/hooks/glue.py", "airflow/providers/amazon/aws/hooks/s3.py"] | Release 2.5.1 of Amazon Provider | ### Apache Airflow version
2.2.3 (latest released)
### What happened
The 2.5.0 version of Amazon Provider contains breaking changes - https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/commits.html specifically https://github.com/apache/airflow/commit/83b51e53062dc596a630edd4bd01407a556f1aa6 or the combination of the following:
- https://github.com/apache/airflow/commit/83b51e53062dc596a630edd4bd01407a556f1aa6
- https://github.com/apache/airflow/commit/d58df468c8d77c5d45e80f2333eb074bb7771a95
- https://github.com/apache/airflow/commit/4be04143a5f7e246127e942bf1d73abcd22ce189
and confirmed by @uranusjr
I have yanked 2.5.0 and we will need to release ~3.0.0~ 2.5.1 with backwards compatibility fixes
I have updated the constraints for 2.2.3 for now - https://github.com/apache/airflow/commit/62d490d4da17e35d4ddcd4ee38902a8a4e9bbfff
UPDATED: (@potiuk) to reflect that we are going to release 2.5.1 instead of 3.0.0
| https://github.com/apache/airflow/issues/20457 | https://github.com/apache/airflow/pull/20463 | 81f92d6c321992905d239bb9e8556720218fe745 | 2ab2ae8849bf6d80a700b1b74cef37eb187161ad | "2021-12-21T22:59:01Z" | python | "2021-12-22T16:52:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,453 | ["tests/providers/google/cloud/hooks/test_cloud_memorystore.py", "tests/providers/google/cloud/hooks/test_looker.py", "tests/providers/google/cloud/transfers/test_calendar_to_gcs.py"] | The "has_calls" is used in place of "assert_has_calls" in a few places | ### Body
As explained in https://github.com/apache/airflow/pull/20428#issuecomment-998714275 we seem to have number (not big) of tests that use "has_calls" rather than "assert_has_calls".
:scream: :scream: :scream: :scream: :scream: :scream: :scream: :scream: :scream:
What "has_calls" does is acually calling "has_calls" method on the mock :) . Which make them no-assertion tests:
:scream: :scream: :scream: :scream: :scream: :scream: :scream: :scream: :scream:
The list of those tests:
- [x] ./tests/providers/google/common/hooks/test_base_google.py: mock_check_output.has_calls(
- [x] ./tests/providers/google/common/hooks/test_base_google.py: mock_check_output.has_calls(
- [x] ./tests/providers/google/cloud/transfers/test_sheets_to_gcs.py: mock_sheet_hook.return_value.get_values.has_calls(calls)
- [x] ./tests/providers/google/cloud/transfers/test_sheets_to_gcs.py: mock_upload_data.has_calls(calls)
- [x] ./tests/providers/google/cloud/hooks/test_bigquery.py: mock_poll_job_complete.has_calls(mock.call(running_job_id), mock.call(running_job_id))
- [x] ./tests/providers/google/cloud/hooks/test_bigquery.py: mock_schema.has_calls([mock.call(x, "") for x in ["field_1", "field_2"]])
- [x] ./tests/providers/google/cloud/hooks/test_bigquery.py: assert mock_insert.has_calls(
- [x] ./tests/providers/google/cloud/hooks/test_pubsub.py: publish_method.has_calls(calls)
- [x] ./tests/providers/google/cloud/hooks/test_cloud_memorystore.py: mock_get_conn.return_value.get_instance.has_calls(
- [x] ./tests/providers/google/cloud/hooks/test_cloud_memorystore.py: mock_get_conn.return_value.get_instance.has_calls(
- [x] ./tests/providers/google/cloud/hooks/test_cloud_memorystore.py: mock_get_conn.return_value.get_instance.has_calls(
- [x] ./tests/providers/google/cloud/hooks/test_dataproc.py: mock_get_job.has_calls(calls)
- [x] ./tests/providers/google/cloud/hooks/test_dataproc.py: mock_get_job.has_calls(calls)
- [x] ./tests/providers/google/suite/operators/test_sheets.py: mock_xcom.has_calls(calls)
- [x] ./tests/providers/http/operators/test_http.py: mock_info.has_calls(calls)
- [ ] ./tests/providers/airbyte/hooks/test_airbyte.py: assert mock_get_job.has_calls(calls)
- [ ] ./tests/providers/airbyte/hooks/test_airbyte.py: assert mock_get_job.has_calls(calls)
- [ ] ./tests/providers/airbyte/hooks/test_airbyte.py: assert mock_get_job.has_calls(calls)
- [ ] ./tests/providers/airbyte/hooks/test_airbyte.py: assert mock_get_job.has_calls(calls)
- [ ] ./tests/providers/airbyte/hooks/test_airbyte.py: assert mock_get_job.has_calls(calls)
We should fix those tests and likelly add pre-commit to ban this.
Thanks to @jobegrabber for noticing it!
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/20453 | https://github.com/apache/airflow/pull/23001 | 8e75e2349791ee606203d5ba9035146e8a3be3dc | 3a2eb961ca628d962ed5fad68760ef3439b2f40b | "2021-12-21T19:28:24Z" | python | "2022-04-16T10:05:29Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,442 | ["airflow/exceptions.py", "airflow/models/dagbag.py", "airflow/models/param.py", "tests/dags/test_invalid_param.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_dag.py", "tests/models/test_dagbag.py", "tests/models/test_param.py"] | NoValueSentinel in Param is incorrectly serialised | ### Apache Airflow version
2.2.2 (latest released)
### What happened
Hi
We upgraded to version 2.2.2 and starting getting errors in the UI when we have DAG params that are numeric (tried with types 'number', 'integer' and 'float' and got the same error.
```
Python version: 3.7.12
Airflow version: 2.2.2
Node: airflow2-webserver-6df5fc45bd-5vdhs
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/param.py", line 61, in __init__
jsonschema.validate(self.value, self.schema, format_checker=FormatChecker())
File "/home/airflow/.local/lib/python3.7/site-packages/jsonschema/validators.py", line 934, in validate
raise error
jsonschema.exceptions.ValidationError: 'NoValueSentinel' is not of type 'number'
Failed validating 'type' in schema:
{'type': 'number'}
On instance:
'NoValueSentinel'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/airflow/.local/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/airflow/.local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/home/airflow/.local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/airflow/.local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/www/auth.py", line 51, in decorated
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 109, in view_func
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 72, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/www/views.py", line 2426, in graph
dag = current_app.dag_bag.get_dag(dag_id)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 186, in get_dag
self._add_dag_from_db(dag_id=dag_id, session=session)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 261, in _add_dag_from_db
dag = row.dag
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/serialized_dag.py", line 180, in dag
dag = SerializedDAG.from_dict(self.data) # type: Any
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 947, in from_dict
return cls.deserialize_dag(serialized_obj['dag'])
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 861, in deserialize_dag
v = {task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for task in v}
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 861, in <dictcomp>
v = {task["task_id"]: SerializedBaseOperator.deserialize_operator(task) for task in v}
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 641, in deserialize_operator
v = cls._deserialize_params_dict(v)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 459, in _deserialize_params_dict
op_params[k] = cls._deserialize_param(v)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/serialization/serialized_objects.py", line 439, in _deserialize_param
return class_(**kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/param.py", line 63, in __init__
raise ValueError(err)
ValueError: 'NoValueSentinel' is not of type 'number'
Failed validating 'type' in schema:
{'type': 'number'}
On instance:
'NoValueSentinel'`
```
### What you expected to happen
shouldn't cause a serialization error.
### How to reproduce
This is a small DAG to reproduce this error.
We use airflow on kubernetes with:
Python version: 3.7.12
Airflow version: 2.2.2
```
from airflow import DAG
from airflow.models.param import Param
from airflow.operators.dummy import DummyOperator
from datetime import datetime
params = {
'wd': Param(type='number', description="demo numeric param")
}
default_args ={
'depends_on_past': False,
'start_date': datetime(2021, 6, 16)
}
with DAG ('mydag',params=params,default_args=default_args,catchup=False,schedule_interval=None) as dag:
t1 = DummyOperator(task_id='task1')
```
### Operating System
centos
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
On every dag with a numeric type of param
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20442 | https://github.com/apache/airflow/pull/20802 | 717169987e76f332d7ce92bc361d2fff6966f6f0 | c59001d79facf7e472e0581ac8a538c25eebfda7 | "2021-12-21T14:14:06Z" | python | "2022-01-16T16:02:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,437 | ["airflow/providers/google/cloud/transfers/bigquery_to_gcs.py", "airflow/providers/google/cloud/transfers/bigquery_to_mssql.py", "airflow/providers/google/cloud/transfers/bigquery_to_mysql.py"] | Deprecated arg is passed to BigQueryHook | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
`apache-airflow-providers-google==1.0.0`
### Apache Airflow version
2.0.0
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
Within the `execute` method of `BigQueryToGCSOperator`, a deprecated arg – namely `bigquery_conn_id` – is being passed to `BigQueryHook`'s init method and causing a warning.
**Relevant code**: https://github.com/apache/airflow/blob/85bedd03c33a1d6c6339e846558727bf20bc16f7/airflow/providers/google/cloud/transfers/bigquery_to_gcs.py#L135
**Warning**:
```
WARNING - /home/airflow/.local/lib/python3.7/site-packages/airflow/providers/google/cloud/transfers/bigquery_to_gcs.py:140 DeprecationWarning: The bigquery_conn_id parameter has been deprecated. You should pass the gcp_conn_id parameter.
```
### What you expected to happen
No `DeprecationWarning` should occur
### How to reproduce
Call the execute method of `BigQueryToGCSOperator`
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20437 | https://github.com/apache/airflow/pull/20502 | b5d520cf73100df714c71ac9898a97bc0df29a31 | 7d4d38b546c44287f8a9d09c4fc141cbea736511 | "2021-12-21T02:01:09Z" | python | "2021-12-28T23:35:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,432 | ["airflow/providers/tableau/operators/tableau.py", "tests/providers/tableau/operators/test_tableau.py"] | Tableau provider signs out prior to blocking refresh check | ### Apache Airflow version
2.2.2 (latest released)
### What happened
It logs into the Tableau server, starts a refresh, and then throws a NotSignedInError exception:
```
[2021-12-20, 07:40:13 CST] {{base.py:70}} INFO - Using connection to: id: tableau. Host: https://xxx/, Port: None, Schema: , Login: xxx, Password: ***, extra: {}
[2021-12-20, 07:40:14 CST] {{auth_endpoint.py:37}} INFO - Signed into https://xxx/ as user with id xxx-xxx-xxx-xxx-xxx
[2021-12-20, 07:40:14 CST] {{workbooks_endpoint.py:41}} INFO - Querying all workbooks on site
[2021-12-20, 07:40:14 CST] {{tableau.py:141}} INFO - Found matching with id xxx-xxx-xxx-xxx-xxx
[2021-12-20, 07:40:14 CST] {{auth_endpoint.py:53}} INFO - Signed out
[2021-12-20, 07:40:14 CST] {{jobs_endpoint.py:41}} INFO - Query for information about job xxx-xxx-xxx-xxx-xxx
[2021-12-20, 07:40:14 CST] {{taskinstance.py:1703}} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1509, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/tableau/operators/tableau.py", line 124, in execute
if not tableau_hook.wait_for_state(
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/tableau/hooks/tableau.py", line 180, in wait_for_state
finish_code = self.get_job_status(job_id=job_id)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/tableau/hooks/tableau.py", line 163, in get_job_status
return TableauJobFinishCode(int(self.server.jobs.get_by_id(job_id).finish_code))
File "/home/airflow/.local/lib/python3.9/site-packages/tableauserverclient/server/endpoint/endpoint.py", line 177, in wrapper
return func(self, *args, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/tableauserverclient/server/endpoint/jobs_endpoint.py", line 42, in get_by_id
url = "{0}/{1}".format(self.baseurl, job_id)
File "/home/airflow/.local/lib/python3.9/site-packages/tableauserverclient/server/endpoint/jobs_endpoint.py", line 14, in baseurl
return "{0}/sites/{1}/jobs".format(self.parent_srv.baseurl, self.parent_srv.site_id)
File "/home/airflow/.local/lib/python3.9/site-packages/tableauserverclient/server/server.py", line 168, in site_id
raise NotSignedInError(error)
tableauserverclient.server.exceptions.NotSignedInError: Missing site ID. You must sign in first.
[2021-12-20, 07:40:14 CST] {{taskinstance.py:1270}} INFO - Marking task as FAILED. dag_id=tableau-refresh, task_id=refresh_tableau_workbook, execution_date=20211220T194008, start_date=20211220T194013, end_date=20211220T194014
```
### What you expected to happen
It should've logged into Tableau, started a refresh, and then waited for the refresh to finish.
### How to reproduce
Use a TableauOperator with a blocking refresh:
```
TableauOperator(
resource='workbooks',
method='refresh',
find='Tableau workbook',
match_with='name',
site_id='tableau_site',
task_id='refresh_tableau_workbook',
tableau_conn_id='tableau',
blocking_refresh=True,
dag=dag,
)
```
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
apache-airflow-providers-tableau==2.1.2
### Deployment
Docker-Compose
### Deployment details
I'm using the official `apache/airflow:2.2.2-python3.9` Docker image
### Anything else
It occurs every time I use the new TableauOperator to do a blocking refresh of a workbook.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20432 | https://github.com/apache/airflow/pull/20433 | 20d863007be02beccf598cf15909b7caf40f8019 | 636ae0a33dff63f899bc554e6585104776398bef | "2021-12-20T20:02:10Z" | python | "2021-12-22T07:55:16Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,426 | ["airflow/providers/google/common/hooks/base_google.py", "tests/providers/google/common/hooks/test_base_google.py"] | GKE Authentication not possible with user ADC and project ID set in either connection or `gcloud` config | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
`apache-airflow-providers-google==6.1.0`
### Apache Airflow version
2.1.4
### Operating System
Debian GNU/Linux 11 (bullseye)
### Deployment
Docker-Compose
### Deployment details
At my company we're developing our Airflow DAGs in local environments based on Docker Compose.
To authenticate against the GCP, we don't use service accounts and their keys, but instead use our user credentials and set them up as Application Default Credentials (ADC), i.e. we run
```
$ gcloud auth login
$ gcloud gcloud auth application-default login
```
We also set the default the Project ID in both `gcloud` and Airflow connections, i.e.
```
$ gcloud config set project $PROJECT
$ # run the following inside the Airflow Docker container
$ airflow connections delete google_cloud_default
$ airflow connections add google_cloud_default \
--conn-type=google_cloud_platform \
--conn-extra='{"extra__google_cloud_platform__project":"$PROJECT"}'
```
### What happened
It seems that due to [this part](https://github.com/apache/airflow/blob/ed604b6/airflow/providers/google/common/hooks/base_google.py#L518..L541) in `base_google.py`, when the Project ID is set in either the Airflow connections or `gcloud` config, `gcloud auth` (specifically `gcloud auth activate-refresh-token`) will not be executed.
This results in e.g. `gcloud container clusters get-credentials` in the `GKEStartPodOperator` to fail, since `You do not currently have an active account selected`:
```
[2021-12-20 15:21:12,059] {credentials_provider.py:295} INFO - Getting connection using `google.auth.default()` since no key file is defined for hook.
[2021-12-20 15:21:12,073] {logging_mixin.py:109} WARNING - /usr/local/lib/python3.8/site-packages/google/auth/_default.py:70 UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK without a quota project. You might receive a "quota exceeded" or "API not enabled" error. We recommend you rerun `gcloud auth application-default login` and make sure a quota project is added. Or you can use service accounts instead. For more information about service accounts, see https://cloud.google.com/docs/authentication/
[2021-12-20 15:21:13,863] {process_utils.py:135} INFO - Executing cmd: gcloud container clusters get-credentials REDACTED --zone europe-west1-b --project REDACTED
[2021-12-20 15:21:13,875] {process_utils.py:139} INFO - Output:
[2021-12-20 15:21:14,522] {process_utils.py:143} INFO - ERROR: (gcloud.container.clusters.get-credentials) You do not currently have an active account selected.
[2021-12-20 15:21:14,522] {process_utils.py:143} INFO - Please run:
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO -
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO - $ gcloud auth login
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO -
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO - to obtain new credentials.
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO -
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO - If you have already logged in with a different account:
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO -
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO - $ gcloud config set account ACCOUNT
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO -
[2021-12-20 15:21:14,523] {process_utils.py:143} INFO - to select an already authenticated account to use.
[2021-12-20 15:21:14,618] {taskinstance.py:1463} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1165, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1283, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1313, in _execute_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.8/site-packages/airflow/providers/google/cloud/operators/kubernetes_engine.py", line 355, in execute
execute_in_subprocess(cmd)
File "/usr/local/lib/python3.8/site-packages/airflow/utils/process_utils.py", line 147, in execute_in_subprocess
raise subprocess.CalledProcessError(exit_code, cmd)
subprocess.CalledProcessError: Command '['gcloud', 'container', 'clusters', 'get-credentials', 'REDACTED', '--zone', 'europe-west1-b', '--project', 'REDACTED']' returned non-zero exit status 1.
```
If we set the environment variable `GOOGLE_APPLICATION_CREDENTIALS`, `gcloud auth activate-service-account` is run which only works with proper service account credentials, not user credentials.
### What you expected to happen
From my POV, it should work to
1. have the Project ID set in the `gcloud` config and/or Airflow variables and still be able to use user credentials with GCP Operators,
2. set `GOOGLE_APPLICATION_CREDENTIALS` to a file containing user credentials and be able to use these credentials with GCP Operators.
Item 1 was definitely possible in Airflow 1.
### How to reproduce
See Deployment Details. In essence:
- Run Airflow within Docker Compose (but it's not only Docker Compose that is affected, as far as I can see).
- Use user credentials with `gcloud`; `gcloud auth login`, `gcloud auth application-default login`
- Configure project ID in `gcloud` config (mounted in the Docker container) and/or Airflow connection
- Run `GKEStartOperator`
### Anything else
Currently, the only workaround (apart from using service accounts) seems to be to not set a default project in either the `gcloud` config or `google_cloud_platform` connections.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20426 | https://github.com/apache/airflow/pull/20428 | 83f8e178ba7a3d4ca012c831a5bfc2cade9e812d | 4233ebe5cea4862dbf16c9d7c72c4fdd11db9774 | "2021-12-20T15:55:54Z" | python | "2021-12-31T16:32:23Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,419 | ["docs/apache-airflow/howto/email-config.rst", "docs/apache-airflow/img/email_connection.png"] | Need clearer documentation on setting up Sendgrid to send emails | ### Describe the issue with documentation
https://airflow.apache.org/docs/apache-airflow/stable/howto/email-config.html
needs to have more documentation on how to use sendgrid to send emails. Like would installing the provider package guarantee the appearance of the "Email" connection type in the frontend? I do not think we can find a Email connection. Secondly do we need to import the providers packages in our dags to make it work or options like email_on_success etc should be automatically working once this is set up.
### How to solve the problem
Better documentation
### Anything else
Better documentation.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20419 | https://github.com/apache/airflow/pull/21958 | cc4b05654e9c8b6d1b3185c5690da87a29b66a4b | c9297808579c0d4f93acfd6791172193be19721b | "2021-12-20T11:18:14Z" | python | "2022-03-03T12:07:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,384 | ["airflow/cli/commands/task_command.py"] | DagRun for <FOO> with run_id or execution_date of 'manual__XXXXX' not found | ### Apache Airflow version
2.2.2 (latest released)
### What happened
After upgrading from Airflow 2.1.4 to 2.2.2, every DAG gives this error upon execution:
> [2021-12-17, 15:01:12 UTC] {taskinstance.py:1259} INFO - Executing <Task(_PythonDecoratedOperator): print_the_context> on 2021-12-17 15:01:08.943254+00:00
[2021-12-17, 15:01:12 UTC] {standard_task_runner.py:52} INFO - Started process 873 to run task
[2021-12-17, 15:01:12 UTC] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'example_python_operator', 'print_the_context', 'manual__2021-12-17T15:01:08.943254+00:00', '--job-id', '326', '--raw', '--subdir', 'DAGS_FOLDER/test.py', '--cfg-path', '/tmp/tmpej1imvkr', '--error-file', '/tmp/tmpqn9ad7em']
[2021-12-17, 15:01:12 UTC] {standard_task_runner.py:77} INFO - Job 326: Subtask print_the_context
[2021-12-17, 15:01:12 UTC] {standard_task_runner.py:92} ERROR - Failed to execute job 326 for task print_the_context
Traceback (most recent call last):
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 287, in task_run
ti = _get_ti(task, args.execution_date_or_run_id)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 86, in _get_ti
dag_run = _get_dag_run(task.dag, exec_date_or_run_id, create_if_necssary, session)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 80, in _get_dag_run
) from None
airflow.exceptions.DagRunNotFound: DagRun for example_python_operator with run_id or execution_date of 'manual__2021-12-17T15:01:08.943254+00:00' not found
Both tables `airflowdb.task_instance` and `airflowdb.dag_run` have rows with `run_id` equal to "manual__2021-12-17T15:01:08.943254+00:00".
The issue seems to arise in the `_get_dag_run()` function from [airflow/cli/commands/task_command.py](https://github.com/apache/airflow/blob/bb82cc0fbb7a6630eac1155d0c3b445dff13ceb6/airflow/cli/commands/task_command.py#L61-L72):
```
execution_date = None
with suppress(ParserError, TypeError):
execution_date = timezone.parse(exec_date_or_run_id)
if create_if_necessary and not execution_date:
return DagRun(dag_id=dag.dag_id, run_id=exec_date_or_run_id)
try:
return (
session.query(DagRun)
.filter(
DagRun.dag_id == dag.dag_id,
DagRun.execution_date == execution_date,
)
.one()
)
```
Here, `exec_date_or_run_id == 'manual__2021-12-17T15:01:08.943254+00:00'` and `timezone.parse(exec_date_or_run_id)` fails, meaning `execution_date` stays as `None` and the session query returns no results.
### What you expected to happen
Expect DAGs to run without the above error.
### How to reproduce
Upgraded from 2.1.4 to 2.2.2 and manually ran a few DAGs. The above log is from the [example_python_operator](https://github.com/apache/airflow/blob/main/airflow/example_dags/example_python_operator.py) DAG provided on the Airflow repo.
### Operating System
Amazon Linux 2
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
Tried `airflow db upgrade` and `airflow db reset` without any luck. The same issue appears on 2.2.3rc2.
Using MySQL 8.0.23.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20384 | https://github.com/apache/airflow/pull/20737 | 384fa4a87dfaa79a89ad8e18ac1980e07badec4b | 7947b72eee61a4596c5d8667f8442d32dcbf3f6d | "2021-12-17T16:45:45Z" | python | "2022-01-08T09:09:02Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,373 | ["airflow/www/static/js/callModal.js", "airflow/www/static/js/dag.js", "airflow/www/static/js/gantt.js", "airflow/www/static/js/graph.js"] | Paused event is logging twice in some pages. | ### Apache Airflow version
2.2.2 (latest released)
### What happened
If I pause/unpause dag in `Tree`, `Graph`, or `Gantt` page, `/paused` url is calling twice.
![image](https://user-images.githubusercontent.com/8676247/146517082-b359a81c-a53f-4aa7-a2e8-11bdbdd1e4d8.png)
So paused events are logging twice.
![image](https://user-images.githubusercontent.com/8676247/146517366-d25efa3c-a029-4aa4-a655-60a8d84a1e1c.png)
### What you expected to happen
It should be logging only once.
### How to reproduce
Go to `Tree`, `Graph`, or `Gantt` page. And pause/unpause it.
### Operating System
centos
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20373 | https://github.com/apache/airflow/pull/28410 | 9c3734bb127ff0d71a0321d0578e556552cfc934 | 2f0f02536f7773dd782bd980ae932091b7badc61 | "2021-12-17T08:59:14Z" | python | "2022-12-22T13:19:40Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,333 | ["airflow/providers/google/cloud/transfers/gcs_to_bigquery.py"] | GCSToBigQueryOperator not rendering list pulled from XCOM | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
_No response_
### Apache Airflow version
2.1.4
### Operating System
Container-Optimized OS with Containerd
### Deployment
Composer
### Deployment details
_No response_
### What happened
I am trying to load some files from GCS to BigQuery. To get the list of files which matches the prefix, I am using the GoogleCloudStorageListOperator which pushes the output to XCOM.
XCOM Push from GoogleCloudStorageListOperator:
['file1.csv', 'file2.csv', 'file3.csv', 'file4.csv', 'file5.csv']
When I am pulling the list from XCOM to use for BigQuery Load operation, it is getting rendered like:
**[**['file1.csv', 'file2.csv', 'file3.csv', 'file4.csv', 'file5.csv']**]**
Due to this I am getting the below error in GCSToBigQueryOperator :
Source URI must not contain the ',' character: gs://inbound-bucket/[['file1.csv', 'file2.csv', 'file3.csv', 'file4.csv', 'file5.csv']]
The code I am using can be found below.
_dag = DAG(AIRFLOW_DAG_ID, default_args = default_values, description = "google_operator_test", catchup = False, max_active_runs = 1 ,render_template_as_native_obj=True, schedule_interval = None)
get_gcs_file_list = GoogleCloudStorageListOperator(
task_id= "get_gcs_file_list_1",
google_cloud_storage_conn_id = "temp_google_access",
bucket= "inbound-bucket",
prefix= source_object_location,
delimiter= '.csv',
dag = dag)
#Load the data from GCS to BQ
load_to_bq_stage = GCSToBigQueryOperator(
task_id= "bqload",
bigquery_conn_id = "temp_google_access",
google_cloud_storage_conn_id = "temp_google_access",
bucket= "inbound-bucket",
source_objects= '{{task_instance.xcom_pull(task_ids="get_gcs_file_list_1")}}',
destination_project_dataset_table= dataset + "." + target_table,
write_disposition='WRITE_TRUNCATE',
field_delimiter = '|',
dag=dag)_
### What you expected to happen
The list should be rendered as is by the GCSToBigQueryOperator.
### How to reproduce
1. Push a list of strings in to XCOM
2. Perform XCOM pull in source_objects parameter of GCSToBigQueryOperator like below,
source_objects ='{{task_instance.xcom_pull(task_ids="task_name")}}'
3. Notice that the rendered source_objects have an extra parenthesis added.
### Anything else
There is a validation performed in the operator’s constructor to check whether or not the source_objecst datatype is a list. If is it not a list, then it’s presumed to be a string and wrap it as a list.Because source_objects is a template field, the field value isn’t evaluated until the task runs (aka meaning the value isn’t rendered until the execute() method of the operator).
https://github.com/apache/airflow/blob/6eac2e0807a8be5f39178f079db28ebcd2f83621/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py#L219
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20333 | https://github.com/apache/airflow/pull/20347 | 6e51608f28f4c769c019624ea0caaa0c6e671f80 | 17404f1f10efd41f98eb8a0317b578ff40f9c77d | "2021-12-16T05:47:39Z" | python | "2021-12-16T20:58:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,323 | ["airflow/decorators/__init__.py", "airflow/decorators/__init__.pyi", "airflow/decorators/sensor.py", "airflow/example_dags/example_sensor_decorator.py", "airflow/sensors/python.py", "docs/apache-airflow/tutorial/taskflow.rst", "tests/decorators/test_sensor.py"] | Add a `@task.sensor` TaskFlow decorator | ### Description
Implement a taskflow decorator that uses the decorated function as the poke method.
### Use case/motivation
Here is a sketch of the solution that might work:
[sensor_decorator.txt](https://github.com/apache/airflow/files/7721589/sensor_decorator.txt)
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20323 | https://github.com/apache/airflow/pull/22562 | a50195d617ca7c85d56b1c138f46451bc7599618 | cfd63df786e0c40723968cb8078f808ca9d39688 | "2021-12-15T17:37:59Z" | python | "2022-11-07T02:06:19Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,320 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/www/static/js/types/api-generated.ts", "tests/api_connexion/endpoints/test_user_endpoint.py"] | Airflow API throwing 500 when user does not have last name | ### Apache Airflow version
2.2.2 (latest released)
### What happened
When listing users via GET api/v1/users, a 500 error is thrown if a user in the database does not have a last name.
```
{
"detail": "'' is too short\n\nFailed validating 'minLength' in schema['allOf'][0]['properties']['users']['items']['properties']['last_name']:\n {'description': 'The user lastname', 'minLength': 1, 'type': 'string'}\n\nOn instance['users'][25]['last_name']:\n ''",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.apache.org/docs/apache-airflow/2.2.2/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
### What you expected to happen
Result set should still be returned with a null or empty value for the last name instead of a 500 error.
### How to reproduce
Create a user in the database without a last name and then hit GET api/v1/users
### Operating System
linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20320 | https://github.com/apache/airflow/pull/25476 | 98f16aa7f3b577022791494e13b6aa7057afde9d | 3421ecc21bafaf355be5b79ec4ed19768e53275a | "2021-12-15T17:15:42Z" | python | "2022-08-02T21:06:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,259 | ["Dockerfile", "Dockerfile.ci", "breeze", "dev/README_RELEASE_PROVIDER_PACKAGES.md", "docs/docker-stack/build-arg-ref.rst", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh", "scripts/docker/compile_www_assets.sh", "scripts/docker/install_airflow.sh", "scripts/docker/prepare_node_modules.sh"] | Make sure that stderr is "clean" while building the images. | Our image generates a lot of "Noise" during building, which makes "real" errors difficult to distinguish from "false negatives".
We should make sure that there are no warnings while the docker image is build.
If there are any warnings that we cannot remove there, we should add a reassuring note that we know what we are doing. | https://github.com/apache/airflow/issues/20259 | https://github.com/apache/airflow/pull/20238 | 5980d2b05eee484256c634d5efae9410265c65e9 | 4620770af4550251b5139bb99185656227335f67 | "2021-12-13T15:44:16Z" | python | "2022-01-11T09:38:34Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,254 | [".github/workflows/ci.yml", "dev/breeze/setup.cfg", "scripts/ci/images/ci_test_examples_of_prod_image_building.sh"] | CI: Rewrite testing examples of images in python | We are runing automated tests for example images building as Bash scripts - those should be converted to Python.
https://github.com/apache/airflow/blob/98514cc1599751d7611b3180c60887da0a25ff5e/.github/workflows/ci.yml#L322 | https://github.com/apache/airflow/issues/20254 | https://github.com/apache/airflow/pull/21097 | bc1f062bdebd5a92b650e2316d4d98d2097388ca | 0d623296f1e5b6354e37dc0a45b2e4ed1a13901e | "2021-12-13T09:08:56Z" | python | "2022-01-26T21:15:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,252 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/visuals/__init__.py"] | Breeze: Asciiart disabling | This is a small thing, but might be useful. Currently Breeze prints the Asciiart when started. You should be able to persistently disable the asciiart.
This is done with storing the rigth "flag" file in the ".build" directory. If the file is there, the ASCIIART should not be printed. If not, it shoudl be printed. | https://github.com/apache/airflow/issues/20252 | https://github.com/apache/airflow/pull/20645 | c9023fad4287213e4d3d77f4c66799c762bff7ba | 8cc93c4bc6ae5a99688ca2effa661d6a3e24f56f | "2021-12-13T09:04:20Z" | python | "2022-01-11T18:16:22Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,251 | ["dev/breeze/doc/BREEZE2.md", "scripts/ci/libraries/_docker_engine_resources.sh", "scripts/in_container/run_resource_check.py", "scripts/in_container/run_resource_check.sh"] | Breeze: Verify if there are enough resources avaiilable in Breeze | At entry of the Breeze command we verify if there iss enouggh CPU/memory/disk space and print (coloured) information if the resources are not enough.
We should replicate that in Python | https://github.com/apache/airflow/issues/20251 | https://github.com/apache/airflow/pull/20763 | b8526abc2c220b1e07eed83694dfee972c2e2609 | 75755d7f65fb06c6e2e74f805b877774bfa7fcda | "2021-12-13T09:01:19Z" | python | "2022-01-19T11:51:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,249 | ["airflow/jobs/base_job.py", "airflow/migrations/versions/587bdf053233_adding_index_for_dag_id_in_job.py", "docs/apache-airflow/migrations-ref.rst"] | DAG deletion is slow due to lack of database indexes on dag_id | ### Apache Airflow version
2.2.1
### What happened
We have an airflow instance for approximately 6k DAGs.
- If we delete a DAG from UI, the UI times out
- If we delete a DAG from CLI, it completes but sometimes takes up to a half-hour to finish.
Most of the execution time appears to be consumed in database queries. I know I can just throw more CPU and memory to the db instance and hope it works but I think we can do better during delete operation. Correct me if I am wrong but I think this is the code that gets executed when deleting a DAG from UI or CLI via `delete_dag.py`
```python
for model in models.base.Base._decl_class_registry.values():
if hasattr(model, "dag_id"):
if keep_records_in_log and model.__name__ == 'Log':
continue
cond = or_(model.dag_id == dag_id, model.dag_id.like(dag_id + ".%"))
count += session.query(model).filter(cond).delete(synchronize_session='fetch')
if dag.is_subdag:
parent_dag_id, task_id = dag_id.rsplit(".", 1)
for model in TaskFail, models.TaskInstance:
count += (
session.query(model).filter(model.dag_id == parent_dag_id, model.task_id == task_id).delete()
)
```
I see we are iterating over all the models and doing a `dag_id` match. Some of the tables don't have an index over `dag_id` column like `job` which is making this operation really slow. This could be one easy fix for this issue.
For example, the following query took 20 mins to finish in 16cpu 32gb Postgres instance:
```sql
SELECT job.id AS job_id FROM job WHERE job.dag_id = $1 OR job.dag_id LIKE $2
```
and explain is as follows
```sql
EXPLAIN SELECT job.id AS job_id FROM job WHERE job.dag_id = '';
QUERY PLAN
---------------------------------------------------------------------------
Gather (cost=1000.00..1799110.10 rows=6351 width=8)
Workers Planned: 2
-> Parallel Seq Scan on job (cost=0.00..1797475.00 rows=2646 width=8)
Filter: ((dag_id)::text = ''::text)
(4 rows)
```
This is just one of the many queries that are being executed during the delete operation.
### What you expected to happen
Deletion of DAG should not take this much time.
### How to reproduce
_No response_
### Operating System
nix
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20249 | https://github.com/apache/airflow/pull/20282 | 6d25d63679085279ca1672c2eee2c45d6704efaa | ac9f29da200c208bb52d412186c5a1b936eb0b5a | "2021-12-13T07:31:08Z" | python | "2021-12-30T10:26:24Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,225 | ["airflow/example_dags/example_kubernetes_executor.py", "airflow/example_dags/example_python_operator.py", "airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py"] | Various example dag errors on db init on fresh install | ### Apache Airflow version
2.2.3rc1 (release candidate)
### What happened
After fresh install of the latest release and running db init it getting below error messages among usual db init messages. Note that I deliberately didn't chose to use VENV but I think that should be handled gracefully rather than below errors. Also relates to https://github.com/apache/airflow/pull/19355
```ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/lib/python3.8/site-packages/airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/dagbag.py", line 331, in _load_modules_from_file
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/usr/local/lib/python3.8/site-packages/airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py", line 81, in <module>
tutorial_etl_dag = tutorial_taskflow_api_etl_virtualenv()
File "/usr/local/lib/python3.8/site-packages/airflow/models/dag.py", line 2984, in factory
f(**f_kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py", line 76, in tutorial_taskflow_api_etl_virtualenv
order_data = extract()
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/base.py", line 219, in factory
op = decorated_operator_class(
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/python_virtualenv.py", line 61, in __init__
super().__init__(kwargs_to_upstream=kwargs_to_upstream, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/base.py", line 131, in __init__
super().__init__(**kwargs_to_upstream, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/operators/python.py", line 371, in __init__
raise AirflowException('PythonVirtualenvOperator requires virtualenv, please install it.')
airflow.exceptions.AirflowException: PythonVirtualenvOperator requires virtualenv, please install it.
ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/lib/python3.8/site-packages/airflow/example_dags/example_python_operator.py
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/dagbag.py", line 331, in _load_modules_from_file
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/usr/local/lib/python3.8/site-packages/airflow/example_dags/example_python_operator.py", line 86, in <module>
virtualenv_task = callable_virtualenv()
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/base.py", line 219, in factory
op = decorated_operator_class(
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/python_virtualenv.py", line 61, in __init__
super().__init__(kwargs_to_upstream=kwargs_to_upstream, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/decorators/base.py", line 131, in __init__
super().__init__(**kwargs_to_upstream, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 188, in apply_defaults
result = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/operators/python.py", line 371, in __init__
raise AirflowException('PythonVirtualenvOperator requires virtualenv, please install it.')
airflow.exceptions.AirflowException: PythonVirtualenvOperator requires virtualenv, please install it.
WARNI [unusual_prefix_97712dd65736a8bf7fa3a692877470abf8eebade_example_kubernetes_executor] Could not import DAGs in example_kubernetes_executor.py
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/example_dags/example_kubernetes_executor.py", line 36, in <module>
from kubernetes.client import models as k8s
ModuleNotFoundError: No module named 'kubernetes'
WARNI [unusual_prefix_97712dd65736a8bf7fa3a692877470abf8eebade_example_kubernetes_executor] Install Kubernetes dependencies with: pip install apache-airflow[cncf.kubernetes]
ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/lib/python3.8/site-packages/airflow/example_dags/example_kubernetes_executor.py
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/dagbag.py", line 331, in _load_modules_from_file
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/usr/local/lib/python3.8/site-packages/airflow/example_dags/example_kubernetes_executor.py", line 51, in <module>
"pod_override": k8s.V1Pod(metadata=k8s.V1ObjectMeta(annotations={"test": "annotation"}))
NameError: name 'k8s' is not defined
```
### What you expected to happen
No errors, but warnings
### How to reproduce
docker run --name test-airflow-2.2.3rc1 -it python:3.8.12-buster bash
airflow db init
pip install apache-airflow==2.2.3rc1
### Operating System
debian
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
docker run --name test-airflow-2.2.3rc1 -it python:3.8.12-buster bash
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20225 | https://github.com/apache/airflow/pull/20295 | 60b72dd8f5161f0e00f29a9de54522682f7cd5f6 | 5a6c022f946d1be2bd68a42a7a920fdf932932e5 | "2021-12-11T23:46:59Z" | python | "2021-12-14T21:28:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,215 | ["airflow/providers/amazon/aws/example_dags/example_emr_serverless.py", "airflow/providers/amazon/aws/hooks/emr.py", "airflow/providers/amazon/aws/operators/emr.py", "airflow/providers/amazon/aws/sensors/emr.py", "airflow/providers/amazon/provider.yaml", "docs/apache-airflow-providers-amazon/operators/emr_serverless.rst", "tests/providers/amazon/aws/hooks/test_emr_serverless.py", "tests/providers/amazon/aws/operators/test_emr_serverless.py"] | EMR serverless, new operator | ### Description
A new EMR serverless has been announced and it is already available, see:
- https://aws.amazon.com/blogs/big-data/announcing-amazon-emr-serverless-preview-run-big-data-applications-without-managing-servers/
- https://aws.amazon.com/emr/serverless/
Having an operator for creating applications and submitting jobs to EMR serverless would be awesome.
### Use case/motivation
New operator for working with EMR serverless.
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20215 | https://github.com/apache/airflow/pull/25324 | 5480b4ca499cfe37677ac1ae1298a2737a78115d | 8df84e99b7319740990124736d0fc545165e7114 | "2021-12-11T01:40:22Z" | python | "2022-08-05T16:54:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,197 | ["airflow/www/templates/airflow/dag.html", "tests/www/views/test_views_tasks.py"] | Triggering a DAG in Graph view switches to Tree view | ### Apache Airflow version
2.2.2 (latest released)
### What happened
After I trigger a DAG from the Graph view, the view switches to the Tree view (happens on both 2.2.2 and main).
### What you expected to happen
I expect to stay in the Graph view, ideally switch to the newly created DAG run
### How to reproduce
Trigger a DAG from the Graph view
### Operating System
N/A
### Versions of Apache Airflow Providers
N/A
### Deployment
Astronomer
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20197 | https://github.com/apache/airflow/pull/20955 | 10f5db863e387c0fd7369cf521d624b6df77a65d | 928dafe6c495bbf3e03d14473753fce915134a46 | "2021-12-10T11:15:26Z" | python | "2022-01-20T08:15:37Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,155 | ["setup.py"] | ssl_cert_reqs has been deprecated in pymongo 3.9 | ### Apache Airflow Provider(s)
mongo
### Versions of Apache Airflow Providers
2.2.0
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
`ssl_cert_reqs` has been deprecated in pymongo 3.9
https://github.com/apache/airflow/blob/providers-mongo/2.2.0/airflow/providers/mongo/hooks/mongo.py#L94
```
{taskinstance.py:1703} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1509, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/transfers/mongo_to_s3.py", line 123, in execute
mongo_db=self.mongo_db,
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 145, in find
collection = self.get_collection(mongo_collection, mongo_db=mongo_db)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 116, in get_collection
mongo_conn: MongoClient = self.get_conn()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 96, in get_conn
self.client = MongoClient(self.uri, **options)
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/mongo_client.py", line 707, in __init__
keyword_opts.cased_key(k), v) for k, v in keyword_opts.items()))
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/mongo_client.py", line 707, in <genexpr>
keyword_opts.cased_key(k), v) for k, v in keyword_opts.items()))
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/common.py", line 740, in validate
value = validator(option, value)
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/common.py", line 144, in raise_config_error
raise ConfigurationError("Unknown option %s" % (key,))
pymongo.errors.ConfigurationError: Unknown option ssl_cert_reqs
```
Ref:
https://pymongo.readthedocs.io/en/stable/changelog.html#changes-in-version-3-9-0
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20155 | https://github.com/apache/airflow/pull/20511 | 86a249007604307bdf0f69012dbd1b783c8750e5 | f85880e989d7751cfa3ae2d4665d7cc0cb3cc945 | "2021-12-09T08:30:58Z" | python | "2021-12-27T19:28:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,110 | ["airflow/www/extensions/init_views.py", "tests/api_connexion/test_cors.py"] | CORS access_control_allow_origin header never returned | ### Apache Airflow version
2.2.2 (latest released)
### What happened
To fix CORS problem added the [access_control_allow_headers](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-headers), [access_control_allow_methods](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-methods), [access_control_allow_origins](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-origins) variables to the 2.2.2 docker-compose file provided in documentation. Both header, and methods returns with the correct value, but origins never does.
### What you expected to happen
The CORS response returning with provided origin header value.
### How to reproduce
Download the latest docker-compose from documentation add the following lines:
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_HEADERS: 'content-type, origin, authorization, accept'`
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_METHODS: 'GET, POST, OPTIONS, DELETE'`
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_ORIGINS: '*'`
run and call with a CORS preflight
### Operating System
Windows 11
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
It's repeatable regardless of ORIGINS value. There was a name change on this variable that's possibly not handled.
On 2.1.4 the same works without problems.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20110 | https://github.com/apache/airflow/pull/25553 | 1d8507af07353e5cf29a860314b5ba5caad5cdf3 | e81b27e713e9ef6f7104c7038f0c37cc55d96593 | "2021-12-07T16:36:19Z" | python | "2022-08-05T17:41:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,099 | ["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"] | Scheduler crashlooping when dag with task_concurrency is deleted | ### Apache Airflow version
2.2.2 (latest released)
### What happened
After deleting the dag, the scheduler starts crashlooping and cannot recover. This means that an issue with the dag causes the whole environment to be down.
The stacktrace is as follows:
airflow-scheduler [2021-12-07 09:30:07,483] {kubernetes_executor.py:791} INFO - Shutting down Kubernetes executor
airflow-scheduler [2021-12-07 09:30:08,509] {process_utils.py:100} INFO - Sending Signals.SIGTERM to GPID 1472
airflow-scheduler [2021-12-07 09:30:08,681] {process_utils.py:66} INFO - Process psutil.Process(pid=1472, status='terminated', exitcode=0, started='09:28:37') (1472) terminated with exit
airflow-scheduler [2021-12-07 09:30:08,681] {scheduler_job.py:655} INFO - Exited execute loop
airflow-scheduler Traceback (most recent call last):
airflow-scheduler File "/home/airflow/.local/bin/airflow", line 8, in <module>
airflow-scheduler sys.exit(main())
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/__main__.py", line 48, in main
airflow-scheduler args.func(args)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command
airflow-scheduler return func(*args, **kwargs)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/cli.py", line 92, in wrapper
airflow-scheduler return f(*args, **kwargs)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/scheduler_command.py", line 75, in scheduler
airflow-scheduler _run_scheduler_job(args=args)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/commands/scheduler_command.py", line 46, in _run_scheduler_job
airflow-scheduler job.run()
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/base_job.py", line 245, in run
airflow-scheduler self._execute()
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 628, in _execute
airflow-scheduler self._run_scheduler_loop()
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 709, in _run_scheduler_loop
airflow-scheduler num_queued_tis = self._do_scheduling(session)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 820, in _do_scheduling
airflow-scheduler num_queued_tis = self._critical_section_execute_task_instances(session=session)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 483, in _critical_section_execute_task_instances
airflow-scheduler queued_tis = self._executable_task_instances_to_queued(max_tis, session=session)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/utils/session.py", line 67, in wrapper
airflow-scheduler return func(*args, **kwargs)
airflow-scheduler File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 366, in _executable_task_instances_to_queued
airflow-scheduler if serialized_dag.has_task(task_instance.task_id):
airflow-scheduler AttributeError: 'NoneType' object has no attribute 'has_task'
### What you expected to happen
I expect that the scheduler does not crash because the dag gets deleted. The biggest issue however is that the whole environment goes down, it would be acceptable that the scheduler has issues with that dag (it is deleted after all) but it should not affect all other dags on the environment.
### How to reproduce
1. I created the following dag:
`
from airflow import DAG
from datafy.operators import DatafyContainerOperatorV2
from datetime import datetime, timedelta
default_args = {
"owner": "Datafy",
"depends_on_past": False,
"start_date": datetime(year=2021, month=12, day=1),
"task_concurrency": 4,
"retries": 2,
"retry_delay": timedelta(minutes=5),
}
dag = DAG(
"testnielsdev", default_args=default_args, max_active_runs=default_args["task_concurrency"] + 1, schedule_interval="0 1 * * *",
)
DatafyContainerOperatorV2(
dag=dag,
task_id="sample",
cmds=["python"],
arguments=["-m", "testnielsdev.sample", "--date", "{{ ds }}", "--env", "{{ macros.datafy.env() }}"],
instance_type="mx_small",
instance_life_cycle="spot",
)
`
When looking at the airflow code, the most important setting apart from the defaults is to specify task_concurrency.
2. I enable the dag
3. I delete it. When the file gets removed, the scheduler starts crashlooping.
### Operating System
We use the default airflow docker image
### Versions of Apache Airflow Providers
Not relevant
### Deployment
Other Docker-based deployment
### Deployment details
Not relevant
### Anything else
It occurred at one of our customers and I was quickly able to preproduce the issue.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20099 | https://github.com/apache/airflow/pull/20349 | dba00ce6a32b7f50153887c6974f62985ca8023f | 98715760f72e5205c291293088b5e79636884491 | "2021-12-07T09:51:50Z" | python | "2022-01-13T21:23:10Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,092 | ["airflow/configuration.py", "docs/apache-airflow/howto/set-config.rst", "tests/core/test_configuration.py"] | PR #18772 breaks `sql_alchemy_conn_cmd` config | ### Apache Airflow version
2.2.1
### What happened
#18772 added two options to `tmp_configuration_copy()` and defaults tasks run as the same user to generate the temp configuration file for the task runner with the options `include_env=False` and `include_cmds=False`. This is a change from the previous defaults which materialized the values of `*_cmd` configs into the JSON dump.
This presents a problem because when using `sql_alchemy_conn_cmd` to set the database connection and running as the same user the temporary config JSON dump now includes both `sql_alchemy_conn`, set to the Airflow distribution default as well as the user set `sql_alchemy_conn_cmd`. And because bare settings take precedence over `_cmd` versions while the Airflow worker can connect to the configured DB the task runner itself will instead use a non-existent SQLite DB causing all tasks to fail.
TLDR; The temp JSON config dump used to look something like:
```
{
"core": {
"sql_alchemy_conn": "mysql://...",
...
}
}
```
Now it looks something like:
```
{
"core": {
"sql_alchemy_conn": "sqlite:////var/opt/airflow/airflow.db",
"sql_alchemy_conn_cmd": "/etc/airflow/get-config 'core/sql_alchemy_conn'"
...
}
}
```
But because `sql_alchemy_conn` is set `sql_alchemy_conn_cmd` never gets called and tasks are unable to access the Airflow DB.
### What you expected to happen
I'm not quite sure what is the preferred method to fix this issue. I guess there are a couple options:
- Remove the bare `sensitive_config_values` if either `_cmd` or `_secret` versions exist
- Ignore the Airflow default value in addition to empty values for bare `sensitive_config_values` during parsing
- Go back to materializing the sensitive configs
### How to reproduce
With either Airflow 2.2.1 or 2.2.2 configure the DB with `sql_alchemy_conn_cmd` and remove the `sql_alchemy_conn` config from your config file. Then run the Airflow worker and task runner as the same user. Try running any task and see that the task tries to access the default SQLite store instead of the configured one.
### Operating System
Debian Bullseye
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20092 | https://github.com/apache/airflow/pull/21539 | e93cd4b2cfa98be70f6521832cfbd4d6b5551e30 | e07bc63ec0e5b679c87de8e8d4cdff1cf4671146 | "2021-12-07T04:31:34Z" | python | "2022-03-15T18:06:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,063 | ["airflow/models/dag.py", "airflow/www/views.py"] | Forward slash in `dag_run_id` gives rise to trouble accessing things through the REST API | ### Apache Airflow version
2.1.4
### Operating System
linux
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.2.0
apache-airflow-providers-celery==2.0.0
apache-airflow-providers-cncf-kubernetes==2.0.2
apache-airflow-providers-docker==2.1.1
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==5.1.0
apache-airflow-providers-grpc==2.0.1
apache-airflow-providers-hashicorp==2.1.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.1.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.2.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sendgrid==2.0.1
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-slack==4.0.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.1.1
### Deployment
Docker-Compose
### Deployment details
We tend to trigger dag runs by some external event, e.g., a media-file upload, see #19745. It is useful to use the media-file path as a dag run id. The media-id can come with some partial path, e.g., `path/to/mediafile`. All this seems to work fine in airflow, but we can't figure out a way to use the such a dag run id in the REST API, as the forward slashes `/` interfere with the API routing.
### What happened
When using the API route `api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` in, e.g., a HTTP GET, we expect a dag run to be found when `dag_run_id` has the value `path/to/mediafile`, but instead a `.status: 404` is returned. When we change the `dag_run_id` to the format `path|to|mediafile`, the dag run is returned.
### What you expected to happen
We would expect a dag run to be returned, even if it contains the character `/`
### How to reproduce
Trigger a dag using a dag_run_id that contains a `/`, then try to retrieve it though the REST API.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20063 | https://github.com/apache/airflow/pull/23106 | ebc1f14db3a1b14f2535462e97a6407f48b19f7c | 451c7cbc42a83a180c4362693508ed33dd1d1dab | "2021-12-06T09:00:59Z" | python | "2022-05-03T21:22:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,032 | ["airflow/providers/snowflake/hooks/snowflake.py", "tests/providers/snowflake/hooks/test_snowflake.py"] | Snowflake Provider - Hook's support for not providing a region is broken when using SQLAlchemy | ### Apache Airflow Provider(s)
snowflake
### Versions of Apache Airflow Providers
Versions 2.2.x (since https://github.com/apache/airflow/commit/0a37be3e3cf9289f63f1506bc31db409c2b46738).
### Apache Airflow version
2.2.1
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Other 3rd-party Helm chart
### Deployment details
Bitnami Airflow Helm chart @ version 8.0.2
### What happened
When connecting to Snowflake via SQLAlchemy using the Snowflake Hook, I get an error that the URL is not valid because my Snowflake instance is in US West 2 (Oregon) which means I don't provide a region explicitly. Snowflake's documentation says:
> If the account is located in the AWS US West (Oregon) region, no additional segments are required and the URL would be xy12345.snowflakecomputing.com
The error is that `xy12345..snowflakecomputing.com` is not a valid URL (note the double-dot caused by the lack of a region).
### What you expected to happen
I expect the connection to be successful.
### How to reproduce
You can use the default snowflake connection if you have one defined and see this problem with the following one-liner:
```shell
python -c 'from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook; SnowflakeHook().get_sqlalchemy_engine().connect()'
```
### Anything else
Fortunately I imagine the fix for this is just to leave the region URL component out when `region` is `None` here: https://github.com/apache/airflow/commit/0a37be3e3cf9289f63f1506bc31db409c2b46738#diff-2b674ac999a5b938fe5045f6475b0c5cc76e4cab89174ac448a9e1d41a5c04d5R215.
Using version `2.1.1` of the Snowflake provider with version `2.2.1` of Airflow is currently a viable workaround so for now I am just avoiding the update to the provider.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20032 | https://github.com/apache/airflow/pull/20509 | b7086f9815d3856cb4f3ee5bbc78657f19df9d2d | a632b74846bae28408fb4c1b38671fae23ca005c | "2021-12-04T00:26:56Z" | python | "2021-12-28T12:54:54Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,007 | ["airflow/providers/google/cloud/transfers/postgres_to_gcs.py"] | PostgresToGCSOperator fail on empty table and use_server_side_cursor=True | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==6.1.0
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
When I'm execute `PostgresToGCSOperator` on empty table and set `use_server_side_cursor=True` the operator fails with error:
```
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 154, in execute
files_to_upload = self._write_local_data_files(cursor)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 213, in _write_local_data_files
row = self.convert_types(schema, col_type_dict, row)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 174, in convert_types
return [self.convert_type(value, col_type_dict.get(name)) for name, value in zip(schema, row)]
TypeError: 'NoneType' object is not iterable
```
Operator command when I'm using:
```python
task_send = PostgresToGCSOperator(
task_id=f'send_{table}',
postgres_conn_id='postgres_raw',
gcp_conn_id=gcp_conn_id,
sql=f'SELECT * FROM public.{table}',
use_server_side_cursor=True,
bucket=bucket,
filename=f'{table}.csv',
export_format='csv',
)
```
### What you expected to happen
I'm expected, that operator on empty table not creating file and no upload it on Google Cloud.
### How to reproduce
- Create empty postgresql table.
- Create dag with task with PostgresToGCSOperator. that upload this table in Google Cloud.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20007 | https://github.com/apache/airflow/pull/21307 | 00f0025abf6500af67f4c5b7543d45658d31b3b2 | 2eb10565b2075d89eb283bd53462c00f5d54ab55 | "2021-12-03T08:17:25Z" | python | "2022-02-15T11:41:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,989 | ["airflow/www/static/js/dag_dependencies.js"] | Arrows in DAG dependencies view are not consistent | ### Apache Airflow version
main (development)
### Operating System
N/A
### Versions of Apache Airflow Providers
N/A
### Deployment
Other
### Deployment details
_No response_
### What happened
The arrows in the DAG dependencies view are not consistent with the graph view:
Graph View:
![image](https://user-images.githubusercontent.com/6249654/144484716-31557857-1be0-441b-9bf3-828fa6629a8a.png)
DAG dependencies view:
![image](https://user-images.githubusercontent.com/6249654/144484794-e0e3cb26-b0e7-4f06-98c2-9e903b09e0da.png)
### What you expected to happen
All arrows should be filled black
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19989 | https://github.com/apache/airflow/pull/20303 | 2d9338f2b7fa6b7aadb6a81cd5fc3b3ad8302a4a | 28045696dd3ea7207b1162c2343ba142e1f75e5d | "2021-12-02T18:53:08Z" | python | "2021-12-15T03:50:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,988 | ["airflow/utils/db.py"] | Migrations fail due to GTID Consistency Violation when using GTID Replication | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other 3rd-party Helm chart
### Deployment details
Using MySQL 8.0 via Google CloudSQL
### What happened
When upgrading from 2.1.2 to 2.2.2 the `airflow db upgrade` command failed with the following command:
```
sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) (1786, 'Statement violates GTID consistency: CREATE TABLE ... SELECT.')
[SQL: create table _airflow_moved__2_2__task_instance as select source.* from task_instance as source
left join dag_run as dr
on (source.dag_id = dr.dag_id and source.execution_date = dr.execution_date)
where dr.id is null
]
```
On further investigation, it looks like `CREATE TABLE AS SELECT ...` queries will cause GTID consistency violations whenever mySQL GTID-based replication is used ([source](https://dev.mysql.com/doc/refman/5.7/en/replication-gtids-restrictions.html)).
This is used by default on all Google CloudSQL mySQL instances, so they will always block these queries ([source](https://cloud.google.com/sql/docs/mysql/features#unsupported-statements))
It looks like these queries are created here:
https://github.com/apache/airflow/blob/eaa8ac72fc901de163b912a94dbe675045d2a009/airflow/utils/db.py#L739-L747
Could we refactor this into two separate queries like:
```python
else:
# Postgres, MySQL and SQLite all have the same CREATE TABLE a AS SELECT ... syntax
session.execute(
text(
f"create table {target_table_name} like {source_table}"
)
)
session.execute(
text(
f"INSERT INTO {target_table_name} select source.* from {source_table} as source " + where_clause
)
)
```
In order to preserve the GTID consistency?
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19988 | https://github.com/apache/airflow/pull/19999 | 480c333c45d31cdfdc63cdfceecd4ad8529eefd4 | 49965016594618211e47feaaa81f6a03f61ec708 | "2021-12-02T18:51:29Z" | python | "2021-12-03T16:15:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,986 | ["BREEZE.rst", "PULL_REQUEST_WORKFLOW.rst", "README.md", "TESTING.rst", "breeze-complete", "docs/apache-airflow/concepts/scheduler.rst", "docs/apache-airflow/howto/set-up-database.rst", "docs/apache-airflow/installation/prerequisites.rst", "scripts/ci/libraries/_initialization.sh"] | Postgres 9.6 end of support | ### Describe the issue with documentation
since November 11, 2021 the support of Postgres 9.6 is finish
https://www.postgresql.org/support/versioning/
### How to solve the problem
remove 9.6 from airflow
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19986 | https://github.com/apache/airflow/pull/19987 | 538612c3326b5fd0be4f4114f85e6f3063b5d49c | a299cbf4ce95af49132a6c7b17cd6a0355544836 | "2021-12-02T18:32:02Z" | python | "2021-12-05T23:04:01Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,977 | ["airflow/providers/google/cloud/transfers/gdrive_to_gcs.py", "tests/providers/google/cloud/transfers/test_gdrive_to_gcs.py"] | GoogleDriveToGCSOperator.dry_run() raises AttributeError due to deprecated template_fields | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==5.1.0
### Apache Airflow version
2.1.4
### Operating System
MacOS BigSur 11.6
### Deployment
Virtualenv installation
### Deployment details
`tox4` virtual environment
### What happened
1) Unit testing some DAG task operators with `asset op.dry_run() is None`
2) `AttributeError` is raised when `baseoperator` iterates through `template_fields` and calls `getattr`
```python
def dry_run(self) -> None:
"""Performs dry run for the operator - just render template fields."""
self.log.info('Dry run')
for field in self.template_fields:
content = getattr(self, field)
```
3) Caused by deprecated `destination_bucket` argument, is being assigned to `self.bucket_name`
```python
template_fields = [
"bucket_name",
"object_name",
"destination_bucket",
"destination_object",
"folder_id",
"file_name",
"drive_id",
"impersonation_chain",
]
def __init__(
self,
*,
bucket_name: Optional[str] = None,
object_name: Optional[str] = None,
destination_bucket: Optional[str] = None, # deprecated
destination_object: Optional[str] = None, # deprecated
file_name: str,
folder_id: str,
drive_id: Optional[str] = None,
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.bucket_name = destination_bucket or bucket_name
if destination_bucket:
warnings.warn(
"`destination_bucket` is deprecated please use `bucket_name`",
DeprecationWarning,
stacklevel=2,
)
self.object_name = destination_object or object_name
if destination_object:
warnings.warn(
"`destination_object` is deprecated please use `object_name`",
DeprecationWarning,
stacklevel=2,
)
self.folder_id = folder_id
self.drive_id = drive_id
self.file_name = file_name
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
```
### What you expected to happen
Calling `op.dry_run()` should return None and not raise any exceptions.
The templated fields contains deprecated arguments (`destination_bucket`, `destination_object`), and aren't initialized in the init method for the class.
The base operator loops through these templated fields, but since `GoogleDriveToGCSOperator` does not initialize `self.destination_bucket` or `self.destination_object`, it raises an `AttributeError`
### How to reproduce
```python
from airflow.providers.google.cloud.transfers import gdrive_to_gcs
# pytest fixtures included as arguments
# won't include for brevity, but can provide if necessary
def test_gdrive_to_gcs_transfer(
test_dag,
mock_gcp_default_conn,
patched_log_entry,
today
):
op = gdrive_to_gcs.GoogleDriveToGCSOperator(
task_id="test_gcs_to_gdrive_transfer",
dag=test_dag,
bucket_name="some-other-bucket",
object_name="thing_i_want_to_copy.csv",
file_name="my_file.csv",
folder_id="my_folder",
drive_id="some_drive_id",
)
assert op.dry_run() is None
```
### Anything else
Not sure where it would be appropriate to address this issue since the deprecated fields support backward compatibility to previous versions of the operator.
This is my first time contributing to the project, but hope this is helpful.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19977 | https://github.com/apache/airflow/pull/19991 | 53b241534576f1a85fe7f87ed66793d43f3a564e | cb082d361a61da7040e044ff2c1f7758142a9b2d | "2021-12-02T15:14:15Z" | python | "2021-12-02T22:25:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,970 | ["dev/breeze/src/airflow_breeze/branch_defaults.py", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/ci/build_image.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/console.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/utils.py"] | Breeze: Build CI images with Breeze | This quite a task and it should likely be split into smaller PRs until completed.
Breeze has the possibility of building CI and PROD images, but this one focuses only on building the CI image.
Building the image is not very straightforward, because in mulitple places the images are build differently:
* for different Python version
* with different caches
* with different build options (upgrade to newer dependencies for example)
Also the image is multiple steps
* building base python image (if needed)
* building the CI image
The whole story about images (both PROD and CI) is described here:
https://github.com/apache/airflow/blob/main/IMAGES.rst
This change should end up withbuild-ci-image command that should build the image with all the possible flags.
NOTE: The old `breeze build image` command also implements complex logic to determine whether images should be pulled or whether they need to be rebuild at all. This is NOT part of this change and it will be implemented later.
Here are the current flags.
```
./breeze build-image --help
Good version of docker 20.10.7.
Build image
Detailed usage for command: build-image
breeze build-image [FLAGS]
Builds docker image (CI or production) without entering the container. You can pass
additional options to this command, such as:
Choosing python version:
'--python'
Choosing cache option:
'--build-cache-local' or '-build-cache-pulled', or '--build-cache-none'
Choosing whether to force pull images or force build the image:
'--force-build-image', '--force-pull-image'
Checking if the base python image has been updated:
'--check-if-base-python-image-updated'
You can also pass '--production-image' flag to build production image rather than CI image.
For GitHub repository, the '--github-repository' can be used to choose repository
to pull/push images.
Flags:
-p, --python PYTHON_MAJOR_MINOR_VERSION
Python version used for the image. This is always major/minor version.
One of:
3.7 3.8 3.9 3.6
-a, --install-airflow-version INSTALL_AIRFLOW_VERSION
Uses different version of Airflow when building PROD image.
2.0.2 2.0.1 2.0.0 wheel sdist
-t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
Installs Airflow directly from reference in GitHub when building PROD image.
This can be a GitHub branch like main or v2-2-test, or a tag like 2.2.0rc1.
--installation-method INSTALLATION_METHOD
Method of installing Airflow in PROD image - either from the sources ('.')
or from package 'apache-airflow' to install from PyPI.
Default in Breeze is to install from sources. One of:
. apache-airflow
--upgrade-to-newer-dependencies
Upgrades PIP packages to latest versions available without looking at the constraints.
-I, --production-image
Use production image for entering the environment and builds (not for tests).
-F, --force-build-images
Forces building of the local docker images. The images are rebuilt
automatically for the first time or when changes are detected in
package-related files, but you can force it using this flag.
-P, --force-pull-images
Forces pulling of images from GitHub Container Registry before building to populate cache.
The images are pulled by default only for the first time you run the
environment, later the locally build images are used as cache.
--check-if-base-python-image-updated
Checks if Python base image from DockerHub has been updated vs the current python base
image we store in GitHub Container Registry. Python images are updated regularly with
security fixes, this switch will check if a new one has been released and will pull and
prepare a new base python based on the latest one.
--cleanup-docker-context-files
Removes whl and tar.gz files created in docker-context-files before running the command.
In case there are some files there it unnecessarily increases the context size and
makes the COPY . always invalidated - if you happen to have those files when you build your
image.
Customization options:
-E, --extras EXTRAS
Extras to pass to build images The default are different for CI and production images:
CI image:
devel_ci
Production image:
amazon,async,celery,cncf.kubernetes,dask,docker,elasticsearch,ftp,google,google_auth,
grpc,hashicorp,http,ldap,microsoft.azure,mysql,odbc,pandas,postgres,redis,sendgrid,
sftp,slack,ssh,statsd,virtualenv
--image-tag TAG
Additional tag in the image.
--skip-installing-airflow-providers-from-sources
By default 'pip install' in Airflow 2.0 installs only the provider packages that
are needed by the extras. When you build image during the development (which is
default in Breeze) all providers are installed by default from sources.
You can disable it by adding this flag but then you have to install providers from
wheel packages via --use-packages-from-dist flag.
--disable-pypi-when-building
Disable installing Airflow from pypi when building. If you use this flag and want
to install Airflow, you have to install it from packages placed in
'docker-context-files' and use --install-from-docker-context-files flag.
--additional-extras ADDITIONAL_EXTRAS
Additional extras to pass to build images The default is no additional extras.
--additional-python-deps ADDITIONAL_PYTHON_DEPS
Additional python dependencies to use when building the images.
--dev-apt-command DEV_APT_COMMAND
The basic command executed before dev apt deps are installed.
--additional-dev-apt-command ADDITIONAL_DEV_APT_COMMAND
Additional command executed before dev apt deps are installed.
--additional-dev-apt-deps ADDITIONAL_DEV_APT_DEPS
Additional apt dev dependencies to use when building the images.
--dev-apt-deps DEV_APT_DEPS
The basic apt dev dependencies to use when building the images.
--additional-dev-apt-deps ADDITIONAL_DEV_DEPS
Additional apt dev dependencies to use when building the images.
--additional-dev-apt-envs ADDITIONAL_DEV_APT_ENVS
Additional environment variables set when adding dev dependencies.
--runtime-apt-command RUNTIME_APT_COMMAND
The basic command executed before runtime apt deps are installed.
--additional-runtime-apt-command ADDITIONAL_RUNTIME_APT_COMMAND
Additional command executed before runtime apt deps are installed.
--runtime-apt-deps ADDITIONAL_RUNTIME_APT_DEPS
The basic apt runtime dependencies to use when building the images.
--additional-runtime-apt-deps ADDITIONAL_RUNTIME_DEPS
Additional apt runtime dependencies to use when building the images.
--additional-runtime-apt-envs ADDITIONAL_RUNTIME_APT_DEPS
Additional environment variables set when adding runtime dependencies.
Build options:
--disable-mysql-client-installation
Disables installation of the mysql client which might be problematic if you are building
image in controlled environment. Only valid for production image.
--disable-mssql-client-installation
Disables installation of the mssql client which might be problematic if you are building
image in controlled environment. Only valid for production image.
--constraints-location
Url to the constraints file. In case of the production image it can also be a path to the
constraint file placed in 'docker-context-files' folder, in which case it has to be
in the form of '/docker-context-files/<NAME_OF_THE_FILE>'
--disable-pip-cache
Disables GitHub PIP cache during the build. Useful if GitHub is not reachable during build.
--install-from-docker-context-files
This flag is used during image building. If it is used additionally to installing
Airflow from PyPI, the packages are installed from the .whl and .tar.gz packages placed
in the 'docker-context-files' folder. The same flag can be used during entering the image in
the CI image - in this case also the .whl and .tar.gz files will be installed automatically
-C, --force-clean-images
Force build images with cache disabled. This will remove the pulled or build images
and start building images from scratch. This might take a long time.
-r, --skip-rebuild-check
Skips checking image for rebuilds. It will use whatever image is available locally/pulled.
-L, --build-cache-local
Uses local cache to build images. No pulled images will be used, but results of local
builds in the Docker cache are used instead. This will take longer than when the pulled
cache is used for the first time, but subsequent '--build-cache-local' builds will be
faster as they will use mostly the locally build cache.
This is default strategy used by the Production image builds.
-U, --build-cache-pulled
Uses images pulled from GitHub Container Registry to build images.
Those builds are usually faster than when ''--build-cache-local'' with the exception if
the registry images are not yet updated. The images are updated after successful merges
to main.
This is default strategy used by the CI image builds.
-X, --build-cache-disabled
Disables cache during docker builds. This is useful if you want to make sure you want to
rebuild everything from scratch.
This strategy is used by default for both Production and CI images for the scheduled
(nightly) builds in CI.
-g, --github-repository GITHUB_REPOSITORY
GitHub repository used to pull, push images.
Default: apache/airflow.
-v, --verbose
Show verbose information about executed docker, kind, kubectl, helm commands. Useful for
debugging - when you run breeze with --verbose flags you will be able to see the commands
executed under the hood and copy&paste them to your terminal to debug them more easily.
Note that you can further increase verbosity and see all the commands executed by breeze
by running 'export VERBOSE_COMMANDS="true"' before running breeze.
--dry-run-docker
Only show docker commands to execute instead of actually executing them. The docker
commands are printed in yellow color.
```
| https://github.com/apache/airflow/issues/19970 | https://github.com/apache/airflow/pull/20338 | 919ff4567d86a09fb069dcfd84885b496229eea9 | 95740a87083c703968ce3da45b15113851ef09f7 | "2021-12-02T13:31:21Z" | python | "2022-01-05T17:38:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,969 | [".github/workflows/build-images.yml", ".github/workflows/ci.yml", "dev/breeze/setup.cfg", "dev/breeze/src/airflow_ci/__init__.py", "dev/breeze/src/airflow_ci/freespace.py", "scripts/ci/tools/free_space.sh"] | CI: Rewrite `free_space` script in Python | In our build/ci.yml (maybe other?) files we are using a "free_space" script that performs cleanup of the machine before running the tasks. This allows us to reclaim memory and disk space for our tasks.
Example: https://github.com/apache/airflow/blob/eaa8ac72fc901de163b912a94dbe675045d2a009/.github/workflows/ci.yml#L334
This should be written in Python and our ci.yml and build.yml should be updated. We shoudl also be able to remove free_space script from the repo.
| https://github.com/apache/airflow/issues/19969 | https://github.com/apache/airflow/pull/20200 | 7222f68d374787f95acc7110a1165bd21e7722a1 | 27fcd7e0be42dec8d5a68fb591239c4dbb0092f5 | "2021-12-02T13:25:46Z" | python | "2022-01-04T17:16:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,966 | [".pre-commit-config.yaml", "dev/breeze/doc/BREEZE.md"] | Breeze: Make Breeze works for Windows seemleasly | The goal to achieve:
We would like to have `./Breeze2.exe` in the main directory of Airlfow which should do the same as `python Breeze2` now - firing the Breeze command line and managing the virtualenv for it.
We need to have:
* colors in terminal
* possibility to use commands and flags
* [stretch goal] autocomplete in Windows (we might separate it out to a separate task later)
| https://github.com/apache/airflow/issues/19966 | https://github.com/apache/airflow/pull/20148 | 97261c642cbf07db91d252cf6b0b7ff184cd64c6 | 9db894a88e04a71712727ef36250a29b2e34f4fe | "2021-12-02T13:15:37Z" | python | "2022-01-03T16:44:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,957 | ["airflow/models/dagrun.py"] | Airflow crashes with a psycopg2.errors.DeadlockDetected exception | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Ubuntu 21.04 on a VM
### Versions of Apache Airflow Providers
root@AI-Research:~/learning_sets/airflow# pip freeze | grep apache-airflow-providers
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Other
### Deployment details
Airflow is at version 2.2.2
psql (PostgreSQL) 13.5 (Ubuntu 13.5-0ubuntu0.21.04.1)
The dag contains thousands of tasks for data download and preprocessing and preparation which is destined to a mongodb database (so, I'm not using the PostgreSQL inside my tasks).
### What happened
[2021-12-01 19:41:57,556] {scheduler_job.py:644} ERROR - Exception when executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
self.dialect.do_execute(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.DeadlockDetected: deadlock detected
DETAIL: Process 322086 waits for ShareLock on transaction 2391367; blocked by process 340345.
Process 340345 waits for AccessExclusiveLock on tuple (0,26) of relation 19255 of database 19096; blocked by process 340300.
Process 340300 waits for ShareLock on transaction 2391361; blocked by process 322086.
HINT: See server log for query details.
CONTEXT: while updating tuple (1335,10) in relation "task_instance"
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line 628, in _execute
self._run_scheduler_loop()
File "/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line 709, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File "/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line 792, in _do_scheduling
callback_to_run = self._schedule_dag_run(dag_run, session)
File "/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line 1049, in _schedule_dag_run
dag_run.schedule_tis(schedulable_tis, session)
File "/usr/local/lib/python3.9/dist-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/airflow/models/dagrun.py", line 898, in schedule_tis
session.query(TI)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py", line 4063, in update
update_op.exec_()
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line 1697, in exec_
self._do_exec()
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line 1895, in _do_exec
self._execute_stmt(update_stmt)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line 1702, in _execute_stmt
self.result = self.query._execute_crud(stmt, self.mapper)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py", line 3568, in _execute_crud
return conn.execute(stmt, self._params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1124, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1316, in _execute_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1510, in _handle_dbapi_exception
util.raise_(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py", line 1276, in _execute_context
self.dialect.do_execute(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (psycopg2.errors.DeadlockDetected) deadlock detected
DETAIL: Process 322086 waits for ShareLock on transaction 2391367; blocked by process 340345.
Process 340345 waits for AccessExclusiveLock on tuple (0,26) of relation 19255 of database 19096; blocked by process 340300.
Process 340300 waits for ShareLock on transaction 2391361; blocked by process 322086.
HINT: See server log for query details.
CONTEXT: while updating tuple (1335,10) in relation "task_instance"
[SQL: UPDATE task_instance SET state=%(state)s WHERE task_instance.dag_id = %(dag_id_1)s AND task_instance.run_id = %(run_id_1)s AND task_instance.task_id IN (%(task_id_1)s, %(task_id_2)s, %(task_id_3)s, %(task_id_4)s, %(task_id_5)s, %(task_id_6)s, %(task_id_7)s, %(task_id_8)s, %(task_id_9)s, %(task_id_10)s, %(task_id_11)s, %(task_id_12)s, %(task_id_13)s, %(task_id_14)s, %(task_id_15)s, %(task_id_16)s, %(task_id_17)s, %(task_id_18)s, %(task_id_19)s, %(task_id_20)s)]
[parameters: {'state': <TaskInstanceState.SCHEDULED: 'scheduled'>, 'dag_id_1': 'download_and_preprocess_sets', 'run_id_1': 'manual__2021-12-01T17:31:23.684597+00:00', 'task_id_1': 'download_1379', 'task_id_2': 'download_1438', 'task_id_3': 'download_1363', 'task_id_4': 'download_1368', 'task_id_5': 'download_138', 'task_id_6': 'download_1432', 'task_id_7': 'download_1435', 'task_id_8': 'download_1437', 'task_id_9': 'download_1439', 'task_id_10': 'download_1457', 'task_id_11': 'download_168', 'task_id_12': 'download_203', 'task_id_13': 'download_782', 'task_id_14': 'download_1430', 'task_id_15': 'download_1431', 'task_id_16': 'download_1436', 'task_id_17': 'download_167', 'task_id_18': 'download_174', 'task_id_19': 'download_205', 'task_id_20': 'download_1434'}]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
[2021-12-01 19:41:57,566] {local_executor.py:388} INFO - Shutting down LocalExecutor; waiting for running tasks to finish. Signal again if you don't want to wait.
[2021-12-01 19:42:18,013] {process_utils.py:100} INFO - Sending Signals.SIGTERM to GPID 285470
[2021-12-01 19:42:18,105] {process_utils.py:66} INFO - Process psutil.Process(pid=285470, status='terminated', exitcode=0, started='18:56:21') (285470) terminated with exit code 0
[2021-12-01 19:42:18,106] {scheduler_job.py:655} INFO - Exited execute loop
### What you expected to happen
Maybe 24 concurrent processes/tasks are too many?
### How to reproduce
reproducibility is challenging, but maybe the exception provides enough info for a fix
### Anything else
all the time, after some time the dag is being run
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19957 | https://github.com/apache/airflow/pull/20894 | 14a057ff921d7e6ceb70326f5fecac29d3a093ad | 0e4a057cd8f704a51376d9694c0114eb2ced64ef | "2021-12-02T05:44:38Z" | python | "2022-01-19T10:12:53Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,955 | ["airflow/www/views.py"] | Add hour and minute to time format on x-axis in Landing Times | ### Description
It would be great if the date and time format of x-axis in Landing Times is `%d %b %Y, %H:%M` instead of `%d %b %Y` to see `execution_date` of task instances.
### Use case/motivation
For x-axis of all line chart using `nvd3.lineChart`, the date and time format has only date without time because its format is `%d %b %Y` by default when `x_is_date` is `True`. From what I have seen in version 1.7, the date and time format was like `%d %b %Y, %H:%M`. It has been changed as Highcharts was no longer used since version 1.8. It is simple to fix by injecting `x_axis_format="%d %b %Y, %H:%M"` where every `nvd3.lineChart` class is initialized.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19955 | https://github.com/apache/airflow/pull/20002 | 2c80aaab4f486688fa4b8e252e1147a5dfabee54 | 6a77e849be3a505bf2636e8224862d77a4719621 | "2021-12-02T02:32:25Z" | python | "2021-12-16T15:32:37Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,945 | ["airflow/migrations/versions/54bebd308c5f_add_trigger_table_and_task_info.py", "airflow/migrations/versions/7b2661a43ba3_taskinstance_keyed_to_dagrun.py", "airflow/migrations/versions/e9304a3141f0_make_xcom_pkey_columns_non_nullable.py"] | Running alembic migration downgrade on database causes error | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
PRETTY_NAME="Debian GNU/Linux 10 (buster)" NAME="Debian GNU/Linux" VERSION_ID="10" VERSION="10 (buster)" VERSION_CODENAME=buster ID=debian HOME_URL="https://www.debian.org/" SUPPORT_URL="https://www.debian.org/support" BUG_REPORT_URL="https://bugs.debian.org/"
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.4.0
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-cncf-kubernetes==2.1.0
apache-airflow-providers-datadog==2.0.1
apache-airflow-providers-docker==2.3.0
apache-airflow-providers-elasticsearch==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==6.1.0
apache-airflow-providers-grpc==2.0.1
apache-airflow-providers-hashicorp==2.1.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.3.0
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-odbc==2.0.1
apache-airflow-providers-postgres==2.3.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sendgrid==2.0.1
apache-airflow-providers-sftp==2.2.0
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
### Deployment
Docker-Compose
### Deployment details
_No response_
### What happened
My current database alembic_version is set at e9304a3141f0
While running the command `alembic -c alembic.ini downgrade a13f7613ad25` I received the following error:
INFO [alembic.runtime.migration] Running downgrade e9304a3141f0 -> 83f031fd9f1c, make xcom pkey columns non-nullable
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.InvalidTableDefinition: column "key" is in a primary key
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.local/bin/alembic", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/config.py", line 588, in main
CommandLine(prog=prog).main(argv=argv)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/config.py", line 582, in main
self.run_cmd(cfg, options)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/config.py", line 562, in run_cmd
**dict((k, getattr(options, k, None)) for k in kwarg)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/command.py", line 366, in downgrade
script.run_env()
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/script/base.py", line 563, in run_env
util.load_python_file(self.dir, "env.py")
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 92, in load_python_file
module = load_module_py(module_id, path)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 108, in load_module_py
spec.loader.exec_module(module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "migrations/env.py", line 107, in <module>
run_migrations_online()
File "migrations/env.py", line 101, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/runtime/environment.py", line 851, in run_migrations
self.get_context().run_migrations(**kw)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/runtime/migration.py", line 620, in run_migrations
step.migration_fn(**kw)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/migrations/versions/e9304a3141f0_make_xcom_pkey_columns_non_nullable.py", line 76, in downgrade
bop.alter_column("execution_date", type_=_get_timestamp(conn), nullable=True)
File "/usr/local/lib/python3.7/contextlib.py", line 119, in __exit__
next(self.gen)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/operations/base.py", line 374, in batch_alter_table
impl.flush()
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/operations/batch.py", line 107, in flush
fn(*arg, **kw)
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/ddl/postgresql.py", line 185, in alter_column
**kw
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/ddl/impl.py", line 240, in alter_column
existing_comment=existing_comment,
File "/home/airflow/.local/lib/python3.7/site-packages/alembic/ddl/impl.py", line 197, in _exec
return conn.execute(construct, multiparams)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/sql/ddl.py", line 72, in _execute_on_connection
return connection._execute_ddl(self, multiparams, params)
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1073, in _execute_ddl
compiled,
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
e, statement, parameters, cursor, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.InvalidTableDefinition) column "key" is in a primary key
[SQL: ALTER TABLE xcom ALTER COLUMN key DROP NOT NULL]
It seems the DOWN migration step for revision `e9304a3141f0 - Make XCom primary key columns non-nullable` is not backwards compatible as it is attempting to make a Primary Key nullable, which is not possible.
I suspect the revision of `bbf4a7ad0465 - Remove id column from xcom` has something to do with this - id perhaps being the old primary key for the table.
Revision list: https://airflow.apache.org/docs/apache-airflow/stable/migrations-ref.html
### What you expected to happen
I expected the alembic version to downgrade to the appropriate version
### How to reproduce
On airflow image 2.2.2 attempt to run `alembic -c alembic.ini downgrade {any version lower than revision id e9304a3141f0}`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19945 | https://github.com/apache/airflow/pull/19994 | 0afed43a8afde093277be2862138cb32fba8ed29 | 14bff0e044a6710bc1d8f7b4dc2c9a40754de4a7 | "2021-12-01T20:32:12Z" | python | "2021-12-03T20:36:34Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,922 | ["airflow/operators/python.py", "tests/operators/test_python.py"] | ShortCircuitOperator does not save to xcom | ### Description
`ShortCircuitOperator` does not return any value regardless of the result.
Still, if `condition` evaluates to `falsey`, it can be useful to store/save the result of the condition to XCom so that downstream tasks. can use.
Many objects evaluates to True/False, not just booleans - see https://docs.python.org/3/library/stdtypes.html
### Use case/motivation
The change is trivial and would allow to combine a Python task and ShortCircuit into one.:
1. if callable returns None (or False) -> skip
2. if callable returns non-empty object (or True) -> continue. The proposed change is to pass on the condition to XCom.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19922 | https://github.com/apache/airflow/pull/20071 | 993ed933e95970d14e0b0b5659ad28f15a0e5fde | 4f964501e5a6d5685c9fa78a6272671a79b36dd1 | "2021-12-01T05:03:27Z" | python | "2021-12-11T16:27:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,917 | ["airflow/models/dag.py", "airflow/models/dagrun.py", "tests/models/test_dag.py", "tests/models/test_dagrun.py", "tests/models/test_taskinstance.py"] | Task-level end_date stops entire DAG | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Ubuntu 19.10
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
If a task has an `end_date`, the whole DAG stops at that time instead of only that task.
### What you expected to happen
The task with the `end_date` should stop at that time and the rest should continue.
### How to reproduce
```
from datetime import datetime, timedelta
from airflow.models import DAG
from airflow.operators.dummy import DummyOperator
default_args = {
"owner": "me",
"email": [""],
"start_date": datetime(2021, 11, 20),
}
dag = DAG(
"end_date_test",
default_args=default_args,
schedule_interval="@daily",
)
task1 = DummyOperator(
task_id="no_end_date",
dag=dag
)
task2 = DummyOperator(
task_id="special_start_end",
start_date=datetime(2021, 11, 22),
end_date=datetime(2021, 11, 24),
dag=dag,
)
```
![image](https://user-images.githubusercontent.com/7014837/144162498-a2bce24f-64da-4949-a9ac-c3a9a1edc735.png)
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19917 | https://github.com/apache/airflow/pull/20920 | f612a2f56add4751e959625c49368d09a2a47d55 | 85871eba420f3324432f55f74fe57005ff47a21c | "2021-12-01T02:44:43Z" | python | "2022-03-27T19:11:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,905 | ["airflow/utils/trigger_rule.py", "airflow/utils/weight_rule.py", "tests/utils/test_trigger_rule.py", "tests/utils/test_weight_rule.py"] | Refactor `TriggerRule` & `WeightRule` classes to inherit from Enum | ### Body
Both `TriggerRule` & `WeightRule` classes should inherit from Enum
https://github.com/apache/airflow/blob/3172be041e3de95f546123e3b66f06c6adcf32d5/airflow/utils/weight_rule.py#L22
https://github.com/apache/airflow/blob/8fde9a4b5b1360f733d86a04c2b0b747f0bfbf01/airflow/utils/trigger_rule.py#L22
An initial attempt to handle this was in https://github.com/apache/airflow/pull/5302
This is something that was also raised during review in https://github.com/apache/airflow/pull/18627#discussion_r719304267
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/19905 | https://github.com/apache/airflow/pull/21264 | d8ae7df08168fd3ab92ed0d917f9b5dd34d1354d | 9ad4de835cbbd296b9dbd1ff0ea88c1cd0050263 | "2021-11-30T20:39:00Z" | python | "2022-02-20T11:06:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,903 | ["airflow/decorators/task_group.py", "tests/utils/test_task_group.py"] | Error setting dependencies on task_group defined using the decorator | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
MacOS 11.6.1
### Versions of Apache Airflow Providers
$ pip freeze | grep airflow
apache-airflow==2.2.2
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Other
### Deployment details
`airflow standalone`
### What happened
```
AttributeError: 'NoneType' object has no attribute 'update_relative'
```
### What you expected to happen
Task group should be set as downstream of `start` task, and upstream of `end` task
### How to reproduce
* Add the following code to dags folder
```python
from datetime import datetime
from airflow.decorators import dag, task, task_group
@dag(start_date=datetime(2023, 1, 1), schedule_interval="@once")
def test_dag_1():
@task
def start():
pass
@task
def do_thing(x):
print(x)
@task_group
def do_all_things():
do_thing(1)
do_thing(2)
@task
def end():
pass
start() >> do_all_things() >> end()
test_dag_1()
```
* Run `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19903 | https://github.com/apache/airflow/pull/20671 | f2039b4c9e15b514661d4facbd710791fe0a2ef4 | 384fa4a87dfaa79a89ad8e18ac1980e07badec4b | "2021-11-30T19:55:25Z" | python | "2022-01-08T04:09:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,902 | ["airflow/decorators/base.py", "tests/utils/test_task_group.py"] | DuplicateTaskIdFound when reusing tasks with task_group decorator | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
MacOS 11.6.1
### Versions of Apache Airflow Providers
$ pip freeze | grep airflow
apache-airflow==2.2.2
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Other
### Deployment details
`airflow standalone`
### What happened
Exception raised :
```
raise DuplicateTaskIdFound(f"Task id '{key}' has already been added to the DAG")
airflow.exceptions.DuplicateTaskIdFound: Task id 'do_all_things.do_thing__1' has already been added to the DAG
```
### What you expected to happen
_No response_
### How to reproduce
* Add the following python file to the `dags` folder:
```python
from datetime import datetime
from airflow.decorators import dag, task, task_group
@dag(start_date=datetime(2023, 1, 1), schedule_interval="@once")
def test_dag_1():
@task
def start():
pass
@task
def do_thing(x):
print(x)
@task_group
def do_all_things():
for i in range(5):
do_thing(i)
@task
def end():
pass
start() >> do_all_things() >> end()
test_dag_1()
```
* Start airflow by running `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19902 | https://github.com/apache/airflow/pull/20870 | 077bacd9e7cc066365d3f201be2a5d9b108350fb | f881e1887c9126408098919ecad61f94e7a9661c | "2021-11-30T19:44:46Z" | python | "2022-01-18T09:37:35Z" |