-
Notifications
You must be signed in to change notification settings - Fork 14.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BigQuery task decorated functions failing in Airflow 2.9.1 #39541
Comments
Could this be something to do with updates to some of the 2.8.4
2.9.1
Although I'm confused why this would affect @task decorated functions? |
You are using BQHook, so it does not matter if task is decorated
Can you downgrade the libraries and check if it fixes the issue? @VladaZakharova - you might want to take a look at this one |
@potiuk Yes, I guess my confusion is more about why hook is giving the error in this context rather than the operator; especially as it doesn't seen to be caused by the provider. I downgraded the libraries but the issue persists. |
Look like this is related to |
@pankajastro Interesting. Downgrading does work but I'm curious to know what issue is and why the hook method gives the error but the operator does not. |
It is because the way we pass
error coming from airflow/airflow/providers/google/cloud/hooks/bigquery.py Lines 1674 to 1676 in 79042cf
I have drafted a PR to pin the lib version until we fix it #39583 |
I see. Thanks for the info. |
Closing this task as we have pinned the version and we have a todo in the code that reference this issue |
I read the thread, but this also happens with official Google Cloud BigQuery operators that uses BQHook under the hood. Thanks for the info ! |
Could you also update the constraint file ? |
Constraints are updated only when they prevent things from being build. We never update constraints with new providers - they are generally frozen in time. New provider will be in the constraints of 2.9.2 Nothing prevents you from installing new provider without constraints - this is described in "upgrade scenario" documentation (See installation documentation) - you should follow this rather than stick to constraints in this case. |
Oh i thought constraints were the recommended way to always install airflow. I forced the installation of the google-cloud-bigquery package in 3.20.1 without constraints as you mentionned. |
- Unpin versions of dbt-core and dbt-bigquery packages - Pin version of google-cloud-bigquery package until backward compatibility issue is fixed: apache/airflow#39541 Change-Id: I5699fc652d96cb92a934bc5a1271ae42b0c1da8f GitOrigin-RevId: cd260c9249d6d793693eb5b9e64f9bcb5c28c02b
Pin version of google-cloud-bigquery package until backward compatibility issue is fixed: apache/airflow#39541 Change-Id: I57bdeaf9713b9ad3015eb884a3f69ec6893e0151 GitOrigin-RevId: 9bf56dc740ba99ce97a9d121df328f67814c8449
Apache Airflow version
2.9.1
If "Other Airflow 2 version" selected, which one?
No response
What happened?
After upgrading to Airflow 2.9.1, @task decorated functions that implement BigQuery hooks are not successfully submitting jobs but returning an error such as the following:
I have replicated this issue against Airflow 2.9.1 and against
main
but this does not seem to be related to the Google provider because using either 10.17 or 10.16 will result in this error.Using either Google provider with Airflow 2.8.4 does not cause this error.
What you think should happen instead?
No response
How to reproduce
Here's a simple DAG that will replicate the issue using Breeze. The
bq_hook_test
task will fail butbq_insert_job_test
based onBigQueryInsertJobOperator
with the same configuration will succeed.We are using Google default credentials for authentication with the following environment variables:
Operating System
n/a
Versions of Apache Airflow Providers
apache-airflow-providers-google=10.17.0
Deployment
Astronomer
Deployment details
No response
Anything else?
Full log exception.
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: