- Più recenti
- Maggior numero di voti
- Maggior numero di commenti
Fine, I'll fix it myself -- Thanos
Make sure that in your requirements.txt, --constraint flag is at the top (first line). MWAA searches for constraint flag the moment it opens requirements.txt. If it's unable to find it (on the first line) then it will attach it's own. If someone puts it on the second or third line, then MWAA will load your constraint later so now you've two constraints.
pip3 install -r requirements.txt -c AIRFLOW_DEFAULT_CONSTRAINTS -c CUSTOM_CONSTRAINTS
and this will cause error like this:
[INFO] - The conflict is caused by:
[INFO] - The user requested greenlet==3.2.4
[INFO] - The user requested (constraint) greenlet==3.1.1, ==3.2.4
In the above example, I requested 3.2.4 but my --constraint flag was on third line hence MWAA loaded its own constraint (3.1.1)
if your --constraint is on the first line then, MWAA will only read the CUSTOM_CONSTRAINTS as shown below:
pip3 install -r requirements.txt -c CUSTOM_CONSTRAINTS
Next, make sure that you're downloading the WHLs "with" constraints. I created a local docker container (image: amazonlinux:2023). When building the image, make sure to add --platform "linux/amd64"
Then run
pip3 download -r requirements.txt -d TARGET_DIRECTORY -c constraints.txt
If you get a conflict, update constraints.txt and continue running the pip download command. Keep fixing the constraint as you encounter it. Make sure that you're also downloading the correct package version of the dependencies. Just downloading the correct version of the main package won't suffice. Edit every dependency's version in constraints.txt or else you'll get dumb errors like:
[INFO] - The conflict is caused by:
[INFO] - apache-airflow-providers-snowflake 6.4.0 depends on apache-airflow-providers-common-compat>=1.6.0
[INFO] - The user requested (constraint) apache-airflow-providers-common-compat==1.6.0
The above error occurred because one of the dependency's version of the snowflake-provider was incorrect
In an Amazon MWAA private-webserver, the webserver does not have outbound internet access. This means it cannot download Python packages or constraint files directly from public repositories such as PyPI or GitHub during an environment update.
As explained in the Amazon MWAA best practices for managing Python dependencies [1] blog post, for private environments you must package your dependencies into a .zip file, upload that file to Amazon S3, and specify its location in your MWAA environment configuration. Because the private webserver cannot access PyPI through the internet, MWAA installs the dependencies directly from that .zip file during the update or deployment process.
Additionally, use the aws-mwaa-local-runner utility to test out your requirements.txt and dependencies locally before deploying the same to an Amazon MWAA environment to ensure the compatibility of packages before deployment.
[+] https://github.com/aws/aws-mwaa-local-runner [+] https://github.com/aws/amazon-mwaa-docker-images
If you continue to face the same issue after following the suggested recommendations, I would request you to kindly raise a support case with AWS with your requirements.txt file so we can assist you further with troubleshooting.
For Apache Airflow v2.7.2 and later (including your v2.10.3), you need to properly modify the constraints file to use a newer provider version. Here's the correct approach for your private MWAA environment:
-
Download the original constraints file for your Airflow version (2.10.3) and Python version (likely 3.11) from GitHub.
-
Modify this constraints file to update the specific provider package version you want to change (apache-airflow-providers-snowflake==6.4.0).
-
Save this modified constraints file to your Amazon S3 dags folder (for example as "constraints-3.11-updated.txt").
-
In your requirements.txt file, reference this local constraints file at the top:
--constraint "/usr/local/airflow/dags/constraints-3.11-updated.txt"
apache-airflow-providers-snowflake==6.4.0
The key is that your local constraints file must properly resolve all dependency conflicts. If there are any conflicts between your specified package version and other dependencies, the installation will fail. Make sure your modified constraints file maintains compatibility with all other packages.
Since you mentioned you're able to install in a Linux container, verify that your container environment matches the MWAA environment closely enough. Also check the CloudWatch logs for your environment after the failed update - they should contain specific error messages about any package conflicts that are causing the installation to fail.
Sources
Apache Airflow provider packages installed on Amazon MWAA environments - Amazon Managed Workflows for Apache Airflow
Amazon MWAA best practices for managing Python dependencies | AWS Big Data Blog
Managing Python dependencies in requirements.txt - Amazon Managed Workflows for Apache Airflow
Contenuto pertinente
- AWS UFFICIALEAggiornata 7 mesi fa

Yeah, I did mention the version in constraint but MWAA still looks for 5.8.0