I understand that you have enabled auto import and auto export based on NEW, CHANGED, DELETED for AWS FSx for Lustre persistent 2 with an S3 bucket as a data repository and sometimes see error "Failed to Export file because it is being used by another process." in cloudwatch logs.
This error occurs when Amazon FSx was unable to export the file because it was being modified by another client on the file system. You can retry the DataRepositoryTask after your workflow has finished writing to the file. I would also like to inform you that this could cause due to factors like,
* How you are accessing the file? * What S3 storage class (e.g. S3 Standard, S3 IA, Glacier) is the underlying S3 object in?
To deep dive further, I request you to open a support case with AWS Premium support as the following details are required to understand the cause of the issue.
- Application name and access command? (e.g. cat)
- What is the output when you run "ls -lrt" in the directory where this file is present? ( This will give us more information to investigate here)
- FSX ID / region
SageMaker training with FSx: what is "directory_path"Accepted Answer
FSx for Lustre errors with No such file or directoryasked 9 months ago
Does Billing for FSx for Lustre start during "CREATING" Status or when it enters "AVAILABLE" status?Accepted Answerasked 2 years ago
What value should I set for directory_path for the Amazon SageMaker SDK with FSx as data source?Accepted Answer
AWS FSx for lustre client availability for SLES15asked 6 months ago
AWS FSx for DR and interaction with S3Accepted Answerasked 3 years ago
What's the load time for FSx for Lustre from S3?Accepted Answerasked 2 years ago
How do I achieve the least-access secure networking for SageMaker Training on Amazon FSx for Lustre?Accepted Answer
Inbound restriction for security group for Amazon FSx for Lustre?Accepted Answer
Can we setup FSx as an Active-Active DR solution for On-premises file server?Accepted Answerasked 5 months ago