Questions tagged with AWS DataSync
Content language: English
Sort by most recent
I have requirement of data sync from ebs to efs. If I change any particular file from a folder or add a file in that folder it should be automatically sync to my efs mounted folder as of now i have achieved it with msrsync util. but the limitation is i have to set a cron job for every minute. So is there any better way without cron job when ever there is a data that should be automatically sync with efs
Hi, I have been using this ([Build a Cloud Sync Engine](https://docs.microsoft.com/en-us/windows/win32/cfapi/build-a-cloud-file-sync-engine)) documentation to build a Cloud Sync Engine, Is there any possibility to Sync the Data from the AWS as like OneDrive did, We are already have users data in S3 bucket, I'm planning to sync them with this, Like the Folders and Files that are in the user's account, I'm planning to display them in the cloud sync folder like this, ![Enter image description here](https://repost.aws/media/postImages/original/IMbEmtJzX9TjCnGTaelYagdA) Kindly suggest me a way to integrate the AWS with the cloud sync engine Any suggestions would be helpful! Thanks for your time.
Hi Team, we are facing some issues in the AWS Datasync while running datasync task in a specific testing agent, while the job is executing it is showing some error like "cannot allocate memory". so Please suggest some articles or any source that would be resolved the issue which we are facing. Thanks in Advance,
Hello, I am testing DataSync solution to migrate data from onprem to S3. After a successful test of the solution that I created manually, I wanted to automate the setup and configuration of the solution using terrafom.For that I had to delete the agent from Aws datasync console. Now when I want to activate it again I am getting an error that the agent is unreachable (agent port 80 is still closed). Could you please help me understand why port 80 is still closed and what should I do to fix this issue? thanks in advance.
using datasync, I transferred the files in Glacier Deep Archive to the s3 Glacier Flexible Retrieval class in the same s3 bucket, but now all files are 0 kb and appear in STANDARD class. Files transferred 132952 Data transferred 20.26 MiB(actually it should be much higher) I did not take any action in the bucket beforehand, did I miss something or did something wrong?
Hello I would like to ask what is the difference between these two configuration methods: Way 1: https://aws.amazon.com/blogs/storage/how-to-replicate-amazon-fsx-file-server-data-across-aws-regions/ Way 2: https://www.youtube.com/watch?v=c4XQMDUVHU8 The case is as follows. I have 2 Amazon FSx, in two different regions, and I would like to implement synchronization between Amazon FSx in region X, and Amazon FSx in region Y using DataSync. Which method should I choose? I think method number 2 is better because of the lack of additional VM, I think it will be more cost effective. But I am not sure if there is not some hidden issue here that I should pay attention to? I would appreciate any help on this topic. Best regards.
i deployed the datasync agent (Hypervisor) locally and generated the activation code in the local console and entered in the cloud console. the agent shows online for about 1 minute. after that the status remains offline. all the network tests in the local console are successful, but the agent never comes online.
Hello, We have set up Service Cloud Voice in Salesforce with Amazon contact center. We have installed the Voicemail Express package from GitHub. Then, when the customer press 3 to leave a voicemail message, it is creating a Voice Call Record and a case later on with a link to the voicemail message in salesforce. However, the voice call record is not related to that case, it's too bad because we loose the contact information ( in Salesforce, the phone number of the incoming call is associated to a contact if it exists ). We would like to link the VoiceCallId to the Case when it is created in Salesforce but I don't find what is creating this case and I don't find neither where to add this parameter, can you help me please? Thank you.
Hello I would like to ask about DataSync Agent deployed on EC2 VM. We have EC2 machine with DataSync Agent installed and we noticed that both the machine kernel and the agent itself are outdated. According to FAQ (https://aws.amazon.com/datasync/faqs/): A: Updates to the agent VM, including both the underlying operating system and the AWS DataSync software packages, are automatically applied by AWS once the agent is activated. Updates are applied non-disruptively when the agent is idle and not executing a data transfer task. But it doesn't seem to be working for us. So the question is how to fix the automatic update or update the machine and agent manually? I would appreciate any help on this topic. Best regards
We have been using data sync with no issues with an EFS drive which does not require encryption in transit. For compliance reasons, we have moved to a drive which requires encryption in transit. DataSync to the new drive fails. When I remove the policy, the task completes. When I restore the policy, the task fails. Now what?
Hi There, I want to copy some data from my EBS volume to EFS site and for this purpose I want to use AWS DataSync. When I am trying to choose source location (while setting up Datasync Task) there is no option to choose EBS volume as source point. Can someone guide on how can I accomplish data transfer from EBS volume to EFS using DataSync service?
I have AWS DB of Production and AWS DB of UAT environment. I want whenever needed I can get Production data of few tables to my UAT Database by clicking button from GUI. So GUI means we will design one GUI and we put one button there and whenever user need just click that button and using some DMS API's (not sure) selected tables data from Production get synchronized in UAT database.