- 최신
- 최다 투표
- 가장 많은 댓글
Hello.
How about mounting S3 directly using an S3 mount point?
It can be used at low cost since it can be used only for the S3 API fee.
https://github.com/awslabs/mountpoint-s3
https://aws.amazon.com/jp/blogs/aws/mountpoint-for-amazon-s3-generally-available-and-ready-for-production-workloads/
It shouldn't be too hard to roll your own. aws s3 sync
will keep the local folders synchronised with the target S3 bucket, and you could run this in a scheduler like cron, or use something like incron to kick-off a new sync job any time a change is made on the source.
Thanks Steve. But I'm looking for something a little more sophisticated than cron
Q: Why use inotify and not dnotify? There are many reasons. The first one is that dnotify sucks. The second one is that dnotify sucks much. The third one is that dnotify sucks very much... See Why to use for more information. This product was way too basic for my commercial needs.
Hi,
I am using rclone for such use cases: https://rclone.org/
Rclone has powerful cloud equivalents to the unix commands rsync, cp, mv, mount, ls, ncdu,
tree, rm, and cat. Rclone's familiar syntax includes shell pipeline support, and --dry-run protection.
It is used at the command line, in scripts or via its API.
Users call rclone "The Swiss army knife of cloud storage", and "Technology indistinguishable from magic".
Rclone really looks after your data. It preserves timestamps and verifies checksums at all times.
Transfers over limited bandwidth; intermittent connections, or subject to quota can be restarted,
from the last good file transferred. You can check the integrity of your files. Where possible,
rclone employs server-side transfers to minimise local bandwidth use and transfers from one provider
to another without using local disk.
Virtual backends wrap local and cloud file systems to apply encryption, compression,
chunking, hashing and joining.
Rclone mounts any local, cloud or virtual filesystem as a disk on Windows, macOS, linux
and FreeBSD, and also serves these over SFTP, HTTP, WebDAV, FTP and DLNA.
It works just fine.
Best,
Didier
Thank you for the answer. Unfortunately, I don't have the UNIX/Linux engineer or the script engineer to rollout an implementation like this to 40 of my customers. I'm also not inclined to use any software or script that doesn't have Microsoft validation (all my customers are medical, and their governance and compliance levels are very high) what I'm looking for is a ready to go certified application I can deploy on customer servers, that is safe compliant simple reliable. I can't believe I'm having trouble finding an application to fulfil these needs. Already tried using S3 browser, and a few other third-party applications which are all not reliable or not fast enough
Thank you, but I'm not looking for a connection tool. Making a connection is not my issue, as I am not wanting to do a direct backup to a S3 folder. I just want to replicate a local folder to S3 bucket. A straight process, scheduled automated. I'm looking for an actual application that can schedule/auto replicate, local folder directly into the bucket
as i understand you are looking for just a third party software for backup you can check the following tool https://www.msp360.com/backup/ before we were using this tool for our staff/
관련 콘텐츠
- AWS 공식업데이트됨 2년 전
Although it is not an application, how about periodically running the AWS CLI's "aws s3 sync" command using Cron? This will automatically synchronize files in your local folders on a regular basis. https://docs.aws.amazon.com/cli/latest/userguide/cli-services-s3-commands.html#using-s3-commands-managing-objects-sync