most cost effective way to use a bash script to update a csv file

0

I have a bash script that is used to cycle through a bunch of csv files and update answers in them mostly one word entries. (self tracking)

As of today i do it in a Ubuntu machine. But I have to fire up the machine every time and then it takes about 10 min to update all entries one by one and then shut down the machine. instead of this I want to do this as

  1. update all answers in a single csv file and a script will read and update this in all other csv file from this single file using a lambda function
  2. have a ec2 instance in aws which i can connect from a ipad termius app and do the same stuff i was doing on desktop to be done in cloud.
  3. is there is better third way

I think option 1 is better in terms of compute resources. Need inputs on which could be a better way.

  • cost effective as in being able to use it from any device at any time (preferably without firing up the instance. )

已提問 2 年前檢視次數 352 次
1 個回答
0

If your script can run within Lambda runtime limits (memory and time) then that's definitely the best way to go. You are only billed when the function is running.

profile pictureAWS
專家
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南