How to provide unique filename to each file in each event in aws lambda python?

0

Hi There,

I have a lambda function which trigger as soon as AWS data pipeline writes a directory with multiple files to s3 bucket. A lambda function which is written in python reads that directory files and convert into csv and write back to different bucket folder.

The problem is when i pass the object name as file name it creates the same name file and upload to s3 bucket and it does for every file this creates mess because everyday i have to point to the new file. I want to give unique name to these files so everyday the same file being replaced instead created new one.

EDIT:

Here is my code

if event:

print(f"Got Events")
#Reading file 
df = pd.read_json(obj['Body'], lines=True, orient='records', dtype='dict')
csv_buffer = StringIO()
df.to_csv(csv_buffer, index=False)
filename = 'file1.csv'
client.put_object(Bucket=bucket, Key=filename,Body=csv_buffer.getvalue())

Please help...!

3 Antworten
0

You have control on which file name to use. If you want to use a fixed name, just use that in your S3 API.

profile pictureAWS
EXPERTE
Uri
beantwortet vor 2 Jahren
  • Here is my code:

    def lambda_handler(event, context):

    if event:
    
        print(f"Got Events")
        #Reading file 
        df = pd.read_json(obj['Body'], lines=True, orient='records', dtype='dict')
       csv_buffer = StringIO()
       df.to_csv(csv_buffer, index=False)
       filename = 'file1.csv'
      client.put_object(Bucket=bucket, Key=filename,Body=csv_buffer.getvalue())
    

    six files getting uploaded so this code runs 6 times where to give name which should be changed every event.

    Please help...!

  • You specify the filename in the client.put_object(Bucket=bucket, Key=filename,Body=csv_buffer.getvalue()) line. Just change filename to the value you want. Currently you gave it file1.csv.

    Lookin at another answer I see that you want to use filenames such as file1, file2, etc. To make sure that every invocation uses a different N, you will need to persist it somewhere, e.g., DynamoDB, and use it in each invocation. You will also need to make sure to zero it every day.

0

Hello,

One possible approach is to append date time to your filename to generate a unique filename before upload in your lambda function. I'm not a Python or code expert but see if following helps.

from datetime import datetime
now = datetime.now() # current date and time
dt = now.strftime("%Y-%m-%d-%H-%M-%S") #store date time in string
file = "file-" + dt #append the name to a string to generate filename

You can test the code on a local machine first and see if it meets the requirement.

AWS
beantwortet vor 2 Jahren
  • I want to give names like file1, file2,.....filen don't want to use other name else it will create many files everyday and create mess while consuming them

  • Sorry it won't work cause it will create multiple files every time which i don't want..how can i give unique name to the four files every time so that it will replace the destination files.

0

I'm assuming you are using the boto3 S3 Client.

The put_object, upload_file and related methods accept a Key parameter so you can name the file in S3 whatever you'd like.

Kyle
beantwortet vor 2 Jahren

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen