- Más nuevo
- Más votos
- Más comentarios
Hi,
There is a python wrapper module for GnuPG, which can be found here (w/instructions):
https://pythonhosted.org/python-gnupg/
Else, you can use subprocesses and 'roll your own'.
Hope this helps.
Anonymous Internet Person
Thanks @git-er-dun, I created my own package zipped my python script with installed gnupg in it. I in fact had debug logging but for some reason it doesn't neither it shows any error.
Here is my code.
import boto3
import gnupg
import aws_lambda_logging
def lambda_handler(event, context):
aws_lambda_logging.setup(level='DEBUG')
s3 = boto3.client("s3")
object_path='folder/file-name.csv.gpg'
file=(object_path.split('/')[-1])
folder=(object_path.split('/')[0])
bucket='bucket-name'
secretmanager = boto3.client('secretsmanager')
def secret_function(secret):
response = secretmanager.get_secret_value(
SecretId=secret
)
return response['SecretString']
key_data = secret_function('Public-Key') + '\n' + secret_function('Private-Key')
gpg = gnupg.GPG(gnupghome='/tmp')
import_result = gpg.import_keys(key_data)
local_file_name = '/tmp/'+file
s3.download_file(file, bucket, local_file_name)
with open(file, 'rb') as a_file:
gpg.decrypt_file(a_file, output='testdecrypted-python.csv')
upload_file_name = '/tmp/testdecrypted-python.csv'
s3_path=folder+'/testdecrypted-python.csv'
s3.upload_file(upload_file_name, bucket, s3_path)
It looks like you are just initializing the logger, but not actually passing anything to it.
ie:
log.debug('This will be logged')
Docs can be found here:
https://pypi.org/project/aws-lambda-logging/
I didn't mean I am not getting anything from cloudwatch log streams, I am getting details but nothing useful or pointing to an error.
I changed the code to log every step for any error or anything.
import boto3
import botocore
import gnupg
import os
def lambda_handler(event, context):
s3 = boto3.client("s3")
object_path='BOSN/testencrypted.csv.gpg'
file=(object_path.split('/')[-1])
folder=(object_path.split('/')[0])
bucket='tfnsw-analytics-land-nprd-s3'
secretmanager = boto3.client('secretsmanager')
print (object_path)
print (file)
print (folder)
print (bucket)
def secret_function(secret):
response = secretmanager.get_secret_value(
SecretId=secret
)
return response['SecretString']
key_data = secret_function('BOSN-PGP-Secret-NPRD') + '\n' + secret_function('BOSN-PGP-Secret-NPRD-Private')
print (key_data)
gpg = gnupg.GPG(gnupghome='/tmp')
print (gpg.gnupghome)
import_result = gpg.import_keys(key_data)
print (import_result.results)
local_file_name = '/tmp/'+file
print (local_file_name)
s3.download_file(bucket, object_path, local_file_name)
dirlist = os.listdir("/tmp")
print (dirlist)
with open(file, 'rb') as a_file:
status = gpg.decrypt_file(a_file, output='testdecrypted-python.csv')
print (status.stderr)
print (status.status)
upload_file_name = '/tmp/testdecrypted-python.csv'
s3_path=folder+'/testdecrypted-python.csv'
s3.upload_file(upload_file_name, bucket, s3_path)
However post printing "local_file_name" nothing comes up, no event generated further, why is that?
can you share detail step on how to package it to use on lambda function?
I am facing the same issue. What is the solution to this issue? I added the gnupg package as a layer to the lambda function.
I am facing the same issue. What is the solution to this issue? I added the gnupg package as a layer to the lambda function.
Contenido relevante
- OFICIAL DE AWSActualizada hace 3 años
- OFICIAL DE AWSActualizada hace 3 años