# Questions tagged withAWS Lambda

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

### Getting " {"errorType":"Error","errorMessage":"ENOENT: no such file or directory, open" error on linking local file path to GoogleAuth class 'keyFile' property of googleapislg...

I am trying to invoke googleapis through AWS lambda. I am using a google service account to generate server-server authentication. we have stored the service account details in a JSON file name 'config.json'. Inorder to invoke API for google, we need to create auth object using the 'GoogleAuth' class and pass the config file path as a value to the 'keyfile' property. Though we have provided the correct path location of the file, lambda could not recognize the path and throw error. For path, I have tried absolute, and relative paths, using Path package with '__dirname", process.env.cwd(), used environment variable etc. I even tried with assets too. I am using AWS cdk to form cloud formation using nodejs. My intention is to invoke google apis using service account credentials with AWS lambda.  import {GoogleAuth} from 'google-auth-library' const auth = new GoogleAuth({ keyFile: 'path/to/file', scope: SCOPES })   undefined ERROR Uncaught Exception {"errorType":"Error","errorMessage":"ENOENT: no such file or directory, open '/keys/config.json'","code":"ENOENT","errno":-2,"syscall":"open","path":"/keys/config.json","stack":["Error: ENOENT: no such file or directory, open '/keys/config.json'"," at Object.openSync (node:fs:601:3)"," at Object.readFileSync (node:fs:469:35)"," at Object.<anonymous> (/var/task/index.js:533512:28)"," at Module._compile (node:internal/modules/cjs/loader:1254:14)"," at Module._extensions..js (node:internal/modules/cjs/loader:1308:10)"," at Module.load (node:internal/modules/cjs/loader:1117:32)"," at Module._load (node:internal/modules/cjs/loader:958:12)"," at Module.require (node:internal/modules/cjs/loader:1141:19)"," at require (node:internal/modules/cjs/helpers:110:18)"," at _tryRequireFile (file:///var/runtime/index.mjs:912:37)"]} 
1
0
26
views

### Fanning out DynamoDB stream into a kinesis stream using a lambda trigger?lg...

Hi, I'm trying to implement a system which essentially provides in-order item-level information about DDB updates to multiple consumers. I'm using DDB streams as Kinesis streams from DDB don't maintain order and deduping ([source](https://stackoverflow.com/a/74487638/19570509)). However, DDB streams can only have a max. of 2 concurrent consumers before being throttled. So what I'm trying to do is have the DDB stream trigger a lambda, which serializes the com.amazonaws.services.lambda.runtime.events.DyanmodbEvent.DynamodbStreamRecord, and passes it into a kinesis stream. A few questions: 1. What's the best method of serializing the DynamodbStreamRecord? I've seen some examples using KCL and the RecordAdapter, but that operates on the com.amazonaws.services.dynamodbv2.model.Record object, and not the lambda event objects DynamodbStreamRecord and StreamRecord. 1. When I'm writing code to send data into a kinesis stream, the putRecord API in the kinesis client requires a partition key parameter. If I'm aiming to maintain the same order on an item level in the kinesis stream, should the partition key that I supply for the putRecord call just be the partition key of the relevant item? Thanks!
1
0
25
views

1
0
41
views