Questions tagged with Amazon EventBridge
Content language: English
Sort by most recent
I want to trigger a lambda function using EventBridge on specific dates which don't have any pattern. Let's say the dates are
12th Jan 2022, 29th Jan 2022, 1st Feb 2022, 14th Feb 2022, 15th March 2022, and 17th May 2022
How can I achieve this using EventBridge?
Is it possible to pass multiple cron expressions to a single eventbridge scheduler?
> [0 0 12 Jan * 2022,0 0 29 Jan * 2022,0 0 1 Feb * 2022,0 0 14 Feb * 2022,0 0 15 Mar * 2022,0 0 17 May * 2022 ]
In the following CloudFormation template, with the string "<userIdentity> tried to change a networkinterface. Probably the security group." on the same line as InputTemplate, the deployment will fail with the error "Invalid InputTemplate for target Ab1c2345d6-789e-0f1g-h234-ij5678k90l12 : [Source: (String)"null tried to change a networkinterface. Probably the security group."; line: 1, column: 11]. (Service: AmazonCloudWatchEvents; Status Code: 400; Error Code: ValidationException; Request ID: a1234b56-789c-0123-4d56-78901e234567; Proxy: null)".
When I change the code into:
```
[...]
InputTemplate: >-
"<userIdentity> tried to change a networkinterface. Probably the security group."
```
it works. This seems a bug to me, can you please look into this?
Thank you in advance!
Frederique
```
---
AWSTemplateFormatVersion: '2010-09-09'
Resources:
EventRule0:
Type: AWS::Events::Rule
Properties:
EventBusName: default
EventPattern:
source:
- aws.ec2
detail-type:
- AWS API Call via CloudTrail
detail:
eventSource:
- ec2.amazonaws.com
eventName:
- ModifyNetworkInterfaceAttribute
Name: Test
State: ENABLED
Targets:
- Id: MyId
Arn: >-
arn:aws:sns:eu-west-1:040909972200:bitwarden-AlarmTopicMail-ELLBXyn1jv3z
InputTransformer:
InputPathsMap:
userIdentity: $.detail.userIdentity.principalId
InputTemplate: "<userIdentity> tried to change a networkinterface. Probably the security group."
```
I'm wanting to use an EventBridge Pipe to pipe data from a various Kinesis streams into another Kinesis stream.
It looks like I can create a pipe that uses a Kinesis stream as a source, filter it to just the subset of data that I need, and pipe it to another Kinesis stream as the target.
The issue is that it seems like I have to specify a hard coded partition key for the target Kinesis stream. Am I not allowed to set a partition key path like the EventBridge Rule system? If I have to hard code the partition key, I wont be able to shard out, since the data will always go to only one shard.
I'm trying to build a schedule in AWS EventBridge that is going to invoke an API Gateway endpoint on a specific rate. However, I'm not able to specify the API Gateway endpoint when creating the schedule. I couldn't find examples in the documentation about the JSON body that I need to provide to specify the API Gateway ARN, endpoint and headers.
I reviewed this AWS EventBridge documentation: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-api-gateway-target.html but did not find details about how to specify the API Gateway endpoint.
I have followed the below reference link to create the high volume outbound campaign :
"https://aws.amazon.com/blogs/contact-center/make-predictive-and-progressive-calls-using-amazon-connect-high-volume-outbound-communications/"
Once i publish the journey in pinpoint, it just get through the segment block and never goes to "contact center" block at all. Tried checking the logs in event bridge too i can just see the logs till "Initiated" only. There is no error in any of the configuration. Am struck at this point.
Your input's are highly appreciated to resolve this issue.
Thanks in advance!
Documentation say. To edit a scheduled task (Amazon ECS console)
Open the Amazon ECS console at https://console.aws.amazon.com/ecs/.
Choose the cluster in which to edit your scheduled task.
On the Cluster: cluster-name page, choose Scheduled Tasks.
Select the box to the left of the schedule rule to edit, and choose Edit.
Edit the fields to update and choose Update.
But I cannot see the "Scheduled Tasks" option. it was there before but ever since the new interface i cannot see it. Is there anyway that i can edit the scheduled task? I tried rules in eventbridge but it is not letting me edit the contraineroverrides.
Hi there,
As is not recommended by "Database per service" design pattern, every integration between microservices should be done with any messaging system?
We have an application where users can upload videos.
The API is available using GraphQL, and we have federation to route the video uploads to a cluster of servers responsible to create the video in the database (RDS).
Once the video is uploaded to S3, a service that is triggered by a S3 event start a MediaConvert job to create a HLS profile.
Once completed, we need to mark the video as available to viewers (updating the table).
What is the best practice to do this?
The convert service should connect to the database and update the record?
Execute a service API to update the record?
Send a SQS message that will be handle in the cluster that is connected to the database?
I have a service which is hosted in my private VPC and currently we are using API Gateway to expose it publicly. All our API calls go through it. So right now EventBridge's API Destinations point to the API Gateway's public endpoint. But we would like to change that and call the service at its private endpoint from EventBridge's API Destination itself so as to make sure our internal service calls stays in our VPC. How can we go about this?
What happens to events which trigger a target say invoking the lambda but it is not processed successfully ?
Will it retry for x times ? is the x times configurable ?
Will the events are pushed to DLQ when the retries are exhausted ?
what happens to events which are sent to event bus but not matching rule to deliver it to target ?
When to use default event bus and when to create our own custom event bus ?
I am trying to create an AWS Cloud watch event which will trigger an email whenever a S3 bucket is created or modified to allow public access.
I have created the cloud trail, log stream and am tracking all the S3 events logs. When i am trying to create a custom event by giving the pattern to detect S3 buckets with public access i am not able to fetch any response or the event doesn't get triggered even if i create bucket with public access. Can you help me out with the custom pattern for the same ?
I have tried giving GetPublicAccessBlock, PutPublicAccessBlock etc., in event type but no luck. Please suggest accordingly.