Aurora Postgres - Audit and Historical Reporting

0

We use Aurora Postgres as our application database and one of our application has a requirement where any CRUD actions performed through the app services needs to be shown through the application UI. Historically, this was done by triggers on every single app table and storing all the CRUD changes into HIST tables. The other alternative is by using Hibernate Envers, we could store the historical data into HIST tables and use these for reporting. But, for lambdas using NodeJS, Envers is clearly not possible. So, we are scouting for an alternate solution to met our requirement. I looked at options to leverage Postgres WAL logs, Aurora Postgres DB streams etc. but all those options will only work if we create a set of HIST tables for every single app table instead of trying to store the historical data into S3. Any expert adivce on this please? Thank you very much

vkaamc
asked 4 months ago274 views
1 Answer
0

Hi,

AWS Aurora Postgres provides the following options to audit the activity in the Aurora database:

1. Audit using pgAudit extension

The pgAudit extension allows you to set auditing on the instance level, user level, or database level for various levels of details. The audit data is written to the Postgres log and can be published to AWS Cloudwatch for further processing.

For more details on setting up the pgAudit extension, please refer to the below document:

[-] Using pgAudit to log database activity - Setting up the pgAudit extension - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Appendix.PostgreSQL.CommonDBATasks.pgaudit.html#Appendix.PostgreSQL.CommonDBATasks.pgaudit.basic-setup

Setting the pgaudit.log parameter to WRITE audits INSERT, UPDATE, DELETE, TRUNCATE, and COPY when the destination is a table. The parameter READ audits SELECT and COPY when the source is a relation or a query.

2. Audit through Database Activity Streams

With Database Activity Stream enabled, your Aurora DB cluster pushes activities to an Amazon Kinesis data stream in near real time. From Kinesis, you can configure AWS services such as Amazon Kinesis Data Firehose and AWS Lambda to consume the stream and store the data. It provides audit data in unified JSON format.

For more details, please refer to:

[-] Overview of Database Activity Streams - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/DBActivityStreams.Overview.html

The following blog discusses how to pgAudit and Database Activity Stream to create and consume the audit trail:

[-] Part 1: Audit Aurora PostgreSQL databases using Database Activity Streams and pgAudit - https://aws.amazon.com/blogs/database/part-1-audit-aurora-postgresql-databases-using-database-activity-streams-and-pgaudit/

[-] Part 2: Audit Aurora PostgreSQL databases using Database Activity Streams and pgAudit - https://aws.amazon.com/blogs/database/part-2-audit-aurora-postgresql-databases-using-database-activity-streams-and-pgaudit/

3. Audit using AWS Database Migration Service (AWS DMS)

In this customized solution, you can use AWS DMS to create a staging table, and PostgreSQL database triggers to create the final audit log table. This audit log is a table of timestamped records that captures the history of all updates and deletes made to the table. This solution can be used to audit specific tables and their data.

For more details, please refer to:

[-] Create an audit trail for an Amazon Aurora PostgreSQL table - https://aws.amazon.com/blogs/database/create-an-audit-trail-for-an-amazon-aurora-postgresql-table/

Please let us know if there are any specific requirements which the above options are unable to meet.

AWS
SUPPORT ENGINEER
answered 4 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions