How do I list the attachments or detachments history of a specific Amazon EBS volume using AWS CLI commands?
How do I list the attachments or detachments history of a specific Amazon Elastic Block Storage (Amazon EBS) volume using AWS Command Line Interface (AWS CLI)?
Short description
Amazon Elastic Compute Cloud (Amazon EC2) and Amazon EBS resources don't store the history of attachments or detachments. However, AWS CloudTrail does keep this information. CloudTrail is a service that records AWS API calls and events for AWS accounts. You can use the AWS CloudTrail API through the AWS CLI to pull the attachment and detachment log.
Amazon EBS volumes attached and detached using RunInstances and TerminateInstances API calls don't have individual CloudTrail events. These events don't appear in the CloudTrail lookup-events API output.
Resolution
Note: If you receive errors when running AWS CLI commands, make sure that you’re using the most recent version of the AWS CLI.
Run the CloudTrail lookup-events API. This command uses the AWS CLI JSON processor (JMESPath) to search for Attach and Detach events.
There are two methods you can use to return the data.
Print Unix Epoch timestamp
1. Run the following command:
$ aws cloudtrail lookup-events \ --lookup-attributes AttributeKey=ResourceName,AttributeValue=VOLUME_ID \ --max-results 3 \ --region REGION_ID \ --query 'Events[?EventName == `DetachVolume` || EventName == `AttachVolume`].{EventTime:EventTime,EventName:EventName,InstanceID:(Resources[1].ResourceName)}'
Replace VOLUME_ID with your Amazon EBS volume and REGION_ID with the appropriate Region. Use the max-results variable to set the number of Amazon EBS volume events returned. The default number of results returned is 50, with a maximum of 50 possible.
2. CloudTrail displays the timestamps in Unix Epoch time. Use one of the following methods to convert the timestamp into UTC:
macOS:
Remove the decimal point from the timestamp, then run the following command:
$ date -r 1571065747 -u Mon Oct 14 15:09:07 UTC 2019
Linux:
Run the following command:
$ date -d @1571065747.0 -u Mon Oct 14 15:09:07 UTC 2019
Windows:
Convert the timestamp using a converter, such as epochconverter.com.
Print human-readable timestamp in UTC time zone
Note: This method uses the sed utility and the jq processor, and it is recommended for Linux users only.
The sed utility is used to transform the CloudTrail Event value into a JSON-friendly layout. Most Linux distributions come with the sed utility already installed. You can download the utility from the GNU sed website if it's not installed.
The jq processor is used to search for and return the values for EventName, InstanceID, and EventTime. You can download it from the jq processor website if you don't have it installed.
Run the following command:
$ aws cloudtrail lookup-events \ --lookup-attributes AttributeKey=ResourceName,AttributeValue=VOLUME_ID \ --max-results 3 \ --region REGION_ID \ --query 'Events[?EventName == `DetachVolume` || EventName == `AttachVolume`].CloudTrailEvent' | \ sed 's/\\//g' | sed 's/"}"/"}/g' | sed 's/"{"/{"/g' | \ jq '.[] | {EventName:.eventName, InstanceID:.requestParameters.instanceId, EventTime:.eventTime}'
Replace VOLUME_ID with your Amazon EBS volume and REGION_ID with the appropriate Region. Use the max-results variable to set the number of Amazon EBS volume events returned. The default number of results returned is 50, with a maximum of 50 possible.
See the following example output:
{ "EventName": "AttachVolume", "InstanceID": "i-00a49ef5dd45af31b", "Time": "2019-10-02T15:36:18Z" } { "EventName": "DetachVolume", "InstanceID": "i-0554d4452aa4cf91b", "Time": "2019-10-02T14:26:04Z" } { "EventName": "AttachVolume", "InstanceID": "i-0554d4452aa4cf91b", "Time": "2019-10-02T14:25:42Z" }
**Note:**CloudTrail has a default search history length of 90 days. Any event older than 90 days doesn't appear. To retain your event logs longer than 90 days:
1. Create your own Trail in CloudTrail.
2. Store the logs in an Amazon Simple Storage Service (Amazon S3) bucket.
3. Use Amazon Athena to query the logs in your Amazon S3 bucket.
Related information

Relevanter Inhalt
- AWS OFFICIALAktualisiert vor 4 Monaten
- AWS OFFICIALAktualisiert vor 7 Monaten
- AWS OFFICIALAktualisiert vor 4 Monaten
- AWS OFFICIALAktualisiert vor 4 Monaten