Owner and Bucket Mismatch s3 Issue - isgaur/AWS-BigData-Solutions GitHub Wiki

Detailed Explanation of the issue :

S3 evaluates a subset of policies in three specific contexts—user context, bucket context, and object context.[1]

  1. User context: Does the requester's IAM role have required permission to perform the operation?
  2. Bucket context: Does bucket policy/ACL allow the requester to perform the operation?
  3. Object context: Does the requester have permissions from the object owner to perform a specific object operation which is determined by Access Control List(ACL)?

Example:

Consider a scenario where account-B (486380097998) is trying to access objects(CloudTrail Logs) in the bucket 'buckek_name' in account-A (870296345612). The account-C (113285607260 - Amazon Internal) writes the logs with 'bucket-owner-full-control' acl. Therefore the only bucket owner is allowed to access the logs. Also, the bucket policy will only allow the account any other account '870296345612' to perform operation on those objects which are owned by the bucket owner. Here, the bucket owner and object owner are different. Therefore, the GetObject operation failed as the requester does not have required permission at the object level/context to perform operation on the object. You will observer the follw 'Authentication Info' in S3 Log Dive-Analysis with Notes 'Object and Bucket Owner Mismatch'

Bucket Owner: 870296345612 (Account-A) Requester Account: 486380097998 (Account-B) Object Owner: 113285607260 (Account C - Amazon Internal) Solution : Since the bucket owner has the access to the logs. The bucket owner needs to perform any one of the following options :

  1. Create an IAM role with permission to access objects, and grant permission to another account to assume the role. Create a role/user in the bucket owner account-A (870296345612) with appropriate permissions and use that role/user credentials to access those objects. This solution is provided in the documentation[2]. However, please note that some features such as Glue Crawler won't allow cross account 'iam:PassRole'.

  2. Update the object level permissions of all the files by adding the Canonical ID of account-B in the object ACL. The s3api put-object-acl[3] operation can only be performed on each object individually. Also we do not have any '--recursive' parameter for the same. Therefore, the bucket owner account need to run the above command for individual object/file of the CloudTrail logs you wish to read.

AWS CLI command to update object ACL of a single object :

aws s3api put-object-acl --bucket --key <s3key/path> --grant-read id= --grant-full-control id= We can use the following AWS CLI command to update object ACL of all the object inside a given path. It is a custom solution using pipe operator(|) and 'xargs -I {}' as we do not have '--recursive' parameter.

bucket_name='ishan' #bucket_name path='AWSLogs/114623583815/CloudTrail/ap-northeast-2/2019/08/' #S3 path/key aws s3api list-objects --bucket $bucket_name --prefix $path --query 'Contents[][Key]' --output text |xargs -I {} aws s3api put-object-acl --bucket $bucket_name --key {} --grant-read id=68a2f1709dbd7dde18e85d5d24c55a2c1a852551f27d30b74a3df72ba51cd817 --grant-full-control id=c5a6839c87506a8c81dce5284b8249fff68f6b36ac17e10a0aae37b707b55dcc

  1. Copy the all the objects/files again using 'aws s3 cp':

If the bucket owner copies all the CloudTrail files, then the new objects created at the destination are owned by the bucket owner. The aws s3 cp[4] command also allows '--recursive' so we can perform

aws s3 cp s3://ishan/source/path/ s3://ishan/destination/path/ --recursive However, if we need to copy the objects to the same location we need to add parameter such as --storage-class STANDARD otherwise the copy operation will fail with the following error.

(InvalidRequest) when calling the CopyObject operation: This copy request is illegal because it is trying to copy an object to itself without changing the object's metadata, storage class, website redirect location or encryption attributes. You can use the following command to copy the object in the same location.

aws s3 cp s3://ishan/source/path/ s3://ishan/soruce/path/ --recursive --storage-class STANDARD

⚠️ **GitHub.com Fallback** ⚠️