Security Engineering
How to create a secure S3 bucket policy
Many people know they need to secure the data in their S3 buckets, but it’s difficult (why). This guide will show you how to create a secure S3 bucket policy that implements least-privilege access and enforces use of encryption. We will build the policy incrementally, explaining why each statement exists and how the policy works.
Context
Let’s start with the deployment scenario and policy requirements.
Suppose we have a simple application deployed entirely in AWS:
Figure 1: Simple App Using Lambda & S3
The application:
- deploys to AWS using an automated delivery pipeline
- runs on AWS Lambda and is identified by its Lambda Execution Role
- is supported by a customer service team
- stores data in the
sensitive-app-data
bucket
The deployment must implement the organization’s high-level security policy requirements:
- implement least privilege, allowing only explicitly-specified principals the actions and access to data they need to perform their business function and denying access to all other principals
- require encryption at rest and in transport
This guide will show you how to create an S3 Bucket resource policy that does that.
The sensitive-app-data
‘s S3 bucket policy will contain statements to:
- Allow Administration
- Allow Reads
- Allow Writes
- Deny Actions by Unidentified Principals
- Deny Unencrypted Transport or Storage
This guide assumes you know the basic elements of an AWS security policy, particularly how Statements are constructed from Principals, Actions, Resources, and Conditions, but we’ll explain how these function and interact as we go.
Step 1: Identify who needs access
First, identify who needs access to the bucket and what kind of access they need.
Suppose interviews with the application delivery and customer support team reveal the following access requirements for IAM principals:
ci
user needs administration capabilities to deploy application updatesadmin
role needs administration capabilities to fix urgent problemsapp
role needs read and write data capabilities for the application to functioncust-service
role needs to read data capabilities to investigate problems
All of these identities and application resources exist within a single AWS account, 111
.
This describes the intended access precisely enough to provision access in the steps that follow.
Step 2: Create policy document
Start by creating an empty policy document:
This policy will contain several Statement
s that allow or deny access according to certain policy conditions. Each of the statements that follow will be added to the Statement
array.
Step 3: Deny everyone who should not have access
Now add a statement that denies access to everyone who doesn’t need it to the Statement
array:
Let’s break down how the DenyEveryoneElse
statement works.
First, the statement denies all S3 API actions for the AWS account 111
with the Effect
, Action
, and Principal
elements.
This blocks all access to the account’s IAM principals, regardless of the IAM policy attached to those users or roles. This prevents unintended access to sensitive-app-data
introduced by overscoped IAM policies, particularly AWS managed policies that provide access to service APIs, but have no way of specifying a resource condition to limit scope of that API access.
This approach forces all of the access to this bucket and its data to become visible.
Of course, we need to allow some access, so this policy uses a Condition
element to make exceptions for the allowed principals. The Deny
statement only applies when the principal whose access is being evaluated is not admin
, app
, ci
, or cust-service
.
Ok, now let’s allow access to the bucket.
Step 4: Allow Intended Access – Administer, Read, Write
Allow intended access to the bucket with distinct statements for administration, reading data, and writing data. Deleting data would get its own statement if we had that use case. Each of these Allow statements will all have the same form:
This statement allows all the actions identified for the administration, read, or write capability for all principals whose ARN matches the list provided to the aws:PrincipalArn
policy condition. The Resource element allows the actions for both the bucket and the objects in the bucket.
You may be curious why these Allow statements specify a Principal
of "AWS": "*"
, which means “all accounts” instead of 111
, as in the DenyEveryoneElse
statement. That is so you may allow cross account access by adding the needed IAM principal ARN to the condition, if you need it.
Let’s make this concrete by defining a statement allowing administration.
Allow Administration
The ci
user and admin
role need access to administer the application’s bucket and data. This AllowRestrictedAdministerResource
statement specifies administrative S3 API actions and intended principal ARNs to accomplish that:
The AllowRestrictedAdministerResource
statement allows 25 S3 API actions related to administering a bucket. The actions include configuration of security, monitoring, replication, tagging, and other actions needed by operational and automated administration processes.
This list of administrative S3 API actions is curated by k9 Security and used by our Terraform and CDK libraries to generate least privilege security policies for S3.
The ArnEquals
condition allows only administrators to execute the administrative API actions by specifying the arn:aws:iam::111:role/admin
and arn:aws:iam::111:user/ci
ARNs.
Allowing reads and writes works in the same way.
Allow Read & Write
Let’s add statements that allow read and write capabilities.
To allow reads, add an AllowRestrictedReadData
statement that looks like:
This statement includes the S3 API actions most people expect when they ask for read capabilities (full list). The s3:ListBucket
and s3:GetObject
actions two actions illustrate that when people say ‘read’, they often mean they need actions that apply to both the bucket and objects in the bucket. This is why the Resource element covers the bucket and objects in the bucket.
The ArnEquals
condition grants access the app
and cust-service
principal ARNs to read data from the bucket, and no one else.
To allow writes, add an AllowRestrictedWriteData
statement that looks like:
This statement allows only the S3 API actions needed to write to the S3 bucket (full list), and only uses the ArnEquals
condition to grant that access only to the app
role.
Congratulations! You’ve made it through constructing AWS security policy statements that allow least privilege administration, reads, and writes to an S3 bucket.
You could extend this pattern to allow deletes, if your use case required it. This team is happy confident knowing no data can be deleted from the sensitive-app-data
bucket, neither accidentally nor intentionally.
Step 6: Deny Unencrypted Transport or Storage
Now let’s wrap up this bucket policy by implementing two common encryption security requirements, requiring encryption in-transport and at-rest.
There’s no direct way to force implementation of those requirements at a bucket configuration level, however we can deny actions that would violate those requirements.
Deny accessing the bucket or its data over an insecure transport with this statement:
The DenyInsecureCommunications
statement denies execution of all S3 API actions when the request is made to an insecure endpoint, i.e. over http instead of https. The caller can still issue the request, but AWS will deny the request as soon as it hits the S3 API endpoint. So use of http endpoints should be found quickly.
Similarly, you can require encryption at rest with statements like:
These statements both apply to s3:PutObject
and all objects in the bucket.
The DenyUnencryptedStorage
denies putting data in the bucket if the s3:x-amz-server-side-encryption
request header is not set. (AWS sets this automatically when using a secure endpoint. The client could satisfy this encryption requirement by encrypting the object either with AES256 or a KMS encryption key.
The DenyStorageWithoutKMSEncryption
statement requires objects be encrypted with a KMS key, which is more useful from a security perspective (details). The statement denies s3:PutObject
unless the s3:x-amz-server-side-encryption
request header has the value of aws:kms
(and not AES256
).
Summary
This guide showed you how to create a secure S3 bucket policy that supports common use cases and security requirements in a maintainable way.
The Complete S3 Secure Bucket Policy Example built in this guide is available on GitHub.
Bucket policies are the best way to control access and enforce many security requirements in S3. They are also difficult and time consuming to build. You can use the k9 Security Terraform module and CDK constructs for S3 and other services to accelerate delivery of strong security policies. We’re happy to help you use those infrastructure code libraries and incorporate feedback.
Contact Us
Please contact us with questions or comments. We’d love to discuss AWS security with you.