Basics of Access To AWS S3 Buckets And Objects

Posted by Panos Matsinopoulos on 03/Mar/2023 (18:12)
Hero Image by Penny from Pixabay

Basics of Access to AWS S3 Buckets and Objects

We will run an online live seminar in May 2023, on Terraform and AWS. It is an 11 sessions (2-hours-long each) seminar (22 hours total). If you want to learn more about it you can click here: Practical Terraform & AWS

AWS S3 Service is the AWS service that allows for object storage in the cloud. It is used for many kind of different applications. For example, a lot of Web or mobile applications that deal with photos they might store the photo files into S3 buckets.

When it comes to security and who can access which bucket and how much of its contents, AWS S3 offers a very strong implementation that can cover almost every possible use case that you can think of.

In this blog post, we would like to present you with the basics. However, even the basics are sometimes difficult to understand by just reading the theory. Hence, we will take you into a practical walkthrough so that you can better understand what is going on.

Action Plan

Here is the action plan that we will follow while reading and practising at the same time:

  1. We will use the AWS cli for our interaction with AWS S3.
  2. We will create a bucket.
  3. We will upload a file.
  4. We will test that the file is private.
  5. We will then make the file public.
  6. We will then upload another file which will be immediately public.
  7. We will then delete the files and the bucket to make sure that we don't consume AWS resources.

Prerequisites

In order for you to be able to follow the actions and actually take them on your AWS account, there are some prerequisites:

  1. You need an AWS Account.
  2. You need to have an IAM User (let's call it s3_bucket_manager) who has the following privileges policy as a minimum:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "s3:CreateBucket",
            "Resource": "arn:aws:s3:::tech-career-booster-s3-blog-post"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:PutObjectAcl"
            ],
            "Resource": "arn:aws:s3:::tech-career-booster-s3-blog-post/*"
        },
        {
            "Sid": "VisualEditor2",
            "Effect": "Allow",
            "Action": "s3:DeleteBucket",
            "Resource": "arn:aws:s3:::tech-career-booster-s3-blog-post"
        }
    ]
}

This policy above allows the user to create the S3 bucket with name tech-career-booster-s3-blog-post. Also, the policy allows the user to create, download, set the ACL (Access Control List) and delete the objects in this specific bucket. Finally, it allows the user to delete the bucket tech-career-booster-s3-blog-post.

  1. You need to have AWS cli installed.
  2. You need to have an AWS profile locally configured for the IAM user that will be used for the practice, i.e. for s3_bucket_manager. If you want to learn how to configure AWS cli for a profile, please read here
  3. Locally, create the file hello.txt and put some content inside.
  4. Locally, create the file bar.txt and put some content inside.

With the prerequisites in place, let's carry out our action plan

Create the S3 Bucket

The following command will create the S3 bucket named tech-career-booster-s3-blog-post inside the region eu-west-1.

$ aws s3api create-bucket --profile s3_bucket_manager \
--bucket tech-career-booster-s3-blog-post \
--region eu-west-1 --create-bucket-configuration LocationConstraint=eu-west-1

If everything goes well, you will see the output:

{
    "Location": "http://tech-career-booster-s3-blog-post.s3.amazonaws.com/"
}

When we create an S3 bucket with a command like the above the default settings set the bucket as private. Also, any object that we might upload to this bucket will be private too. Private means that only the bucket and object owner have access to it.

Upload a File

With the bucket created, let's use the next command to upload the file hello.txt.

$ aws s3api put-object --profile s3_bucket_manager --bucket tech-career-booster-s3-blog-post \
--region eu-west-1 --key hello.txt --body hello.txt

If everything goes well you will see something like this:

{
    "ETag": "\"8ddd8be4b179a529afa5f2ffae4b9858\"",
    "ServerSideEncryption": "AES256"
}

Test File Privacy

As we said earlier, the default settings make this uploaded file private. This means that if we try to download it using curl for example, or a browser, it will fail. Let's try that:

$ curl -v -X GET -L 'http://tech-career-booster-s3-blog-post.s3.amazonaws.com/hello.txt' --output hello-downloaded.txt

Note: -L is asking curl to follow redirects because the URL we are passing is not the direct URL to the file in the S3 bucket.

The response to the curl command is 403 Forbidden status code which clearly tells us that we are not allowed to download the file, i.e. the file is not publicly available.

The file is only accessible by the AWS account of the user who owns it, who is the user who has created.

Hence, the following command will succeed:

$ aws s3api get-object --profile s3_bucket_manager --region eu-west-1 --bucket tech-career-booster-s3-blog-post \
--key hello.txt hello-downloaded.txt

It will return something like this:

{
    "AcceptRanges": "bytes",
    "LastModified": "2023-03-03T15:14:25+00:00",
    "ContentLength": 13,
    "ETag": "\"8ddd8be4b179a529afa5f2ffae4b9858\"",
    "ContentType": "binary/octet-stream",
    "ServerSideEncryption": "AES256",
    "Metadata": {}
}

and the file hello-downloaded.txt will have been created locally, having the same content as the original hello.txt.

Let's remove the download file. We will downloaded again later.

$ rm hello-downloaded.txt

Make the File Public

Can we turn the file from private to public? The answer is yes of course. We will use the following command to update the ACL (Access Control List) of the S3 object.

$ aws s3api put-object-acl --profile s3_bucket_manager --region eu-west-1 --bucket tech-career-booster-s3-blog-post \
--key hello.txt --acl public-read

Do you see the --acl public-read argument? This is how we specify a canned ACL, i.e. some ACLs that are already predefined by AWS S3 and can be re-used across objects. The public-read canned ACL gives read access to everyone on the particular object it is applied to.

With the file now having this ACL, then we can use curl command again:

$ curl -v -X GET -L 'http://tech-career-booster-s3-blog-post.s3.amazonaws.com/hello.txt' --output hello-downloaded.txt

This time, the response code is 200 OK, and we see the file hello-downloaded.txt created with the same content as the original hello.txt file.

Upload File And Make Immediately Public

We can render a file immediately public at the moment we upload it to the S3 bucket. Let's do that with the other file that we have, the bar.txt.

We use the same command to upload but we add the --acl public-read argument too:

$ aws s3api put-object --profile s3_bucket_manager --bucket tech-career-booster-s3-blog-post \
--region eu-west-1 --key bar.txt --body bar.txt --acl public-read

Now the curl command will immediately succeed:

$ curl -v -X GET -L 'http://tech-career-booster-s3-blog-post.s3.amazonaws.com/bar.txt' --output bar-downloaded.txt

Delete the objects and the bucket

Before we close this blog post, let's use the aws cli to delete the two objects and then the bucket.

For the deletion of the objects we use the delete-objects command that will allow us to delete both objects with one call.

$ aws s3api delete-objects --profile s3_bucket_manager --region eu-west-1 --bucket tech-career-booster-s3-blog-post \
--delete 'Objects=[{Key=hello.txt},{Key=bar.txt}]'

For the deletion of the bucket we use the delete-bucket command.

$ aws s3api delete-bucket --profile s3_bucket_manager --region eu-west-1 --bucket tech-career-booster-s3-blog-post

Next Blog Post

In an upcoming blog post, we will use 3 popular programming languages to create buckets, upload files and control their privacy

Don't forget to subscribe to our newsletter to get notified when a new blog post is published to Tech Career Booster blog.

Thank you for reading. Your comments are always welcome.

About Tech Career Booster

Tech Career Booster offers high-quality computer programming courses and professional services.


Get notified about new courses and blog posts

* indicates required