AWS Professional Solutions Architect 2020: S3 Security

AWS Professional Solutions Architect 2020: S3 Security

This course covers AWS (Amazon Web Services) S3 object cloud storage service. It is important to know how to block objects from being deleted or modified, and this course teaches learners how to use the security options available. The course demonstrates bucket encryption, access policies, and locking. You will learn how to create a permissions policy to allow resources to access an S3 bucket, how to modify bucket permissions using the bucket ACL (access control list), and how to block public access to an S3 bucket. You will explore how to determine when to use CORS (cross-origin resource sharing) to gain access to sources from a different origin. You will learn the difference between governance or compliance mode, which is dependent on your legal or regulatory requirements, and how to enable these modes. Finally, the course covers S3 storage class analysis, S3 object locking, and how to enable S3 encryption by using the GUI, CLI, and PowerShell. This course can be used in preparation for the AWS Certified Solutions Architect-Professional SAP-C01 certification exam.

Table of Contents

  1. Course Overview
  2. S3 Bucket Policies
  3. S3 Bucket Access Control List
  4. S3 Bucket Public Access
  5. S3 CORS Support
  6. S3 Storage Class Analysis
  7. S3 Object Locking
  8. S3 Bucket Encryption and the GUI
  9. S3 Bucket Encryption and the CLI
  10. S3 Bucket Encryption and PowerShell
  11. Course Summary

Course Overview

[Video description begins] Topic title: Course Overview. The host for this session is Dan Lachance, an IT Trainer / Consultant.  [Video description ends]

Hi, I’m Dan Lachance. I’ve worked in various IT roles since the early 1990s, including as a technical trainer, as a programmer, a consultant, as well as an IT tech author and editor. I’ve held and still hold IT certifications related to Linux, Novell, Lotus, CompTIA, and Microsoft. Some of my specialties over the years have included networking, IT security, cloud solutions, Linux management, and configuration and troubleshooting across a wide array of Microsoft products. The AWS Certified Solutions Architect Professional SAP-C01 certification exam focuses on the abilities of a solutions architect to design and deploy scalable, available, and reliable applications on AWS. Use appropriate AWS services to design and deploy applications to meet given requirements, migrate complex applications on AWS, design and deploy enterprise-wide scalable operations on AWS, and implement cost control strategies.

In this course, we’re going to explore S3 security options, including bucket encryption, access policies, and object locking. I’ll start by examining how to create a permissions policy for an S3 bucket, how to modify bucket permissions using the bucket access control list, or ACL, and how to block public access for an S3 bucket. I’ll then explore how to determine when to use cross-origin resource sharing, otherwise called CORS, C-O-R-S, and how to perform an S3 storage class analysis. Lastly, I’ll demonstrate how to enable S3 object locking and how to use the GUI, CLI, and PowerShell to enable S3 encryption.

S3 Bucket Policies

[Video description begins] Topic title: S3 Bucket Policies. The presenter is Dan Lachance. [Video description ends]

You can control access to the contents of an S3 bucket using a permissions policy, a bucket policy.

[Video description begins] A web page labeled “AWS Management Console” is open. It includes the following sections “AWS services” and “Explore AWS”. [Video description ends]

To get started with this, in the AWS Management Console, I’m going to go ahead and search up S3 and pop up the S3 Management Console. And I’m going to click on an existing bucket where I want to set the policy.

[Video description begins] In the AWS services section, he clicks a search box labeled “Find Services”. A drop-down list opens. He clicks an option labeled “S3”. A web page labeled “S3 Management Console” opens. The web page is divided into two parts. The first part is the navigation pane. The navigation pane is labeled “Amazon S3”. It includes options labeled “Buckets”, “Batch operations”, and “Access analyzer for S3”. The Buckets option is selected. The second part is the content pane. It includes a table with several columns and three rows. The column headers include “Bucket name”, “Access”, and “Region”. He clicks an entry labeled “bucketyhz” under the column header Bucket name. The corresponding page opens. The page contains tabs labeled “Overview”, “Properties”, “Permissions”, “Management”, and “Access points”. The Overview tab is selected. [Video description ends]

Now, to get to it, I’m going to have to go to the Permissions tab, where I’ll see a Bucket Policy button.

[Video description begins] He clicks the Permissions tab. It includes buttons labeled “Block public access”, “Access Control List”, “Bucket Policy”, and “CORS configuration”. [Video description ends]

And from here, if we already have a policy, it’ll be shown here in JSON syntax.

[Video description begins] He clicks the Bucket Policy button. The corresponding section opens. It includes a Bucket policy editor text box and a Policy generator link. [Video description ends]

But we have nothing. Now, down below, instead of typing it in from scratch, starting on line one, we could click instead the Policy Generator link which opens a new tab. And from here, we would choose for the type of policy, S3 Bucket Policy.

[Video description begins] A web page labeled “AWS Policy Generator” opens. It contains sections labeled “Step 1: Select Policy Type”, “Step 2: Add Statement(s)”, and “Step 3: Generate Policy”. The Step 1: Select Policy Type section contains a drop-down list box labeled “Select Type of Policy”. He selects an option labeled “S3 Bucket Policy” in the drop-down list box. [Video description ends]

Now we have to determine if with this specific configuration we want to allow or deny specific access to the bucket, I want to Allow it.

[Video description begins] He points to the Step 2: Add Statement(s) section. It includes two radio buttons labeled “Allow” and “Deny” adjacent to the field “Effect”, two text boxes labeled “Principal” and “Amazon Resource Name (ARN)”, and two drop-down list boxes labeled “AWS Service” and “Actions”. The Allow radio button is selected. [Video description ends]

You could specify the Principal or Principals, you could have a comma separated list principals or security entities.

[Video description begins] He points to the Principal text box. [Video description ends]

Like, user accounts, that would have access to the bucket, or even IAM roles. Now, you could also specify an asterisk here for everybody.

[Video description begins] He types “*” in the Principal text box and points to the AWS Service drop-down list box. [Video description ends]

So the AWS service is set to S3, then you get to determine the action, or you could just choose All Actions. But normally, you don’t want to do that.

[Video description begins] He clicks the Actions drop-down list box. A drop-down list opens. [Video description ends]

That’s basically in violation of the principal of least privilege, if you’re just giving all access, instead of only the access that’s needed. And maybe the only access that’s needed is to retrieve information. So, maybe using the Get series of actions in S3, so maybe GetObject, for example, to retrieve objects.

[Video description begins] He selects an option labeled “GetObject” from the drop-down list. [Video description ends]

So once you’ve configured this correctly, you also have to make sure you put in the Amazon Resource Name or the ARN for the S3 bucket. Now the ARN will look like this.

[Video description begins] He switches to the S3 Management Console web page and highlights the following code: “arn:aws:s3:::bucketyhz/*” in the Bucket policy editor text box. [Video description ends]

I’ve gone back to my policy permissions page, where I can see the code that’s been generated here. I can see that the arn here is set to aws:s3::: and then the name of my bucket/*.

[Video description begins] He highlights the following code: “s3:GetObject”. [Video description ends]

We can see that s3:GetObject as an action has been selected, the principal here of course, is the “*”, the effect is to allow as opposed to deny.

[Video description begins] He highlights * in the following code: “Principal”: “*”,. [Video description ends]

So we’re going to go ahead and Save that policy.

[Video description begins] He highlights Allow in the following code: “Effect”: “Allow”,. [Video description ends]

And notice when I do that I get an Error, Access denied.

[Video description begins] He clicks a button labeled “Save”. [Video description ends]

Now there was a message that was displayed here down below, that says that The block public access settings turned on for this bucket will prevent granting public access. Okay, well let’s go to Block public access up at the top.

[Video description begins] He clicks the Block public access button. It includes various checkboxes. [Video description ends]

It’s currently on, well that’s not good. Let’s click Edit, we’re going to turn that off, Save that.

[Video description begins] He unchecks a checkbox labeled “Block all public access” and clicks a button labeled “Save”. A dialog box labeled “Edit block public access (bucket settings) opens. It contains a text box and buttons labeled “Cancel” and “Confirm”. [Video description ends]

We’re going to type in the word confirm, as it requests, to confirm I want to do this. I’m going to go back to the Bucket Policy again and I’ve got my policy code in here now.

[Video description begins] He clicks the Confirm button. He clicks the Bucket Policy button and its corresponding section opens. [Video description ends]

So, I’m just going to go ahead and click on Save. Notice, this time, it took. It says, This bucket has public access. Now, naturally, you don’t want to take this lightly. If there’s any type of sensitive information, then you don’t want to do this for the entire bucket. So now that we’ve done this, we need to test it.

[Video description begins] He selects the Overview tab. It includes a folder labeled “projects”. [Video description ends]

So the way we’re going to test it, is go to the Overview tab for the bucket, where we can see, for example, we’ve got a projects folder.

[Video description begins] He clicks the projects folder. It includes several files. Adjacent to each file, there is a checkbox. [Video description ends]

We’ve got a number of files available here, such as Project_A.txt. If I were to just click the checkbox to the left of it, we can see the Object URL over here.

[Video description begins] The corresponding pane opens. It includes an Object URL: https://bucketyhz.s3.amazonaws.com/projects/Project_A.txt. [Video description ends]

So, let’s just click on that. And notice, it takes us directly into the contents of that text document.

[Video description begins] The corresponding web page opens in a browser. [Video description ends]

We aren’t prompted for credentials or anything because the bucket policy, along with turning off blocking public access, allows us to view this S3 object.

S3 Bucket Access Control List

[Video description begins] Topic title: S3 Bucket Access Control List. The presenter is Dan Lachance. [Video description ends]

Every Amazon S3 bucket has an access control list, or an ACL, that determines which entities have access to the contents within the S3 bucket. To look at that here in the S3 Management Console got to click on an existing bucket to open it up.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. He clicks the entry “bucketyhz”. The corresponding page opens. [Video description ends]

Got to click on the Permissions tab at the top, then I’m going to click Access Control List. So we can see up at the top the Canonical ID for the bucket owner, that is myself which has the ability to List objects, Write objects, and Read and Write bucket permissions. Now, below that we have an Add account button for other AWS accounts. Now, remember, larger enterprises could use multiple AWS accounts for different business units, different projects, or different countries, that type of thing. So you might require to open up what the contents of this bucket are to another department or another project and whatnot. So to do that, we can click Add account where we can specify a Canonical ID number, or an email address for that other account. And then, of course, set the appropriate permissions.

[Video description begins] He points to checkboxes labeled “Yes” for permissions labeled “List objects”, “Write objects”, “Read bucket permissions”, and “Write bucket permissions”. [Video description ends]

Now down below, we can also grant public access permissions for everyone, or S3 log delivery group permissions for logging purposes.

[Video description begins] He selects a radio button labeled “Everyone” in a section labeled “Public access”. The corresponding pane opens. [Video description ends]

So back to Public access, I’ve turned on Everyone. Let’s say we want to allow List objects. It says This bucket will have public access because that’s what you’re turning on.

[Video description begins] He selects a checkbox labeled “List objects”. [Video description ends]

Now I can also elect to turn on writing of objects, but let’s just leave it at List objects. And I’ll just click Save. Now, when I scroll back up, notice now how it shows us that we’ve got public access for this entire bucket.

[Video description begins] He points to the Access Control List button. [Video description ends]

So that if we were to go to the Overview, let’s say, we can see folders.

[Video description begins] He clicks the Overview tab. It includes the projects folder. He clicks the projects folder. It includes several files. Adjacent to each file, there is a checkbox. [Video description ends]

Click on that. We can also see files.

[Video description begins] He selects the checkbox adjacent to the Project_A.txt file. The corresponding pane opens. It includes the Object URL: https://bucketyhz.s3.amazonaws.com/projects/Project_A.txt. [Video description ends]

We could test out access to a file. So if I put a check mark to the left of a file, it pops up the Property dialogue box on the right, where I can see the Object URL to that file. So when I go ahead and click on it, it lets me straight into it.

[Video description begins] The corresponding web page opens in a browser. [Video description ends]

Now this is because we’ve modified the ACL, the access control list for the bucket, to allow listing of objects here for the S3 bucket. Now there are many other ways that you can do this as well through a bucket policy and enabling bucket public access, making sure that it’s not blocked.

S3 Bucket Public Access

[Video description begins] Topic title: S3 Bucket Public Access. The presenter is Dan Lachance. [Video description ends]

Here in the S3 Management console, I can see I’ve got two existing buckets. And under Access, it says Objects can be public.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. [Video description ends]

Now, when I go to Create a bucket, let’s do this. And I’ll just fill in a fictitious name with some random numbers.

[Video description begins] In the content pane, he clicks a button labeled “Create bucket” and its corresponding wizard opens. A step labeled “1 Name and region” is open in the wizard. [Video description ends]

We’re not actually going to create the bucket. But I want to show you that when you go through the process of creating a new storage bucket in the cloud.

[Video description begins] He clicks a button labeled “Next” and a step labeled “2 Configure options” opens. He again clicks the Next button and a step labeled “3 Set permissions” opens. [Video description ends]

When you get to the Block public access portion of the wizard, notice the default is to Block all public access. So I’m going to click the X, I don’t want to create this. Now, clearly, that’s not been left as it was for bucketyhz because it says objects can be public. Okay, what does that mean exactly? It means public access to the contents of the bucket, such as not having to authenticate to gain access to files. So for example, if I click to open up the bucket, I can see there’s a projects folder, let’s open that up.

[Video description begins] He clicks the entry “bucketyhz”. The corresponding page opens. He clicks the projects folder. It includes several files. [Video description ends]

And I can see a number of files in that folder, I’m going to put a check mark to the left of one of them.

[Video description begins] He selects the checkbox adjacent to the Project_A.txt file. The corresponding pane opens. It includes the Object URL: https://bucketyhz.s3.amazonaws.com/projects/Project_A.txt. [Video description ends]

And over on the right, I’m going to click the Object URL, see if I can connect directly to it.

[Video description begins] The corresponding web page opens in a browser. [Video description ends]

Of course, it lets me in directly because public access is not being blocked currently, as it would be by default upon the initial creation of the bucket. Now, let’s check this out. So if I go back here in my breadcrumb trail in the upper left and click on the name of the bucket.

[Video description begins] He clicks the back button in the browser. A page labeled “Project_A.txt” is open in the S3 Management Console web page. He navigates back to the bucketyhz page. [Video description ends]

And if we click on Permissions, so here’s Block public access, and it’s been turned off, down below. Well, that’s not the default, but we can change it back if we so choose. If we decide you know, I don’t want to allow basically anonymous access to the contents of this bucket any longer, let’s change it around. So I’m going to click the Edit button, I’m going to choose Block all public access. So you have a number of subordinate options here. So blocking public access to buckets and objects granted through new ACLs, or through any ACLs. Granted through new public bucket or access point policies, now, that would be by going to the Bucket Policy up here. And also, through any policies, they’re not just new ones, all of them. So we’re blocking everything here, we’re not allowing any public access. Okay, let’s Save this.

[Video description begins] He clicks the Save button. The Edit block public access (bucket settings) dialog box opens. He types the text “confirm” in the text box and clicks the Confirm button. [Video description ends]

It wants us to type in confirm, okay, Confirm button. And it says Public access settings updated successfully. Let’s try this again, let’s go back to the Overview tab for the bucket.

[Video description begins] He clicks the Overview tab. He clicks the projects folder. [Video description ends]

There’s the projects folder, and there’s, well, it doesn’t really matter which file, but we’ll choose the same file by clicking in the check box to the left. And I’m going to click the URL over on the right for that file.

[Video description begins] He selects the checkbox adjacent to the Project_A.txt file. The corresponding pane opens. The corresponding web page opens in a browser. [Video description ends]

Now, if we do a refresh, we’re now getting an access denied because public access has been blocked. Meaning that has been turned on, the blocking feature to block public access.

S3 CORS Support

[Video description begins] Topic title: S3 CORS Support. The presenter is Dan Lachance. [Video description ends]

Amazon S3 supports cross-origin resource sharing, otherwise called CORS spelled, C-O-R-S.

[Video description begins] S3 Cross-Origin Resource Sharing (CORS). [Video description ends]

Now, to explain this, imagine that you’ve got a client using a web app. So that means that the client device is connected to a specific DNS domain name for that web app.

[Video description begins] He points to the following domain: webapp.s3-website-us-east-1.amazonaws.com. [Video description ends]

But let’s say that there’s a component in that web app that needs to call upon a resource in a different DNS domain.

[Video description begins] He points to the following domain: bucket1.s3.amazonaws.com. [Video description ends]

Well, that’s where things can get a little bit sticky. And, if permissions aren’t set correctly, that request will fail. Now, we can enable this by setting up cross-origin resource sharing. To do that, you first need to create a CORS config XML file.

[Video description begins] Creating a CORS Configuration. [Video description ends]

Now, you could use a standard text editor and write it from scratch. But more often than not, you’d probably find yourself using the CORS editor that’s built directly into the S3 bucket properties permissions page. And the next order of business is to create one or more rules. Now, you can create up to 100 rules within a single configuration. Now, these rules are used to define origin DNS domain names. And what access they should have to target DNS domain names. And you’ll see what I’m talking about in a minute when we look at a sample configuration. So you specify the allowed origins to target the bucket. And the allowed HTTP methods, whether it’s GET, PUT, POST, DELETE, or HEAD. Now, picture on the screen, you’re looking at a sample CORS configuration.

[Video description begins] Sample CORS Configuration. A screenshot is displayed, which contains the following code, code starts: <CORSConfiguration> <CORSRule> <AllowedOrigin>https://webapp.s3-website-us-east-1.amazonaws.com </AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <MaxAgeSeconds>2000</MaxAgeSeconds> <AllowedHeader>*</AllowedHeader> </CORSRule> </CORSConfiguration>. Code ends. [Video description ends]

So we’ve got the first opening tag CORSConfiguration. And at the very bottom /CORSConfiguration to close that tag. Then we’ve got a single CORSRule within the configuration. The allowed origin here is listed with https:// and it’s got a DNS name. This would be the origin name for, in our example, the web app that the user is connected to. Now, that’s the allowed origin tag. The allowed method in this case is only GET. We’ve also got MaxAgeSeconds here. Now, this is the amount of time that the client web browser will cache the response from Amazon S3. Now, this is just used to speed things up in some cases. Then we’ve got the AllowedHeader tag, which has an “*” in it. This is in reference, of course, to HTTP headers. So because this configuration is within the S3 bucket environment, we know what the target is. So we are allowing calls from the listed allowed origin.

S3 Storage Class Analysis

[Video description begins] Topic title: S3 Storage Class Analysis. The presenter is Dan Lachance. [Video description ends]

When you store something in an S3 bucket for example, here in the S3 Management console.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. He clicks the entry “bucketyhz”. The corresponding page opens. The Overview tab is selected. [Video description ends]

I will open an existing bucket and folder.

[Video description begins] He clicks the projects folder. It includes several files. [Video description ends]

Here we can see we’ve got a number of files in the bucket and then in the Storage class column, they are currently sub standard.

[Video description begins] He selects the checkbox adjacent to a file labeled “ProjectA_Budget.xls”. He then clicks a drop-down button labeled “Actions” in the Overview tab and clicks an option labeled “Change storage class”. The corresponding dialog box opens. A radio button labeled “Standard” is selected in the dialog box. [Video description ends]

Now, we could select one or more of those entities, and we can go to Actions, Change storage class. and maybe to save money, we would switch it to Standard-IA, infrequently accessed. which costs less than the standard Storage class. But how do you know, especially when you initially put contents or files into the bucket that, that is going to be the usage pattern over time? You don’t always know.

[Video description begins] He clicks a button labeled “Cancel”. He unchecks the checkbox adjacent to the ProjectA_Budget.xls file. [Video description ends]

And that’s where the S3 storage class analysis comes in as being handy, because it’ll watch access to your S3 objects, either the entire bucket, or based on tags, or some kind of filter string, it’s up to you. And it will make recommendations over different time periods, number of days, or when you might consider moving things over to Standard-IA and fully access to reduce your storage charges. Now, how does this work? Let’s go back to the bucket. So I’m going to click on the bucket name and the breadcrumb trail, up at the top.

[Video description begins] He clicks the Management tab. [Video description ends]

Now if I go to Management, you can Add life cycle rules manually to determine that you want to move data to a different storage class. But that’s not what we’re talking about, atleast not directly. S3 storage class analysis can help you, properly craft your life cycle policies. But it doesn’t do it for you automatically. So what I’m going to do then, is under Management, I’m going to choose Analytics. Now here for Storage class analysis we can search for filter prefix or tag, well really done anything yet. So I’m going to click Add, down below. Now I’m going to add a filter called Filter1.

[Video description begins] A pane labeled “Add filter” opens. It includes a text box labeled “Filter name”. He types Filter1 in the Filter name text box. [Video description ends]

Now you can add multiple filters. Literally, you can have up to 1,000 filters and you’ll get a separate analysis report for each of those filters. And that can be useful if for example, you’re storing files in the bucket for project A as well as for project B. And you want to do a Storage class analysis just for project A stored items. You could do that if you wanted to. So, down below we can put in a prefix, a text, string prefix, or even a folder name, for example, or tag information that we want to filter by or we can have this filter apply to the entire bucket contents. So I’m going to leave it empty so that the entire bucket will be analyzed and I’m going to click Save. You can now see it says Analyzing your data. Now realistically, you shouldn’t expect to get meaningful information right away. This is designed to analyse your storage uses over time.

[Video description begins] A web page labeled “Amazon S3 Analytics – Storage Class Analysis” opens in a new tab. [Video description ends]

Now the different uses of data will vary depending on the nature of data, but here I’ve gone to the documentation for S3 Analytics, so we can get a sense of what the result will look like over time. What you’re going to start to see, just scroll down here, is you’re going to get a sense of what it will present in terms of a chart or a graph that you can look at. So notice we can see here that we’ve got S3 objects between 90 and 119 days old. And we can see the amount of space that is available. But take a look as we go further over to the right. We can see S3 objects that are older. And these are classified as Infrequently accessed, and they would be a good Candidate for transition to Standard-IA as a storage class, standard Infrequently accessed.

So we can see from the legend above, that the blue listings are for the amount of storage and the purple is for the amount of data that was retrieved from that overall storage. So the purpose then of S3 Storage Class analytics is to let it watch access to your S3 Objects over time to determine if some things don’t get accessed a lot or often and therefore should be transitioned to Standard-IA. And you would configure your lifecycle policy rules accordingly, based on this resultant output.

S3 Object Locking

[Video description begins] Topic title: S3 Object Locking. The presenter is Dan Lachance. [Video description ends]

S3 object locking is a powerful feature. There might be times where you need to secure some objects stored in an S3 bucket from being deleted or modified. And we can do this through S3 object locking. But thing is it’s one of those features that you have to enable when you create the bucket. For instance, let’s open up an existing bucket I’ve got here called bucketyhz.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. [Video description ends]

I’ll click to open it up, we’re going to go to the Properties tab for the bucket and I’m going to scroll down and there’s Object lock. So let’s click on it.

[Video description begins] He clicks the entry “bucketyhz”. The corresponding page opens. The Overview tab is selected. He selects the Properties tab. A pop-up box labeled “Object lock” opens. It includes a button labeled “Close”. [Video description ends]

And it says object lock can be enabled only when a bucket is created. If you’re not sure, if you get a need, these kinds of options to prevent modification of the contents of S3, maybe to regulatory compliance, or maybe for legal hold reasons. Maybe you’ve been instructed by your legal department to enable legal hold for some objects in S3 because they might be used in a court of law. And for admissibility in the court of law, needs to be a way to prove data was temper proof. So given that we can only do this upon creation time it might be a good idea to create buckets with the option on. And then you can determine if you actually want to use it after the fact.

[Video description begins] He clicks the Close button. [Video description ends]

Okay, I’m going to close all of that. We’re going to create a new bucket here.

[Video description begins] He clicks an option labeled “Amazon S3” from the breadcrumb navigation pane and the corresponding page opens. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. [Video description ends]

So I’m going to go back to the Amazon S3 Management console and I’m going to click Create bucket, this we’ll call bucketyyy and I’m going to click Next.

[Video description begins] The Create bucket wizard opens. It contains four steps labeled “1 Name and region”, “2 Configure options”, “3 Set permissions”, and “4 Review”. The “1 Name and region” step is open in the wizard. He clicks the Next button. The “2 Configure options” step opens. [Video description ends]

Now, on this page here, the second page of the Wizard here for Configure options, if I scroll down under Advance, there’s Object lock, however, I can’t turn it on.

[Video description begins] He expands a section labeled “Advanced settings”. [Video description ends]

If I just read a little bit further, the fine print below it says, object lock requires bucket versioning. Let’s turn on versioning up at the top here.

[Video description begins] He selects a checkbox labeled “Versioning”. [Video description ends]

Let’s go back down and try it again. Yeah, that worked.

[Video description begins] He selects a checkbox labeled “Object lock”. He clicks the Next button. The “3 Set permissions” step opens. He clicks the Next button. The “4 Review” step opens. He clicks a button labeled “Create bucket”. [Video description ends]

Object lock is now enabled, okay. Let’s go through and accept the rest of the defaults to get the bucket created. There’s our new bucket, bucketyyy.

[Video description begins] The corresponding page opens. He selects the Properties tab. [Video description ends]

If I click to open it up and go to the Properties, sure enough if we go down under Advanced settings there’s the Object lock panel Permanently enabled. if I click on Object lock we have some other options.

[Video description begins] The “Object lock” pop-up box opens. A radio button labeled “None” is selected. [Video description ends]

We can enable governance mode, now it says very clearly here if you do this governance mode can be disabled by AWS accounts that have the appropriate IAM permissions. But if you enable something called compliance mode, it cannot be disabled by anybody, not even the AWS root account. That’s interesting. So depending on your legal or regulatory requirements, will determine whether you enable governance or compliance mode. Now either way, if you were to choose governance mode, you have a retention period, the default of which is 1 day. Same thing is true for compliance mode.

[Video description begins] A text box labeled “Retention period” appears. [Video description ends]

What does this mean? That really means for how long compliance or governance mode, or in other words, how long objects will be locked, after which you can then delete them if you need to. So I’m going to go ahead and turn on Enable governance mode and I’ll choose Save. I’ll leave the retention period at the default of 1 day. Says okay, please type in confirm.

[Video description begins] A dialog box labeled “Confirm governance mode” opens. [Video description ends]

Very good, we’re going to go ahead and do that. So I’ll type in confirm.

[Video description begins] He clicks the Confirm button. [Video description ends]

Looks good. So let’s go back up here, scroll back up and let’s go to the Overview of the bucket. Let’s upload a file to see if it picks up that change because existing files will not have that applied to them.

[Video description begins] He clicks the Overview tab and clicks an option labeled “Upload”. A wizard labeled “Upload” opens. A step labeled “1 Select files” is open in the wizard. [Video description ends]

So I’m going to click Add files and add a sample file here and I’ll go ahead and click Next.

[Video description begins] The File Explorer window opens. He selects a file labeled “Project_A.txt”. The “Project_A.txt” file is displayed in the “1 Select files” step. [Video description ends]

And basically I’m going to accept the rest of the defaults.

[Video description begins] He clicks the Next button in the next two steps labeled “2 Set permissions” and “3 Set properties” and then clicks a button labeled “Upload” in a step labeled “4 Review“. The Project_A.txt file appears in the Overview tab. [Video description ends]

So we’ve got a new file upload it looks good directly in the file name, let’s go to the Properties tab for that file.

[Video description begins] The corresponding page opens. He selects the Properties tab. [Video description ends]

And let’s go down and click on the Object lock panel for that individual S3 file, that object. Notice governance mode is automatically selected and the retention date or the lock date is a day after today’s date.

[Video description begins] The “Object lock” pop-up box opens. [Video description ends]

Then notice what’s different when you go into the object lock settings for an individual object as opposed to the entire bucket level, you get the Legal hold option. You can turn on Enable for Legal hold. Think of legal hold is being an override to the retention date.

[Video description begins] He selects a radio button labeled “Enable”. [Video description ends]

So basically, when legal hold is enabled, then this item cannot be modified until legal hold is disabled regardless of what they retain until date sets. So I’m going to go ahead and choose Save, okay. So it says Legal hold was successfully enabled for this object, so I’m just going to go back to the bucket.

[Video description begins] He clicks an option labeled “bucketyyy” from the breadcrumb navigation pane. The corresponding page opens. [Video description ends]

So at this point we’ve enabled object locking at the bucket level during creation time and we also saw how we can enable legal hold for an individual S3 object.

S3 Bucket Encryption and the GUI

[Video description begins] Topic title: S3 Bucket Encryption and the GUI. The presenter is Dan Lachance. [Video description ends]

Encryption of data at rest in the Amazon Web Services Cloud is sometimes required to be compliant with laws or regulations. We can do it. We can set encryption as a default at the bucket level. So for example here in the S3 Management Console I’ve got an existing bucket named bucketyyy. Now let’s click on it to what’s in there.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. [Video description ends]

We’ve got one sample file called project_A.txt.

[Video description begins] The corresponding page opens. [Video description ends]

If I put a check mark to the left of it and look over on the right at the Properties, we can see Encryption is set to None.

[Video description begins] A pane titled “Project_A.txt” opens. [Video description ends]

Well that’s consistent with the fact that if I go into the Properties tab here for the bucket, and if we look at the Default encryption, it’s actually Disabled.

[Video description begins] He clicks the Properties tab. [Video description ends]

Okay, however what we could do is we could enable an individual object in the bucket for encryption if we wanted to. So for example, Project_A text, I’m going to select that checkbox to the left to select it. I’ll go to Actions and I’ll choose Change encryption.

[Video description begins] He clicks the Overview tab. He clicks the Actions drop-down button and selects an option labeled “Change encryption”. The corresponding pane opens. [Video description ends]

It’s currently set to None, let’s use AES-256.

[Video description begins] He selects a radio button labeled “AES-256”. [Video description ends]

That’s Advanced Encryption Standard 256-bit encryption, where Amazon will manage the key. So otherwise, we could use KMS, where we’ve got our own keys that we control.

[Video description begins] He selects a radio button labeled “AWS-KMS”. A drop-down list box appears. He clicks the drop-down list box. It includes options labeled “aws/s3” and “Key-East”. [Video description ends]

And that might be required sometimes to remain compliant with certain regulations. For this case, I’m just going to use AES-256 and choose Save. Says all objects will change encryption. Well, it’s only one object up there. That’s okay, perfect change.

[Video description begins] A pane with the same name opens. He points to the Project_A.txt file. He then clicks a button labeled “Change”. The pane closes. [Video description ends]

Now let’s just click somewhere else and come back to overview and refresh to make sure everything’s good. And I’ll just put the checkmark. Just select that file again. Let’s just check the encryption over here on the right. Notice now it says Encryption AES-256. Okay, but wouldn’t it be great if we didn’t have to worry about it?

[Video description begins] He selects the Properties tab and switches back to the Overview tab. He selects the checkbox adjacent to the “Project_A.txt” file. The corresponding pane opens. [Video description ends]

Well we can set it as a default for the bucket. So let’s go back into the Properties tab for the bucket.

[Video description begins] He clicks the Properties tab and then clicks an option labeled “Default encryption”. The corresponding pop-up box opens. [Video description ends]

Let’s go down to Default encryption, and again, I’m going to choose AES-256. There is just a little note here that bucket policies, which control permissions such as, reading or writing to the bucket among other things Really get executed first or evaluated before bucket encryption occurs. Good to know that’s fine, Save.

[Video description begins] He clicks a button labeled “Save”. [Video description ends]

Okay, so we’ve enabled AES-256 encryption as a default for the bucket. Let’s go back to Overview. Let’s upload some new content to the bucket.

[Video description begins] He clicks the Overview tab. He then clicks the Upload button. The corresponding wizard opens. The “1 Select files” step is open in the wizard. A file labeled “Project_B.txt” is displayed. He clicks the Next button. [Video description ends]

So I’m going to go ahead and go through Next and accept all of the defaults here.

[Video description begins] He clicks the Next button in the “2 Set permissions” and “3 Set properties” steps and the Upload button in the “4 Review“ step. The Project_B.txt file appears in the Overview tab. [Video description ends]

And we’ve now got a new file Project_B and if I select it here and look over on the right.

[Video description begins] He selects the checkbox adjacent to the Project_B.txt file. The corresponding pane opens. [Video description ends]

We can see that the Encryption is automatically set now, to AES-256 bit. So let’s pick that up from our bucket default encryption setting.

S3 Bucket Encryption and the CLI

[Video description begins] Topic title: S3 Bucket Encryption. The presenter is Dan Lachance. [Video description ends]

You can use the AWS CLI to encrypt the contents of S3 buckets. Here in the S3 Management Console, I’m looking at an existing bucket named bucketyhz. And I’ve got a projects folder within it with a number of sample files.

[Video description begins] The “bucketyhz” page is open in the S3 Management Console web page. The Overview tab is selected. [Video description ends]

Let’s click on that to open it up. Notably, we’re going to be working with Project_A.txt, Project_B.txt and Project_C.txt. Now, if I click on any one of those to select it and take a look at whether or not it’s encrypted.

[Video description begins] He selects the checkbox adjacent to the “Project_A.txt” file. The corresponding pane opens. [Video description ends]

Notice Encryption is set to None. We’ll just look, let’s say at Project_C, same thing is true for it.

[Video description begins] He selects the checkbox adjacent to the “Project_C.txt” file. The corresponding pane opens. [Video description ends]

Now, let’s flip over to the seal line, let’s start some encryption. Now, in this particular case, I’m going to encrypt that existing file.

[Video description begins] He opens the Command Prompt window. The following prompt is displayed: C:\>. [Video description ends]

So I’m not going to set it as a default at the bucket level. So I’m going to do that using the aws s3 cp command. cp means copy. If you’re Unix or Linux person, then you definitely know that. So what I’m doing is specifying the s3:// prefix the name of my bucket the projects folder. We know there’s a file there called specifically Project_A.txt. And we want to copy that basically to the same name. We’ll repeating that. That’s the target. And what we’re doing is adding -sse for service-side encryption. We’re specifying we want to use AES256, which means that these are Amazon Managed Keys. Let’s go ahead and do it. Well, we didn’t get very far because there’s a space syntax is wrong. Well, let’s go back and fix that these things happen. Let’s try it again. Looks better, but let’s go and check our work in the console.

[Video description begins] He executes the following command: aws s3 cp s3://bucketyhz/projects/Project_A.txt s3://bucketyhz/projects/Project_A.txt –sse AES256. The output reads: copy: s3://bucketyhz/projects/Project_A.txt to s3://bucketyhz/projects/Project_A.txt. The prompt does not change. [Video description ends]

Back here in the console, let’s just select Project_A here.

[Video description begins] The corresponding pane opens. [Video description ends]

And let’s just take a peek, and indeed, the Encryption is AES-256. It didn’t even have to click the Refresh icon to see that change. But what if you want to encrypt everything?

[Video description begins] He switches back to the Command Prompt window. [Video description ends]

Certainly, you’re not going to do it for each individual item. So we can do that. Here, we’re going to accomplish this with the aws s3 cp command. So copy again. Same kind of thing, except instead of pointing to a specific S3 object, in this case, I could point to the just the entire bucket. But here I’m pointing to the projects folder within the bucket. And the same thing is true for the target. The only thing that’s different here is I’m saying –recursive. And of course, once again –sse for server-side encryption. And I want to set it to AES256. Again, Amazon Managed Keys. That’s fine. Let’s press Enter and make it so. We’re going to go check our work here.

[Video description begins] He executes the following command: aws s3 cp s3://bucketyhz/projects/ s3://bucketyhz/projects/ –recursive –sse AES256. The output displays the encryption of all the files in the project folder. [Video description ends]

So looks like it did it to all the items within the project’s folder. So back here in the console, really we can take our pic about ProjectB_Budget.xls about that files. So I will select that, deselect the other one.

[Video description begins] The corresponding pane opens. [Video description ends]

And we can take a look here. And we see that Encryption indeed is set at a AES-256.

S3 Bucket Encryption and PowerShell

[Video description begins] Topic title: S3 Bucket Encryption and PowerShell. The presenter is Dan Lachance. [Video description ends]

PowerShell can be used to enable default S3 bucket encryption. In this example, we’re going to take a look at an existing bucket here in the S3 Management Console called bucketyhz. So I’m going to click on it to open it up and go to the Properties.

[Video description begins] The S3 Management Console web page is open. The Buckets option is selected in the navigation pane and its corresponding page is open in the content pane. [Video description ends]

And let’s just take a look at the Default encryption panel currently says disable that we could hear in the GUI enable it for either AES-256 where Amazon manages the keys or AWS-KMS Key Management Service where we can Configure a key and select it to our choosing from here.

[Video description begins] He clicks the Properties tab. [Video description ends]

So if we need to manage the key yourself for regulatory or legal compliance of some kind.

[Video description begins] He clicks the “Default encryption” option. The corresponding pop-up box opens. He selects the “AWS-KMS” radio button. The drop-down list box appears. He clicks the drop-down list box. It includes the “aws/s3” and “Key-East” options. [Video description ends]

However, I don’t want to change anything here. Let’s just cancel all the way back out and let’s enable the Default encryption through PowerShell. Then what’s that’s going to mean is any new content placed into that S3 bucket will be automatically encrypted.

[Video description begins] He selects the None radio button and closes the pane. [Video description ends]

To do this, we’re going to use the Set-S3BucketEncryption PowerShell commandlet. We’re going to use the -BucketName parameter.

[Video description begins] The Windows PowerShell window opens. The following prompt is displayed: PS C:\>. [Video description ends]

Of course we have to give it the name of the bucket, in this case bucketyhz. Then I’m going to use -ServerSideEncryptionConfConfiguration So it spelt the word _ServerSideEncryptionRule. That’s a long parameter name. Then I’m going to have an @ symbol, and then an open as well as an outer closing curly bracket. Now after the first opening one, we’re going to put in ServerSideEncryptionByDefault= We’re going to have an @ symbol, another opening curly bracket, ServerSideEncryptionAlgorithm= and in quotes AES256. And we’re going to close that curly brace. So you’ve got two opening curly braces at different points, and you need to match them with ending curly braces. Let’s go ahead and press Enter to enable that.

[Video description begins] He executes the following command: Set-S3BucketEncryption -BucketName bucketyhz -ServerSideEncryptionConfiguration_ServerSideEncryptionRule @{ServerSideEncryptionByDefault=@{ServerSideEncryptionAlgorithm=”AES256″}}. No output is displayed and the prompt does not change. [Video description ends]

Okay. Let’s go check our work. So back here, we’re still looking at the same page.

[Video description begins] He switches to the S3 Management Console web page. The Properties tab is selected. [Video description ends]

Let’s just click on something else. Like the Overview tab and then Properties again, just to refresh it. Okay, there it is the function is set now and D that he has 256. Let’s see if it works, meaning let’s upload some new content.

[Video description begins] He clicks the Overview tab. He then clicks the Upload button. The corresponding wizard opens. The “1 Select files” step is open in the wizard. A file labeled “logo.jpg” is displayed. [Video description ends]

So I’m just going to go ahead and Upload something to the root of this bucket, click the Upload button. And let’s go ahead and continue on to the Wizard. just accepting all of the defaults. There’s the file.

[Video description begins] He clicks the Next button. He then clicks the Next button in the “2 Set permissions” and “3 Set properties” steps and the Upload button in the “4 Review“ step. The logo.jpg file appears in the Overview tab. [Video description ends]

We just uploaded logo.jpg. And so I’m going to click on the checkbox here to the left to select it and let’s look at the Properties on the right and sure enough.

[Video description begins] The corresponding pane opens. [Video description ends]

It’s encrypted automatically with AES-256 because we enabled that bucket option using PowerShell.

Course Summary

[Video description begins] Topic title: Course Summary. [Video description ends]

So, in this course, we’ve examined S3 security options, including bucket encryption, access policies, and object locking. We did this by exploring S3 bucket policies, access control lists, and public access. We talked about S3 CORS support, storage class analysis, and object locking. We then talked about S3 bucket encryption using the GUI.

We talked about S3 bucket encryption using the CLI. And finally, bucket encryption using PowerShell. In our next course, we’re going to move on to explore VPC management and peering by discovering how to create and configure VPCs to align with business needs.

Question: You are using the Policy Generator tool to configure an S3 bucket permissions policy. Which policy option specified the S3 bucket name?

Ans- Amazon resource name

Question: Which of the following is NOT a valid S3 ACL entry?

Ans- IAM role

Question: You are using the AWS management console to create a new S3 bucket. What is the default permission for the new bucket?

Ans – Block all public access

Question: Which statement best describes the purpose of CORS for an S3 bucket?

Ans- It defines a way for client web apps in one DNS domain to access resources in a different DNS domain

Question: What is the purpose of conducting an S3 storage analysis?

Ans- To move S3 data to the appropriate storage class

Question: Which of the following are valid statements regarding S3 object locking? Choose two.

Ans- S3 object overwrites can be prevented for a limited time

S3 object overwrites can be prevented indefinitely

Question: What is the encryption option when enabling S3 bucket encrypting using Amazon managed keys?

Ans- AES-256

Question: You need to enable encryption for all existing objects in an S3 bucket. Which CLI command should you use?

Ans- aws s3 cp s3://bucket1/ s3://bucket1/ –recursive –sse AES256

Question: Which PowerShell cmdlet enables S3 bucket encryption?

Ans- Set-S3BucketEncryption