I’m taking AWS training from Linux Academy along with some peers. These are continuations of my notes.

AWS Certified Solutions Architect - S3

Linux Academy’s S3 section notes. S3 is what I’ve used the most with AWS, and I’m hosting two websites from S3.

S3 Essentials

  • These numbers keep appearing: 11 9’s durability, 99.99% availability
  • Object size 0 bytes to 5 TB
  • No total storage limit
  • 100 bucket limit (concurrent) in an AWS account
  • Buckets can’t be transferred
  • Standard, RRS, Glacier explanations redux
  • Versioning, lifecycle redux
  • S3 permissions breakdown look like possible test question fodder
    • Private by default
    • ACLs (can share across accounts) - I didn’t realize there’s a distinction between ACLs and IAM policies
    • IAM policies
    • Bucket policies - they specifically mention IP restriction and HTTP referrer header restriction
    • Public (URL)
    • Signed URLs - dev or CloudFront - can be time-limited
  • S3 endpoints SSL-terminated (for API)
  • Can encrypt server-side
  • Can encrypt client-side, use own encryption keys
  • Can be used for static websites “when used with Route 53”. Wha? I’m using it for static websites with my own hosted DNS, just CNAMEing to the S3 http endpoint name. But can’t do the apex domain that way, so maybe that’s why the caveat.
  • Can be origin for CloudFront (I’m doing that, too)
  • Multipart upload
    • Allows stop/resume uploads
    • Upload parts concurrently
    • Required for > 5 GB
    • Recommended for >= 100 MB
    • Use w/SDK/cli (I’m using AWS CLI)
  • Objects synced in-region across all AZs
  • Eventual consistency
    • Immediately after creating new object, can read it from any AZ (although AZ can’t be specified, anyway)
    • Overwriting PUTs and DELETEs subject to eventual consistency, change may lag
      • I wonder how versioning affects this as behind-the-scenes a new version is a net-new object?
  • Possibly test-worthy list of reasons to use S3
    • Hosting static files
    • Origin for CloudFront
    • Hosting static websites
    • File shares for networks
    • Backup/Archiving with a shout out to AWS Storage Gateway
  • Notifications on actions including RRS item lost
    • SNS, SQS, Lambda

Getting Started

Let’s get it started in here?

  • Can manage buckets in any region from any region’s console as S3 is global service
  • Reminder that S3 transfer within region doesn’t cost, but cross-region does
  • Bucket properties redux
  • S3 buckets can be part of a resource group that groups AWS resources together; seems to be related to tags
  • Can have “requestor pays” where other AWS accounts reading from our S3 bucket pay the data transfer charges (obviously no anonymous access)
  • Objects can have metadata such as common http header metadata (e.g. Content-Type)
  • Explaning distinction between file folder and prefix/namespace
  • Object properties
    • Can change metadata and even server-side encryption
  • Object link is distinct from S3 endpoint

S3 Permissions

  • He says the website hosting and permissions are two ways to implement permissions
  • Interesting: CORS options under Permissions section
  • Bucket policy: need to know capabilities but not necessarily write policies for associate cert
    • Possible to require user to MFA to e.g. delete item
    • Can restrict access by IP
  • IAM policies vs. bucket policies:
    • IAM apply to user
      • Can create custom policy to specify down to the bucket (resource) level … oh, can actually specify prefixes and objects, too
      • Will need to give permission to list all buckets if user needs to use the S3 console

S3 Bucket/Object Versioning And LifeCycle Policies

Expansions on the Essentials training that are obvious to me.

Website Hosting With S3

Been there, done that.

  • For the second time in this series he’s emphasizing the region being in the S3 link (or in this case the http endpoint), so might be a test question

Quiz

100%

Labs

Two labs are offered, but I’m going to skip them at least for now as I feel comfortable with S3.