AWS

Amazon Web Services

Wednesday, Jun 17, 2020

AWS

Getting Started

Installing AWS

Enabling Command Completion

Uninstalling AWS

AWS Services

  1. Computing
  1. Networking
  1. Storage
  1. Security

AWS Global Infrastructure

EC2

EC2, Amazon’s Elastic Compute Cloud, is a virtual server that can perform computations remotely. The compute capacity is easy to resize, and you only pay for the computing capacity that is used.

aws ec2 create-key-pair > ~/.ssh/aws_key.pem \
  --key-name 'aws' \
  --query 'KeyMaterial' \
  --output 'text'

chmod 400 ~/.ssh/aws_key.pem
aws ec2 describe-key-pairs --key-name 'aws'
aws ec2 describe-vpcs
aws ec2 describe-subnets
aws ec2 describe-security-groups
aws ec2 run-instances \
  --count 1 \
  --image-id 'ami-0e34e7b9ca0ace12d' \
  --instance-type 't3.micro' \
  --key-name 'id_aws' \
  --security-group-ids 'sg-0efcc5d86ade500ec' \
  --subnet-id 'subnet-13bcff58'

AWS CLI

Tip: If you ever need help for a given command, the documentation is surprisngly robust for this program, so suppress that urge to go to Stack Overflow! You can type help after any subcommand and a manual page will appear to explain the available functionality.

aws configure --profile tommy

The AWS Console will check for these variables in your shell environment:

  1. AWS_ACCESS_KEY_ID otherwise specified in ~/.aws/credentials or inside ~/.aws/config as aws_access_key_id
  2. AWS_SECRET_ACCESS_KEY otherwise specified in ~/.aws/credentials or inside ~/.aws/config as aws_secret_access_key
  3. AWS_SESSION_TOKEN otherwise specified in ~/.aws/credentials or inside ~/.aws/config as aws_session_token
  4. AWS_PROFILE otherwise specified with aws --profile tommy
  5. AWS_DEFAULT_REGION otherwise specified with aws --region us-east-1 or inside ~/.aws/config as aws_default_region
  6. AWS_DEFAULT_OUTPUT otherwise specified with aws --output json or inside ~/.aws/config as aws_default_output

This is an example addition to ~/.aws/config

[profile example]
aws_access_key_id=foo
aws_secret_access_key=bar

This is an example addition to ~/.aws/credentials

[example]
aws_access_key_id=foo
aws_secret_access_key=bar

This is an example addition to ~/.profile

export AWS_DEFAULT_OUTPUT='json'
export AWS_DEFAULT_REGION='us-west-2'
export AWS_ACCESS_KEY_ID='foo'
export AWS_SECRET_ACCESS_KEY='bar'

Warning: Any environment variables set in your shell, such as in the code snippet above, will override the configurations set in ~/.aws/config and ~/.aws/credentials

--query

When output is returned by AWS, it’s in the form of JSON. Query uses JMESPath, a query language for JSON, with library support in all popular programming languages.

AWS SageMaker

AWS SageMaker allows you to make cloud-hosted Jupyter notebooks, which can easily be connected to S3 buckets and EC2 instances available on your account.

You can use Amazon’s SDK for Python, known as boto3 to perform operations between AWS services within a python script, such as a Jupyter notebook. This is an example of pulling a JSON file from the S3 bucket tamagotchi to the SageMaker notebook neopets

# Import the AWS SDK boto3
import boto3
s3 = boto3.resource('s3')

# Print all of the available S3 buckets
for bucket in s3.buckets.all():
  print(bucket.name)

# Specify the name of the S3 bucket
bucket = s3.Bucket('tamagotchi')

# List all of the objects in a bucket
for obj in bucket.objects.all():
  print(obj.key)

# Download the S3 file, and save it to the Jupyter notebook
bucket.download_file('/s3bucket/path/to/sample.json', '/path/to/sample.json')

# Open the file inside the Jupyter notebook
my_file = open('/path/to/sample.json')
import json
my_object = json.load(my_file)

# View properties of the object
print(my_object)

Warning: Be sure to use the .download_file() method first, as you can’t access the S3 bucket’s version directly, (or so I believe).

Uploading a file to an S3 bucket can be done as follows:

import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('tamagotchi')

# Upload file 'example.json' from Jupyter notebook to S3 Bucket tamagotchi
bucket.upload_file('/local/path/to/example.json', '/remote/path/to/example.json')

Deleting the objects in an S3 bucket can be done as follows:

import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('tamagotchi')
request = {
  "Objects": [
    {
      "Key": "sample.json"
    }
  ],
  "Quiet": True
}

# Delete all of the objects specified by keys in the "Objects" array
response = bucket.delete_objects(request)

Deleting an S3 bucket can be done as follows:

import boto3
s3 = boto3.resource('s3')
bucket = s3.Bucket('tamagotchi')

# Delete the S3 bucket named tamagotchi
bucket.delete()

Note: You won’t be able to delete a bucket until all of the objects within it have been deleted as well.

IAM

Vocabulary:

IAM Policy Structure has a few key components:

By default, all permissions are denied. It must be specifically allowed. If the action you are trying to perform is being denied, it could be a result of the policy’s surrounding any of the above components. Maybe the current ARN doesn’t have permission for that action, or it would if a different condition was in place.

Types of Policies:

AWS S3

AWS Lambda

EC2

Configure

Documentation

It’s worth noting that you can specify which SSO profile name to use in two different ways:

1. By passing a name to the `--profile` option, (e.g. `--profile tommy`)

2. By assigning a name to the environment variable `AWS_DEFAULT_PROFILE`

Cloud9

AWS has an in-browser IDE called Cloud9, which you can power using an existing EC2 instance. Supposedly it supports pair programming as well.

Organizations

API Gateway