AMAZON WEB SERVICES CLi GUIDE
Summary by Damian Ndunda © May, June2019
TABLE OF CONTENTS
AMAZON WEB SERVICES CLi GUIDE. 1
AMAZON AND CLOUD COMPUTING.. 10
HOW YOU CAN BENEFIT FROM USING AWS. 10
CLOUD SERVICE LAYERS DIAGRAM... 11
CLOUD SERVICE MODEL COMPARISON DIAGRAM... 12
AWS GLOBAL INFRASTRUCTURE PICTURE. 13
AWS COMPUTING PLATFORM DIAGRAM... 14
CHAPTER – COMMAND LINE INTERFACE. 15
GETTING STARTED WITH AWS CLI 15
MANAGING ACCESS AND SECURITY USING THE AWS CLI 17
GETTING HELP WITH THE AWS CLI 22
AWS CLI API DOCUMENTATION.. 29
COMMAND STRUCTURE IN THE AWS CLI 29
SPECIFYING PARAMETER VALUES FOR THE AWS CLI 30
USING QUOTATION MARKS WITH STRINGS. 37
LOADING PARAMETERS FROM A FILE. 39
GENERATE THE CLI SKELETON AND INPUT PARAMETERS FROM A JSON INPUT FILE. 43
CONTROLLING COMMAND OUTPUT FROM THE AWS CLI 50
HOW TO SELECT THE OUTPUT FORMAT. 50
HOW TO FILTER THE OUTPUT WITH THE --QUERY OPTION.. 54
USING SHORTHAND SYNTAX WITH THE AWS COMMAND LINE INTERFACE. 62
USING AWS CLI PAGINATION OPTIONS. 65
UNDERSTANDING RETURN CODES FROM THE AWS CLI 66
USING THE AWS CLI TO WORK WITH AWS SERVICES. 69
USING AMAZON DYNAMODB WITH THE AWS CLI 69
USING AMAZON EC2 WITH THE AWS CLI 73
CREATE, DISPLAY, AND DELETE AMAZON EC2 KEY PAIRS. 73
CREATE, CONFIGURE, AND DELETE SECURITY GROUPS FOR AMAZON EC2. 75
ADDING RULES TO YOUR SECURITY GROUP. 78
DELETING YOUR SECURITY GROUP. 82
LAUNCH, LIST, AND TERMINATE AMAZON EC2 INSTANCES. 82
ADDING A BLOCK DEVICE TO YOUR INSTANCE. 88
ADDING A TAG TO YOUR INSTANCE. 88
USING AMAZON S3 GLACIER WITH THE AWS CLI 90
CREATING AN AMAZON S3 GLACIER VAULT. 90
PREPARING A FILE FOR UPLOADING.. 91
INITIATING A MULTIPART UPLOAD AND UPLOAD FILES. 92
USING AWS IDENTITY AND ACCESS MANAGEMENT FROM THE AWS CLI 98
CREATING IAM USERS AND GROUPS. 98
ATTACH AN IAM MANAGED POLICY TO AN IAM USER. 100
SET AN INITIAL PASSWORD FOR AN IAM USER. 101
CREATE AN ACCESS KEY FOR AN IAM USER. 102
USING AMAZON S3 WITH THE AWS CLI 102
USING HIGH-LEVEL (S3) COMMANDS WITH THE AWS CLI 103
USING API-LEVEL (S3API) COMMANDS WITH THE AWS CLI 111
CONFIGURING A LOGGING POLICY. 111
USING AMAZON SNS WITH THE AWS CLI 112
USING AMAZON SWF WITH THE AWS CLI 115
LIST OF AMAZON SWF COMMANDS BY CATEGORY. 115
COMMANDS RELATED TO ACTIVITIES. 115
COMMANDS RELATED TO DECIDERS. 116
COMMANDS RELATED TO WORKFLOW EXECUTIONS. 116
COMMANDS RELATED TO ADMINISTRATION.. 116
Workflow Execution Management. 117
Workflow Execution Visibility. 117
WORKING WITH AMAZON SWF DOMAINS USING THE AWS CLI 118
GET INFORMATION ABOUT A DOMAIN.. 119
TROUBLESHOOTING AWS CLI ERRORS. 121
MAIN CLI PROGRAM MUST HAVE 'RUN' PERMISSION.. 121
YOU MUST USE VALID CREDENTIALS. 121
YOUR IAM USER MUST BE ABLE TO RUN THE COMMAND.. 122
• AWS Key Management Service. 123
RECOMMENDATIONS AND BEST PRACTICES. 124
CHAPTER – COMMAND LINE INTERFACE
Wadia Y. (February 2016)
Amazon Inc (2019) AWS Command Line Interface: User Guide
GETTING STARTED WITH AWS CLI
CLIs are more than just simple access and management tools. Using CLIs, you can automate the deployment and management of your AWS services using simple code and script, much like how you would use bash and shell scripting. This provides you with a lot of flexibility and customizability that a standard GUI simply won't provide!
The 64-bit AWS CLI installer for Windows can be downloaded from https://s3.amazonaws.com/aws-cli/AWSCLI64.msi.
The 32-bit installer can be downloaded from https://s3.amazonaws.com/aws-cli/AWSCLI32.msi.
The installation of the CLI in Linux involves two major steps; the first involves the installation of Python setup tools, which is a prerequisite of installing Python's pip. Run the following commands from your Linux terminal:
1. Download the setup tools
tar file from the Python source repo:
wget https://pypi.python.org/packages/source/s/setuptools/setuptools-7.0.tar.gz
2. Next, untar the setup
tools installer using the tar command:
tar xvf setuptools-7.0.tar.gz
3. Once the contents of the
tar file are extracted, change the directory to the setup tools directory:
cd setuptools-7.0
4. Finally, run the setup.py
script to install the setup tools package:
python setup.py install
5. The following is the screenshot of preceding commands of the install process:
The next process is very simple aswell. We now install the Python pip package.
Run the following commands from your Linux terminal to install the Python pip package:
1.
Download the Python pip installer script from Python's repo:
wget https://bootstrap.pypa.io/get-pip.py
2. Install the pip package:
python get-pip.py
3. Once pip is installed, you can now easily install the AWS CLI by executing
the following command:
pip install awscli
Refer to the following screenshot showing the output of the installation process:
4. You can test your AWS CLI by executing few simple commands, for example, check the AWS CLI version using the following command:
aws –version
MANAGING ACCESS AND SECURITY USING THE AWS CLI
Open up a terminal of your Linux box, which has the AWS CLI installed on it, and type in the following command:
# aws configure
Once entered, you will be prompted to enter the user's Access Key ID and the Secret Access Key, along with the default region name and the default output format to use. The default region name is a mandatory field and can be any of the regions from which your users will be operating, for example, us-east-1, us-west-2, and so on:
AWS Access Key ID [None]:TH1$is$0MUC#fuN
AWS Secret Access Key [None]:iH@vEN01De@W#@T1@mD01ng#ERe
Default region name [None]: us-west-2
Default output format [None]: table
The
output format accepts any of these three values as the preferred method to
display the output of the commands: table, text, or json.
You should never share your keys with anyone! As an alternative, you can set up named profiles for each of your users using their own set of keys using this simple command:
# aws configure --profile jason
Here, we
are creating a named profile for our user named Jason. Similarly, you can
create multiple named profiles of individual IMA users using this same syntax:
AWS will store these credentials and configuration details in two separate files named ~/.aws/credentials and ~/.aws/config, respectively.
Listing the users present in our account. Type in the following command:
# aws iam list-users --profile Jason
note that we ran the CLI command using the named profile that we created a short while back.
create a new user using this simple command:
# aws iam create-user --user-name YoYo --profile jason
This
command will only create a user for you. This user still does not have any
passwords or access keys generated for it, so let's go ahead and create some!
Type in the following command to create a password for your user:
# aws iam create-login-profile --user-name YoYo --password P@$$w0rD--profile
Jason
Here, we passed two mandatory arguments with the commands --user-name and –password:
Besides these, you can additionally pass an optional argument called --passwordreset-required. This field will ensure that the IAM user has to reset his/her password upon first login from the AWS Management Console. Once the passwords are created, we go ahead and create the user's all important access key and Secret Key. To do so, type in the following command as shown:
# aws iam create-access-key --user-name YoYo --profile Jason
The create-access-key command requires only one mandatory argument, which is the username itself. Once executed, it will display the user's access and Secret Keys respectively in the output. Make sure you save the Secret Key as this is the last time it will be shown to you for obvious security reasons. With this step, your new IAM user is all ready to be added to groups!
Create a new group and attach our user to it. Type in the following command to create a new group:
# aws iam create-group --group-name SuperUsersGroup --profile Jason
The output should display the new group's ARN as well as the Group ID as shown:
With the group created, it's now time to attach our new user to it. Simply type in the following command as shown:
# aws iam add-user-to-group --user-name YoYo --group-name SuperUsersGroup--profile
jason
This
command accepts two mandatory arguments, which include the username as well as
the group name to which the user has to be attached to.
Create a simple JSON-based file on your Linux box. This JSON file will contain your new group's or user's set of permissions. For simplicity, I created a very basic policy that will grant its users complete access to all of AWS's products and resources. Run the following command to first create your policy:
# vi /tmp/MyPolicy.json
Add the following contents to your policy file as shown:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}
The commands will look as follows:
Next, run the following command to attach this policy document to your newly created group or user:
# awsiam put-group-policy --user-name YoYo \
--policy-name Admin-Access-All-About-Dogs \
--policy-document file:///tmp/MyPolicy.json \
--profile Jason
You can replace the --user-name attribute with the --group-name attribute in case you want to assign the policy to a group.
THE AWS CLI
By default, the AWS CLI sends requests to AWS services by using HTTPS on TCP port 443. To use the AWS CLI successfully, you must be able to make outbound connections on TCP port 443.
GETTING HELP WITH THE AWS CLI
The following command displays help for the general AWS CLI options and the available top-level commands.
$ aws help
The following command displays the available Amazon Elastic Compute Cloud (Amazon EC2) specific commands.
$ aws ec2 help
The following example displays detailed help for the Amazon EC2 DescribeInstances operation. The help includes descriptions of its input parameters, available filters, and what is included as output. It also includes examples showing how to type common variations of the command.
$ aws ec2 describe-instances help
The help
for each command is divided into six sections:
Name
The name of the command.
describe-instances
Description
A description of the API operation that the command invokes.
DESCRIPTION
Describes one or more of your instances.
If you specify one or more instance IDs, Amazon EC2 returns information for
those instances. If you do not specify instance IDs, Amazon EC2 returns
information for all relevant instances. If you specify an instance ID that is
not valid, an error is returned. If you specify an instance that you do not
own, it is not included in the returned results. ...
Synopsis
The basic syntax for using the command and its options. If an option is shown
in square brackets, it's either optional, has a default value, or has an
alternative option that you can use instead.
SYNOPSIS
describe-instances
[--dry-run | --no-dry-run]
[--instance-ids <value>]
[--filters <value>]
[--cli-input-json <value>]
[--starting-token <value>]
[--page-size <value>]
[--max-items <value>]
[--generate-cli-skeleton]
Options
A description of each of the options shown in the synopsis.
OPTIONS
--dry-run | --no-dry-run (boolean)
Checks whether you have the required permissions for the action, without actually making the request, and provides an error response. If you have the required permissions, the error response is DryRunOperation . Otherwise, it is UnauthorizedOperation .
--instance-ids
(list)
One or more instance IDs.
Default: Describes all your instances.
...
EXAMPLES
To describe an Amazon EC2 instance
Command:
aws ec2 describe-instances --instance-ids i-5203422c
To describe all instances with the instance type m1.small
Command:
aws ec2 describe-instances --filters
"Name=instance-type,Values=m1.small"
To describe all instances with an Owner tag
Command:
aws ec2 describe-instances --filters "Name=tag-key,Values=Owner"
Output
Descriptions of each of the fields and data types included in the response from
AWS. For describe-instances, the output is a list of reservation objects, each
of which contains several fields and objects that contain information about the
instances associated with it. This information comes from the API documentation for
the reservation data type used by Amazon EC2.
OUTPUT
When the output is rendered into JSON by the AWS CLI, it becomes an array of reservation objects, similar to the following example.
{
"Reservations": [
{
"OwnerId": "012345678901",
"ReservationId": "r-4c58f8a0",
"Groups": [],
"RequesterId": "012345678901",
"Instances": [
{
"Monitoring": {
"State": "disabled"
},
"PublicDnsName":
"ec2-52-74-16-12.us-west-2.compute.amazonaws.com",
"State": {
"Code": 16,
"Name": "running"
},
...
Each
reservation object contains fields describing the reservation and an array of
instance objects, each with its own fields (for example, PublicDnsName) and
objects (for example, State) that describe it.
Windows users
You can pipe
(|) the output of the help command to the more command to view the help
file one page at a time. Press the space bar or PgDn to view more of the
document, and q to quit.
C:\>aws ec2 describe-instances help | more
AWS CLI API DOCUMENTATION
API Documentation Sections
• Actions– Detailed information on each action and its parameters (including constraints on length or content, and default values). It lists the errors that can occur for this action. Each action corresponds to a subcommand in the AWS CLI.
• Data Types– Detailed information about structures that a command might require as a parameter or return in response to a request.
• Common Parameters– Detailed information about the parameters that are shared by all of action for the service.
• Common Errors– Detailed information about errors that can be returned by any of the service's actions.
The
name and availability of each section can vary depending on the service.
SERVICE-SPECIFIC CLIS
Some services have a separate CLI that dates from before a single AWS CLI was created to work with all services. These service-specific CLIs have separate documentation that is linked from the service's documentation page. Documentation for service-specific CLIs does not apply to the AWS CLI.
COMMAND STRUCTURE IN THE AWS CLI
1.
The base call to the aws program.
2. The top-level command, which typically corresponds to an AWS service
supported by the AWS CLI.
3. The subcommand that specifies which operation to perform.
4. General CLI options or parameters required by the operation. You can specify
these in any order as long as they follow the first three parts. If an
exclusive parameter is specified multiple times, only the last value applies.
$
aws <command><subcommand> [options and parameters]
Parameters can take various types of input values, such as numbers, strings, lists, maps, and JSON structures.
SPECIFYING PARAMETER VALUES FOR THE AWS CLI
Many parameters used in the AWS Command Line Interface (AWS CLI) are simple string or numeric values, such as the key pair name my-key-pair in the following example.
$ aws ec2 create-key-pair --key-name my-key-pair
Strings
without any space characters can be surrounded with quotation marks or not.
However, you must use quotation marks around strings that include one or more
space characters. Use single quotation marks (' ') in Linux, macOS, Unix, or
PowerShell. Use double quotation marks (" ") in the Windows command
prompt, as shown in the following examples.
PowerShell, Linux, macOS, or Unix
$
aws ec2 create-key-pair --key-name 'my key pair'
Windows command prompt
C:\>
aws ec2 create-key-pair --key-name "my key pair"
Optionally, you can optionally separate the parameter name from the value
with an equals sign (=) instead of a space. This is typically necessary only if
the value of the parameter starts with a hyphen.
$ aws ec2 delete-key-pair --key-name=-mykey
COMMON PARAMETER TYPES
$ aws ec2 describe-spot-price-history help
The help for each subcommand describes its function, options, output, and examples.
String – String parameters can contain alphanumeric characters, symbols, and white space from the ASCII character set. Strings that contain white space must be surrounded by quotation marks.
Timestamp – Timestamps are formatted according to the ISO 8601 standard. These are sometimes referred to as "DateTime" or "Date" parameters.
$ aws ec2 describe-spot-price-history --start-time
2014-10-13T19:00:00Z
Acceptable
formats include:
• YYYY-MM-DDThh:mm:ss.sssTZD (UTC), for example,
2014-10-01T20:30:00.000Z
• YYYY-MM-DDThh:mm:ss.sssTZD (with offset), for example,
2014-10-01T12:30:00.000-08:00
• YYYY-MM-DD, for example, 2014-10-01
•
Unix time in seconds, for example 1412195400. This is sometimes referred to as Unix Epoch Time and represents the number
of seconds since midnight, January 1, 1970 UTC.
List – One or more strings separated by spaces. If any of the string items contain a space, you must put quotation marks around that item.
$
aws ec2 describe-spot-price-history --instance-types m1.xlarge m1.medium
Boolean – Binary flag that turns an option on or off. For example, ec2 describe-spot-price history has a Boolean --dry-run parameter that, when specified, validates the query with the service without actually running the query.
$
aws ec2 describe-spot-price-history --dry-run
The
output indicates whether the command was well formed. This command also
includes a --no-dry run version of the parameter that you can use to explicitly
indicate that the command should be run normally. Including it isn't necessary
because this is the default behavior.
Integer – An unsigned, whole number.
$
aws ec2 describe-spot-price-history --max-items 5
Blob – Binary object. Blob parameters take a path to a local file that contains the binary data. The path should not contain any protocol identifier, such as http:// or file://. The specified path is interpreted as being relative to the current working directory. For example, the --body parameter for aws s3api put-object is a blob.
$ aws s3api put-object --bucket my-bucket --key testimage.png --body /tmp/image.png
Map – A set of key-value pairs specified in JSON or by using the CLI's
shorthand
syntax . The
following JSON example reads an item from an Amazon DynamoDB table named my-table
with a map parameter, --key. The parameter specifies the primary key named id
with a number value of 1 in a nested JSON structure.
$
aws dynamodb get-item --table-name my-table --key '{"id":
{"N":"1"}}'
{
"Item": {
"name": {
"S": "John"
},
"id": {
"N": "1"
}
}
}
USING JSON FOR PARAMETERS
JSON is useful for specifying complex command line parameters. For example, the following command lists all Amazon EC2 instances that have an instance type of m1.small or m1.medium that are also in the us-west-2c Availability Zone.
$ aws ec2 describe-instances --filters "Name=instance-type,Values=t2.micro,m1.medium""Name=availability-zone,Values=us-west-2c"
Alternatively, you can specify the equivalent list of filters as a JSON array
The value to the right of the "Values" key is itself an array. This is required, even if the array contains only one value string.
[
{
"Name": "instance-type",
"Values": ["t2.micro", "m1.medium"]
},
{
"Name": "availability-zone",
"Values": ["us-west-2c"]
}
]
The
outermost brackets, however, are required only if more than one filter is
specified. A single filter
version of the previous command, formatted in JSON, looks like this.
$ aws ec2 describe-instances --filters '{"Name":
"instance-type", "Values":
["t2.micro","m1.medium"]}'
This example shows the JSON to specify a single 20 GiB Amazon Elastic Block Store (Amazon EBS) device to be mapped at /dev/sdb on the launching instance.
{
"DeviceName": "/dev/sdb",
"Ebs": {
"VolumeSize": 20,
"DeleteOnTermination": false,
"VolumeType": "standard"
}
}
To
attach multiple devices, list the objects in an array, as shown in the next
example.
[
{
"DeviceName": "/dev/sdb",
"Ebs": {
"VolumeSize": 20,
"DeleteOnTermination": false,
"VolumeType": "standard"
}
},
{
"DeviceName": "/dev/sdc",
"Ebs": {
"VolumeSize": 10,
"DeleteOnTermination": true,
"VolumeType": "standard"
}
}
]
USING QUOTATION MARKS WITH STRINGS
Linux,
macOS, or Unix
Use single quotation marks (' ') to enclose the JSON data structure, as in the
following example:
$
aws ec2 run-instances --image-id ami-12345678 --block-device-mappings '[{"DeviceName":"/dev/sdb","Ebs":{"VolumeSize":20,"DeleteOnTermination":false,"VolumeType":"standard"}}]'
PowerShell
PowerShell
requires single quotation marks (' ') to enclose the JSON data structure, as
well as a backslash (\) to escape each double quotation mark (") within
the JSON structure, as in the following example:
PS
C:\>aws ec2 run-instances --image-id ami-12345678 --block-device mappings
'[{\"DeviceName\":\"/dev/sdb\",\"Ebs\":{\"VolumeSize\":20,\"DeleteOnTermination\":false,\"VolumeType\":\"standard\"}}]'
Windows Command Prompt
The
Windows command prompt requires double quotation marks (" ") to
enclose the JSON data structure. You must then escape (precede with a backslash
[ \ ] character) each double quotation mark (") within the JSON data
structure itself, as in the following example:
C:\>aws
ec2 run-instances --image-id ami-12345678 --block-device mappings "[{\"DeviceName\":\"/dev/sdb\",\"Ebs\":{\"VolumeSize\":20,\"DeleteOnTermination\":false,\"VolumeType\":\"standard\"}}]"
Only
the outermost double quotation marks are not escaped. If the value of a parameter
is itself a JSON document, escape the quotation marks on the embedded JSON
document. For example, the attribute parameter for aws sqs create-queue can
take a RedrivePolicy key. The --attributes parameter takes a JSON document,
which in turn contains RedrivePolicy, which also takes a JSON document as its
value. The inner JSON embedded in the outer JSON must be escaped.
$ aws sqs create-queue --queue-name my-queue --attributes '{ "RedrivePolicy":"{\"deadLetterTargetArn\":\"arn:aws:sqs:uswest-2:0123456789012:deadletter\", \"maxReceiveCount\":\"5\"}"}'
LOADING PARAMETERS FROM A FILE
To
specify a file that contains the value, specify a file URI. The URL provides
the path to the file that contains the actual parameter content.
Note
This
behaviour is disabled automatically for parameters that already expect a URI,
such as parameter that identifies a AWS CloudFormation template URI. You can
also disable this behaviour yourself by adding the following line to your CLI
configuration file:
cli_follow_urlparam
= false
The
file paths in the following examples are interpreted to be relative to the
current working directory.
Linux, macOS, or Unix
//
Read from a file in the current directory
$ aws ec2 describe-instances --filters file://filter.json
//
Read from a file in /tmp
$ aws ec2 describe-instances --filters
Windows
//
Read from a file in C:\temp
C:\>aws ec2 describe-instances --filters file://C:\temp\filter.json
The file:// prefix option supports Unix-style expansions, including "~/", "./", and "../". On Windows, the "~/" expression expands to your user directory, stored in the %USERPROFILE% environment variable.
For example, on Windows 10 you would typically have a user directory under
C:\Users\User
Name\.
JSON documents that are embedded as the value of another JSON document must
still be escaped.
$ aws sqs create-queue --queue-name my-queue --attributes file://attributes.json
attributes.json
{
"RedrivePolicy": "{\"deadLetterTargetArn\":\"arn:aws:sqs:uswest-2:0123456789012:deadletter\", \"maxReceiveCount\":\"5\"}"
}
BINARY FILES
For commands that take binary data as a parameter, specify that the data is binary content by using the fileb:// prefix. Commands that accept binary data include:
•
aws ec2 run-instances – --user-data parameter.
• aws s3api put-object – --sse-customer-key parameter.
• aws kms decrypt – --ciphertext-blob parameter.
The following example generates a binary 256-bit AES key using a Linux command line tool, and then provides it to Amazon S3 to encrypt an uploaded file server-side.
$
dd if=/dev/urandom bs=1 count=32 > sse.key
32+0 records in
32+0 records out
32 bytes (32 B) copied, 0.000164441 s, 195 kB/s
$
aws s3api put-object --bucket my-bucket --key test.txt --body test.txt --sse-customer-key
fileb://sse.key --sse-customer-algorithm AES256
{
"SSECustomerKeyMD5": "iVg8oWa8sy714+FjtesrJg==",
"SSECustomerAlgorithm": "AES256",
"ETag": "\"a6118e84b76cf98bf04bbe14b6045c6c\""
}
REMOTE FILES
The AWS CLI also supports loading parameters from a file hosted on the internet with an http:// or https:// URL. The following example references a file stored in an Amazon S3 bucket. This allows you to access parameter files from any computer, but it does require that the container is publicly accessible.
$ aws ec2 run-instances --image-id ami-12345678 --block-device-mappings http://mybucket.s3.amazonaws.com/filename.json
The preceding example assumes that the file filename.json contains the following JSON data.
[
{
"DeviceName": "/dev/sdb",
"Ebs": {
"VolumeSize": 20,
"DeleteOnTermination": false,
"VolumeType": "standard"
}
}
]
GENERATE THE CLI SKELETON AND INPUT PARAMETERS FROM A JSON INPUT FILE
Question about whether a specific command supports these parameters, run the following command, replacing the service and command names with the ones you're interested in:
$
aws service command help
The
output includes a Synopsis section that shows the parameters that the specified
command supports. The --generate-cli-skeleton parameter causes the command not
to run, but instead to generate and display a parameter template that you can
customize and then use as input on a later command. The generated template
includes all of the parameters supported by the command. For example, if you
run the following command, it generates the parameter template for the Amazon
Elastic Compute Cloud (Amazon EC2) command run-instances.
$ aws ec2 run-instances --generate-cli-skeleton
To generate and use a parameter skeleton file
1. Run the command with the --generate-cli-skeleton parameter and direct the output to a file to save it.
$ aws ec2 run-instances --generate-cli-skeleton > ec2runinst.json
2. Open the parameter skeleton file in your text editor and remove any of the parameters that you don't need. For example, you might strip it down to the following.
{
"DryRun": true,
"ImageId": "",
"KeyName": "",
"SecurityGroups": [
""
],
"InstanceType": "",
"Monitoring": {
"Enabled": true
}
}
In
this example, we leave the DryRun parameter set to true to use EC2's dry run
feature, which lets you safely test the command without actually creating or
modifying any resources.
3. Fill in the remaining values with values appropriate for your scenario. In this example, we provide the instance type, key name, security group and identifier of the AMI to use. This example assumes the default region. The AMI ami-dfc39aef is a 64-bit Amazon Linux image hosted in the uswest-2 region. If you use a different region, you must find the correct AMI ID to use.
{
"DryRun": true,
"ImageId": "ami-dfc39aef",
"KeyName": "mykey",
"SecurityGroups": [
"my-sg"
],
"InstanceType": "t2.micro",
"Monitoring": {
"Enabled": true
}
}
4. Run the command with the completed parameters by passing the JSON file to the --cli-input json parameter using the file:// prefix. The AWS CLI interprets the path to be relative to your current working directory, so the following example which displays only the file name with no path is looked for the file directly in the current working directory.
$
aws ec2 run-instances --cli-input-json file://ec2runinst.json
A
client error (DryRunOperation) occurred when calling the RunInstances
operation: Request would have succeeded, but DryRun flag is set. The dry run
error indicates that the JSON is formed correctly and the parameter values are
valid. If any other issues are reported in the output, fix them and repeat the
above step until the "Request would have succeeded" message is
displayed.
5. Now you can set the DryRun parameter to false to disable dry run.
{
"DryRun": false,
"ImageId": "ami-dfc39aef",
"KeyName": "mykey",
"SecurityGroups": [
"my-sg"
],
"InstanceType": "t2.micro",
"Monitoring": {
"Enabled": true
}
}
6. Now when you run the command, run-instances actually launches an EC2 instance and displays the details generated by the successful launch.
$
aws ec2 run-instances --cli-input-json file://ec2runinst.json
{
"OwnerId": "123456789012",
"ReservationId": "r-d94a2b1",
"Groups": [],
"Instances": [
...
CONTROLLING COMMAND OUTPUT FROM THE AWS CLI
HOW TO SELECT THE OUTPUT FORMAT
The AWS CLI supports three different output formats:
•
JSON (json)
• Tab-delimited text (text)
• ASCII-formatted table (table)
As explained in the configuration (p. 19) topic, you can specify the output format in three ways:
•
Using the output option in a named profile in the config file. The following
example sets the default output format to text.
[default]output=text
• Using the AWS_DEFAULT_OUTPUT environment variable. The following output sets
the format to table for the commands in this command-line session until the
variable is changed or the session ends. Using this environment variable
overrides any value set in the config file.
$ export AWS_DEFAULT_OUTPUT="table"
• Using the --output option on the command line. The following example sets the output of only this one command to json. Using this option on the command overrides any currently set environment variable or the value in the config file.
$ aws swf list-domains --registration-status REGISTERED --output json
JSON OUTPUT FORMAT
If you need more advanced features that might not be possible with --query, you can check out jq, a command line JSON processor. You can download it and find the official tutorial at http://stedolan.github.io/jq/.
TEXT OUTPUT FORMAT
The
text output format follows the basic structure shown below. The columns are
sorted alphabetically by the corresponding key names of the underlying JSON
object.
IDENTIFIER sorted-column1 sorted-column2
IDENTIFIER2 sorted-column1 sorted-column2
The following is an example of a text output.
$
aws ec2 describe-volumes --output text
$
aws ec2 describe-volumes --query 'Volumes[*].[VolumeId,
Attachments[0].InstanceId, AvailabilityZone, Size, FakeKey]' --output text
TABLE OUTPUT FORMAT
The
table format produces human-readable representations of complex AWS CLI output
in a tabular form.
$ aws ec2 describe-volumes --output table
----------------------------------------------------------------------------------------------
$ aws ec2 describe-volumes --query 'Volumes[*]. {ID:VolumeId,InstanceId:Attachments[0].InstanceId,AZ:AvailabilityZone,Size:Size}' –output table
$
aws ec2 describe-volumes --query
'Volumes[*].[VolumeId,Attachments[0].InstanceId,AvailabilityZone,Size]'
--output table
----------------------------------------------------
HOW TO FILTER THE OUTPUT WITH THE --QUERY OPTION
Output below describes two Amazon Elastic Block Store (Amazon EBS) volumes attached to separate Amazon EC2 instances.
$ aws ec2 describe-volumes
We can choose to display only the first volume from the Volumes list by using the following command that indexes the first volume in the array.
$ aws ec2 describe-volumes --query 'Volumes[0]
In the next example, we use the wildcard notation [*] to iterate over all of the volumes in the list and also filter out three elements from each: VolumeId, AvailabilityZone, and Size. The dictionary notation requires that you provide an alias for each JSON key, like this: {Alias1:JSONKey1,Alias2:JSONKey2}. A dictionary is inherently unordered, so the ordering of the key-aliases within a structure might be inconsistent.
$ aws ec2 describe-volumes --query 'Volumes[*].{ID:VolumeId,AZ:AvailabilityZone,Size:Size}'
Using dictionary notation, you can also chain keys together, like key1.key2[0].key3, to filter elements deeply nested within the structure. The following example demonstrates this with the Attachments[0].InstanceId key, aliased to simply InstanceId.
$ aws ec2 describe-volumes --query 'Volumes[*].{ID:VolumeId,InstanceId:Attachments[0].InstanceId,AZ:AvailabilityZone,Size:Size}'
You can also filter multiple elements using list notation: [key1, key2]. This formats all filtered attributes into a single ordered list per object, regardless of type.
$
aws ec2 describe-volumes --query 'Volumes[*].[VolumeId,
Attachments[0].InstanceId,AvailabilityZone, Size]'
The following example query outputs only volumes in the us-west-2a Availability Zone.
$ aws ec2 describe-volumes --query 'Volumes[?AvailabilityZone==`us-west-2a`]
The --query parameter further limits the output to only those volumes with a Size value that is larger than 50, and shows only the specified fields with user-defined names.
$ aws ec2 describe-volumes \
~/workplace/awscli/src/
AWSUnifiedCLIDocs
--filter "Name=availability-zone,Values=us-west-2a"
"Name=status,Values=attached" \
--query 'Volumes[?Size > `50`].{Id:VolumeId,Size:Size,Type:VolumeType}
The --query parameter also enables you to count items in the output. The following example displays the number of available volumes that are more than 1000 IOPS.
$
aws ec2 describe-volumes \
~/workplace/awscli/src/
AWSUnifiedCLIDocs
--filter "Name=status,Values=available" \
--query 'length(Volumes[?Iops > `1000`])'
3
The following example lists the five most recent AMIs that you created, sorted from most recent to oldest.
$ aws ec2 describe-images --owners self \
~/workplace/awscli/src/
AWSUnifiedCLIDocs
--query
'reverse(sort_by(Images,&CreationDate))[:5].{id:ImageId,date:CreationDate}'
[
The following example lists the five most recent AMIs that you created, sorted from most recent to oldest.
$
aws ec2 describe-images --owners self \
~/workplace/awscli/src/
AWSUnifiedCLIDocs
--query 'reverse(sort_by(Images,&CreationDate))[:5].{id:ImageId,date:CreationDate}'
[
This following example shows only the InstanceId for any unhealthy instances in the specified AutoScaling Group.
$ aws autoscaling describe-auto-scaling-groups --auto-scaling-group-name My-AutoScalingGroup-Name --output text\--query 'AutoScalingGroups[*].Instances[?HealthStatus==`Unhealthy`].InstanceId'
USING SHORTHAND SYNTAX WITH THE AWS COMMAND LINE INTERFACE
STRUCTURE PARAMETERS
The shorthand syntax in the AWS CLI makes it easier for users to input parameters that are flat (nonnested structures). The format is a comma-separated list of key-value pairs.
Linux, macOS, or Unix
--option
key1=value1,key2=value2,key3=value3
PowerShell
--option
"key1=value1,key2=value2,key3=value3"
These
are both equivalent to the following example, formatted in JSON.
--option '{"key1":"value1","key2":"value2","key3":"value3"}'
There must be no white space between each comma-separated key-value pair. Here is an example of the Amazon DynamoDBupdate-table command with the --provisioned-throughput option specified in shorthand.
$ aws dynamodb update-table –provisioned throughput ReadCapacityUnits=15,WriteCapacityUnits=10 --table-name MyDDBTable
This
is equivalent to the following example formatted in JSON.
$ aws dynamodb update-table --provisionedthroughput '{"ReadCapacityUnits":15,"WriteCapacityUnits":10}' --table-name MyDDBTable
LIST PARAMETERS
You can specify Input parameters in a list form in two ways: JSON or shorthand.
The
basic format is shown here, where values in the list are separated by a single
space.
--option value1
value2 value3
This is
equivalent to the following example, formatted in JSON.
--option '[value1,value2,value3]
The following is an example of the stop-instances command for Amazon Elastic Compute Cloud (Amazon EC2), where the input parameter (list of strings) for the --instanceids option is specified in shorthand.
$ aws ec2 stop-instances --instance-ids i-1486157a i-1286157c i-ec3a7e87
This is equivalent to the following example formatted in JSON.
$ aws ec2 stop-instances --instance-ids '["i-1486157a","i-1286157c","i-ec3a7e87"]
The following example shows the Amazon EC2 create-tags command, which takes a list of nonnested structures for the --tags option. The --resources option specifies the ID of the instance to tag.
$ aws ec2 create-tags --resources i-1286157c --tags Key=My1stTag,Value=Value1Key=My2ndTag,Value=Value2
Key=My3rdTag,Value=Value3
This
is equivalent to the following example, formatted in JSON. The JSON parameter
is written in multiple lines for readability.
$
aws ec2 create-tags --resources i-1286157c --tags '[
{"Key": "My1stTag", "Value": "Value1"},
{"Key": "My2ndTag", "Value": "Value2"},
{"Key": "My3rdTag", "Value": "Value3"}
]
USING AWS CLI PAGINATION OPTIONS
By default, the AWS CLI uses a page size of 1000 and retrieves all available items.You can use the --page-size option to specify that the AWS CLI request a smaller number of items from each call to the AWS service. The CLI still retrieves the full list, but performs a larger number of service API calls in the background and retrieves a smaller number of items with each call. This gives the individual calls a better chance of succeeding without a timeout.
$
aws s3api list-objects --bucket my-bucket --page-size 100
{
"Contents": [
...
To include fewer items at a time in the AWS CLI output, use the --max-items option.
$ aws s3api list-objects --bucket my-bucket --max-items 100
The
following example shows how to use the NextToken value returned by the previous
example, and enables you to retrieve the second 100 items.
Note
The parameter --starting-token cannot be null or empty. If the previous command does not return a NextToken value, then there are no more items to return and you do not need to call the command again.
$ aws s3api list-objects --bucket my-bucket --max-items 100 --starting-token
UNDERSTANDING RETURN CODES FROM THE AWS CLI
To determine the return code of an AWS CLI command, run one of the following commands immediately after running the CLI command.
Linux/Unix/Mac systems
$ echo $?
Windows PowerShell
PS>echo
$lastexitcode
Windows Command Prompt
C:\>echo
%errorlevel%
The following are the return code values that can be returned at the end of running an AWS Command Line Interface (AWS CLI) command.
Code |
Meaning |
0 |
The command completed successfully. There were no errors generated by either the AWS CLI or by the AWS service to which the request was sent. |
1 |
One or more S3 transfer operations failed. Limited to s3 commands. |
2 |
The meaning of this return code depends on the command. • The command entered on the command line couldn't be parsed. Parsing failures can be caused by, but aren't limited to, missing required subcommands or arguments, or using unknown commands or arguments. Applicable to all CLI commands. • One or more files marked for transfer were skipped during the transfer process. However, all other files marked for transfer were successfully transferred. Files that are skipped during the transfer process include: files that do not exist, files that are character special devices, block special device, FIFO's, or sockets, and files that the user doesn't have read permissions to. Limited to S3 commands. |
13 |
0The command was interrupted by a SIGINT (Ctrl-C). |
25 |
5The command failed. There were errors generated by either the AWS CLI or by the AWS service to which the request was sent. |
To learn more details about a failure, run the command with the --debug switch.
USING THE AWS CLI TO WORK WITH AWS SERVICES
USING AMAZON DYNAMODB WITH THE AWS CLI
To list the AWS CLI commands for DynamoDB, use the following command.
aws dynamodb help
Note
For
readability, long commands in this section are broken into separate lines. The
backslash character is the line continuation character for the Linux command
line, and lets you copy and paste (or enter) multiple lines at a Linux prompt.
If you are using a shell that doesn't use the backslash for line continuation,
replace the backslash with that shell's line continuation character. Or remove
the backslashes and put the entire command on a single line.
For example, the following command creates a table named MusicCollection
$
aws dynamodb create-table \
--table-name MusicCollection \
--attribute-definitions \
You
can add new lines to the table with commands similar to those shown in the
following example. These examples use a combination of shorthand syntax and
JSON.
$ aws dynamodb put-item \
--table-name MusicCollection \
--item '{
It can be difficult to compose valid JSON in a single-line command. To make this easier, the AWS CLI can read JSON files. For example, consider the following JSON snippet, which is stored in a file named expression-attributes.json.
{
":v1": {"S": "No One You Know"},
":v2": {"S": "Call Me Today"}
}
You can use that file to issue a query request using the AWS CLI. In the following example, the content of the expression-attributes.json file is used for the --expression-attribute-values parameter.
$ aws dynamodb query --table-name MusicCollection \ --key-condition-expression "Artist = :v1 AND SongTitle = :v2" \ --expression-attribute-values file://expression-attributes.json
DynamoDB Local is a small client-side database and server that mimics the DynamoDB service.
REFERENCES
Amazon Inc (2019) AWS Command Line Interface: User Guide,Amazon Web Services, Inc.
Amazon Inc (2019) Amazon Elastic Compute Cloud - User Guide for Linux Instances, Amazon Web Services Inc.
Amazon Inc (2019) Amazon Web Services Documentation, AWS inc
Amazon Inc (2019) Amazon Web Services Whitepaper, AWS inc
Beach B.(2014) Pro Powershell for Amazon Web Services – DevOps for the AWS Cloud, Apress Media, California USA.
Hashimoto N. (August 2013) Amazon S3 Cookbook, PACKT Publishing, Birmingham, United Kingdom
Kvreddi, Hand Book For AWS Admin, SUVEN IT, Hyderabad India [Online] Available from: https://www.scribd.com/document/.../Aws-Admin-Guide-by-Suven-It-v1-PDF [Accessed date 24 May 2019 16:24hrs]
Wadia Y. (February 2016) AWS Administration – The Definitive Guide, Packt Publishing, Birmingham UK
Witting A, Witting M, (2016) Amazon Web services in Action. Manning Publications Co Shelter Island, New York
Home/ Info/ Products/ Price list/ PC Buyers Guide/ Technology Videos/ Venus Project/ Contact
Copyright BICT Solutions