#format wiki
#language en
= Amazon Web Services - Cloud provider =
 * Links [[AWS/SSO]] , [[AWS/CloudWatch/FlowLog]], [[https://www.apptio.com/blog/aws-ebs-performance-confused/|2021 apptio EBS performance]], [[AWS/LinuxNetwork]]
== Install aws cli v2 ==
 * Install client {{{
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install
}}}
 * With new access tokens created in IAM, and region {{{
aws configure
}}}

== 2021 get ami images ==
 * Using aws cli {{{
$ aws ec2 describe-images --region ap-southeast-2 --owners self amazon
or
$ aws ec2 describe-images --region ap-southeast-2 --filters "Name=name,Values=Windows_Server-2019-English-Full-Base*" --owners self amazon | grep "\"Name\""
}}}

== 2018 CloudWatch syslog ==
 * Install agent {{{
$ sudo dpkg -iE amazon-cloudwatch-agent.deb
$ sudo systemctl enable amazon-cloudwatch-agent.service
$ sudo systemctl start amazon-cloudwatch-agent.service
$ sudo systemctl status amazon-cloudwatch-agent.service
 }}}
 * --( syslog {{{
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O 
$ sudo python3 awslogs-agent-setup.py
awslogs-agent-setup.py
}}} )--
 * aws cli cloudwatch log groups {{{
$ aws logs describe-log-groups --profile nonprod
}}}
== 2018 aws tool, setup for s3 upload on Raspberry Pi3 ==
 * sudo pip3 install -U aws
    * Error
    * sudo apt install libffi-dev libssl-dev


== 2016 ==
 * [[https://live.awsevents.com/?sc_channel=em&sc_campaign=chicagosummit2016&sc_publisher=aws&sc_medium=em_13869&sc_content=launch_t1launch_tier1&sc_country=global&sc_geo=global&sc_category=mult&sc_outcome=launch&trk=ema_13869&mkt_tok=eyJpIjoiTWpZMk1XTmtPVGRrTmpnMSIsInQiOiJVb3VIVURraVRGNTRKNEtWNzNjNTlJWmlPUmRwSDRyWFhzaG1PSHY1YXJcL0swRnpKd1BhUEFMdzNGMU53UTd4Mkd6WlJyM1htWGladlNBNHpZMk1sUTIxR3NzTlB4RDdOYnVvaDlZRHErXC9RPSJ9|2016-aws-live]]
 * [[https://aws.amazon.com/partners/success/infor/|AWS-Partner-infor]]
 * on linux can use $s3cmd to backup to S3 storage in AWS.
   * $ s3cmd  --configure
     * get keys from http://aws.amazon.com/ User Name,Access Key Id,Secret Access Key
     * s3cmd mb s3://backupVigor
       {{{ Bucket 's3://backupVigor/' created }}}
   * Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
     * 20160322 - This works great, and no need for a local copy of the file while creating the backup.
 * Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.
== Linux commandline bash upload to aws s3 ==
 * https://geek.co.il/2014/05/26/script-day-upload-files-to-amazon-s3-using-bash

...
----
CategoryStorage CategoryDevelopement CategorySecurity