I am trying to write a bash script to confirm all unconfirmed users in a Cognito User Pool. The documentation here says that I can use cognito:user_status to filter by state. So I wrote this code.
#!/bin/bash
USER_POOL_ID=pool_id
RUN=1
until [ $RUN -eq 0 ] ; do
echo "Listing users"
# Here is the problem after the --filter param. How should I query for the unconfirmed users?
USERS=`aws --profile jaws-lap cognito-idp list-users --user-pool-id ${USER_POOL_ID} --filter 'cognito:user_status="unconfirmed"' | grep Username | awk -F: '{print $2}' | sed -e 's/\"//g' | sed -e 's/,//g'`
if [ ! "x$USERS" = "x" ] ; then
for user in $USERS; do
echo "Confirming user $user"
aws --profile jaws-lap cognito-idp admin-delete-user --user-pool-id ${USER_POOL_ID} --username ${user}
echo "Result code: $?"
echo "Done"
done
else
echo "Done, no more users"
RUN=0
fi
done
The thing is that the --filter is not configured properly. How should I write the statement so I get the unconfirmed users?
Thanks.
This command worked for me:
aws cognito-idp list-users --user-pool-id xxx --filter 'cognito:user_status="CONFIRMED"'
Related
How can we delete all users from a specific user pool in AWS Cognito using AWS CLI?
try with below:
aws cognito-idp list-users --user-pool-id $COGNITO_USER_POOL_ID |
jq -r '.Users | .[] | .Username' |
while read uname1; do
echo "Deleting $uname1";
aws cognito-idp admin-delete-user --user-pool-id $COGNITO_USER_POOL_ID --username $uname1;
done
In order to speed up deletion, I modified #GRVPrasad's answer to use xargs -P which will farm deletions out to multiple processes.
aws cognito-idp list-users --user-pool-id $COGNITO_USER_POOL_ID | jq -r '.Users | .[] | .Username' | xargs -n 1 -P 5 -I % bash -c "echo Deleting %; aws cognito-idp admin-delete-user --user-pool-id $COGNITO_USER_POOL_ID --username %"
Here is a bash version based on #ajilpm's batch script:
# deleteAllUsers.sh
COGNITO_USER_POOL_ID=$1
aws cognito-idp list-users --user-pool-id $COGNITO_USER_POOL_ID |
jq -r '.Users | .[] | .Username' |
while read user; do
aws cognito-idp admin-delete-user --user-pool-id $COGNITO_USER_POOL_ID --username $user
echo "$user deleted"
done
You must have jq installed and remember to make the script executable: chmod +x deleteAllUsers.sh.
The user pool id can be provided as a command line argument: ./deleteAllUsers.sh COGNITO_USER_POOL_ID.
I created a script to do it from Windows CMD if you have AWS Cli installed and configured, which will delete all the users page by page, so you need to run it till all users are removed.
You need to have JQ downloaded and its path added to system env path for the following to work.
---delete.bat---
#echo off setlocal
for /f "delims=" %%I in
('aws cognito-idp list-users --user-pool-id $COGNITO_USER_POOL_ID ^|
jq -r ".Users | .[] | .Username"')
do
(aws cognito-idp admin-delete-user --user-pool-id $COGNITO_USER_POOL_ID --username %%I
echo %%I deleted)
---delete.bat---
Sorry cannot add comment. I had same requirement and slight modification in command mentioned by ajilpm worked in windows 10 for me. You need to download jq.exe and keep on path in command line
---Start.bat---
#echo off setlocal
for /f "delims=" %%I in ('aws cognito-idp list-users --user-pool-id us-west-2_O7rRBQ5rr --profile dev-hb ^| jq -r ".Users | .[] | .Username"') do ( aws cognito-idp admin-delete-user --user-pool-id us-west-2_O7rRBQ5rr --username %%I --profile dev-hb)
---delete.bat---
With Python and boto3:
I use email as username
import boto3 as aws
import pandas as pd
client_cognito = aws.client('cognito-idp')
getProperties = pd.read_csv('CognitoUsers.csv',header=0)
usernames = getProperties['email']
for username in usernames:
response = client_cognito.admin_delete_user(
UserPoolId="us-east-1_xxxxxxxxx",
Username = username,
)
you need login in aws-cli with your AWS Access Key ID & AWS Secret Access Key
I am trying to get the details from the AWS SSM Parameter store. I have the data stored for which the value in SSM parameter store like this:
CompanyName\credits
Please find the SSM command executed through AWS CLI, the output is as follows:
aws ssm get-parameters --names "/Data/Details"
- Output
{
"Parameters": [
{
"Name": "/Data/Details",
"Type": "String",
"Value": "CompanyName\\Credits",
"Version": 1,
"LastModifiedDate": "2019-08-13T18:16:40.836000+00:00",
"ARN": "arn:aws:ssm:us-west-1:8484848448444:parameter/Data/Details"
}
],
"InvalidParameters": []
} ```
In the above output, I am trying to print only **Credits** in **"Value": "CompanyName\\Credits"**, so I have added more filters to my command as follows:
``` aws ssm get-parameters --names "/Data/Details" |grep -i "Value" |sed -e 's/[",]//g' |awk -F'\\' '{print $2}' ```
The above command gives nothing in the output. But when I am trying to print the first field I am able to see the output as ** Value: CompanyName ** using the following command:
``` aws ssm get-parameters --names "/Data/Details" |grep -i "Value" |sed -e 's/[",]//g' |awk -F'\\' '{print $1}' ```
Since this is Linux machine, the double slash in the field **Value : CompanyName\\Credits ** occurred to escape the '\' character for the actual value CompanyName\Credits. Can someone let me know how can I modify the command to print only the value **Credits** in my output.
Regards,
Karthik
I think this should work:
param_info=$(aws ssm get-parameters \
--names "/Data/Details" \
--query 'Parameters[0].Value' \
--output text)
echo ${param_info} | awk -F'\\' {'print $3'}
# or without awk
echo ${param_info#*\\\\} # for Credit
echo ${param_info%\\\\*} # for CompanyName
This uses query and output parameters to control output from the aws cli. Also bash parameter expansions are used.
I am asked to list all users accesses in my company's aws account. Is there any way that I can list out the resources and respective permissions a user has? I feel it is difficult to get the details by looking into both IAM polices as well as Resource based policies. And it is much difficult if the user has cross account access.
There is no single command that you can list all the permission. if you are interested to use some tool then you can try a tool for quickly evaluating IAM permissions in AWS.
You can try this script as well, As listing permission with single command is not possible you can check with a combination of multiple commands.
#!/bin/bash
username=ahsan
echo "**********************"
echo "user info"
aws iam get-user --user-name $username
echo "***********************"
echo ""
# if [ $1=="test" ]; then
# all_users=$(aws iam list-users --output text | cut -f 6)
# echo "users in account are $all_users"
# fi
echo "get user groups"
echo "***********************************************"
Groups=$(aws iam list-groups-for-user --user-name ahsan --output text | awk '{print $5}')
echo "user $username belong to $Groups"
echo "***********************************************"
echo "listing policies in group"
for Group in $Groups
do
echo ""
echo "***********************************************"
echo "list attached policies with group $Group"
aws iam list-attached-group-policies --group-name $Group --output table
echo "***********************************************"
echo ""
done
echo "list attached policies"
aws iam list-attached-user-policies --user-name $username --output table
echo "-------- Inline Policies --------"
for Group in $Groups
do
aws iam list-group-policies --group-name $Group --output table
done
aws iam list-user-policies --user-name $username --output table
Kindly run below one liner bash script regarding to list all users with their policies, groups,attached polices.
aws iam list-users |grep -i username > list_users ; cat list_users |awk '{print $NF}' |tr '\"' ' ' |tr '\,' ' '|while read user; do echo "\n\n--------------Getting information for user $user-----------\n\n" ; aws iam list-user-policies --user-name $user --output yaml; aws iam list-groups-for-user --user-name $user --output yaml;aws iam list-attached-user-policies --user-name $user --output yaml ;done ;echo;echo
I try to encode and decode plaintext using aws kms encrypt and decrypt.But it showing a following error:
aws [options] <command> <subcommand> [<subcommand> ...] [parameters]
To see help text, you can run:
aws help
aws <command> help
aws <command> \<subcommand> help
**Unknown options: --decode, >, ExampleEncryptedFile.txt, base64"**
Commands which i used:
**aws kms encrypt --key-id 1234abcd-12ab-34cd-56ef-1234567890ab --
plaintext mysecretpassword --output text --query
CiphertextBlob | base64 --decode > ExampleEncryptedFile**
If i use like the below it works:
**aws kms encrypt --key-id 1234abcd-12ab-34cd-56ef-1234567890ab --
plaintext fileb://ExamplePlaintextFile --output text --query
CiphertextBlob**
Decode also facing issue like: **InvalidCiphertextException error**
Thanks in advance!
Try these:
#Encrypt
#!/bin/bash
if [ -z $3 ]
then
echo 'Encrypt a file with AWS KMS'
echo 'Usage: ./encrypt.sh <inputfile> <outputfile>'
echo 'Example: ./encrypt.sh input.txt output.txt'
exit 1
fi
aws kms encrypt --key-id alias/**SOME_KEY_ALIAS** --plaintext fileb://$1 --output text --query CiphertextBlob | base64 --decode > $2
#####
# Decrypt
#!/bin/bash
if [ -z $2 ]
then
echo 'Decrypt a file with AWS KMS'
echo 'Usage: ./decrypt.sh <inputfile> <outputfile>'
echo 'Example: ./decrypt.sh input.txt output.txt'
exit 1
fi
aws kms decrypt --ciphertext-blob fileb://$1 --output text --query Plaintext | base64 --decode > $2
I'm trying to write a script that with the help of Jenkins will look at the updated files in git, download it and will encrypt them using AWS KMS. I have a working script that does it all and the file is downloaded to the Jenkins repository on local server. But my problem is encrypting this file in Jenkins repo. Basically, when I encrypt files on local computer, I use the command:
aws kms encrypt --key-id xxxxxxx-xxx-xxxx-xxxx-xxxxxxxxxx --plaintext fileb://file.json --output text --query CiphertextBlob | base64 --decode > Encrypted-data.json
and all is ok, but if I try to do it with Jenkins I get an error that the AWS command not found.
Does somebody know how to solve this problem and how do I make it run AWS through Jenkins?
Here is my working code which breaks down on the last line:
bom_sniffer() {
head -c3 "$1" | LC_ALL=C grep -qP '\xef\xbb\xbf';
if [ $? -eq 0 ]
then
echo "BOM SNIFFER DETECTED BOM CHARACTER IN FILE \"$1\""
exit 1
fi
}
check_rc() {
# exit if passed in value is not = 0
# $1 = return code
# $2 = command / label
if [ $1 -ne 0 ]
then
echo "$2 command failed"
exit 1
fi
}
# finding files that differ from this commit and master
echo 'git fetch'
check_rc $? 'echo git fetch'
git fetch
check_rc $? 'git fetch'
echo 'git diff --name-only origin/master'
check_rc $? 'echo git diff'
diff_files=`git diff --name-only $GIT_PREVIOUS_COMMIT $GIT_COMMIT | xargs`
check_rc $? 'git diff'
for x in ${diff_files}
do
echo "${x}"
cat ${x}
bom_sniffer "${x}"
check_rc $? "BOM character detected in ${x},"
aws configure kms encrypt --key-id xxxxxxx-xxx-xxxx-xxxx-xxxxxxxxxx --plaintext fileb://${x} --output text --query CiphertextBlob | base64 --decode > Encrypted-data.json
done
After discussion with you this is how this issue was resolved :
First corrected the command by removing configure from it.
Installed the awscli for the jenkins user :
pip install awscli --user
Used the absolute path of aws in your script
for eg. say if it's in ~/.local/bin/ use ~/.local/bin/aws kms encrypt --key-id xxxxxxx-xxx-xxxx-xxxx-xxxxxxxxxx --plaintext fileb://${x} --output text --query CiphertextBlob | base64 --decode > Encrypted-data.json in your script.
Or add the path of aws in PATH.