I try to learn AWS and follow some manual from Udemy.
I try to follow the lesson where I need to manage the c:/Users/home/.aws directory. This folder contains users credentials, but I haven't this directory on my windows machine.
My steps:
I created windows server
Created user
The user is logined to the AWS-server
I connected to the windows-machine with Administrator credentials
Installed awscli
In the tutorial lector get to the c:/users/home/.aws directory, but I haven't such a folder on my machine. What is this folder and which actions do I need to execute?
-
Related
I wanted to ask something regarding Debian Virtual Machine on Google Cloud. So I've set some configurations on my Compute Engine and on my laptop such that I can ssh to the VM from my laptop with ssh account1#IPAddress and now I'm logged in as account1#VM-name $ in my terminal. After doing so, I clone my own GitHub Repo to start my Node.js server (the Node.js server might be unimportant in this case, but the git clone is important to the story line). Then, I decided to disconnect from terminal and decided to do SSH-in-browser method. With that I'm logged in with my email, that is let's say account2#VM-name $. However, when I typed ls, the GitHub folder is not there. Is this normal with Debian VMs across cloud services such that different accounts cannot see other folders, or is there actually away to share the same files with different "ssh accounts"? Or maybe I should sudo git clone such that it is saved in the root folder in the VM?
Thanks a lot for the help!
when you logged using account1#vm-name and cloned your github repo it will be available in /home/account1 directory.
Now when you are logged in now with account2#vm and do ls , it will show details of /home/account2 and it is obvious nothing is there.
you can use below commands.
sudo cd /home/account1
ls
I've got a Django project which works great. Previously we just cloned down and used password authentication. I changed the remote to git#bitbucket.org:myteam/our_repo.git
Recently we started requiring 2FA, so now we can only clone down over SSH.
For this project, I created an access key (read-only, which is all I need for cloning down on a staging server) and I was able to clone down the repo (git clone git#bitbucket.org:myteam/our_repo.git) without issue and get it all set up. This appeared to have worked.
The other server admin remoted in and tried to run git pull origin master, he got a permission issue. His windows user is part of the Administrators group - but for some reason that didn't matter. His local user had to be added to the directory with full access before he could run git pull origin master
It appears that this permission issue is causing other issues, too. File uploads (from the Django admin) are no longer actually uploading the files into the directory on the server - my guess is that this is related to the permissions issue, too. Nothing was changed to impact this - the project was just cloned down over SSH.
Does cloning something down over SSH change the permissions on the directories or somehow lock it down more? This wasn't an issue before - only since we've switched over to SSH.
Any feedback is helpful!
Does cloning something down over SSH change the permissions on the directories or somehow lock it down more?
No, it does not change anything locally.
And 2FA is only impacting HTTPS URL (where your password must be a PAT, Persoanl Access Token)
It has no bearing on SSH URLS.
Check first ssh -Tv git#github.com output.
I have installed Jenkins and Apache to one instance of CentOS in aws. I have connected Jenkins with github but I am not able to access the application through url as it shows the following error.
You should add your website content to the directory /var/www/html/.
I need to copy files from jenkins directory to the mentioned one can you please help me how to copy app so that I can use it on browser.
We are trying to implement Amazon Cognito services for user authentication with our built in laravel 5.1 application. We are looking for a composer package for laravel 5.1 that allow to Amazon Cognito User Pools, registering user's into User Pools, password resets etc.
You can use AWS SDK for PHP using Composer. See the following quoted steps of this guide.
Open a terminal window and navigate to the directory where your project is stored. Composer is installed on a per-project basis.
Download and install Composer in your project directory. If you have curl installed, you can use the following command:
curl -sS https://getcomposer.org/installer | php
When the installation script finishes, a composer.phar file will be created in the directory where you ran the installer.
Create a file at the root level of your project called composer.json and add the following dependency for the AWS PHP SDK:
{
"require": {
"aws/aws-sdk-php": "2.*"
}
}
Install the dependencies by running Composer's install command:
php composer.phar install
This will create a vendor directory in your project with the required libraries and an autoloader script used to load them for your
project.
Require Composer's autoloader by adding the following line to your code's bootstrap process (typically in index.php):
require '/path/to/sdk/vendor/autoload.php';
Your code is now ready to use the AWS SDK for PHP!
AWS has added Cognito User Pools management in Version 3.32.7. You may have a look at AWS Service Provider for Laravel as well for more information.
This question is quite old, but for someone looking for such package check this out, I think this is what you need
https://github.com/black-bits/laravel-cognito-auth
I have a Django/Python map application. This application uses gdal library to create geotiffs and tiled layers. Everything works absolutely ok - geotiffs are nicely created, folders with tiles are also created. But there is a problem - I can not create a store and publish a layer, because geoserver says, that it does not have permissions to write to my Django folders. The error message looks like so:
'/var/www/django_project/django_project/layers/tiled/test_layer1/0/0.prj', "[Errno 13] Permission denied:
And multiple similar error messages. When I check permissions to test_layer1 folder, I see that it is 777 and the owner is apache (not apache tomcat, but apache httpd). I guess the whole problem is in the owner - geoserver for some reason is not able to write to these folders (inspite of 777 permisions). But I'm not sure what owner and what permissions I should set, so that both Django application (or perhaps I should say apache) could write to this folder and GeoServer could read and write to it. By the way, GeoServer is running on top of Apache Tomcat.