Boto.conf not found - amazon-web-services

I am running a flask app on an AWS EC2 server, and have been using boto to access data stored in dynamoDB. After accidentally adding boto.conf to a git commit (and push and pull on the server), I have found that my python code can no longer locate the boto.conf file. I rolled back the changes with git, but the problem remains.
The python module and boto.conf file exist in the same directory, but when the module calls
boto.config.load_credential_file('boto.conf')
I get the flask error IOError: [Errno 2] No such file or directory: 'boto.conf'.

As per Documentation:
I'm not really sure why you are using boto.config_load_credential_file. In general you can pick up the config in a file called either ~/.boto or /etc/boto.cfg.
You can also look at this questions from SO that also answers how to get the configuration for boto: Getting Credentials File in the boto.cfg for Python

Related

Unable to push on heroku

Processing /D:/bld/astroid_1640971040574/work
remote: ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory: '/D:/bld/astroid_1640971040574/work'
I am having above error while trying to upload to heroku using git push heroku main.
How can I push the changes?
Either what you are specifying includes a PATH (a local path which is only valid on your workstation, you local development environment).
Or what you are executing includes a path.
On the specification side, as explained here, check your requirements.txt file content (no path should be there)
On the execution side, make sure your server.js is not using PWD.
might be your project is in master branch of github.But you are trying to push main branch in heroku. Please check the branch in github where you uploaded the project carefully then push accordingly. Please upload with the screanshot of all command you given there.

Why can't my GCP script/notebook find my file?

I have a working script that finds the data file when it is in the same directory as the script. This works both on my local machine and Google Colab.
When I try it on GCP though it can not find the file. I tried 3 approaches:
PySpark Notebook:
Upload the .ipynb file which includes a wget command. This downloads the file without error but I am unsure where it saves it to and the script can not find the file either (I assume because I am telling it that the file is in the same directory and pressumably using wget on GCP saves it somewhere else by default.)
PySpark with bucket:
I did the same as the PySpark notebook above but first I uploaded the dataset to the bucket and then used the two links provided in the file details when you click the file name inside the bucket on the console (neither worked). I would like to avoid this though as wget is much faster then downloading on my slow wifi then reuploading to the bucket through the console.
GCP SSH:
Create cluster
Access VM through SSH.
Upload .py file using the cog icon
wget the dataset and move both into the same folder
Run script using python gcp.py
Just gives me an error saying file not found.
Thanks.
As per your first and third approach, if you are running a PySpark code on Dataproc, irrespective of whether you use .ipynb file or .py file, please note the below points:
If you use the ‘wget’ command to download the file, then it will be downloaded in the current working directory where your code is executed.
When you try to access the file through the PySpark code, it will check defaultly in HDFS. If you want to access the downloaded file from the current working directory, use the “ file:///” URI with absolute file path.
If you want to access the file from HDFS, then you have to move the downloaded file to HDFS and then access from there using an absolute HDFS file path. Please refer the below example:
hadoop fs -put <local file_name> </HDFS/path/to/directory>

File_x exists in gCloud directory, but gcloud deploy gives FileNotFoundError for File_x

I've successfully tested my Flask app in my local machine, and I've set up the correct app.yaml and requirements.txt files (and the rest of the application files) for gCloud, and yet I get a FileNotFoundError for a file that actually exists in the directory I created in gCloud Shell.
For context, I'm very new to the Flask app and gCloud app deployment.
I've successfully tested my Flask app in my local machine, and I've set up the correct app.yaml and requirements.txt files, and the rest of my files, in gCloud cloud shell. For context, within my main project directory (named FlaskProject_App), there exists a subfolder "static" with the file CEW_file.txt.
However, when I deploy the project, I reach an error:
with open(CEW_file.txt) as word_file:
FileNotFoundError: [Errno 2] No such file or directory:
'CEW_file.txt'
I tried listing the entire path (in gCloud) of CEW_file.txt in the open() function, yet it still gave the same error - that it could not find the file. The path I tried in the open() function was /home/vmagent/app/static/CEW_file.txt but that did not work either.
I also tried ~/FlaskProject_App/static/CEW_file.txt since those directories are listed in my gCloud (as below in the picture), but this also failed, giving the same FileNotFoundError.
common_words = []
with open('~/FlaskProject_App/static/CEW_file.txt', 'r') as
word_file:
f = word_file.read()
for line in f.split('\n'):
common_words.append(line)
I expected the gcloud app deploy command to work, but instead, even though my file system in gCloud has the CEW_file.txt stored as:
[myusername]/FlaskProject_app/static/CEW_file.txt, the open() command still gave the FileNotFoundError.
This error is just telling you that the file cannot be found in the path that you are providing.
I understand that your code is in the FlaskProject_app folder and inside it you have another one called static which contains the CEW_file.txt.
If that is the case then you should be able to open the file using
open('static/CEW_file.txt', 'r')
If that doesn't work try using these lines to see which files are in the static folder:
cdir = os.getcwd()
files = os.listdir(cdir+"/static")
and print the files variable to see if you can find CEW_file.txt in the list. If you cannot see it there, then you will have to move the file there and that should solve your issue.

Python: permission denied when extracting tar.gz

I would like to extract all tar.gz inside a folder but I am getting [Errno 13] Permission denied. I have been through different posts related to the problem but nothing helps. Even extracting a specific member inside tar.gz gives same error. Can someone help what could be wrong?
I want to create a script for unzip (.tar.gz) file via (Python)
Python: Extracting specific files with pattern from tar.gz without extracting the complete file
Overwrite existing read-only files when using Python's tarfile
tar = tarfile.open(fname, "r:gz")
tar.extractall()
tar.close()
Are you running this as a local user? Is this running on Unix/Linux? Does the account running the python script have the appropriate rights to the folder you are attempting to write to?
It happens when you do not have permission to the tmp folder on your system.
Make a temporary directory in your home directory :
mkdir tmp_local
Try to change the tmp directory to your local folder using the following command:
export TMPDIR='/local_home/ah32097/tmp_local'
after this you can directly pip install the python package in tar.gz format.

AWS Elastic Beanstalk deploy not working

I'm new to AWS Eleastic Beanstalk. I'm trying to deploy a new application through awsebcli and I'm getting the following error:
"Error: OSError :: [WinError 145] The directory is not empty '.elasticbeanstalk\app_versions'
I was able to init the eb application. I am running the command line under administrator privileges.
Please Help.
I've just ran into the same issue.
"eb deploy" temporarily creates a subfolder "app_versions" in the ".elasticbeanstalk" folder at the root of the project that contains the zip file to be uploaded to S3. Once done, the folder gets deleted. Check whether any software on your computer might be responsible for preventing this.
The cause for me was a files-syncing software (Dropbox-like) that was watching the entire project for file/folder changes.
I'm developing a Django Application and I get this message -
Uploading app to S3. This may take a while. Upload Complete.
How to fix every time it happens
Disable/Pause file syncing applications, such as: Google Drive Sync/OneDrive/DropBox
Delete the (If exists) mysite.elasticbeanstalk\app_versions , don't worry, it's created each time you type "eb deploy"
Open Command prompt in the folder mysite\ and run the command
pip freeze > requirements.txt
Navigate mysite\ and run again eb deploy should work
The message I get when it's not working
The message I get when it's working