Hi there I struggle with AWS EC2 on Mac terminal. Of course at first I tried as many possibilities as I could. I read and applied almost everything which I found. At first I want to tell you that I made step by step in amazon web and I stocked just here at this phrase.
chmod 400 myApp.pem
and then throws error "No such file or directory"
I tried different combos I got this file on my desktop and in download files. I tried both and nothing work properly. I don't know where is problem, but I can see that is big problem.
in my case was key download from AWS wich looks like keyPair.pem.tex rename and just follow the flow :D
Related
I'm currently having a problem with the fonts when I generate a PDF with wkhtmltopdf in Centos 7 on a normal hosting account. However, when I create the PDF in root I get no errors.
The error that I'm getting is:
Fontconfig error: Cannot load default config file
I checked the /etc/fonts/fonts.conf and it exists and it also has read privileges for everyone and I dont know what else coould be going on taking in account that it is working for root and not for the sub accounts.
The code I am using to generate the PDF:
wkhtmltopdf /rout/to/my.html /rout/to/my.pdf
The main problem is that the fonts aren't rendering and we always get the "Sans Serif" font as default. But the funny thing is that if I put the font as bold, it does render with the type of font that I need. In this case it's "Verdana".
Thanks in advance.
I had faced this problem with AWS Lambda today which is AWS Linux but cent OS from inside. Also, I found and successfully solved this problem so I think I should contribute to the community by answering this here.
First, it can be checked that if the font are available for that user, if not you can give path and provide your app fonts.
An easy to deploy implementation of HTML-pdf for AWS Lambda
But any phantom/wkhtmltopdf code throws Error: write EPIPE Next on this link all the required dependencies are posted which I think should be listed somewhere but aren't except this one. Also, the configuration is clearly explained
Aws Lambda PhontomJS dependencies for amazon Linux 2
Ok, so in my particular problem, it was not working because the hosting account had a "Jailed Shell" instead of a "Normal Shell".
This option can be changed in WHM for any specific account in the option "Manage Shell Access".
Hope this helps people in the future.
I am seeing a radical change in behavior from my AWS SageMaker account behavior today. I don't see any service warnings on the site but unfortunately I am in the trial period and can not get technical support.
So, I setup a simple test to see if someone has seen this and TIA for any suggestions on what I might try:
I can upload a file to my juypter notebook home directory. Upload is successful. File appears in the directory.
When I try to download the same file I just uploaded, I get the following error:403 : Forbidden
The error was: Blocking Cross Origin request from https://alpha-gold.notebook.us-east-1.sagemaker.aws/tree/
I can rename the file in the home directory.
I can run notebooks in the home directory.
I can down juypter notebooks as either notebooks are else as .html via the notebook menu options.
I CANNOT download a py file or png file - 403 error
I can load the py file into my notebook and it does works correctly.
I hope this is a transient condition. However, on the off chance this is something that is fixable, TIA.
This is due to Jupyter notebook. alternatively, you can open jupyterlab and then you should be able to download the files.
I am from SageMaker Notebooks team.
It was an issue on our end, we pushed a bad change. The change has been reverted. Please let us know if you are still seeing this issue.
Thanks,
Neelam
Thank you for using SageMaker.
This issue was related to Jupyter, which has been addressed and should not be occurring as of now.
Here is the GitHub issue for this problem: https://github.com/jupyter/notebook/issues/5067
I have tried code entered into the putty command line such as
rsync -avz myHugeFile.dat amazonusername#my.amazon.host.ip:
Trouble Uploading Large Files to RStudio using Louis Aslett's AMI on EC2
(code taken from the linked question above)
Though I could not get it to work and I think this was due to not knowing how to properly frame it in light of the location of the files.
I am putting this as a separate question rather than a comment as I don't have enough 'reputation' to comment. in the context of a windows user asking this
1) Was I correct to use this in Putty?
2) Did I need to put anything before it?
3) the only things editable are the address of the EC2 so I have that correct and the location of the file. If the file is on the desktop, how would I write this?
Even if you can only answer one of these basic questions it would be really helpful as I piece together code on how to do this.
If this question is too basic for this site and your are going to remove, could you please give me an answer before doing so ;)
to sum up what we said in comments: don't use putty to upload files, it is more intended to ssh to your instances. Use a software like winscp or filezilla instead, which are free and easy to use, you will find it a lot easier
Linux n00b here having trouble pulling a file from the server to my local Windows 7 professional 64 bit machine. I am using Wowza to stream live video and I am recording these live videos to my Google Cloud instance located here:
/usr/local/WowzaStreamingEngine/content/myStream.mp4
When I ssh:
gcutil --project=”myprojectname” pull “my instance”
“/usr/local/WowzaStreamingEngine/content/myStream.mp4” “/folder1”
I receive a permission denied error. When I try saving another folder deep on my local machine i.e "/folder1/folder2" the error returned is file or directory not found. I've checked that I have write permisions set on my local Windows 7 machine so I do not think it is a permissions error. Again, apologize for the n00b question, I'm just been stuck here for hours.
Thx,
~Greg
Comment added 7/18:
I enter the following through ssh:
gcutil --project=”Myproject” pull “instance-1” "/usr/local/WowzaStreamingEngine/content/myStream.mp4” “/content"
By entering this I'm expecting the file mystream.mp4 to be copied to my C:/content folder. The following is returned: Warning: Permanently added '107.178.218.8' (ECDSA) to the list of known hosts. Enter passphrase for key '/home/Greg/.ssh/google_compute_engine':
Here I enter the passphrase and the following error is returned: /content: Permission denied Have write set up on this folder. Thanks! – Greg
-=-=-==->
To answer the question about using Cygwin, I'm not familiar with Cygwin and I do not believe it was used in this instance. I ran these commands through the Google Cloud SDK shell which I installed per the directions found here: https://developers.google.com/compute/docs/gcutil/.
What I am doing:
After setting up my google cloud instance I open Google CLoud SDK and enter the following:
gcutil --service_version="v1" --project="myproject" ssh --zone="us-central1-a" "instance-1"
I then am prompted for a passphrase which I create and then run the following:
curl http://metadata/computeMetadata/v1/instance/id -H "X-Google-Metadata-Request:True"
This provides the password I use to login to the Wowza live video streaming engine. All of this works beautifully, I can stream video and record the video to the following location: /usr/local/WowzaStreamingEngine/content/myStream.mp4
Next I attempt to save the .mp4 file to my local drive and that is where I'm having issues. I attempt to run:
gcutil --project=”myproject” pull “instance-1” “/usr/local/WowzaStreamingEngine/content/myStream.mp4” “C:/content”
also tried, C:/content C:\content and C:\content
These attempts threw the following error:
Could not resolve hostname C: Name or service not known
Thanks again for your time, I know it is valuable, I really appreciate you helping out a novice.
Update I believe I am close thanks to your help. Switched to local C drive, entered the command as you displayed in your Answer update. Now returning a new, not before seen error:
Error: API rate limit exceeded
I did some research on S.O. and some suggestions made were that billing is not enabled or the relevant API is not enabled and I could solve by turning on Google Compute Engine. Billing has been enabled for a few weeks now on my project. In terms of Google Compute Engine, below are what I believe to be the relevant items turned on:
User Info: Enabled
Compute: Read Write
Storage: Full
Task Queue: Enabled
BigQuery: Enabled
Cloud SQL: Enabled
Cloud Database: Enabled
The test video I recorded was short and small in size. I also have not done anything else with this instance so at a loss as to why I am getting the API rate exceeded error.
I also went to the Google APIs console. I see very limited usage reported so, again, not sure why I am exceeding the API limit. Perhaps I do not have something set appropriately in the APIs console?
I'm guessing you're using Cygwin here (please correct me if I'm wrong).
The root directory for your Cygwin installation is most likely C:\cygwin (see FAQ) and not C: so when you say /content on the command line, you're referring to C:\cygwin\content and not C:\content.
Secondly, since you're likely running as a regular user (and not root) you cannot write to /content so that's why you're getting the permission denied error.
Solution: specify the target directory as C:/content (or C:\\content) rather than /content.
Update: from the update to the question, you're using the Google Cloud SDK shell, not Cygwin, so the above answer does not apply. The reason you're seeing the error:
Could not resolve hostname C: Name or service not known
is because gcutil (like ssh) parses destinations which include : as having the pattern [hostname]:[path]. Thus, you should avoid : in the destination, which means we need to drop the drive spec.
In this case, the following should suffice, assuming that you're currently at a prompt that looks like C:\...>:
gcutil --project=myproject pull instance-1 /usr/local/WowzaStreamingEngine/content/myStream.mp4 \content
If not, first switch to the C: drive by issuing the command:
C:
and then run the above command.
Note: I removed the quotes from the command line because you don't need it in the case where parameters don't have spaces in them.
http://www.keciadesign.dk
I am trying to set up table rates in Magento 1.6.2.0. The problem occurs when I try to upload the file with table rates (CSV-file). Then the error "Unable to list current working directory" appears and I can't go any further.
TMP, Media and Var folders have perm.777.
I have read everything there was to find on the Internet on this problem - many seem to have had this problem but I have yet to see a solution.
Note:
Probably not very relevant, but I am on Unoeuro hosting on a shared serverspot.
With some extensions (Wyomind Simple Google Shopping) the error shows up when var/tmp is missing in Magento directory structure.
The most popular reason of this problem - wrong permissions for media directory. It should be writeable by web server. More information can be checked here.
Look to your php.ini and find upload_tmp_dir option (or use echo ini_get('upload_tmp_dir') in your code. Seems like PHP can't list files in this directory where apache uploads files. I'm afraid you can't change permissions of this folder on shared hosting.
This error can also be reported if you have ran out of disk space.