Transferring files to VMs via SSH web browser upload button - google-cloud-platform

I was wondering where the location of the files are once they are uploaded to VM instances.
I cannot find it very easily.
I am using the below menu to upload available in ssh web browser window.
UPDATES
I managed to find the upload path of the file uploaded(see my answer below).
But I am now having problems with downloading files. I provide the full path but was unable to download it. Is there any trick to this?
I am using Safari Version 10.0.2 (11602.3.12.0.1)

Finally found the location.
Once the upload is done and you see the success message, the file will be in the users home directory.
The username that you have logged in via ssh web browser to upload the file.

Related

How to download files onto personal computer from EC2 without knowing file name?

I want to use selenium to log into a private database and download some files. I can already do this via a python script that will launch a new chrome window (via selenium) and automatically download the files I need locally.
My python script uses selenium. Once the python script is run, it launches a google chrome window, which selenium then does some automatic clicking on to download files.
Now, I want to deploy my code to a web application so that I have a website online for others to use. I have my script on an Amazon EC2 instance and I call/invoke my script whenever a user on my website clicks a button. However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer after he clicks my button on my website.
Is there a way to achieve this, either by re-directing downloads? The file names are not known at runtime.
In summary, I have a script (which downloads files) on EC2 that is invoked when my button on my website is clicked. But I need the downloaded files to go onto the user's computer, not the EC2 instance/terminal.
Thank you in advance!
However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer
No there is no way to redirect downloads from an EC2 instance in the same way that you can't redirect downloads normally outside of AWS anyway.
When you download a file, you download it to the machine that requests the download.
Perhaps try returning the download URL in some way back to the UI and trigger the download yourself on the machine (if the URL does not need credentials). Or download the file, reupload to S3 and create a pre-signed URL that you can return to the UI.

How do I export my project on AppDrag?

I have used the AWS hosting of AppDrag for my client's project, but he now wants to take it to his own hosting service. Is there a way to export the site like in a zipped folder so I can send it to him?
Yes. From the AppDrag Dashboard, you open the code editor, where you will find the folders of your site. Right click the root folder and you should see the download as zip option.

Scan for virus before upload. Coldfusion

I am working on a ColdFusion application which required to scan a file for virus before it upload to server.
Is it possible?
There's no guarantee that the user even has an anti-virus program in the first place. Even if it was possible for JavaScript to call a desktop program on the user's computer (it can't), you wouldn't know if there was one or which one they had.
Your only choice is to upload the file to your server:
Verify that the file being uploaded is of the correct mime-type and content for what you're expecting.
Make sure that you upload it to a folder that is not publicly available to your website.
Run it through the anti-virus program on your server
There are more tips for securely uploading files on Pete Freitag's site.

Update (re-deploy) existing azure webjob

I created an on-demand webjob. In the management portal there is no option to upload a new zip, to update it.
I can delete the existing webjob and create a new one, but I would like to keep my logs.
Is there any way to re-deploy it, overriding the old version, maintaining the logs?
You can connect to the website where the webjob is at via FTP and update the necessary files without erasing your log files.
You can get the credentials to connect via FTP from the Publish Profile.
UPDATE
Added screenshot to find credentiales easier per Erik's comment
You can also use your website's debug console at: https://yoursitename.scm.azurewebsites.net/DebugConsole
There you get a file explorer in your browser where you can drag/drop files (even zips that will be extracted into your website).
In the file browser go to d:\home\site\wwwroot\App_Data\jobs\triggered\jobname
Some more info about this at: http://blog.amitapple.com/post/74215124623/deploy-azure-webjobs/

Joomla manual restoring from Cpanel

I have a Joomla website being hacked and when I restore, the front end is ok but at the backend none of the components of joomla is working, any link I click I get to the same interface for uploading which is not from joomla,
Could anybody help me solve this issue or in updating joomla via cpanel without being logged in joomla backend?
waiting to hear from you guys
You should download Joomla Patch File and also take backup of old files and directory where current joomla has installed. and follow below steps.
Open up your FTP client
Log in to your website
Navigate to the the folder where Joomla is actually installed
Upload the patched Joomla installer into the folder
Open your cPanel window
Go back to the “File Manager”
Navigate to the patched Joomla installer you uploaded via FTP, and check the box next to it.
Click extract
After a few seconds the files should be extracted
Lastly, login to your Joomla admin, refresh the page and the version number should be updated.
For further or more detail you can go through this link.
You can follow this step
1 Open Your cpanle
2 Log in
3 goto softeculous link
4 then, you can updating joomla via cpanel