Update (re-deploy) existing azure webjob - azure-webjobs

I created an on-demand webjob. In the management portal there is no option to upload a new zip, to update it.
I can delete the existing webjob and create a new one, but I would like to keep my logs.
Is there any way to re-deploy it, overriding the old version, maintaining the logs?

You can connect to the website where the webjob is at via FTP and update the necessary files without erasing your log files.
You can get the credentials to connect via FTP from the Publish Profile.
UPDATE
Added screenshot to find credentiales easier per Erik's comment

You can also use your website's debug console at: https://yoursitename.scm.azurewebsites.net/DebugConsole
There you get a file explorer in your browser where you can drag/drop files (even zips that will be extracted into your website).
In the file browser go to d:\home\site\wwwroot\App_Data\jobs\triggered\jobname
Some more info about this at: http://blog.amitapple.com/post/74215124623/deploy-azure-webjobs/

Related

wso2is start in developer mode

I'm working with WSO2 Identity Server and I'm curious if there is a way to run the product in developer mode without building each component of identity server. I found a way to start the "My Account" component in dev mode by following this tutorial ( https://is.docs.wso2.com/en/5.11.0/develop/setting-up-my-account-in-a-dev-environment/ )
but I want to be able to modify different components such as recovery-portal and authentication-portal by forking and cloning the required github repositories and starting the entire app in developer mode in order to see the code changes in real-time.
AFAIK the developer mode will work only for the MyAccount and Console. You can refer to the doc for more details on that.
The recovery portal, the authentication portal etc. cannot be tried with the developer mode. However, there are two ways that you can try this.
Build the war files manually and add them to the WebApps directory. If the server is running, war file changes will automatically get deployed. If the server is not running, you have to delete the existing directory and restart the server.
You can do the changes to the JSPs that are deployed inside the pack. Once the changes are done, you can save the changes and the changes will automatically get deployed.

How to download files onto personal computer from EC2 without knowing file name?

I want to use selenium to log into a private database and download some files. I can already do this via a python script that will launch a new chrome window (via selenium) and automatically download the files I need locally.
My python script uses selenium. Once the python script is run, it launches a google chrome window, which selenium then does some automatic clicking on to download files.
Now, I want to deploy my code to a web application so that I have a website online for others to use. I have my script on an Amazon EC2 instance and I call/invoke my script whenever a user on my website clicks a button. However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer after he clicks my button on my website.
Is there a way to achieve this, either by re-directing downloads? The file names are not known at runtime.
In summary, I have a script (which downloads files) on EC2 that is invoked when my button on my website is clicked. But I need the downloaded files to go onto the user's computer, not the EC2 instance/terminal.
Thank you in advance!
However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer
No there is no way to redirect downloads from an EC2 instance in the same way that you can't redirect downloads normally outside of AWS anyway.
When you download a file, you download it to the machine that requests the download.
Perhaps try returning the download URL in some way back to the UI and trigger the download yourself on the machine (if the URL does not need credentials). Or download the file, reupload to S3 and create a pre-signed URL that you can return to the UI.

Google Cloud Build Step Logs Not Viewable in Console

I am not able to view Google Cloud Build logs in the console. For each step that I click on I cannot see the associated logs in the Build Log window on the right (see picture). This occurs with both the Build Summary and each detail step. The only way to view these logs is to click View Raw, but that is only a great workaround.
Another issue is that each build step status (Success/Failure) is only populated at the end of the entire build process, as opposed to updating after each step.
Is anybody also experiencing this or have suggestions to rememdy this issue? My browser is Google Chrome Version 93.0.4577.82 (Official Build) (x86_64)
Experience shows that there can be adverse interactions between Chrome Plugins and a variety of websites that have rich content or streaming (such as Google's Console). If something seems odd, try and create a new Chrome profile or try running in incognito mode and see if that resolves the issue. If it does, you can incrementally add (or remove) the plugins until you find the one that is causing problems. If you do find the culprit plugin, consider posting that as a comment to others on what you find.
As per the documentation, if you’re storing your build logs in logging, you won’t be able to see them in the cloud build page, instead you will be able to see them in the Logging page(i.e. Operations logging).
To view the build in Cloud Build page in the Cloud Console, if your build logs are present in the Google-created Cloud Storage bucket, grant the Project Viewer role on the project but if your build logs are in a user-specified Cloud Storage bucket, grant the Storage Object Viewer role. And for more information, consider looking at the documentation.
Your second point is an expected behaviour, please look here.
Adding on Kolban's answer above, one of the Chrome extensions that interfered is Imagus. Uninstalling / disabling it should fix the problem.
Another Chrome extension that seems to cause the problem is "Dark Mode". My version is 0.4.2 on Chrome version 96.0.4664.110 and disabling this and refreshing the Build Detail page restored the build log listing.

Transferring files to VMs via SSH web browser upload button

I was wondering where the location of the files are once they are uploaded to VM instances.
I cannot find it very easily.
I am using the below menu to upload available in ssh web browser window.
UPDATES
I managed to find the upload path of the file uploaded(see my answer below).
But I am now having problems with downloading files. I provide the full path but was unable to download it. Is there any trick to this?
I am using Safari Version 10.0.2 (11602.3.12.0.1)
Finally found the location.
Once the upload is done and you see the success message, the file will be in the users home directory.
The username that you have logged in via ssh web browser to upload the file.

Azure Deployment not picking up SSL certificates in repo

I'm currently running a Django app on the Azure server. I have a MySQL database to access using SSL. The SSL certificates I need to access the server are physically in the repo and I got my Django settings file to point to these using a relative path.
I have Azure set up to do continuous deployment from BitBucket. Problem is, at the end of the deployment, it will copy over all the files EXCEPT for the .pem files that I need.
I have to manually copy over the certificates everytime I push a commit. The files are in static/certs/*.pem
Is there something wrong with Azure? Or BitBucket? Or is there a better way of doing this?
I figured it out. Anything put manually inside the static folder gets cleaned out by Azure during deployment.
Just don't put anything inside the static folder