GhostScript in Azure - azure-webjobs

I'm in the process of moving some on-premise app to Azure and struggling with once aspect - GhostScript. We use GhostScript to convert PDF's to multi page TIFF's. At present this is deployed in an Azure VM, but it seems like a WebApp and WebJob would be a better fit - from a management point of view. In all of my testing I've been unable to get a job to run the GhostScript exe.
Has anyone been able to run GhostScript or any third party exe in a WebJob?
I have tried packaging the GhostScript exe, lib and dll into a ZIP file and then unzip to Path.GetTempPath() and then using a new System.Diagnostics.Process to run the exe with the required parameters - this didn't work - the process refused to start with an exit code of -1073741819.
Any help or suggestions would be appreciated.

We got it to work here:
Converting PDFs to Multipage Tiff files Using Azure WebJobs. The key was putting the Ghostscript assemblies in the root of the project and setting "Copy always". This is what allows them to be pushed to the Azure server, and to end up in the correct place, when you publish the project.
Also, we needed to download the file to be processed by Ghostscript to the local Azure WebJob temp directory. This is discovered by using the following code:
Environment.GetEnvironmentVariable("WEBJOBS_PATH");

Related

Amazon aws missing dlls (MSVCP140, VCRUNTIME140, CONCRT140)

I made an web server app, that should reside on amazon aws vitual machine with win10.
When i trying execute this app, it says that it missing dlls (MSVCP140, VCRUNTIME140, CONCRT140)
In the app i use cpprestsdk, cryptocpp, opencv.
I try to install various visual C++ redistributables, but its not gave the result.
Downloaded this dlls separately and put it into system32 foled ,but either no result.
When i put dlls in app folder it gave me the error that app was unable to start correctly (0xc0000007b).
Tryed to execute app builded with /MT flag (it actually dont remove all the dlls) and problem is still there
Amazon instance type is t2.micro

Azure devops to reuse the artifacts already download in previous runs

Currently we are using Nugget packages as our Azure artifacts and during the release process we download the artifact using "Download Package" task . It is working perfectly. But we noticed that even though we have downloaded the package already,during the next run of the pipeline in the same agent, again we have to download it. This is taking lot of time. So we want to prevent the package from getting download if it already present. Could you provide a way to reuse the already downloaded package.
In release pipeline, the System.DefaultWorkingDirectory (Example: C:\agent\_work\r1\a. Same as Agent.ReleaseDirectory and System.ArtifactsDirectory) is the directory to which artifacts are downloaded during deployment of a release. The directory is cleared before every deployment if it requires artifacts to be downloaded to the agent. This is a default behavior, we can not change it unfortunately.

Selenium cloud execution on a machine without code or IDE

I set up my Selenium project (Maven, Java, TestNG) in GitHub repo and it is connected to Jenkins. I am able to execute the Maven project via Jenkins and do the testing. This requires all dependant tools (Maven,Java,Jenkins) set up in my local machine.
But we have a requirement to do this in the cloud. I know we can use Selenium Grid-Docker, BrowserStack or GCP to execute the tests in the cloud but what we need is to have everything installed in the cloud and any external user with access being able to execute any test via UI or executable file without installing anything in user's local machine.
Is this possible at all? If yes,how?
I searched a lot and couldn't find anything. One of my friends said it can be done using AWS but doesn't know how. I just need guidance on the path to take here and I'm willing to learn and implement it myself.
Solved this my deploying code to AWS-EC2.
Here's what I did.
I created a TestNG-Maven project and uploaded to GitHub. Then created a AWS-EC2 t2.micro linux instance and installed Chrome and Jenkins in it. I accessed Jenkins from my local machine and connected it to GitHub repo. From Jenkins when I build the project everything was getting downloaded in EC2 and execution happened in EC2. This will be chrome-headless execution.

Blender on IBM Cloud (Cloud Foundry)

I'm currently developing a web application (Django 2.0) application.
My app will be deployed on IBM Cloud (Cloud Foundry) using python build-pack.
One of my requirements is to install blender.
Everything else is very well, but for blender installation.
What I've tried so far was:
I tried access my app using SSH connection, but surely I don't have root access to apt-get install blender!!
And tried to include blender in packages.json file and push that file using cf push my-app.
But nothing worked for me.
In another shorter question: what is the main approach in Cloud Foundry Apps to install packages like when we use apt-get install in Ubuntu / Debian.
Please correct me if I did anything wrong, or guide me with headlines to solve this problem!!
I see a couple options for you to install packages if they cannot be installed using the regular requirements file (which is the preferred way):
Download the relevant libraries and put them in subfolders of the app before pushing it. The libraries will be uploaded. That is how I would do it.
Once you have an SSH connection, use secure copy (scp) to upload the files and place them in the subfolders where they are expected.
Regarding Blender, the question is what you need in addition to having the code copied over. Does it need a running daemon? Are there more dependencies? You would need to share more information about your specific app to answer that. Maybe, packaging everything as one or more containers and run it on Kubernetes or a combination of Cloud Foundry and Kubernetes is a better way.

How could i deploy my Cloud Code to AWS Elastic Beanstalk? (Parse Server)

I am struggling about how to upload my Cloud Code files that i had on Parse.com to my Parse Server hosted on AWS EB.
So far i have:
Parse Server hosted on AWS EB. To host it on AWS i used the Orange Deploy Button which basically makes all stuff easier for people without having to install the Parse Server locally and upload it later to AWS.
iOS App written in objective C connected to the Parse server and working perfectly
Parse Dashboard locally on my mac connected to the Parse Server on AWS
The only thing that i would need is to upload all my cloud code files to the Parse Server. How could i do this? I have researched a lot over Google, stackoverflow, etc without success. There is some information but its unclear. Thanks in advance.
Finally and thanks to Ran Hassid i now have a Fully functional Parse Server on AWS with Cloud Code. For those who are in the same situation where i was, here is the answer to my question:
Go to this link here and follow all the steps (By the time i asked the question, the information provided by this link of AWS wasn't that clear as it is now. They improved the explanations and the info.)
After you finish all the previous steps from the link. You would have a Parse Server on AWS working.
Now the part of CLOUD CODE. Just create a folder in your MAC or PC wherever you like. Let's say on the desktop and called it Parse Server AWS (You can call it whatever you want)
Install the EB CLI which is the Command line interface to user Terminal (On Mac) or the equivalent on windows to work with the parse server you just set up on AWS (Similar to CloudCode with Parse CLI). The easy way to install it is running this command:
brew install awsebcli
Now open terminal on mac (or the equivalent on windows) and go to the folder that you just created on the step 3.
Run the next command. It will ask you to select the location of your parse server, and then the name.
eb init
Now this command. It will download all the files from AWS of your parse server to this folder you are in.
eb labs download
Finally, you will have a folder called Cloud where you can put all your cloud code files in.
When you finish just run the command:
eb deploy
Now you have your parse server with all your cloud code files working on AWS.
Now any change you need to make to your cloudCode files, just change the local files inside this folder just created on step 3 and run again the command from the step 9. Just exactly as you used to do with Parse Deploy command
Hopefully this information will help many people as it helped to me.
Have a happy coding!
parse-server cloud code is a bit different from Parse.com cloud code. In Parse.com we use the Parse CLI in order to modify and deploy our cloud code (parse deploy ...) in parse-server your cloud code exist under the following path of your parse project ./cloud/main.js* so your cloud code endpoint is the main.js file which by default located under the **cloud folder of your parse project. If you really want you can change this path but to keep it simple use the default location.
Now about deployment. in parse-server you need to redeploy your parse server again when you do some modification to your cloud code. Another option is to edit your cloud code remotely but from my POV its better to redeploy it