How to view contents of a PCF .droplet file in Windows? - cloud-foundry

Problem
The question pretty much says it all. I used the plugin: "cf local" to get the .droplet file for my app in PCF. However, I have no idea how to expand or view the contents of the file.
What I tried
I tried adding a .zip at the end, but that did not work.
I tried viewing in NotePad, but that did not work.
Notes
We are using a Diego back-end which prevents us from using "cf files".
It shouldn't matter but we are deploying a .NET application
Related: Is it possible to download all files of an application in Cloud Foundry?

It's a gziped tar archive. Try adding a .tgz or .tar.gz extension. You may need a third party archive tool, I don't know if Windows will open that file by default. 7zip or something comparable should open it.

Daniel was on the right track, however I wanted to post exactly what I used as an alternative to an extraction utility.
I found that the easiest way is to use bash, within Windows. When we installed Github desktop there was an option to use bash and most of us, in my area, have done this.
If you have already installed it then: Goto Preferences an choose your Git-Shell. Under default Shell you can choose between: CMD, Git Bash, Powershell or Custom.
Once that is in place you can navigate to the folder where the .droplet file lives and execute the following command:
tar -xvzf app-name.droplet
This will extract the contents into a folder called "app" in the current directory which has the contents of your asset that would be in PCF.

Related

Download Compressed Folder from Jupyter Notebook in GCloud Deep Learning VM

This seems to be a very simple question, but I couldn't find a way to do it. The jyputer notebook has the option to download file one by one. But my training process generates too many files, and I want to download them all at once. Is there any way to do it?
Assuming it is JupyterLab what you are using:
Open a new Launcher (+ icon) and start a new terminal session.
Use zip -r FILE_NAME.zip PATH/TO/OUTPUT/FOLDER/ to compress the required folder.
Download the zip file as you were doing with the other ones.

SublimeText 3 - Package Control: Removed directory for orphaned package ColdFusion

I am running SublimeText 3 (build 3083). I follow the manual install directions for the ColdFusion plugin as described here (https://github.com/SublimeText/ColdFusion):
Download Manually
Download the files using the GitHub .zip download option
Unzip the files and rename the folder to ColdFusion
Copy the folder to your Sublime Text 2 Packages directory
This has worked (and still works) on many machines. But today I have a PC that each time I re-open SublimeText it removes the ColdFusion directory and in the SublimeText console it says "Package Control: Removed directory for orphaned package ColdFusion". Haven't been able to find anything on this yet. Help appreciated. Thanks!
The "Download Manually" section is for Sublime Text 2 only. At the very top of the README you linked are the ST3 instructions:
The development branch contains a rewrite of the ColdFusion plugin. The only installation method is via Git.
cd Packages/
git clone https://github.com/SublimeText/ColdFusion.git
cd ColdFusion
git checkout development
On Windows machines, the Packages folder is in %APPDATA%\Sublime Text 3\Packages. You'll need to have a working copy of git on your machine, which can be obtained here.
Alternatively, you can download a zip file of the development branch, extract it, rename the resulting ColdFusion-development folder to ColdFusion, then copy that into your Packages folder.
EDIT
I did a little digging, and apparently this package is no longer being developed. However, the CFML package has been suggested as a replacement. Not being a ColdFusion user, I haven't tested it myself, but reviews from others are good. It's available for ST3 only, but can be installed directly via Package Control, so you don't have to worry about using git.
If you are using Sublime Text 3 ST3, the recommended package to use is the one named CFML available via Package Control here: https://packagecontrol.io/packages/CFML
If you are using Sublime Text 2 ST2, the recommended package is the one named Cold​Fusion available via Package Control here: https://packagecontrol.io/packages/ColdFusion
For both of them you'll find there installation instructions using Package Control or manually from GitHub.

Django hosting on pythonanywhere.com/

How can I upload my finished django local project to pythonanywhere.com? Is there any option or I should to do file by file?
I have right now something like this My Django website on pythonanywhere
but I don't see there how to upload my finished project :(
I uploaded a zip file but how to unzip it by bash console?
To unzip the file from a bash console, just start one from the "Consoles" tab and then run unzip filename.zip.
from here:
Getting code and content in and out is easy — you can use our built-in
browser-based editor and Bash consoles, scp, or you can use git,
mercurial and other VCS's to push and pull your code. You can even
sync up via Dropbox.
update
The Dropbox feature is not available anymore, and see first comment below

Setting up .htaccess on host for execution of C++ cgi scripts

I have a cgi script that I know works (as far as the code is concerned), but which cannot be accessed through my website. My hosting provider simply states that I need to edit the .htaccess file, but I have no idea what options/handlers I need to set in order to make the contents of a directory execute like c++.
How is this done?
You can't on this service provider. A quick search of the Bluehost Kb gave this: https://my.bluehost.com/cgi/help/48
Our LINUX web servers have the capability to run CGI scripts in your own "cgi-bin" directory. Scripts may be written in Perl, Python and CGI languages.
Here are some helpful tips to follow when installing scripts:
Upload to your cgi-bin directory to ensure proper file permission settings.
All scripts on our server must have permissions set to 755 (rwx-rx-rx). If you need help in changing script permissions, please see our article about setting file and user permissions.
Upload in ASCII transfer mode (and NOT BINARY mode)
The first line of each script must read: a) #!/usr/bin/perl (for Perl) b) #!/usr/bin/python (for Python)
Ensure the permissions are set to 755
However, there is nothing stopping you just trying just putting your exe in the cgi-bin dir and seeing if it runs, but this probably won't work.
In this case, you'd need to relink any C++ against the local target server, and I doubt that Bluehost would facilitate this -- just too much support hassle for the few $ / month that you pay.

How do clone a Mercurial repository into a directory that already exists?

I have a client's Django project that I'm developing locally, using Mercurial for version control. I push my local repository to my personal remote server (where I keep all my projects) and then when I come to deploy it (on whichever web server) I clone that respository there from my personal server.
This works fine on most servers (where I have total control) but I have a few projects where I'm deploying on to WebFaction. WebFaction is great, but a little unusual with it's setup, as I need to first declare the Django project as an 'application' through their control panel. This creates a few things automatically, such as an 'apache2', 'myproject', etc folder. It's this same folder though where I want to clone the repository from my personal remote server. Doing the usual hg clone command just doesn't work though as it says the destination folder already exists. There isn't much I can do about the contents of this folder really, so I need to work around this.
I'm not an expert at Mercurial and the only way I could seem to work it out is clone it to another folder and then moving all the contents (including the .hg) into the actual folder I want. This seems silly though...
I'm using Mercurial v1.6.2 (installed through easy_install). Could anyone share some light on this?
Many thanks.
Copying just the .hg dir definitely works, but you could also do a hg init and then hg pull http://remote/repo. A repo that has just been initalized has only the 000000000000000 changeset, so you can pull from any repo without getting the "unrelated repos" warning. This is essentially the same as hg clone --pull with a manual init.
You can copy just the .hg folder, then revert or update to tip. E.g.:
cp -a src/.hg dest/
cd dest
hg up -C
you can either move the folder after the fact, or you can just make a symlink to it. my webfaction directory is actually symlinked, so i know it works fine.
In the main, it looks like you might be trying to use Mercurial as an installation manager which is certainly not its design goal.
If I am reading you correctly, part of your source repository should be something like make deploy which puts the files into their proper places. Put another way, having a repository clone (in .hg) in your deployment directory seems odd and trouble-prone.