Create a Cloud Foundry Service from a hosted Cloud Foundry application - cloud-foundry

I have a Cloud Foundry application hosted using Swisscom. One of the features I require from the application is that the user can set up a Cloud Foundry Service - Specifically the Swisscom Secrets-Store service. However, this is turning out to be a bit of a challenge.
Firstly, I used the cloudfoundry-cli NPM module to create the service from my application much like the example provided:
const cf = require('cloudfoundry-cli')
cf.provisionService({
name: 'mongoNo1',
type: 'compose-for-mongodb',
plan: 'Standard'
}, function(err) {
if (err) return console.log(err)
console.log('Service provisioned')
})
This worked fine when I was running my application locally because obviously I have the CF CLI installed on my computer. However, when I pushed the application to Swisscom and tried to create the service nothing happened as the CF CLI isn't installed on the application.
I have also checked out cf-nodejs-client which is suggested if you don't want to install the CF CLI by allowing you to pass your Cloud Foundry login details and API endpoint. However, I've checked the docs and it seems this NPM module doesn't allow you to create services. The documents show you can either getService, getServicePlans, getServices, or remove - No option to create.
Worst case scenario I thought I could SSH into the running application and install the CF CLI, but for that you need root access which is not possible.
My question is - Is it possible to install the CF CLI on the application when it is being pushed? Perhaps a buildpack which can install the CF CLI when the application is being pushed? I have checked the list of community buildpacks and this doesn't seem to exist and I wouldn't know where to start creating one. Otherwise, does anyone know how to create Cloud Foundry services from a Node application without the need of the CF CLI.

Is it possible to install the CF CLI on the application when it is being pushed?
One option would be to simply bundle the cf cli with your application. It's a static binary, so it's pretty easy to bundle with your app. The following should work.
Download the cf cli for Linux from here & unzip it.
Put this into a folder that's under your application root. For example: mkdir bin/ then move your binary from step #1 to the bin/ folder.
Add a .profile file to the root of your application. In that file, put export PATH=$PATH:$HOME/bin, which will put the bin/ directory from step #2 on the PATH (change bin to whatever you named that folder).
Now cf push your app. This will include the binary. When it starts up, the .profile script will execute and put the path to your cf binary on the PATH. Thus you can run cf ... commands in your container.
Hope that helps!

Related

AWS CodeDeploy agent is deleting files in the wrong folder during install

We have an unusual setup. We use git on Azure Devops for our code repositories, and AWS for our cloud-based services. In our arsenal we have a mixture of AWS Lambda functions, along with console apps, web apps, and Windows services running on EC2 instances. We have been able to create CI/CD pipelines for all three classes of apps. For the apps running on EC2 instances we use AWS CodeDeploy. These deployments are more complicated, but they all work -- except for one.
Another unusual thing about our setup is that both our development and QA environments are on the same EC2 instance. When the CodeDeploy agent running on that instance retrieves the deployment archive, it unpacks it, reads the appspec.yml file, runs our before install script, which backs up the existing installation and shuts down any services that might be using those files. Then, the install phase updates the files in the designated environment, then deletes -- or tries to delete -- all the files in the other environment folder.
In other words, if a DEV deployment is running, it replaces the files in the DEV folder and also tries to delete the files in the QA folder. I know this sounds like a scripting problem, but I have checked all the script and yaml files no where do I reference the opposing environment.
In this case, the app is a Windows service. Normally, I get a Ruby 'Permission denied # unlink_internal' error on a file in the other folder. As an experiment, I shut down the service in the other environment in my before install script and, as I expected, the agent deleted all the files in the other environment. It updated the files in the target environment, but left the folder in the other environment empty!
Here are my files. I suspect, the problem is being caused by something I did, but I can't, for the life of me, find it.
These are all .NET projects. In my solution I have a ConfigFiles folder set up with subfolders for each environment. Then, in my pipeline yaml file I run a script to select the correct files to move into the archive based on the git branch that is being built.
Here's the code for code for the script that selects the correct files.
Here's the Azure pipeline YAML file.
Here's my before install script:
And, finally, here is my appspec.yml file, which the CodeDeploy agent uses to know where to update the files during installation. How I want this to be the wrong path, but in the deployment archive, the environment specific values are all exactly right.
Any ideas on this one would be greatly appreciated.
I encountered the same problem where deployment of an app deletes files from another app in another folder unexpectedly. My solution is to use different deployment groups for each app, even though they are deploying to the same EC2 instance.
Deploying many apps on the same EC2 instance using the same deployment group results in files/folder deletion on other deployed projects.
From AWS Technical Support:
The reason is that codedeploy creates a clean up file by the format '[deployment group 1 ID]_cleanup" in the directory '/opt/codedeploy-agent/deployment-root/deployment-instructions' everytime a deployment is made to the deployment group and this file deletes all the files that had been installed during the previous deployment made to the deployment group. Since the deployment group is the same in your case, when you make a deployment to the deployment group which installs files to the folder "/var/www/project1", files installed by the previous deployment in the folder "/var/www/project2" are being cleaned up and vice versa which is an expected mechanism of the codedeploy agent.
You can find the explaination here: https://docs.aws.amazon.com/codedeploy/latest/userguide/codedeploy-agent.html#codedeploy-agent-install-files
Please consider creating two different applications/deployment groups
and configure the two pipelines to use different
applications/deployment groups which should fix your problem.

How to set up development environment for AWS Lambda?

I wish to implement the answer that is outlined here: https://stackoverflow.com/a/50397276/1980516
However, I find that I keep running into Unable to import module 'index' at exactly this line:
const _archiver = require('archiver');
So, I'm guessing that I cannot do this via the online console. Instead, I probably have to create a deployment package.
How do I go about this? I apparently need AWS CLI, Node.js, npm and I'm new to all of it. In the Amazon docs I can't find a practical list of how to set up my local development environment.
What tools do I install, which versions and in what order exactly?
Edit: Windows :)
My guess is that you need to npm install archiver and package the node_modules dependencies along with your index.js (handler file for your lambda entry point). You can zip up and deploy/upload it to your lambda.
Also have a look at https://github.com/serverless/serverless framework, that will do these type of things easier.
Have a look at AWS SAM, the Serverless Application Model. It provides a local development setup for things like Lambda functions and API Gateway endpoints, and a way to easily package and deploy things. The exact steps you need are:
Create an AWS account and an IAM user with admin privileges
Install node.js
Install the AWS CLI (and configure it with aws configure)
Install SAM CLI and Docker (the local instances run in docker containers)
Initialize a new SAM project with sam init --runtime nodejs (or other runtime version if need)
Run through the quickstart to get an idea of how to define a SAM template, build a SAM app, and deploy.
If you don't want to use the framework or local development environment and just want to create the source bundle, there are docs. The gist is:
Install nodejs (e.g. using homebrew or an installer)
npm install the modules you need
Zip up your code including the node_modules folder
Upload the zip via the AWS Console

AWS cognito user authentication with laravel

We are trying to implement Amazon Cognito services for user authentication with our built in laravel 5.1 application. We are looking for a composer package for laravel 5.1 that allow to Amazon Cognito User Pools, registering user's into User Pools, password resets etc.
You can use AWS SDK for PHP using Composer. See the following quoted steps of this guide.
Open a terminal window and navigate to the directory where your project is stored. Composer is installed on a per-project basis.
Download and install Composer in your project directory. If you have curl installed, you can use the following command:
curl -sS https://getcomposer.org/installer | php
When the installation script finishes, a composer.phar file will be created in the directory where you ran the installer.
Create a file at the root level of your project called composer.json and add the following dependency for the AWS PHP SDK:
{
"require": {
"aws/aws-sdk-php": "2.*"
}
}
Install the dependencies by running Composer's install command:
php composer.phar install
This will create a vendor directory in your project with the required libraries and an autoloader script used to load them for your
project.
Require Composer's autoloader by adding the following line to your code's bootstrap process (typically in index.php):
require '/path/to/sdk/vendor/autoload.php';
Your code is now ready to use the AWS SDK for PHP!
AWS has added Cognito User Pools management in Version 3.32.7. You may have a look at AWS Service Provider for Laravel as well for more information.
This question is quite old, but for someone looking for such package check this out, I think this is what you need
https://github.com/black-bits/laravel-cognito-auth

How could i deploy my Cloud Code to AWS Elastic Beanstalk? (Parse Server)

I am struggling about how to upload my Cloud Code files that i had on Parse.com to my Parse Server hosted on AWS EB.
So far i have:
Parse Server hosted on AWS EB. To host it on AWS i used the Orange Deploy Button which basically makes all stuff easier for people without having to install the Parse Server locally and upload it later to AWS.
iOS App written in objective C connected to the Parse server and working perfectly
Parse Dashboard locally on my mac connected to the Parse Server on AWS
The only thing that i would need is to upload all my cloud code files to the Parse Server. How could i do this? I have researched a lot over Google, stackoverflow, etc without success. There is some information but its unclear. Thanks in advance.
Finally and thanks to Ran Hassid i now have a Fully functional Parse Server on AWS with Cloud Code. For those who are in the same situation where i was, here is the answer to my question:
Go to this link here and follow all the steps (By the time i asked the question, the information provided by this link of AWS wasn't that clear as it is now. They improved the explanations and the info.)
After you finish all the previous steps from the link. You would have a Parse Server on AWS working.
Now the part of CLOUD CODE. Just create a folder in your MAC or PC wherever you like. Let's say on the desktop and called it Parse Server AWS (You can call it whatever you want)
Install the EB CLI which is the Command line interface to user Terminal (On Mac) or the equivalent on windows to work with the parse server you just set up on AWS (Similar to CloudCode with Parse CLI). The easy way to install it is running this command:
brew install awsebcli
Now open terminal on mac (or the equivalent on windows) and go to the folder that you just created on the step 3.
Run the next command. It will ask you to select the location of your parse server, and then the name.
eb init
Now this command. It will download all the files from AWS of your parse server to this folder you are in.
eb labs download
Finally, you will have a folder called Cloud where you can put all your cloud code files in.
When you finish just run the command:
eb deploy
Now you have your parse server with all your cloud code files working on AWS.
Now any change you need to make to your cloudCode files, just change the local files inside this folder just created on step 3 and run again the command from the step 9. Just exactly as you used to do with Parse Deploy command
Hopefully this information will help many people as it helped to me.
Have a happy coding!
parse-server cloud code is a bit different from Parse.com cloud code. In Parse.com we use the Parse CLI in order to modify and deploy our cloud code (parse deploy ...) in parse-server your cloud code exist under the following path of your parse project ./cloud/main.js* so your cloud code endpoint is the main.js file which by default located under the **cloud folder of your parse project. If you really want you can change this path but to keep it simple use the default location.
Now about deployment. in parse-server you need to redeploy your parse server again when you do some modification to your cloud code. Another option is to edit your cloud code remotely but from my POV its better to redeploy it

How to tell which buildpack has staged my application in Cloud Foundry?

In Cloud Foundry, is there a way to tell which buildpack (and which version of that buildpack) has staged a given application?
I am thinking there must be a way because I can see it on the Bluemix web console, but I can't find the same information from the cf cli (tried both cf apps and cf app APPNAME).
cf curl /v2/apps
Find your app and look at the block of data within. If you wanted to get fancy you could filter on your app name like so cf curl /v2/apps?q=name:<appname>
In the Bluemix web console, under "Files and Logs", you should have a file called staging_info.yml. The contents of that file will tell you the buildpack that was detected. Here's an example:
buildpack_path: /var/vcap/data/dea_next/admin_buildpacks/5186873d-27b5-4033-ba97-a2db19d387a2_2dcb9b37027cd39d9742223e2690f16f079a0792
detected_buildpack: Liberty for Java(TM) (WAR, liberty-2015.4.0_0, ibmjdk-1.7.1_sr2fp11ifx-20150312,
env)
start_command: .liberty/initial_startup.rb