Azure FTPS How to find the deployed project files - django

I am using Azure FTP to update the project files after they are deployed via CLI
I get to /site/wwwroot
I see a tar file, where do I find all the project files.
Requirement is to update single file each time to test it.
CMD used
az webapp up --resource-group <resourcegroupname> --location <location> --plan <plan_name> --os-type Linux --runtime "python|3.9" --sku B1

You cannot update files in an App Service via Azure CLI
Source: https://learn.microsoft.com/en-us/azure/app-service/deploy-ftp?tabs=cli
Please, follow this documentation

Related

Create a Cloud Foundry Service from a hosted Cloud Foundry application

I have a Cloud Foundry application hosted using Swisscom. One of the features I require from the application is that the user can set up a Cloud Foundry Service - Specifically the Swisscom Secrets-Store service. However, this is turning out to be a bit of a challenge.
Firstly, I used the cloudfoundry-cli NPM module to create the service from my application much like the example provided:
const cf = require('cloudfoundry-cli')
cf.provisionService({
name: 'mongoNo1',
type: 'compose-for-mongodb',
plan: 'Standard'
}, function(err) {
if (err) return console.log(err)
console.log('Service provisioned')
})
This worked fine when I was running my application locally because obviously I have the CF CLI installed on my computer. However, when I pushed the application to Swisscom and tried to create the service nothing happened as the CF CLI isn't installed on the application.
I have also checked out cf-nodejs-client which is suggested if you don't want to install the CF CLI by allowing you to pass your Cloud Foundry login details and API endpoint. However, I've checked the docs and it seems this NPM module doesn't allow you to create services. The documents show you can either getService, getServicePlans, getServices, or remove - No option to create.
Worst case scenario I thought I could SSH into the running application and install the CF CLI, but for that you need root access which is not possible.
My question is - Is it possible to install the CF CLI on the application when it is being pushed? Perhaps a buildpack which can install the CF CLI when the application is being pushed? I have checked the list of community buildpacks and this doesn't seem to exist and I wouldn't know where to start creating one. Otherwise, does anyone know how to create Cloud Foundry services from a Node application without the need of the CF CLI.
Is it possible to install the CF CLI on the application when it is being pushed?
One option would be to simply bundle the cf cli with your application. It's a static binary, so it's pretty easy to bundle with your app. The following should work.
Download the cf cli for Linux from here & unzip it.
Put this into a folder that's under your application root. For example: mkdir bin/ then move your binary from step #1 to the bin/ folder.
Add a .profile file to the root of your application. In that file, put export PATH=$PATH:$HOME/bin, which will put the bin/ directory from step #2 on the PATH (change bin to whatever you named that folder).
Now cf push your app. This will include the binary. When it starts up, the .profile script will execute and put the path to your cf binary on the PATH. Thus you can run cf ... commands in your container.
Hope that helps!

Google Cloud Build

Hi I am new to Google Cloud Platform. I want to build an Java application which should be built using Google Cloud Build without docker containers. And also the built application to be tested and artifact to be saved in bucket. Can anyone help me on this ?
Cloud Build is conceptually a pipeline mechanism that takes some set of files as input (commonly in some source repo) and applies a number of processing steps to the files including steps that produce output: file(s) | step-1 | step-2 | ... | step-n.
Most of the examples show Cloud Build producing Docker images but this underplays all the many things it can do.
Importantly, each of the processors (steps) must be a Docker containers but the input and output need not be docker images.
You can use javac or mvn or gradle steps to compile your code and then use the gsutil step to copy the war or jar to Google Cloud Storage.
https://github.com/GoogleCloudPlatform/cloud-builders/tree/master/javac
https://github.com/GoogleCloudPlatform/cloud-builders/tree/master/mvn
https://github.com/GoogleCloudPlatform/cloud-builders/tree/master/gradle
https://github.com/GoogleCloudPlatform/cloud-builders/tree/master/gsutil
Since you mentioned that you without docker container, I assume you want to deploy your application not in docker image. You can deploy your app into Google App Engine Standard. So in how to deploy into App Engine, you can refer to this documentation: https://cloud.google.com/build/docs/deploying-builds/deploy-appengine
To run the application on App Engine, you create app.yaml on your project Then you put these lines inside app.yaml
runtime: java11
entrypoint: java -Xmx64m -jar {your application artifact in jar file}```

How do I access the AWS PHP SDK on EC2?

I have gotten to know that the AWS SDK comes with the default operating system installed on AWS EC2 when provisioned via Elastic Beanstalk.
I would like to know if I can access the PHP version of the SDK.
Or does it need to be installed separately.
Thank you.
You have to Install it using phar or composer , Please Refer AWS Documentation for this:
To use Composer with the AWS SDK for PHP:
Open a terminal window and navigate to the directory where your
project is stored. Composer is installed on a per-project basis.
Download and install Composer in your project directory. If you have
curl installed, you can use the following command:
curl -sS https://getcomposer.org/installer | php
Create a file at the root level of your project called composer.json
and add the following dependency for the AWS PHP SDK:
{
"require": {
"aws/aws-sdk-php": "2.*"
}
}
Install the dependencies by running Composer's install command:
php composer.phar install
This will create a vendor directory in your project with the required libraries and an autoloader script used to load them for your project.
Require Composer's autoloader by adding the following line to your
code's bootstrap process (typically in index.php):
require '/path/to/sdk/vendor/autoload.php';
Your code is now ready to use the AWS SDK for PHP!

AWS cognito user authentication with laravel

We are trying to implement Amazon Cognito services for user authentication with our built in laravel 5.1 application. We are looking for a composer package for laravel 5.1 that allow to Amazon Cognito User Pools, registering user's into User Pools, password resets etc.
You can use AWS SDK for PHP using Composer. See the following quoted steps of this guide.
Open a terminal window and navigate to the directory where your project is stored. Composer is installed on a per-project basis.
Download and install Composer in your project directory. If you have curl installed, you can use the following command:
curl -sS https://getcomposer.org/installer | php
When the installation script finishes, a composer.phar file will be created in the directory where you ran the installer.
Create a file at the root level of your project called composer.json and add the following dependency for the AWS PHP SDK:
{
"require": {
"aws/aws-sdk-php": "2.*"
}
}
Install the dependencies by running Composer's install command:
php composer.phar install
This will create a vendor directory in your project with the required libraries and an autoloader script used to load them for your
project.
Require Composer's autoloader by adding the following line to your code's bootstrap process (typically in index.php):
require '/path/to/sdk/vendor/autoload.php';
Your code is now ready to use the AWS SDK for PHP!
AWS has added Cognito User Pools management in Version 3.32.7. You may have a look at AWS Service Provider for Laravel as well for more information.
This question is quite old, but for someone looking for such package check this out, I think this is what you need
https://github.com/black-bits/laravel-cognito-auth

How could i deploy my Cloud Code to AWS Elastic Beanstalk? (Parse Server)

I am struggling about how to upload my Cloud Code files that i had on Parse.com to my Parse Server hosted on AWS EB.
So far i have:
Parse Server hosted on AWS EB. To host it on AWS i used the Orange Deploy Button which basically makes all stuff easier for people without having to install the Parse Server locally and upload it later to AWS.
iOS App written in objective C connected to the Parse server and working perfectly
Parse Dashboard locally on my mac connected to the Parse Server on AWS
The only thing that i would need is to upload all my cloud code files to the Parse Server. How could i do this? I have researched a lot over Google, stackoverflow, etc without success. There is some information but its unclear. Thanks in advance.
Finally and thanks to Ran Hassid i now have a Fully functional Parse Server on AWS with Cloud Code. For those who are in the same situation where i was, here is the answer to my question:
Go to this link here and follow all the steps (By the time i asked the question, the information provided by this link of AWS wasn't that clear as it is now. They improved the explanations and the info.)
After you finish all the previous steps from the link. You would have a Parse Server on AWS working.
Now the part of CLOUD CODE. Just create a folder in your MAC or PC wherever you like. Let's say on the desktop and called it Parse Server AWS (You can call it whatever you want)
Install the EB CLI which is the Command line interface to user Terminal (On Mac) or the equivalent on windows to work with the parse server you just set up on AWS (Similar to CloudCode with Parse CLI). The easy way to install it is running this command:
brew install awsebcli
Now open terminal on mac (or the equivalent on windows) and go to the folder that you just created on the step 3.
Run the next command. It will ask you to select the location of your parse server, and then the name.
eb init
Now this command. It will download all the files from AWS of your parse server to this folder you are in.
eb labs download
Finally, you will have a folder called Cloud where you can put all your cloud code files in.
When you finish just run the command:
eb deploy
Now you have your parse server with all your cloud code files working on AWS.
Now any change you need to make to your cloudCode files, just change the local files inside this folder just created on step 3 and run again the command from the step 9. Just exactly as you used to do with Parse Deploy command
Hopefully this information will help many people as it helped to me.
Have a happy coding!
parse-server cloud code is a bit different from Parse.com cloud code. In Parse.com we use the Parse CLI in order to modify and deploy our cloud code (parse deploy ...) in parse-server your cloud code exist under the following path of your parse project ./cloud/main.js* so your cloud code endpoint is the main.js file which by default located under the **cloud folder of your parse project. If you really want you can change this path but to keep it simple use the default location.
Now about deployment. in parse-server you need to redeploy your parse server again when you do some modification to your cloud code. Another option is to edit your cloud code remotely but from my POV its better to redeploy it