I'm working on a C#/C++ project that need specific hardware architecture, therefore, event for unit testing, I can't run them in a standard docker; I need to use a specific AWS EC2 instance.
My current build steps are the following one:
Build Step 1: Teamcity is making the build in a local docker.
Build Step 2: Teamcity is uploading the the artifacts on S3.
Build Step 3: Teamcity is launching a specific AWS EC2 instance.
Now I want to tell Teamcity to run the tests on this specific instance and still following the progress of the tests to be able to send alerts.
Or, at least, I can manage the testing execution at instance Startup, then need information on the output format to send to Teamcity.
Regards
Related
I have a Django Web Application which is not too large and uses the default database that comes with Django. It doesn't have a large volume of requests either. Just may not be more than 100 requests per second.
I wanted to figure out a method of continuous deployment on AWS from my source code residing in GitHub. I don't want to use EBCLI to deploy to Elastic Beanstalk coz it needs commands in the command line and is not automated deployment. I had tried setting up workflows for my app in GitHub Actions and had set up a web server environment in EB too. But it ddn't seem to work. Also, I couldn't figure out the final url to see my app from that EB environment. I am working on a Windows machine.
Please suggest the least expensive way of doing this or share any videos/ articles you may hae which will get me to my app being finally visible on the browser after deployment.
You will use AWS CodePipeline, a service that builds, tests, and deploys your code every time there is a code change, based on the release process models you define. Use CodePipeline to orchestrate each step in your release process. As part of your setup, you will plug other AWS services into CodePipeline to complete your software delivery pipeline.
https://docs.aws.amazon.com/whitepapers/latest/cicd_for_5g_networks_on_aws/cicd-on-aws.html
I have a microservice architecture with tens of repositories. It is deployed on AWS.
I have three environments on separate regions : dev, staging, prod.
Code is hosted on VSTS / Azure Devops.
I am forced to use VSTS for hosting code and AWS CodePipeline to deploy.
I have this double CI setup at the moment. I run unit tests in Azure DevOps that triggers AWS Code Pipeline that will deploy the architecture via Cloudformation.
Now I wand to trigger units tests and end to end tests from a Pull Request for each environment.
I have to be able to deploy to run end to end tests but I am not sure what's happenning if tests wont pass with an already deployed non working architecture.
Repositories can be coupled, and I want to be able to trigger several CI and rollback them.
What is the best solution :
1.
Keep Unit tests in VSTS
Trigger code pipeline deployment
Wait for all codepipelines to be executed and successful
Trigger end to end from VSTS
2.
Trigger CodePipeline from VSTS
Run units tests in CodePipeline
Deploy new architecture
Wait for all deployment to be executed and successful
Run e2e tests from CodePipeline
Wait for all codepipelie to be ready and successfull and send the response to VSTS to make the CI okay.
Thanks for your help.
Personally I'd prefer option 2.
CodePipeline is a great tool to orchestrate the entire workflow.
Additionally as a point of note, when you say trigger CodePipeline you will need to to deploy using S3 most likely. By waiting for it to be successful you will need to monitor the workflow has executed successfully by the API.
Second option seems to be a better choice:
you have almost all stuff in one place - you don't need to switch between CodePipeline and AzureDevops to see whole picture
it should be also easier to develop, yes you can use web hooks on AzureDevops to trigger pipelines, but you should be able to achiive better and simpler control staying (mostly) with one vendor
Imagine a case when you for instance need to rollback after end-2-end tests - which approach will support that scenario better?
Trying to test the build using codebuild and want to run tests on independent machines (i.e. EC2 instance) and report the tests result in codebuild to make a call either build fails or pass.
Is it possible to do it via codebuild?
I have a question regarding AWS, have an AMI with windows server installed, IIS installed, and a site up and running.
My AutoScale always maintains two instances created based on this AMI.
However, whenever I need to change something on the site I need to upload a new instance, make the changes, update the AMI and update the auto-scale, which is quite time consuming.
Is there any way to automate this by linking to a Git repository?
This is more like a CI CD work rather than achieved in AWS.
You can schedule a CI CD pipeline to detect any update happens in SCM(GIT) and trigger a build job(Jenkins or similar tool) which will provide an artifact to you. You can deploy the artifact to respective application server using CD tools (ansible/even with jenkins or similar tools) whichever suits your infra. In the deploy script itself you can connect to ec2 service to create a new AMI once deployment is completed.
You need to use set of tools to achieve it SCM webhook/poll, Jenkins, Ansible.
I wanted to know if there is a way to setup a cloud environment using Amazon Web Services automatically (like by just invoking a batch file...).
I have a scenario where i want to setup the Environment with all the requisite things like OS, Platforms etc. I want to automate the entire process of setting up the environment. Is this possible?
I am trying to do Continuous Integration and as a part of CI i want to first set up the environment for the application to be deployed, deploy the application and then run automated and performance tests. i am using Jenkins to run my automated and performance test cases with Selenium and Jmeter. Kindly help me.
You can use different tools based on your requirement.
If you also want to configure VPC and other network level configuration, you can use cloud formation, basically you'll create a template and launch your infrastructure using this template file.
https://aws.amazon.com/cloudformation/cloufformation
if you need to launch an project with a database, application server (tomcat, java, python, ...) and with load balancing and autoscaling configuration, you can use elasticbeanstalk
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/Welcome.html
opswork, docker could be also an option depending on your requirements. But they need pre configuration.
Would be more easy to advise a solution if you extend question with your use case.