Configure REDIS in Gitlab CI/CD for usage in Django - django

I had a working pipeline on GitLab which started failing after I added tests involving REDIS. I've applied (?) what is written inside GitLab docs but still REDIS is not discovered by my tests.
Here's my gitlab-cy.yml file:
pep8:
image: python:latest
services:
- postgres:10-alpine
- redis:latest
variables:
POSTGRES_DB: ci
DATABASE_URL: "postgresql://postgres:postgres#postgres:5432/$POSTGRES_DB"
REDIS_URL: redis
stage: test
script:
- python -V
- pip install -r ./requirements/gitlab.txt
- pytest --pep8
cache:
paths:
- ~/.cache/pip/
And here's how I'm using it inside Django:
REDIS_URL = os.environ.get("REDIS_URL", "redis://localhost:6379/0")
cache = redis.StrictRedis.from_url(url=REDIS_URL)
I've also set Environment Variables inside CI/CD settings in GitLab REDIS_URL to redis however still when the test executes None gets assigned to host and the test fails ....-> self = Connection<host=None,port=6379,db=0>
Any idea how to connect with Redis on GitLab?

Finally figured it out, so I'm posting it here if you are also having this problem :)
I removed the REDIS_URL variable from gitlab-ci.yml and I have added the following variable in GitLab CI Environment variables settings:
REDIS_URL -> redis://redis:6379/0
which solved the problem :)

Related

Django unable to create multiple databases for test in circleci

I have a Django project which uses 6 databases. For CI/CD, I am using CircleCI. I have written some unit test cases which works fine on my machine (local environment). But when I try to run it in the CircleCI environment, it fails. The reason of failing is Django creates only one database from the six (that also random i.e different one every time). I am not sure what I am doing wrong.
Here is the config, I am using for CircleCI
version: 2.1
orbs:
python: circleci/python#0.2.1
jobs:
test-job:
docker:
- image: circleci/python:3.8
environment:
DATABASE_URL: mysql://root#127.0.0.1:3306/db0
DB1_DATABASE_URL: mysql://root#127.0.0.1:3306/db1
DB2_DATABASE_URL: mysql://root#127.0.0.1:3306/db2
DB3_DATABASE_URL: mysql://root#127.0.0.1:3306/db3
DB4_DATABASE_URL: mysql://root#127.0.0.1:3306/db4
DB5_DATABASE_URL: mysql://root#127.0.0.1:3306/db5
ALLOWED_HOSTS: localhost
CORS_ORIGIN_WHITELIST: http://localhost:8080
CONN_MAX_AGE: 150
DEBUG: False
QRCODE_URL: http://test.com/
- image: circleci/mysql:8.0.18
command: [--default-authentication-plugin=mysql_native_password]
environment:
MYSQL_DATABASE: db0
steps:
- checkout
- python/load-cache
- python/install-deps
- python/save-cache
- run:
command: python manage.py test
name: Run Test
workflows:
main:
jobs:
- test-job:
filters:
branches:
only:
- add_test_to_circleci
Any help would be highly appreciated. Thanks in advance!

Deploying Django Web App using Devops CI/CD onto Azure App Service

I'm trying to deploy simple django web ap to Azure App Service using CI/CD pipeline (the most basic one that is offered by Microsoft for app deployment- no changes from me). However I'm getting the following error:
2021-03-08T16:55:51.172914117Z File "", line 219, in _call_with_frames_removed
2021-03-08T16:55:51.172918317Z File "/home/site/wwwroot/deytabank_auth/wsgi.py", line 13, in
2021-03-08T16:55:51.172923117Z from django.core.wsgi import get_wsgi_application
2021-03-08T16:55:51.172927017Z ModuleNotFoundError: No module named 'django'
I checked other threads and tried doing all the things mentioned but it did not help, or I am missing something:
In wsgi.py I added:
import os
import sys
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + '/..' )
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + '/../licenses_api')
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + '/../deytabank_auth')
from django.core.wsgi import get_wsgi_application
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'deytabank_auth.settings')
application = get_wsgi_application()
But still getting the same error, where django is not recognized. I can see that reuqirements.txt is being installed successfully and it has all the neccessary libraries there (including Django)
My CI/CD yaml file looks like this:
# Python to Linux Web App on Azure
# Build your Python project and deploy it to Azure as a Linux Web App.
# Change python version to one thats appropriate for your application.
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- develop
variables:
# Azure Resource Manager connection created during pipeline creation
azureServiceConnectionId: '***'
# Web app name
webAppName: 'DeytabankAuth'
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Environment name
environmentName: 'DeytabankAuth'
# Project root folder. Point to the folder containing manage.py file.
projectRoot: $(System.DefaultWorkingDirectory)
# Python version: 3.7
pythonVersion: '3.7'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImageName)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- script: |
python -m venv antenv
source antenv/bin/activate
python -m pip install --upgrade pip
pip install setup
pip install -r requirements.txt
workingDirectory: $(projectRoot)
displayName: "Install requirements"
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(projectRoot)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
displayName: 'Upload package'
artifact: drop
- stage: Deploy
displayName: 'Deploy Web App'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: DeploymentJob
pool:
vmImage: $(vmImageName)
environment: $(environmentName)
strategy:
runOnce:
deploy:
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python version'
- task: AzureWebApp#1
displayName: 'Deploy Azure Web App : DeytabankAuth'
inputs:
azureSubscription: $(azureServiceConnectionId)
appName: $(webAppName)
package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
Maybe I need to configure something in the Azure App Service? But i am not sure exactly what.
I have met this issue before, and the problem might be your deployment method. Not sure which one you use, but the classic deployment center below is being deprecated, try use the new deployment center.
Checked your workflow with the one worked on my side, there is nothing different. So I will post the correct step worked on my side for you to refer.
check your project locally to make sure it could run successfully.
Create a new web app (this is to make sure no damage on your web app) and navigate to the Deployment center page.
Go to your GitHub and navigate to GitHub Action page to see the log.
Test your web app and check the file structure on KuDu site: https://{yourappname}.scm.azurewebsites.net/wwwroot/
You could test by click the browse button like what I did.
If you want to run command, go to this site: https://{yourappname}.scm.azurewebsites.net/DebugConsole
By the way, I post this link if you need deploy using DevOps.
The possible reason for this question is that you don't have Django installed.
In the Microsoft-hosted agent ubuntu-latest, Django is not pre-installed. That is, you need to install it manually.
pip install Django==3.1.7
Click this document for detailed information about downloading Django.

Is there a way to inspect the process.env variables on a cloud run service?

After deployment, is there a way to inspect the process.env variables on a running cloud run service?
I thought they would be available in the following page:
https://console.cloud.google.com/run/detail
Is there a way to make them available here? Or to inspect it in some other way?
PS: This is a Docker container.
I have the following ENV on my Dockerfile. And I know they are present, because everything is working as it should. But I cannot see them in the service details:
Dockerfile
ENV NODE_ENV=production
ENV PROJECT_ID=$PROJECT_ID
ENV SERVER_ENV=$SERVER_ENV
I'm using a cloudbuild.yaml file. The ENV directives are present in my Dockerfile, and they are being passed to my container. Maybe I should add env to my cloudbuild.yaml file? Because I'm using --substitutions on my gcloub builds sumbmit call and they are passed as --build-arg to my Docker build step. But I'm not declaring them as env in my cloudbuild.yaml.
I followed the official documentation and set the environment variables on a Cloud Run service using the console.Then I was able to list them on the Google Cloud Console.
You can set environment variables using the Cloud Console, the gcloud
command line, or a YAML file when you create a new service or deploy a
new revision:
With the help of #marian.vladoi's answer. This what I've ended up doing
In my deploy step from cloudbuild.yaml file:
I added the --set-env-vars parameter
steps:
# DEPLOY CONTAINER WITH GCLOUD
- name: "gcr.io/google.com/cloudsdktool/cloud-sdk"
entrypoint: gcloud
args:
- "beta"
- "run"
- "deploy"
- "SERVICE_NAME"
- "--image=gcr.io/$PROJECT_ID/SERVICE_NAME:$_TAG_NAME"
- "--platform=managed"
- "--region=us-central1"
- "--min-instances=$_MIN_INSTANCES"
- "--max-instances=3"
- "--set-env-vars=PROJECT_ID=$PROJECT_ID,SERVER_ENV=$_SERVER_ENV,NODE_ENV=production"
- "--port=8080"
- "--allow-unauthenticated"
timeout: 180s

How to launch a Docker container in Gitlab CI/CD

I'm new to Gitlab (and I only know the basic features of git : pull, push, merge, branch...).
I'm using a local DynamoDB database launched with docker run -p 8000:8000 amazon/dynamodb-local to do unit testing on my Python project. So I have to launch this docker container in the Gitlab CI/CD so that my unit tests work.
I already read the documentation on this subject on the site of gitlab without finding an answer to my problem and I know that I have to modify my gitlab-ci.yml file in order to launch the docker container.
When using Gitlab you can use Docker-in-Docker.
At the top of your .gitlab-ci.yml file
image: docker:stable
services:
- docker:dind
Then in your stage for tests, you can start up the database and use it.
unit_tests:
stage: tests
script:
- export CONTAINER_ID=$(docker run -p 8000:8000 amazon/dynamodb-local)
## You might need to wait a few seconds with `sleep X` for the container to start up.
## Your database is now here docker:8000
## Run your tests here. Database host=docker and port=8000
This is the best way I have found to achieve it and the easiest to understand

Integrating Selenium with Gitlab CI

I have created an automated selenium test script which works perfectly fine.
My task now is to set up Gitlab CI and try to automatically run this selenium script when I make a push to git.
Is it possible to make the selenium script automatically execute and inform the user if the script runs successfully or it fails?
Thank you
How to automatically run Automation Tests on Gitlab Ci with Selenium and specflow with a .net Project ?
If this is something ,you are looking for then .
Here is the core part which is to setup the gitlab-ci.yml file :
Here is how the sample gitlab-ci.yml should look :
image: please give your own docker which can download .net stuff
variables:
DOCKER_DRIVER: overlay2
SOURCE_CODE_DIRECTORY: 'src'
BINARIES_DIRECTORY: 'bin'
OBJECTS_DIRECTORY: 'obj'
NUGET_PACKAGES_DIRECTORY: '.nuget'
stages:
- Build
- Test
before_script:
- 'dotnet restore ${SOURCE_CODE_DIRECTORY}/TestProject.sln --packages ${NUGET_PACKAGES_DIRECTORY}'
Build:
stage: Build
script:
- 'dotnet build $SOURCE_CODE_DIRECTORY/TestProject.sln --no-restore'
except:
- tags
artifacts:
paths:
- '${SOURCE_CODE_DIRECTORY}/*/${BINARIES_DIRECTORY}'
- '${SOURCE_CODE_DIRECTORY}/*/${OBJECTS_DIRECTORY}'
- '${NUGET_PACKAGES_DIRECTORY}'
expire_in: 2 hr
Test:
stage: Test
services:
- selenium/standalone-chrome:latest
script:
- 'export MSBUILDSINGLELOADCONTEXT=1'
- 'export selenium_remote_url=http://selenium__standalone-chrome:4444/wd/hub/'
- 'export PATH=$PATH:${SOURCE_CODE_DIRECTORY}/chromedriver.exe'
- 'dotnet test $SOURCE_CODE_DIRECTORY/ExpressTestProject.sln --no-restore'
artifacts:
paths:
- '${SOURCE_CODE_DIRECTORY}/chromedriver.exe'
- '${SOURCE_CODE_DIRECTORY}/*/${BINARIES_DIRECTORY}'
- '${SOURCE_CODE_DIRECTORY}/*/${OBJECTS_DIRECTORY}'
- '${NUGET_PACKAGES_DIRECTORY}'
Thats it .When you set up your project with this .git-lab-ci.yml ,90 % of your job is done .
The tests will run automatically in Gitlab ,whenever you commit something in your source tree or Tfs.
Thanks