1.Currently, I'm building a flask project and I also wrote some unit testing code. Now I would like to run the Unit test on the GitHub action, but it stuck at the ./run stage(./run will turn on the http://127.0.0.1:5000/), and does not run the $pytest command. I know the reason why $pytest will not be executed because the Github Action is running the port http://127.0.0.1:5000/. In this case, it can not execute any commands after./run. I was wondering can I run $pytest on another terminal in GitHub action? Is this possible?
Here is the output of my github action:
Run cd Flask-backend
cd Flask-backend
./run
pytest
shell: /usr/bin/bash -e {0}
env:
pythonLocation: /opt/hostedtoolcache/Python/3.8.10/x64
* Serving Flask app 'app.py' (lazy loading)
* Environment: development
* Debug mode: on
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 404-425-256
2.Here is my code for yml file:
name: UnitTesting
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Install Python 3
uses: actions/setup-python#v1
with:
python-version: 3.8
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirement.txt
- name: Run tests with pytest
run: |
cd Flask-backend
./run
pytest
You can use nohup command to run the flask server in the background instead of running on a different terminal.
nohup python app.py
Wait for some time after running this command using the sleep command and then run your tests.
I tried it several times but without any success. But there is a simple workaround to test your api locally. I achieved it with a pytest script by using the test_client() from flask package. This client simulates your flask-app.
from api import app
# init test_client
app.config['TESTING'] = True
client = app.test_client()
# to test your app
def test_api():
r = client.get('/your_endpoint')
assert r.status_code == 200
Note that every requests now made directly over the client and the methods which belongs to.
You can find more information here: https://flask.palletsprojects.com/en/2.0.x/testing/
Related
So question seems easy but let me start with this, ";" "&" does not work.
The two commands to be ran on Github actions instance in CI/CD pipeline :
python3 manage.py runserver
python3 abc.py
After putting the command in the yaml file, only the first command runs and then the workflow is stuck there only and does not executes the second command.
I have tried putting in two separate blocks in workflow yaml file but no luck.
There are two to run commands one after another on Github Actions.
On the same step:
steps:
- name: Run both python files
run: |
python manage.py runserver
python abc.py
On different steps (that will run in sequence):
steps:
- name: Run first python file
run: python manage.py runserver
- name: Run second python file
run: python abc.py
Also, you don't need to inform python3, just python is enough, as you will use the setup/python action informing the version first.
Therefore, your whole workflow would probably look like this:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout repository content
uses: actions/checkout#v2.3.4
- name: Setup Python Version
uses: actions/setup-python#v2
with:
python-version: 3.8
- name: Install Python dependencies
run: python -m pip install --upgrade pip [...] # if necessary
- name: Execute Python scripts
run: |
python manage.py runserver
python abc.py
I have a repo that holds two applications, a django one, and a react one.
I'm trying to integrate tests into the pipeline for the django application, currently, the message I'm getting in the pipeline is:
python backend/manage.py test
+ python backend/manage.py $SECRET_KEY
System check identified no issues (0 silenced).
---------------------------------------------------------------
Ran 0 $SECRET_KEYS in 0.000s
However, running the same command in my local docker container finds 11 tests to run. I'm not sure why they aren't being found in the pipeline
My folder structure is like this
backend/
- ...
- app/
-- tests/
- manage.py
frontend/
- ...
bitbucket-pipelines.yml
and my pipelines file:
image: python:3.8
pipelines:
default:
- parallel:
- step:
name: Test
caches:
- pip
script:
- pip install -r requirements.txt
- python backend/manage.py test
The same issue I was facing(django application) in bitbucket pipeline and in local it's working fine when we are doing it in bitbucket pipeline it's not working, it's installed all the requirements and related packages the last command was not running, my assumption is sqlite3 is not support in bitbucket pipeline.
I'm trying to test my react app with yarn using Github Actions, I need to have Django running for some tests, however Django has the
Watching for file changes with StatReloader
Performing system checks...
System check identified no issues (0 silenced).
October 15, 2021 - 03:32:25
Django version 3.2.6, using settings 'controller.settings'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
message which locks bash, any ideas how I can start Django inside the action so that the action continues to run?
This is currently my action
name: CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
jobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Installing dependecies
run:
./install.sh
shell:
bash
- name: Testing backend
run:
./backend-test.sh
shell:
bash
- name: Starting backend
run:
./backend.sh > /dev/null 2>&1
shell:
bash
- name: Testing frontend
run:
./frontend-test.sh
shell:
bash
And backend.sh does this
# Backend test script
cd backend/ && \
sudo service postgresql start && \
source .venv/bin/activate && \
python3 manage.py test
you should add & at the end of your command running backend.sh script to run it in the background. The process ID will printed to screen and you can run others scripts/commands
- name: Starting backend
run:
./backend.sh > /dev/null 2>&1 &
shell:
bash
I am setting up a gitlab pipeline that I want to use to deploy a Django app on AWS using Terraform.
At the moment I am just setting up the pipeline so that validates the terraform and runs tests (pytest) and lynting.
The pipeline uses docker in docker and it looks like this:
image:
name: hashicorp/terraform:1.0.5
entrypoint:
- '/usr/bin/env'
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
stages:
- Test and Lint
Test and Lint:
image: docker:20.10.9
services:
- docker:20.10.9-dind
stage: Test and Lint
script:
- apk add --update docker-compose
- apk add python3
- apk add py3-pip
- docker-compose run --rm app sh -c "pytest && flake8"
rules:
- if: '$CI_MERGE_REQUEST_TARGET_BRANCH_NAME =~ /^(master|production)$/ || $CI_COMMIT_BRANCH =~ /^(master|production)$/'
The pipeline fails to run the tests due to a database error I think which is weird as I am using pytest to mock the django database.
If I just run:
docker-compose run --rm app sh -c "pytest && flake8"
on the terminal of my local machine all tests pass.
Any idea how can I debug this?
p.s.
let me know if I need to add more info.
I don't think you are able to run docker in the CI directly. You can specify which image to use in each step and then run the commands. For instance:
image: "python:3.7"
before_script:
- python --version
- pip install -r requirements.txt
stages:
- Static Analysis
- Test
unit_test:
stage: Test
script:
- pytest
See, in this pipeline, I used the python:3.7 image. You can upload your docker image to some Registry and use it in the pipeline.
I manage to solve it and the tests in CI pass using
script:
- apk add --update docker-compose
- docker-compose up -d --build && docker-compose run --rm app sh -c "pytest && flake8"
1.I was trying to run the UnitTesting on GitHub action. After I run the script to connect the http://127.0.0.1:5000/, I can not input any commands anymore(Unless open another terminal). I was wondering if there is any way can input the command and execute this command after connecting the port http://127.0.0.1:5000/? Or does Github Action support run in different terminals?
2.Here is my yml code:
name: UnitTesting
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Install Python 3
uses: actions/setup-python#v1
with:
python-version: 3.8
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirement.txt
- name: Run server
run: |
cd Flask-backend
nohup python3 app.py
sleep
- name: Run Unit test
run: |
pytest
Your command does not run in the background.
You need to change it to
nohup python3 app.py & &>/dev/null
and I believe sleep requires an argument, like sleep 3