github pages issue when using github actions and github-pages-deploy-action? - github-pages

I have simple github repo where I host the content of my CV. I use hackmyresume to generate the index.html. I'm using Github Actions to run the npm build and it should publish the generated content to the gh-pages branch.
My workflow file has
on:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Deploy with github-pages
uses: JamesIves/github-pages-deploy-action#master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
BASE_BRANCH: master # The branch the action should deploy from.
BRANCH: gh-pages # The branch the action should deploy to.
FOLDER: target # The folder the action should deploy.
BUILD_SCRIPT: npm install && npm run-script build
And the build command is
"build": "hackmyresume BUILD ./src/main/resources/json/fresh/resume.json target/index.html -t compact",
I can see the generated html file getting committed to the github branch
https://github.com/emeraldjava/emeraldjava/blob/gh-pages/index.html
but the gh-page doesn't pick this up? I get a 404 error when i hit
https://emeraldjava.github.io/emeraldjava/
I believe my repo setting and secrets are correct but I must be missing something small. Any help would be appreciated.

This is happening because of your use of the GITHUB_TOKEN variable. There's an open issue with GitHub due to the fact that the built in token doesn't trigger the GitHub Pages deploy job. This means you'll see the files get committed correctly, but they won't be visible.
To get around this you can use a GitHub access token. You can learn how to generate one here. It needs to be correctly scoped so it has permission to push to a public repository. You'd store this token in your repository's Settings > Secrets menu (Call it something like ACCESS_TOKEN), and then reference it in your configuration like so:
on:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Deploy with github-pages
uses: JamesIves/github-pages-deploy-action#master
env:
ACCESS_TOKEN: ${{ secrets.ACCESS_TOKEN }}
BASE_BRANCH: master # The branch the action should deploy from.
BRANCH: gh-pages # The branch the action should deploy to.
FOLDER: target # The folder the action should deploy.
BUILD_SCRIPT: npm install && npm run-script build
You can find an outline of these variables here. Using an access token will allow the GitHub Pages job to trigger when a new deployment is made. I hope that helps!

Related

How to setup terraform cicd with gcp and github actions in a multidirectory repository

Introduction
I have a repository with all the infrastructure defined using IaC, separated in folders. For instance, all terraform configuration is in /terraform/. I want to apply all terraform files inside that directory from the CI/CD.
Configuration
The used github action is shown below:
name: 'Terraform'
on: [push]
permissions:
contents: read
jobs:
terraform:
name: 'Terraform'
runs-on: ubuntu-latest
environment: production
# Use the Bash shell regardless whether the GitHub Actions runner is ubuntu-latest, macos-latest, or windows-latest
defaults:
run:
shell: bash
#working-directory: terraform
steps:
# Checkout the repository to the GitHub Actions runner
- name: Checkout
uses: actions/checkout#v3
# Install the latest version of Terraform CLI and configure the Terraform CLI configuration file with a Terraform Cloud user API token
- name: Setup Terraform
uses: hashicorp/setup-terraform#v1
- id: 'auth'
uses: 'google-github-actions/auth#v1'
with:
credentials_json: '${{ secrets.GCP_CREDENTIALS }}'
- name: 'Set up Cloud SDK'
uses: 'google-github-actions/setup-gcloud#v1'
# Initialize a new or existing Terraform working directory by creating initial files, loading any remote state, downloading modules, etc.
- name: Terraform Init
run: terraform init
# Checks that all Terraform configuration files adhere to a canonical format
- name: Terraform Format
run: terraform fmt -check
# On push to "master", build or change infrastructure according to Terraform configuration files
# Note: It is recommended to set up a required "strict" status check in your repository for "Terraform Cloud". See the documentation on "strict" required status checks for more information: https://help.github.com/en/github/administering-a-repository/types-of-required-status-checks
- name: Terraform Apply
run: terraform apply -auto-approve -input=false
Problem
If I log in and then change directory to apply terraform it doesn't find to log in.
storage.NewClient() failed: dialing: google: could not find default credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
On the other hand, if I don't change the directory then it doesn't find the configuration files as expected.
Error: No configuration files
Tried to move the terraform configuration files to the root of the repository and works. How could I implement it in a multidirectory repository?
Such feature was requested before. As explained in the issue, auth files is named as follows gha-creds-*.json.
Therefore, added a step just before using terraform to update the variable environment and moving the file itself:
- name: 'Setup google auth in multidirectory repo'
run: |
echo "GOOGLE_APPLICATION_CREDENTIALS=$GITHUB_WORKSPACE/terraform/`ls -1 $GOOGLE_APPLICATION_CREDENTIALS | xargs basename`" >> $GITHUB_ENV
mv $GITHUB_WORKSPACE/gha-creds-*.json $GITHUB_WORKSPACE/terraform/

Can't access Heroku /tmp/build_{sha} build files in Github Action once build has completed

I run a Github Action that deploys my app to Heroku with two relevant steps.
Build Step: push to Heroku, build (including heroku/nodejs buildpack), create JS sourcemaps (among other things).
Sentry Release Step: create a Sentry release, and ideally upload the sourcemaps created in Build Step.
I've noticed in build logs that my build directory is /tmp/build_{sha}/. The project is built here, and sourcemaps therefore would be found in /tmp/build_{sha}/static/dist. The problem is, I can't find the build directory or the sourcemaps in Sentry Release Step, or any step that runs after Build Step has completed.
Following Build Step completion, I've examined /tmp but there's no build_{sha} folder inside.
Yet when I run heroku run bash on the deployed dyno, I see sourcemaps in /static/dist and /staticfiles/dist, where I'd expect them. So where did build files go after Build Step and before deployment? Shouldn't build files be accessible throughout the Github Action?
I've had success accessing sourcemaps within Build Step, by using a Sentry Sourcemap buildpack. Obviously this runs during the build. But I would prefer to have this happen in the Github Action. I've also tried the SentryWebpackPlugin but I've determined sourcemaps must be uploaded once webpack has completed - more specifically, once manage.py collectstatic has completed, since this changes the sourcemaps' filenames and I want to upload the final sourcemaps.
I've read that Heroku's ephemeral storage is wiped on restarting the dyno. But I can't even find these files after moving onto another step in my Github Action.
...
- name: Push To Heroku Remote
run: |
git fetch --unshallow
git push --force heroku ${{ github.ref_name }}:master
- name: Create Sentry release
uses: getsentry/action-release#v1
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
with:
environment: staging
projects: ${{ secrets.projects }}
sourcemaps: <PATH_TO_TMP?>/staticfiles/dist

Local packages not loading to GCP python functions with github actions

I am trying to deploy a GCP function. My code uses a package that's on a private repository. I create a local copy of that package in the folder, and then use gcloud function deploy from the folder to deploy the function.
This works well. I can see a function that is deployed, with the localpackage.
The problem is with using github actions to deploy the function.
The function is part of a repository that has multiple functions, so when I deploy, I run github actions from outside this folder of the function, and while the function gets deployed, the dependencies do not get picked up.
For example, this is my folder structure:
my_repo
- .github/
- workflows/
-function_deploy.yaml
- function_1_folder
- main.py
- requirements.txt
- .gcloudignore
- localpackages --> These are the packages I need uploaded to GCP
My function_deploy.yaml looks like :
name: Build and Deploy to GCP functions
on:
push:
paths:
function_1_folder/**.py
env:
PROJECT_ID: <project_id>
jobs:
job_id:
runs-on: ubuntu-latest
permissions:
contents: 'read'
id-token: 'write'
steps:
- uses: 'actions/checkout#v3'
- id: 'auth'
uses: 'google-github-actions/auth#v0'
with:
credentials_json: <credentials>
- id: 'deploy'
uses: 'google-github-actions/deploy-cloud-functions#v0'
with:
name: <function_name>
runtime: 'python38'
region: <region>
event_trigger_resource: <trigger_resource>
entry_point: 'main'
event_trigger_type: <pubsub>
memory_mb: <size>
source_dir: function_1_folder/
The google function does get deployed, but it fails with:
google-github-actions/deploy-cloud-functions failed with: operation failed: Function failed on loading user code. This is likely due to a bug in the user code. Error message: please examine your function logs to see the error cause...
When I look at the google function, I see that the localpackages folder hasn't been uploaded to GCP.
When I deploy from my local machine however, it does upload the localpackages.
Any suggestions on what I maybe doing incorrectly? And how to upload the localpackages?
I looked at this question:
Github action deploy-cloud-functions not building in dependencies?
But didn't quite understand what was done here.

Github Actions to mirror and sync with AWS codecommit

I am planning to Synchronize a repo in GitHub and to AWS codecommit. All the present code and future PR's merging to main, dev, and preprod should be in the AWS codecommit. I am referring to GitHub Actions and I see three different wiki/documentation. I am not sure which one to follow?
1.https://github.com/marketplace/actions/github-to-aws-codecommit-sync
2.https://github.com/marketplace/actions/mirroring-repository
3.https://github.com/marketplace/actions/automatic-repository-mirror
The first one (actions/github-to-aws-codecommit-sync) should be enough.
Its script entrypoint.sh does a:
git config --global credential.'https://git-codecommit.*.amazonaws.com'.helper '!aws codecommit credential-helper $#'
git remote add sync ${CodeCommitUrl}
git push sync --mirror
That should pull all branches, including PR branches (in refs/pull/ namespace)
That action should be called on merged PR:
name: OnMergedPR
on:
push:
branches:
- "**"
- "!main"
pull_request:
branches:
- main
types: [opened, synchronize, closed]
workflow_dispatch:
jobs:
build:
if: (!(github.event.action == 'closed' && github.event.pull_request.merged != true))
...

is it wrong/dangerous to include aws-exports.js file in source control?

amplify auto-ignores aws-exports.js in .gitignore possibly simply because it may change frequently and is fully generated - however maybe there are also security concerns?
For this project my github project is private so that is not a concern, but I am wondering for future projects that could be public.
The reason I ask is because if I want to run my app setup/build/test through github workflows then I need this file for the build to complete properly on github machines?
Also I appear to need it for my amplify CI hosting to work on amplify console (I have connected my amplify console build->deploy to my github master branch and it all works perfectly but only when aws-exports.js is in source control).
Here is my amplify.yml, I am using reason-react with nextjs, and my amplify console is telling me I have connected to the correct backend:
version: 1
frontend:
phases:
preBuild:
commands:
- yarn install
build:
commands:
- yarn run build
artifacts:
baseDirectory: out
files:
- '**/*'
cache:
paths:
- node_modules/**/*