I have a number of repos that use a custom domain to give the Github pages consistent URLs (all set to Public). I would like to make the Github pages private, but then they are published to some Github assigned URL. Can you change the URL on private Github pages to match your custom domain? If so, how?
To be able to have a custom domain on a private repo, you would need either GitHub Pro, GitHub Team, GitHub Enterprise Cloud, or GitHub Enterprise Server. Custom domains are not available for private repos on a free account.
https://docs.github.com/en/pages/configuring-a-custom-domain-for-your-github-pages-site
Related
Iv'e created an application in google workspace marketplace which requires a service account.
the approval of the application is complete, but when I install the application from the marketplace, the service account does not have permissions until i add it to the domain delegation:
https://developers.google.com/identity/protocols/oauth2/service-account#delegatingauthority
but i noticed this:
enter image description here
so why is it not added in my GCP?
I have tried messing around with the scopes, installing on several organizations, adding and removing the domain delegation.
I am expecting that once i approve the application on my organization the delegations will be added automatically.
I am trying to set up a cloud build trigger from a public github repository with the Cloud Build GitHub App. I installed the app on my repository and authorized it but when I was redirected to GCP to connect the repository to a project this error message came up:
Failed to retrieve GitHub repositories.
The caller does not have permission
error
I suspect it may have something to do with having two factor authentication enabled on my github account, which I need for an organization.
I was able to mirror the same github repository from cloud source repositories without any issues though. I am the owner of the repository and gcp project.
*edit
Looks like the issue is due to having 2 factor authentication enabled on my github account. I disabled it and cloud build was able to connect with my repository. However I will need to have 2 factor enabled as my github organization requires it.
*edit
I hadn't mentioned the github organization i was part of had an ip whitelist configured on top of requiring 2 factor auth. I left the organization and reenabled 2 factor auth and cloud build was able to connect to my repo. Not sure why I would get the original issue if the repo is not in the github organization.
After looking more into this problem you either need to add GCE IP address ranges to the github organization IP whitelist https://cloud.google.com/compute/docs/faq#find_ip_range or just disable the whitelist if able to.
We are running a static website that gets deployed by CI automatically to a public S3 bucket. The website is a jekyll page that has multiple folders. We are very happy with the setup because of the ease of deployment and no infrastructure.
But we now have traffic to our website and we want to add a staging phase.
This phase should be reachable by selected non-technical people from known IP's. We are not able to achieve this using a S3 bucket as this bucket needs to be public.
So we are looking for a way to deploy the static website with a staging area that is not public. Is this possible with a AWS service or other cloud offering?
The first part is relatively easy, just set up another bucket, deploy to there for staging and from there to your production bucket to go live.
Second part turns out to be straightforward too, you can specifiy a policy on an S3 bucket that restricts access to an IP range - see the example here: http://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-3
Personally I'd suggest that it would be better to use a login based restriction if at all possible (the person you need to sign off being out of the office is a classic example of where IP address restrictions get you into trouble) either way you have sufficiently fine grained control over S3 bucket permissions to let you do what you need
We solved this issue by having subfolders in the S3 bucket with unguessable names. The names allow the subfolders to be publically available and act as a shared secret password to the static website. Every pull request gets automatically deployed in this bucket to a subfolder.
Example:
s3-staging-bucket
└ ed567c0e-dca9-44fc-b1bc-18ed5237f598/
└ index.html
My google account has been added to another google cloud platform account. I want to create a simple static website on the cloud, so have been following this: https://cloud.google.com/storage/docs/hosting-static-website
I need to create a bucket with the name of the domain. It states you need to be the owner of the domain and you verify with webmaster tools which is fine. I own the domain on the google account I was added with. I have then added the google account email address which I have been added to, but every time I go to create the bucket it still says I need to verify it! Does the domain need to be verified by the prime cloud account? Or is this just a cache thing? Or am I doing something else wrong!?
I had the site added as https within google search console and static sites hosted in buckets don't support ssl!
I am building an app with technologies I'm not familiar with (sails.js, AWS). I am trying to set up CI between Bitbucket and an AWS Ubuntu EC2 instance.
I have created a couple of repositories in Bitbucket and I am the owner of these repositories. The instructions I have found say that you need to log in to Bitbucket as an admin and click on the admin link. For some reason, I cannot see an admin link in Bitbucket.
Why is this? Do I need to create an admin account? Why can't the owner see the admin link in Bitbucket?