I am trying this guide.
What should I put in the Introspection Endpoint? this is the phase when I have the Access Token and I need to request the resource
Or how to find the configuration of the introspection endpoint and why is it even asking for that endpoint?
I think you are using master branch of product-is git repo. If you're using IS 5.2.0, you should use that branch. 5.2.0 branch should not have such an endpoint. As mentioned here, change the branch to v5.2.0 after you clone samples in the git repo.
git checkout -b v5.2.0 v5.2.0
https://docs.wso2.com/display/IS520/Downloading+a+Sample
Related
i have deployed my Django portfolio using nginx server.but now i want a feature in which i make changes to my Github repo and then it will get automatic deployed to my nginx server.
how can i do this?.
thankyou
Read more about Jenkins this will help you to pull code from the Github webhook and deploy it automatically, you just have to push code on Github. You just have to install it on the server and set up everything.
you can use cicd but recently it is requiered to validate account with credit cart... if you dont have a card can use git hook for auto deployment
My nextjs front-end app on AWS has a back-end dependency in package.json linked it in this way:
"api-client": "git+https://username:password#bitbucket.org/username/api_client_dev.git".
When I update my backend repository with changes, locally (npm run dev) everything works, but the app on AWS (with Amplify), when building recognizes an error type about a variable referring to something I haven't done yet.
My front-end doesn't recognize the updated repository.
If I check my repo on bitbucket is updated.
No problems with branches.
I don't understand why. Any suggestion?
Thank you
The problem was in amplify.yml
Adding the script npm update on pre-build, force amplify to refresh cached dependencies on node_modules, my dependency included.
I want to rebuild the WSO2 EI from source code on Git repository. But I do not know how to start this job. I have already sent an email to dev#wso2.org to get advice for along time. But I have not received any information from them yet.
I also downloaded and rebuilt the carbon-commons repository tag that was used for the WSO2 AS 5.3.0 release successfully. But I do not know what is the tag of WSO2 EI from the carbon repository?
(following this reference link: https://docs.wso2.com/display/Carbon4411/Working+with+the+Source+Code)
Could you give me some advice to do this job quickly ?
You can build the EI pack by cloning product-ei repository from the Github and checkout to the relevant tag. All WSO2 related dependencies are hosted in WSO2 nexus repository. Therefore you don't need to build dependent repositories. You can build EI project without running test cases by simply running "mvn clean install -Dmaven.test.skip=true".
If you need to do a code change for dependent repository, then you can do the necessary changes in relevant dependent repository and build the repositories in following order.
For example, If you need to do a code change to the Synapse, first you need to build the Synapse. Then build Carbon-mediation project with updated Synapse version. Finally build EI with latest synapse and carbon-mediation version.
I am currently trying to build a project with source in a git repository and some dependencies in an artifactory. I need to first download all the sources and binaries from the repo and artifactory to my local workspace.
I could not find any information regarding artifactory integration with bazel. I can see that this feature has been requested https://www.jfrog.com/jira/browse/RTFACT-15428?jql=labels%20%3D%20bazel.
Is anyone aware of any build tools that can first download resources and then build them?
I need both git and artifactory support.
According to the Bazel documentation for Java, you can define external dependencies resolved to Maven with rule maven_jar.
As Artifactory supports Maven, you can set up your dependencies in a Maven repository, and retrieve artifacts from there with your Bazel build script.
On the other side of the build, publication seems to be a work-in-progress and on the roadmap for Bazel builds.
You can also attempt to write the artifactory rules in Skylark: https://docs.bazel.build/versions/master/skylark/repository_rules.html
Remote build cache
Bazel supports any HTTP 1.1 server with PUT and GET methods as http cache. Simple HTTP Auth is also supported. This means using Artifactory as a remote build cache is straightforward.
Create a new Generic repository in Artifactory.
Now run bazel as
bazel test \
--remote_http_cache=https://user:password#[...].com:8081/artifactory/bazel/ \
test //...
See https://docs.bazel.build/versions/master/remote-caching.html for the relevant Bazel doc.
Is there a way to add multiple git repositories in the same Google cloud project?
You currently cannot do this. We know this is a useful feature, and we're working hard on it. Stay tuned!
We've added the ability to have multiple Cloud Source Repositories for every cloud project.
You can read about how to add a new repo to your project here: https://cloud.google.com/source-repositories/docs/setting-up-repositories
There is no way of doing this as of today. Every project can only have one remote repository.
Git submodule should do the trick. Add git repositories as submodules.
See
https://git-scm.com/docs/git-submodule
https://git-scm.com/book/en/v2/Git-Tools-Submodules
No, there isn't, but you can use Git subtree merges to add multiple "subrepositories" as folders in your main repository, which will do the trick.
See details here https://help.github.com/articles/about-git-subtree-merges/
(There are also submodules as #Shishir stated, but as I understand they are only set for your current local clone and won't be included in checkouts/clones done by others, so I think submodules won't work).
Every Google cloud project can only have one remote repository.
However, t's definitely possible to have multiple local repositories that correspond with the same remote Google cloud repository.
The official documentation describes the following procedure for how to use a Cloud Source Repository as a remote for a local Git repository :
Create a local Git repository
Now, create a repository in your environment using the Git command
line tool and pull the source files for a sample application into the
repository. If you have real-world application files, you can use
these instead.
$ cd $HOME
$ git init my-project
$ cd my-project
$ git pull https://github.com/GoogleCloudPlatform/appengine-helloworld-python
Add the Cloud Source Repository as a remote
Authenticate with Google Cloud Platform and add the Cloud Source
Repository as a Git remote.
On Linux or Mac OS X:
$ gcloud auth login
$ git config credential.helper gcloud.sh
$ git remote add google https://source.developers.google.com/p/<project-id>/
On Windows:
$ gcloud auth login
$ git config credential.helper gcloud.cmd
$ git remote add google https://source.developers.google.com/p/<project-id>/
The credential helper scripts provide the information needed by Git to
connect securely to the Cloud Source Repository using your Google
account credentials. You don't need to perform any additional
configuration steps (for example, uploading ssh keys) to establish
this secure connection.
Note that the gcloud command must be in your $PATH for the
credential helper scripts to work.
It also explains how to create a local git by cloning a Cloud Source repository :
Clone a Cloud Source Repository
Alternatively, you can create a new local Git repository by cloning
the contents of an existing Cloud Source Repository:
$ gcloud init
$ gcloud source repos clone default <local-directory>
$ cd <local-directory>
The gcloud source repos clone command adds the Cloud Source
Repository as a remote named origin and clones it into a local Git
repository located in <local-directory>.