Custom Identity Provider on Google Cloud - google-cloud-platform

I'm a beginner when it comes to Google Cloud. I have only worked with AWS before, but for this purpose I want to give Google Cloud a try.
I want to create an application where I don't have human users, but instead there are multiple instances of the same client application trying to access the pub/sub service. I would like each one of these users to come to register with my cloud function, which in return will:
create a pub/sub topic that only this client can listen to
return an identifier/key/something that can be used to authenticate the client the next time
How should I handle the authentication in this case? Should I create service credentials for each one of the clients? Or is there a way to provide a custom Identity Provider?

The first question is answered in this answer.
For the second one, the best way is for the user to be identified with Google oauth (a.k.a. a Google account).
When you create the pub/sub topic for this user, you should have already identified them, so you can set the proper permissions on the thread. Then, the user can simply call the pub/sub endpoint identified.
GCF, GAE apps, apps running on GKE, ... all of those have service accounts associated with them, so there should not be a problem to properly identify each client app running there.
If those users don't have an account (e.g. the client app is running outside of GCP), you can ask your human users (the ones running the client apps) to either:
Authenticate with their user account on your client app
Create a service account in GCP and make the client app use it
If those are not options, you can create a service account for each of your users, and provide the proper service account key file to each client.

Related

What AWS service to use as OAUTH2 for the use with microservices

I have been doing some research about using some AWS service as OAUTH2 for our application running in 3 docker containers (backend, frontend, database). Backend has an API which is not open to public obviously and accessible only within docker network. We are looking for extending our app with a chat service, which we want to implement as a service, so we build our app following microservices architecture, since we will add other services later on. So when user logs into our app, his session will be also "shared" with chat service.
Our chat service will be using sockets and since sockets require direct connection to user resources, we can not just implement an integration layer which will supply all resources required by chat service, but we have to either:
implement sockets within our application API (which we dont want to do, we want it as microservice),
open API endpoints for chat service to use, but this option requires OAUTH2 and thats what we try to deal with.
I am not sure if there is some other way to handle this and be also ready for a long run, but if this works, which AWS service would fit the best for us to use, which would play OAUTH2 role for handling security in this matter?
I also checked this post but it didn't help me much in my case. I'm open to any suggestions, I've checked AWS lambda, AWS cognito, AWS amplify, pretty confusing, many features, we don't want to overload the architecture with features we don't need.
What exactly is the thing you want? User accounts managed by AWS? Use cognito.
Users logging in with Apple, Facebook, Etc? Use cognito again.
Just have some backend code that once a user logs in, create a token or session so they can chat with that.
There are many youtube videos on AWS cognito but a lot of them suck. The best one is written in React but before they came out with hooks. Here is part one. https://www.youtube.com/watch?v=EaDMG4amEfk

Can GCP service account keys be used as a direct substitute for Cloud Endpoints api keys?

Is it possible to create google API KEY programmatically?
I did see the above question but I wanted to verify its functionality for my use case. I have a REST api deployed to Google App Engine that I want to introduce some api key mechanism for external users. I'm not making a website where I would just be trying to make sure it's only my code talking to my code from front-end to back-end, it's a public api that anyone with valid credentials is able to access.
Google Cloud Endpoints will only authenticate api keys generated through GCP, so my thinking goes that if it's possible to create a service account and associated ServiceAccountKey via http request, then it could be plausible to generate api keys (service account keys) for any prospective user by generating a service account per user and then giving them the relevant private key that will allow them to authenticate through Cloud Endpoints (jwt signing?).
It sounds like a good plan to me but in all likelihood I'm missing something that makes this a terrible idea. Thoughts? Has this been done before/proven?
TL;DR: Wrong way
First, API Key on GCP can only authenticate GCP Project, not user or service account.
Then, a service account key file is not an API Key. It's a secret identity that you can use for generating an OAuth2 JWT token (identity token) according with the Oauth2 flow. This identity token can be provided for an authentication (and it's valid only 1H)
In addition, you are limited to 100 service accounts per project, and the meaning of a service account is to authenticate app, not user. If you want to authenticate user, I recommend you to have a look to Cloud Identity Platform
Finally, API key generation has evolved very recently (about 1 month) and .... I would like to find the doc again, but it's a dead link. Maybe that the beta is not ready yet.
Note: Maybe the answer that I performed here (just now) can help you?

How do I make my end users (under wso2 identity server) subscribe to a api in wso2 api manager?

I am currently doing a PoC on WSO2 API manager (v2.6.0). I am already having a Web application (Ex: Pizza booking) and also registered customers (end users) who is using the application for pizza booking. Now I wanted to bring the backend services of the pizza booking app such as,
Choose store location,
Book pizza,
Track order, etc.
as APIs into the WSO2 API Manager. For this, I would create the required apis in API manager. Then I wanted to bring my existing web application users (end users) into the API manager and provide access to those apis.
What would be the best way to implement this?
Bring my users under WSO2 identity server and use identity server as a key manager to my API manager?
Bring my users under secondary user store/ use custom user store of API manager?
Out in that case, how would I provide access to particular API's (subscribing to API's) without logging to API manager store and subscribing manually for every user?
Also,
What is the use of creating a Service provider and creating an Oauth application under Inbound authentication?
What can I do with this application?
Is this same as the application we create before subscribing to an API in API manager store?
Can I add users to this application and grant access to them in common?
Can I subscribe to API's using this application so that all users under this application will have access to it?
You can do it either way. Using IS as Key manager (if you already using IS), or adding as a secondary user store.
So, if you are already using WSO2 Identity Server in your deployment, configuring it as key manager (by sharing user stores), will automatically enable all the users in IS (with proper permission) to access the apis.
If you do not use the IS currently, the best option is to add as a secondary user store to the existing APIM deployment.
Please find the answers to the other questions below.
What is the use of creating a Service provider and creating an Oauth application under Inbound authentication?
What can I do with this application?
Is this same as the application we create before subscribing to an API in API manager store?
Can I add users to this application and grant access to them in common?
Can I subscribe to API's using this application so that all users under this application will have access to it?
Answer
The Service Provider is created automatically when an Oauth application is created and generate keys. But, there are different aspects of these two entities.
The Service provider is generally used for generating application keys to get an access token to invoke the apis.
OAuth Application (when you create through API store) has several other uses such as subscribing to APIs, enforcing throttling policies for subscriptions etc.
In order to use the token generated by an Application, the respective API should be subscribed by the application. Otherwise, you will not be able to invoke that API although you have a valid access token.
You can subscribe to an API only from the OAuth application created through the API Store.
Your users can use the same OAuth application (which is created via Store portal and subscribed to an API) to generate an access token for them. That is by providing them with the application keys and using password grant type, they can generate token for them
Refer this documentation for more info on the token API and grant types. https://docs.wso2.com/display/AM260/Token+API
Adding to what #Menaka has explained.
Your end-users don't have to subscribe to APIs. Only the application developer has to subscribe and embed the consumer key/secret to their app. Then the application should generate tokens for the end-users using those keypair+end user credentials.

How to provide Service Instance specific Credentials in Cloud Foundry with Service Broker API?

A request to list all service instances to the Cloud Controller API of Cloud Foundry (API Docs) shows a credentials property in the response body.
I know you can provide credentials in service bindings and service keys through the Open Service Broker API, but how do I fill this global credentials object in a service instance?
Imo, this can only happen during Service Provisioning, but all the Service Broker API defines in the response of the provisioning is a dashboard url and an operation.
I looked at a couple of my lab environments, which have a number of different service brokers installed on them. None of them used the field you're asking about.
i.e. cf curl /v2/service_instances. The dictionary resources[].entity.credentials was always empty.
My understanding is that service credentials are associated with a service binding or a service key, not the service itself. If you want to see the service bindings or service keys, you need to use a different API call.
Ex: service binding cf curl /v2/service_instances/<service-instance-guid>/service_bindings. In that output resources[].entity.credentials should be populated with the service information (ie. hostname, port, username, password, etc...; whatever is provided by the service).
Similarly, service key credentials would be under the API cf curl /v2/service_instances/<service-instance-guid>/service_keys.
Maybe someone else can come along and tell us the purpose of this global field, but at the time of me writing this it appears to be unused.
Hope that helps!

WSO API Manager endpoints and cloud

I have several questions about WSO2 API Manager that I am not able to figure out reading the documentation:
Is it possible to setup a "default" basepath for all API? for example if I have to switch my endpoints from localhost:8080/rest/myapi to 12.43.56.89:8080/rest/myapi is it possible to do it without editing any single API's enpoint?
Is it possible to create create a role which allows access only to sandbox endpoint but not to production endpoint? The only way to do this, as far as I know, is to manually block the access to production once the user has subscribed the API. My idea is to allow all users to access the sandbox but enable only trusted users to access the production APIs once their applications have been validated.
Is it possible to distribute several instances of the AM Gateway? Accordingly with documentation it seems that is only possible to run gateway, store, and publisher+keymanager on different servers but not to run multiple instance of the gateway in parallel.
Thank you!
1) You can use a variable for endpoint base path like this.
http://{uri.var.host}:{uri.var.port}/apis/weather
These variables can be taken from system variables. See this for how to do it.
2) You can use Key Generation (i.e. OAuth App Registration) Workflows for this. This will send a approval request to admin user. If you want to automate it to approve based in user roles or something, you can customize workflows.
3) You can have multiple gateways.