Generating consumer key and consumer secret in WSO2 API Manager - wso2

I want to integrate the WSO2 API Manager with a website so that the user doesn't need to login to the API Store.
From what I can see there is a stumbling block, in that the user needs to click on the 'Generate' button in the store in order to first generate the consumer key and secret.
Once this has been done then on it is possible to call the subscription API and generate token API as documented:
$ curl -b cookies http://localhost:9763/store/site/blocks/subscription/subscription-list/ajax/subscription-list.jag?action=getAllSubscriptions
{
"error" : false,
"subscriptions" : [
{
{
"sandRegenarateOption" : true,
"prodKey" : "2486e65cbac4e372fb319375744fb",
"subscriptions" : [
{
...
"prodConsumerSecret" : "Tx9i9WYu6_a3qqW08bF7jEG660",
"prodConsumerKey" : "VfS5r5u4rFhec2vVBlFosxRgcE",
"prodAuthorizedDomains" : "ALL"
...
}
],
...
"prodConsumerSecret" : "Tx9i9WYu6_a3qqW08bF7jEG660",
"prodConsumerKey" : "VfS5r5u4rFhec2vVBlFosxRgcE"
...
}
]
}
However until the 'Generate' button has been clicked in the browser, the values above are null
When I tried using the suggestion:
$ curl -X POST -b cookies http://localhost:9763/store/site/blocks/subscription/subscription-add/ajax/subscription-add.jag -d "action=generateApplicationKey&application=4&keytype=PRODUCTION&provider=admin&tier=Unlimited&version=0.1&callbackUrl=&authorizedDomains="
The server logs:
ERROR - APIStoreHostObject Error while obtaining the application access token for the application:4
org.wso2.carbon.apimgt.api.APIManagementException: Application should be approved before registering.
at org.wso2.carbon.apimgt.impl.APIConsumerImpl.requestApprovalForApplicationRegistration(APIConsumerImpl.java:1678)
at org.wso2.carbon.apimgt.impl.UserAwareAPIConsumer.requestApprovalForApplicationRegistration(UserAwareAPIConsumer.java:34)
at org.wso2.carbon.apimgt.hostobjects.APIStoreHostObject.jsFunction_getApplicationKey(APIStoreHostObject.java:649)
But according to the getApplications API it is approved already:
curl -b cookies http://localhost:9763/store/site/blocks/application/application-list/ajax/application-list.jag?action=getApplications
{
"applications" : [
...
{
"tier" : "Unlimited",
"status" : "APPROVED",
"callbackUrl" : "",
"name" : "app2",
"id" : 4,
"description" : ""
}
],
"error" : false
}
Any more thoughts ?

You can try the following REST invocation
curl -X POST -b cookies http://localhost:9763/store/site/blocks/subscription/subscription-add/ajax/subscription-add.jag -d "action=generateApplicationKey&application=DefaultApplication&keytype=PRODUCTION&provider=admin&tier=Unlimited&version=1.0.0&callbackUrl=&authorizedDomains="

I do not see any direct way to get that done. But you can make use of Store API exposed by wso2 API manger and retrieve all the subscription and from that you can filter the required ConsumerKey and ConsumerSecret.
I have tried this and worked pretty well to me :)

You need to promote before subscribing. You need to call "/site/blocks/life-cycles/ajax/life-cycles.jag" endpoint with "PUBLISHED" status and "updateStatus" action. Once it is in promoted state you can subscribe to it.

Related

Google Cloud Functions Credentials for Local Development

I have a google cloud function. Within this function, I want to write files to GCS (google cloud storage), then get a signed URL of the file that is written to GCS and send that URL to the caller.
For local development, I run the functions locally using the functions-framework command:
functions-framework --source=.build/ --target=http-function --port 8082
When I want to write to GCS or get the signed URL, the cloud functions framework just tries to get the credentials from the signed-in gcloud CLI user. However, I want to point it to read the credentials from a service account. For all other gcloud development purposes, we have put the service account information in a local creds.json file and point the gcloud to read from that file.
Is there any way I can achieve this for functions? Meaning that when I start the functions locally (using functions-framework), I point it to the creds.json file to read the credentials from there?
All Google's SDKs, e.g., for GCS, make use of Application Default Credentials which you should be using instead of explicitly pathing to a key. If this is true for functions-framework, then exporting the variable should work.
The command gcloud auth application-default login is a better recommendation in that case, especially for testing the signed URL, because with that local credential as well as the Cloud Functions credential (through metadata server), the private key isn't present, and the signed URL must be called in a specific manner (provide token and the service account to be able to sign the URL).
Using gcloud auth application-default login creates Application Default Credentials, which have all the powers of the user's account and are persisted as a key called {HOME}/.config/gcloud/application-default_credentials.
Cloud Function Local Development Authentication
This is what I'm doing for local development of a google cloud function using Nodejs that is triggered by a pub/sub event. This function reads a file from a google cloud storage. This uses the Functions Framework Nodejs
TL;DR;
# shell A
gcloud auth application-default login
npm start
pub/sub event message
# shell B
curl -d "#mockPubSub.json" \
-X POST \
-H "Content-Type: application/json" \
http://localhost:8080
Greater Details
Cloud Function with Functions Framework
Docs: Functions Framework Nodejs
package.json
note the --target and --signature-type
{
...
"scripts": {
"start": "npx functions-framework --target=helloPubSub --signature-type=http"
},
"dependencies": {
"#google-cloud/debug-agent": "^7.0.0",
"#google-cloud/storage": "^6.0.0"
},
"devDependencies": {
"#google-cloud/functions-framework": "^3.1.2"
}
...
}
sample nodejs cloud function that downloads file into memory
/* modified from the sample
index.js
*/
const {Storage} = require('#google-cloud/storage');
function log(message, severity = 'DEBUG', payload) {
// Structured logging
// https://cloud.google.com/functions/docs/monitoring/logging#writing_structured_logs
if (!!payload) {
// If payload is an Error, get the stack trace.
if (payload instanceof Error && !!payload.stack) {
if (!!message ) {
message = message + '\n' + payload.stack;
} else {
message = payload.stack;
}
}
}
const logEntry = {
message: message,
severity: severity,
payload : payload
};
console.log(JSON.stringify(logEntry));
}
function getConfigFile(payload){
console.log("Get Config File from GCS")
const bucketName = 'some-bucket-in-a-project';
const fileName = 'config.json';
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
console.log(
`Contents of gs://${bucketName}/${fileName} are ${contents.toString()}.`
);
}
downloadIntoMemory().catch(console.error);
}
exports.helloPubSub = async (pubSubEvent, context) => {
/*
Read payload from the event and log the exception in App project if the payload cannot be parsed
*/
try {
const payload = Buffer.from(pubSubEvent.body.message.data, 'base64').toString()
const pubSubEventObj = JSON.parse(payload) ;
console.log("name: ", pubSubEventObj.name);
getConfigFile(pubSubEventObj)
} catch (err) {
log('failed to process payload: + payload \n' , 'ERROR', err);
}
};
Mock Message for Pub/Sub Event
blog reference, but I'm not using the emulator
myJson.json
{"widget": {
"debug": "on",
"window": {
"title": "Sample Konfabulator Widget",
"name": "main_window",
"width": 500,
"height": 500
},
"image": {
"src": "Images/Sun.png",
"name": "sun1",
"hOffset": 250,
"vOffset": 250,
"alignment": "center"
},
"text": {
"data": "Click Here",
"size": 36,
"style": "bold",
"name": "text1",
"hOffset": 250,
"vOffset": 100,
"alignment": "center",
"onMouseUp": "sun1.opacity = (sun1.opacity / 100) * 90;"
}
}}
encode for Pub/sub message ( likely there's a better way )
cat myJson.json | grep -v % | base64
take that output and put it into value for the data key:
mockPubSub.json
{
"message": {
"attributes": {
"greeting": "Hello from the Cloud Pub/Sub Emulator!"
},
"data": "< put the output of the base64 from above here >",
"messageId": "136969346945"
},
"subscription": "projects/myproject/subscriptions/mysubscription"
}
Follow steps from the TL;DR: above.
Disclaimers
gcloud auth application-default login uses the permissions of the user who executes the command. So remember, in production, the service account the cloud function uses will need to read from the storage bucket.
while scrubbing this (i.e. renaming bits) and copying it over, I may have messed it up. Sorry if that is true.
this is all contrived if you are curious, my design is to take a message from a cloud scheduler that includes relevant details about what to read from the config.
this article explains a way to hot-reload the cloud function
It remains unclear why I'm using ``-signature-type=http` to mock message, but for now, I am.

Log entries api not retrieving log entries

I am trying to retrieve custom logs for a particular project in google-cloud. I am using this api:
https://logging.googleapis.com/v2/entries:list
as per the example given in this link.
The below is the payload:
{
"filter": "projects/projectA/logs/slow_log",
"resourceNames": [
"projects/projectA"
]
}
There is a custom log based metric called slow_log I created in that projectA, which gathers query logs from cloud-SQL database in that project. I also generated data before calling this api. I am able to see the data in stack-driver console, but unable to get it from the rest call.
Every time I run this api, I only get this response and nothing else:
"nextPageToken": "EAA4suKu3qnLwbtrSg8iDSIDCgEAKgYIgL7q8wVSBwibvMSMvhhglPDiiJzdjt_zAWocCgwI2buKhAYQlvTd2gESCAgLEMPV7ukCGAAgAQ"
Is there anything missing here?
How is it possible to pass time range in this query?
Update
Changed the request as per the comment below as gave the full path of the logs: still only the token is displayed
{
"filter": "projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log",
"projectIds": [
"projectA"
],
"orderBy": "timestamp desc"
}
Also I give this command from command line:
gcloud logging read logName="projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log"
then it fetches the logs in command line, so I am not sure what I am missing in the api explorer and postman where I get only nextpage token.
resourceNames, filter and orderBy are mandatory, try like this:
{
"resourceNames": [
"projects/projectA"
],
"filter": "projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log",
"orderBy": "timestamp desc"
}

Google Admin SDK - Create posix attributes on existing user

I can't create posix attributes on existing account in admin.google.com (also known as Google Cloud Identity / Google Directory) using Admin SDK (Directory API).
To explain my issue, I will use the API tester : https://developers.google.com/admin-sdk/directory/v1/reference/users/update?apix=true
I use the update function to update an existing account without POSIX attributes.
To do that I copy the request body below and use request key : testmdr#contoso.com :
{
"posixAccounts": [
{
"username": "testmdr_contoso_com",
"uid": "2147483645", # I use id between 65535 and 2147483647 (explain: in google documentation)
"gid": "1001",
"homeDirectory": "/home/testmdr_contoso.com",
"shell": "/bin/bash"
}
]
}
I obtain an 503 error :
{
"error": {
"errors": [
{
"domain": "global",
"reason": "backendError",
"message": "Service unavailable. Please try again"
}
],
"code": 503,
"message": "Service unavailable. Please try again"
}
}
If I update name or other, it works.
If I update existing POSIX attribute (existing because create when connection on GCE using OS Login functionality :Here), it works.
Please help me if it's limitation or bug
The requestKey should be the UUID of the user . . . There are probably better ways to do this, but you can get the username / name(requestKey/UUID) by querying the metadata on an oslogin-enabled instance, e.g. (first column is username, second column is requestKey for API tester):
curl -s "http://metadata.google.internal/computeMetadata/v1/oslogin/users?pagesize=50&pagetoken=0" -H "Metadata-Flavor: Google" | \
jq -r '.loginProfiles[]|.posixAccounts[].username,.name' | \
paste - -
(You may have to play with the pagesize & pagetoken parameters)

AWS Pinpoint/Ionic - "Resource not found" error when trying to send push through CLI

I am new at programming with AWS services, so some fundamental things are pretty hard for me. Recently, I was asked to develop an app that used Amazon Pinpoint to send push notifications, as a test for considering future implementations.
As you can see in another question I posted in here (Amazon Pinpoint and Ionic - Push notifications not working when app is in background), I was having trouble trying to send push notifications to users when my app is running in the background. The app was developed using Ionic by following these steps.
When I was almost giving up, I decided to try sending the pushes directly through Firebase, and it finally worked. Some research took me to this question, in which another user described the problem as only happening in AWS Console, so the solution would be to use CLI. After searching a little about it, I found this tutorial about how to sending pinpoint messages to users using CLI, that seems to be what I wanted. Combining it with this documentation about phonegap plugin, I was able to generate a JSON I thought could be a solution:
{
"ApplicationId":"io.ionic.starter",
"MessageRequest":{
"Addresses": {
"": {
"BodyOverride": "",
"ChannelType": "GCM",
"Context": {
"": ""
},
"RawContent": "",
"Substitutions": {},
"TitleOverride": ""
}
},
"Context": {
"": ""
},
"Endpoints": {"us-east-1": {
"BodyOverride": "",
"Context": {},
"RawContent": "",
"Substitutions": {},
"TitleOverride": ""
}
},
"MessageConfiguration": {
"GCMMessage": {
"Action": "OPEN_APP",
"Body": "string",
"CollapseKey": "",
"Data": {
"": ""
},
"IconReference": "",
"ImageIconUrl": "",
"ImageUrl": "",
"Priority": "High",
"RawContent": "{\"data\":{\"title\":\"sometitle\",\"body\":\"somebody\",\"url\":\"insertyourlinkhere.com\"}}",
"RestrictedPackageName": "",
"SilentPush": false,
"SmallImageIconUrl": "",
"Sound": "string",
"Substitutions": {},
"TimeToLive": 123,
"Title": "",
"Url": ""
}
}
}
}
But when I executed it in cmd with aws pinpoint send-messages --color on --region us-east-1 --cli-input-json file://test.json, I got the response An error occurred (NotFoundException) when calling the SendMessages operation: Resource not found.
I believe I didn't write the JSON file correctly, since it's my first time doing this. So please, if any of you know what I am doing wrong, no mattering which step I misunderstood, I would appreciate the help!
"Endpoints" field in the Message request deals with the endpoint id (the identifier associated with an end user device while registering to pinpoint and not the region.)
In case if you haven't registered any endpoints with Pinpoint, you can use the "Addresses" field. After registering the GCM Channel in Amazon Pinpoint, you can get the GCM device token from your device and specify it here.
Here is a sample for sending direct messages using Amazon Pinpoint Note: The example deals with sending SMS message. You should have registered a SMS channel first and created an endpoint with the endpoint id as "test-endpoint1". Otherwise, you can use the "Addresses" field instead of "Endpoints" field.
aws pinpoint send-messages --application-id $APP_ID --message-request '{"MessageConfiguration": {"SMSMessage":{"Body":"hi hello"}},"Endpoints": {"test-endpoint1": {}}}
Also Note: ApplicationId is generated by Pinpoint. When you visit the Pinpoint console and choose your application, the URL will be of the format
https://console.aws.amazon.com/pinpoint/home/?region=us-east-1#/apps/someverybigstringhere/
Here "someverybigstringhere" is the ApplicationId and not the name you give for your project.

How to get user attributes (username, email, etc.) using cognito identity id

I have AWS Cognito Identity Pool that is configured with Cognito User Pool as an authentication provider.
Assume I have identity ID of an identity in Cognito Identity Pool (e.g. us-east-1:XXaXcXXa-XXXX-XXXX-XXX-XXXXXXXXXXXX) where this identity has a linked login to a user in Cognito User Pool.
Using identity ID, how can I get the linked user details (email, phone, username)?
The ID Token that you exchange with Cognito federated identity service to get the identity id and credentials already has all user attributes. You do not need an extra call to any service.
It is a JWT token and you can use any library on the client to decode the values. You can read this guide for more information about the tokens vended by Cognito user pools.
Alternatively, you can also use the Access Token to call GetUser API which will return all the user information.
Using REST API
AccessToken
Thought that this could be very helpful to someone as I've spent a lot of time trying to figure out how to get UserAttributes with only accessToken and region ( Similar to this but with REST API ( Without using aws-sdk )
You can get UserAttributes with accessToken using this HTTP request. ( GetUser )
Method: POST
Endpoint: https://cognito-idp.{REGION}.amazonaws.com/
Content-Type: application/x-amz-json-1.1
Content-Length: 1162 // Access Token bytes length
X-Amz-Target: AWSCognitoIdentityProviderService.GetUser
Body: {"AccessToken":"ACCESS_TOKEN"}
And if the accessToken is valid, you should receive example response like the following
{
"UserAttributes": [
{
"Name": "sub",
"Value": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx"
},
{
"Name": "email_verified",
"Value": "true"
},
{
"Name": "name",
"Value": "Jason"
},
{
"Name": "phone_number_verified",
"Value": "true"
},
{
"Name": "phone_number",
"Value": "+xxxxxxxxxxx"
},
{
"Name": "email",
"Value": "xxxx#gmail.com"
}
],
"Username": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx"
}
AWS cognito-idp list-users has a filter option that allows you to filter based on attribute. 'sub' is the attribute that matches the identity id you are describing.
e.g. at the command line:
aws cognito-idp list-users --user-pool-id us-east-1_abcdFghjI --filter "sub=\":XXaXcXXa-XXXX-XXXX-XXX-XXXXXXXXXXXX\""
This also requires the user-pool-id, which I suspect you have. Additionally, I have no idea how this is implemented or how it performances when filtering a large number of users, but I take custom attributes not being usable in filters as a hint that there is some form of indexing behind the curtain.
I faced the similar issue and after too much of scratching i was not able to find the exact way of pulling out the details. My usecase was to get the details in android APP.
After looking into their AWSMobile client API code. I found below and it is working from me.
Log.i(TAG, "User Details"+ AWSMobileClient.getInstance().getUserAttributes().toString());
Recommendation - Try use AWSMobileclient incase you are using it for Android Development as this is new library that is recommended for development.
Just struggled with this for a while, and the way I got the user name, using Java API is:
identityManager.login(this, new DefaultSignInResultHandler() {
#Override
public void onSuccess(Activity activity, IdentityProvider identityProvider) {
...
String userName = ((CognitoUserPoolsSignInProvider) identityProvider).getCognitoUserPool().getCurrentUser().getUserId();
There is a listener we can initialize that will listen to changes in our authentication state and allow us to have access to the type of authentication event that happened and update the application state based on that data.
With Amplify, the Hub module allows us to do this pretty easily:
import { Hub } from 'aws-amplify';
Hub.listen('auth', (data) => {
const {payload} = data;
if (payload.event === 'signOut') {
console.log('signOut');
} else if (payload.event === 'signIn') {
console.log('A new auth event has happened: ', data.payload.data.username + ' has ' + data.payload.event);
}
});
For those who are looking how to get the value of email parameter in Java programmatically
I assume you have already figured out how to get the needed / all users from the pool.
Say I have ListUsersResult with my all users and say I want to check the email value of the first user:
ListUsersResult allUsers = getAllUsers();
UserType userType = allUsers.getUsers().get(0);
First I can get user's all attributes:
List<AttributeType> attributes = userType.getAttributes();
Then loop through the attributes looking for the one we're interested in (our case email):
for (AttributeType att : attributes) {
if (att.getName().equals("email")) {
// do whatever you want
}
}
Remember that printing in to the console will most probably not work since it is sensitive data. But you can compare it like this:
att.getValue().equals("mymail#mail")
Use this piece of code
GetDetailsHandler detailsHandler = new GetDetailsHandler() {
#Override
public void onSuccess(CognitoUserDetails cognitoUserDetails) {
CognitoUserAttributes cognitoUserAttributes=cognitoUserDetails.getAttributes();
stringStringHashMap=new HashMap<>();
stringStringHashMap =cognitoUserAttributes.getAttributes();
userNumber=stringStringHashMap.get("phone_number");
e1.setText(userNumber);
Log.d("Response"," Inside DEATILS HANDLER");
// Store details in the AppHandler
AppHelper.setUserDetails(cognitoUserDetails);
// Trusted devices?
handleTrustedDevice();
// e1.setText(input.getText().toString());
}
#Override
public void onFailure(Exception exception) {
closeWaitDialog();
showDialogMessage("Could not fetch user details!", AppHelper.formatException(exception), true);
}
};
private void getDetails() {
AppHelper.getPool().getUser(username).getDetailsInBackground(detailsHandler);
}
console.log('username is ' + cognitoUser.getUsername());