get serviceInstanceName and serviceKeyName for cloud foundry user provider serves by using cloudFoundryOperations - cloud-foundry

im trying to get the Credentials of UPS in cloud foundry:
using:
Mono<ServiceKey> serviceKey = (Mono<ServiceKey>) cloudFoundryOperations
.services()
.getServiceKey(
GetServiceKeyRequest.builder()
.serviceKeyName("digital_cassandra")
.serviceInstanceName("2a5aa377-e992-4f88-9f85-d9cec5c3bea9")
.build())
.subscribe();
serviceKey.map(serviceKey1 -> {
System.out.println(serviceKey1.getCredentials().toString());
return serviceKey1.getCredentials().get(0);
});
but nothing printed.
how to fet the serviceKeyName and serviceInstanceName by cloudFoundryOperations?
i need to print all the serviceKeyName and serviceInstanceName in my space.

.serviceInstanceName("2a5aa377-e992-4f88-9f85-d9cec5c3bea9")
It should be the actual name, not the guid. Like "my-key" or whatever you called your key.
but nothing printed. how to fet the serviceKeyName and serviceInstanceName by cloudFoundryOperations?
If you just want to print to the console, try something like this:
cloudFoundryOperations
.services()
.getServiceKey(GetServiceKeyRequest.builder()
.serviceInstanceName("reservation-db")
.serviceKeyName("cf-mysql")
.build())
.doOnNext(key -> {
System.out.println("Key:");
System.out.println(" " + key.getName() + " (" + key.getId() + ")");
key.getCredentials().forEach((k, v) -> {
System.out.println(" " + k + " => " + v);
});
})
.block();
The GetServiceKeyRequest determines which service key is looked up. The doOnNext call allows you to inspect but not consume the key, which works fine to print it out. Then the example calls .block() to wait for the results, which is fine cause this is just an example. You wouldn't want to do that in your actual code though. You'd probably want one of the subscribe() variants (you could swap subscribe() for doOnNext() too, just depends on what you're code is doing).
i need to print all the serviceKeyName and serviceInstanceName in my space.
To get all the keys for all the service instances:
cloudFoundryOperations
.services()
.listInstances()
.doOnNext(si -> {
System.out.println(" " + si.getName() + " (" + si.getId() + ")");
})
.flatMap((ServiceInstanceSummary si) -> {
return ops
.services()
.listServiceKeys(ListServiceKeysRequest.builder()
.serviceInstanceName(si.getName())
.build())
.doOnNext(key -> {
System.out.println("Key:");
System.out.println(" " + key.getName() + " (" + key.getId() + ")");
key.getCredentials().forEach((k, v) -> {
System.out.println(" " + k + " => " + v);
});
});
})
.blockLast();
This one is enumerating all the service instances, printing the name/id, then using flatMap to go out and get the service keys for each service instance. It then merges them all into one Flux<ServiceKey>. The doOnNext() is just for printing. You don't necessarily have to do that. You could consume the result in a number of ways, like collect it into a list or subscribe to it, this just works nicely for an example. Use what works best for your code.

Related

Multiple cloud-sql table export as csv

Is there a way to export multiple SQL tables as csv by issuing specific queries from cloud-sql.
Below is the code i currently have. When I call the exportTables for multiple tables back to back, I see a 409 error. It's probably becaause cloud-sql instance is busy with an export and it's not allowing subsequent export request.
How can I get this to work ? What would be the ideal solution here.
private void exportTables(String table_name, String query)
throws IOException, InterruptedException {
HttpClient httpclient = new HttpClient();
PostMethod httppost =
new PostMethod(
"https://www.googleapis.com/sql/v1beta4/projects/"
+ "abc"
+ "/instances/"
+ "zxy"
+ "/export");
String destination_bucket =
String.join(
"/",
"gs://" + "test",
table_name,
DateTimeUtil.getCurrentDate() + ".csv");
GoogleCredentials credentials =
GoogleCredentials.getApplicationDefault().createScoped(SQLAdminScopes.all());
AccessToken access_token = credentials.refreshAccessToken();
access_token.getTokenValue();
httppost.addRequestHeader("Content-Type", "application/json");
httppost.addRequestHeader("Authorization", "Bearer " + access_token.getTokenValue());
String request =
"{"
+ " \"exportContext\": {"
+ " \"fileType\": \"CSV\","
+ " \"uri\":\""
+ destination_bucket
+ "\","
+ " \"databases\": [\""
+ "xyz"
+ "\"],"
+ " \"csvExportOptions\": {"
+ " \"selectQuery\": \""
+ query
+ "\""
+ " }\n"
+ " }"
+ "}";
httppost.setRequestEntity(new StringRequestEntity(request, "application/json", "UTF-8"));
httpclient.executeMethod(httppost);
if (httppost.getStatusCode() > 200) {
String response = new String(httppost.getResponseBody(), StandardCharsets.UTF_8);
if (httppost.getStatusCode() != 409) {
throw new RuntimeException(
"Exception occurred while exporting the table: " + table_name + " Error " + response);
} else {
throw new IOException("SQL instance seems to be busy at the moment. Please retry");
}
}
httppost.releaseConnection();
logger.info("Finished exporting table {} to {}", table_name, destination_bucket);
}
I don't have suggestion to fix the issue on Cloud SQL directly, but a solution to execute in sequence the export thanks to a new tool: Workflow
Define the data format that you want, in JSON, to define ONE export.
Then provide an array of configuration to your workflow
In this workflow,
Make a loops on the configuration array
Perform an API call to Cloud SQL to generate the export on each configuration
Get the answer of the API Call, you have the jobId
Sleep a while
Check if the export is over (with the jobId).
If not, sleep and check again
If yes, loop (and thus start the next export)
It's serverless and the free tier makes this use case free.

Why do profile pic URLs returned from graph.facebook result in a 404

The backend of my application makes a request to:
https://graph.facebook.com/v2.8/me?access_token=<firebase-access-token>&fields=id,name,first_name,birthday,email,picture.type(large){url}&format=json&method=get&pretty=0&suppress_http_code=1
I get a successful (200) response with the JSON data I expect and picture field as such:
"picture": {
"data": {
"url": "https://platform-lookaside.fbsbx.com/platform/profilepic/?asid=<asid>&height=200&width=200&ext=<ext>&hash=<hash>"
}
}
(where in place of <asid> and <ext>, there are numbers and <hash> is some alphanumeric string).
However, when I make a GET request to the platform-lookaside URL above, I get a 404 error.
It's been happening every time since my very first graph.facebook request for the same user. The very first one returned a platform-lookaside URL which pointed to a proper image (not sure if this is simply coincidence).
Is there something I'm doing wrong or is this likely a bug with the Facebook API?
FB currently seems to have issues with some CDNs and therefore your issue might be only temporary. You should also see missing/broken images on some places on fb dot com. Worst time to debug your issue :)
Try this code it worked for me
GraphRequest request = GraphRequest.newMeRequest(
AccessToken.getCurrentAccessToken(), new GraphRequest.GraphJSONObjectCallback() {
#Override
public void onCompleted(JSONObject object, GraphResponse response) {
// Insert your code here
try {
String name = object.getString("name");
String email = object.getString("email");
String last_name = object.getString("last_name");
String first_name = object.getString("first_name");
String middle_name = object.getString("middle_name");
String link = object.getString("link");
String picture = object.getJSONObject("picture").getJSONObject("data").getString("url");
Log.e("Email = ", " " + email);
Log.e("facebookLink = ", " " + link);
Log.e("name = ", " " + name);
Log.e("last_name = ", " " + last_name);
Log.e("first_name = ", " " + first_name);
Log.e("middle_name = ", " " + middle_name);
Log.e("pictureLink = ", " " + picture);
} catch (JSONException e) {
e.printStackTrace();
Log.e("Sttaaaaaaaaaaaaaaaaa", e.getMessage());
}
}
});
Bundle parameters = new Bundle();
parameters.putString("fields", "id,name,email,link,last_name,first_name,middle_name,picture");
request.setParameters(parameters);
request.executeAsync();

Reading AWS ECS Cluster Tags from dotnet core code

I created tags for an ECS cluster in AWS.
e.g, In the cluster mycluster's tag, I may have such as -
ENVIRONMENT=Production
I spent long time searching dotnet core sample code to know how to read the key-value pairs from that ECS cluster tags.
I would be much appreciated if anyone can provide a simple dotnet code how to do it.
Thank you
I found it -
NuGet package AWSSDK.ResourceGroupsTaggingAPI is needed for this
using Amazon.ResourceGroupsTaggingAPI;
using Amazon.ResourceGroupsTaggingAPI.Model;
public string AWSGetClusterTag()
{
string ret = "\n demo - ";
try
{
AmazonResourceGroupsTaggingAPIClient client2 = new AmazonResourceGroupsTaggingAPIClient(Amazon.RegionEndpoint.USEast2);
GetTagValuesRequest req = new GetTagValuesRequest();
req.Key = "your tag's key name here";
GetTagValuesResponse res = System.Threading.Tasks.Task.Run(async () => await client2.GetTagValuesAsync(req)).Result;
List<string> values = res.TagValues;
ret = ret + " Cluster Tag = " + req.Key + " : " + values[0];
}
catch (Exception e)
{
ret = "Exception happened: " + e.Message;
}
return ret;
}

How to set failed in POSTMAN Collection runner via script

How can I set a "FAILED" via script? I have a IF and in this if I want to check is X compare to Y. If not I want to get an "FAILED". For now I get an "FAILED" in my console but the test is "PASS". I already searched other sites but found nothing about this. Hope u now what I mean.
if(pm.iterationData.get("ProfileSeries") == response.ProfileSeries){
console.log("expProfileSeries " + pm.iterationData.get("ProfileSeries") + " = "+ response.ProfileSeries);
}
else{
console.log("FAILED");
}
Add pm.test with relevant pm.expect:
pm.test('ProfileSeries should be equal', function () {
pm.expect(pm.iterationData.get("ProfileSeries")).to.equal(response.ProfileSeries);
});

Improving performance of WURFL code

I tested the following code which prints the properties of the UserAgent. However I notice that it takes noticeable time to execute the code.
// Initialize the WURFL library.
String projRoot = System.getProperty("user.dir");
wurflFile = projRoot + File.separator + "wurfl-2.3.3" + File.separator + "wurfl.xml";
File dataFile = new File(wurflFile);
wurfl = new CustomWURFLHolder(dataFile);
String deviceUrl = "Apple-iPhone5C1";
WURFLManager manager = wurfl.getWURFLManager();
Device device = manager.getDeviceForRequest(deviceUrl);
System.out.println("Device: " + device.getId());
System.out.println("Capability: " + device.getCapability("preferred_markup"));
System.out.println("Device UA: " + device.getUserAgent());
Map capabilities = device.getCapabilities();
System.out.println("Size of the map: " + capabilities.keySet().size());
Iterator itr = capabilities.keySet().iterator();
while (itr.hasNext()) {
String str = (String) itr.next();
System.out.println(str);
}
One of the reason is that it takes time to load and parse the WURFL XML database file (which is about 20MB size).
I want to know if there is any different WURFL API, which will improve this performance? Eventually, I would be putting this code in a HTTP Proxy where I want to check the device profile parameter for adaptation of the content.
Thanks.