Log entries api not retrieving log entries - google-cloud-platform

I am trying to retrieve custom logs for a particular project in google-cloud. I am using this api:
https://logging.googleapis.com/v2/entries:list
as per the example given in this link.
The below is the payload:
{
"filter": "projects/projectA/logs/slow_log",
"resourceNames": [
"projects/projectA"
]
}
There is a custom log based metric called slow_log I created in that projectA, which gathers query logs from cloud-SQL database in that project. I also generated data before calling this api. I am able to see the data in stack-driver console, but unable to get it from the rest call.
Every time I run this api, I only get this response and nothing else:
"nextPageToken": "EAA4suKu3qnLwbtrSg8iDSIDCgEAKgYIgL7q8wVSBwibvMSMvhhglPDiiJzdjt_zAWocCgwI2buKhAYQlvTd2gESCAgLEMPV7ukCGAAgAQ"
Is there anything missing here?
How is it possible to pass time range in this query?
Update
Changed the request as per the comment below as gave the full path of the logs: still only the token is displayed
{
"filter": "projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log",
"projectIds": [
"projectA"
],
"orderBy": "timestamp desc"
}
Also I give this command from command line:
gcloud logging read logName="projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log"
then it fetches the logs in command line, so I am not sure what I am missing in the api explorer and postman where I get only nextpage token.

resourceNames, filter and orderBy are mandatory, try like this:
{
"resourceNames": [
"projects/projectA"
],
"filter": "projects/projectA/logs/cloudsql.googleapis.com%2Fmysql-slow.log",
"orderBy": "timestamp desc"
}

Related

Dealing With Incoming Null Values In Cloud Data Fusion When Building Data Pipeline

I have started trying out google cloud data fusion as a prospect ETL tool that I can finally decide to use.When building a data pipeline to fetch data from a REST API source and load it to a MySQL database am facing this error Expected a string but was NULL at line 1 column 221'. Please check the system logs for more details. and yes it's true I have a field that is null from the JSON response am seeing
"systemanswertime": null
How do I deal with null values since the available dropdown in the cloud data fusion studio string is not working are they other optional data types that I can use?
Below are two screenshots showing my current data pipeline structure
geneneral view
view showing mapping and the output schema
Thank You!!
What you need to do is to tell HTTP plugin that you are expecting a null by checking the null checkbox in front of output on the right side. See below example
You might be getting this error because in the JSON schema you are defining the value properties. You should allow systemanswertime parameter to be NULL.
You could try to parse the JSON value as follow:
"systemanswertime": {
"type": [
"string",
"null"
]
}
In the case you don't have access to the JSON file, you could try to use this plug in in order to enable the HTTP to manage nulleable values by dynamically substituting the configurations that can be served by the HTTP Server. You will need access to the HTTP endpoint in order construct an accessible HTTP endpoint that can serve content similar to:
{
"name" : "output.schema", "type" : "schema", "value" :
[
{ "name" : "id", "type" : "int", "nullable" : true},
{ "name" : "first_name", "type" : "string", "nullable" : true},
{ "name" : "last_name", "type" : "string", "nullable" : true},
{ "name" : "email", "type" : "string", "nullable" : true},
]
},
In case you are facing an error such as: No matching schema found for union type: ["string","null"], you could try the following workaround. The root cause of this errors are when the entries in the response from the API doesn't have all the fields it needs to have. For example, some entries may have callerId, channel, last_channel, last data, etc... but others entries may have not have last_channel or whatever other field from the JSON. This leads to a mismatch in the schema provided in the HTTP source and the pipeline fails right away.
As pear this when nodes encounter null values, logical errors, or other sources of errors, you may use an error handler plugin to catch errors. The way is as following:
In the HTTP source plug-in, change the following:
Output schema to account for custom field.
JSON/XML field mapping to account into custom field.
Changed Non-HTTP Error Handling field to Send to Error. This way it pushes the records through error collector and the pipeline proceeds with subsequent records.
Added Error Collector and a sink to capture the error records.
With this method you will be able to run the pipeline and had the problematic fields detected.
Kind regards,
Manuel

Google Cloud Vision Api only return "name"

I am trying to use Google Cloud Vision API.
I am using the REST API in this link.
POST https://vision.googleapis.com/v1/files:asyncBatchAnnotate
My request is
{
"requests": [
{
"inputConfig": {
"gcsSource": {
"uri": "gs://redaction-vision/pdf_page1_employment_request.pdf"
},
"mimeType": "application/pdf"
},
"features": [
{
"type": "DOCUMENT_TEXT_DETECTION"
}
],
"outputConfig": {
"gcsDestination": {
"uri": "gs://redaction-vision"
}
}
}
]
}
But the response is always only "name" like below:
{
"name": "operations/a7e4e40d1e1ac4c5"
}
My "gs" location is valid.
When I write the wrong path in "gcsSource", 404 not found error is coming.
Who knows why my response is weird?
This is expected, it will not send you the output as a HTTP response. To see what the API did, you need to go to your destination bucket and check for a file named "xxxxxxxxoutput-1-to-1.json", also, you need to specify the name of the object in your gcsDestination section, for example: gs://redaction-vision/test.
Since asyncBatchAnnotate is an asynchronous operation, it won't return the result, it instead returns the name of the operation. You can use that unique name to call GetOperation to check the status of the operation.
Note that there could be more than 1 output file for your pdf if the pdf has more pages than batchSize and the output json file names change depending on the number of pages. It isn't safe to always append "output-1-to-1.json".
Make sure that the uri prefix you put in the output config is unique because you have to do a wildcard search in gcs on the prefix you provide to get all of the json files that were created.

AWS Pinpoint/Ionic - "Resource not found" error when trying to send push through CLI

I am new at programming with AWS services, so some fundamental things are pretty hard for me. Recently, I was asked to develop an app that used Amazon Pinpoint to send push notifications, as a test for considering future implementations.
As you can see in another question I posted in here (Amazon Pinpoint and Ionic - Push notifications not working when app is in background), I was having trouble trying to send push notifications to users when my app is running in the background. The app was developed using Ionic by following these steps.
When I was almost giving up, I decided to try sending the pushes directly through Firebase, and it finally worked. Some research took me to this question, in which another user described the problem as only happening in AWS Console, so the solution would be to use CLI. After searching a little about it, I found this tutorial about how to sending pinpoint messages to users using CLI, that seems to be what I wanted. Combining it with this documentation about phonegap plugin, I was able to generate a JSON I thought could be a solution:
{
"ApplicationId":"io.ionic.starter",
"MessageRequest":{
"Addresses": {
"": {
"BodyOverride": "",
"ChannelType": "GCM",
"Context": {
"": ""
},
"RawContent": "",
"Substitutions": {},
"TitleOverride": ""
}
},
"Context": {
"": ""
},
"Endpoints": {"us-east-1": {
"BodyOverride": "",
"Context": {},
"RawContent": "",
"Substitutions": {},
"TitleOverride": ""
}
},
"MessageConfiguration": {
"GCMMessage": {
"Action": "OPEN_APP",
"Body": "string",
"CollapseKey": "",
"Data": {
"": ""
},
"IconReference": "",
"ImageIconUrl": "",
"ImageUrl": "",
"Priority": "High",
"RawContent": "{\"data\":{\"title\":\"sometitle\",\"body\":\"somebody\",\"url\":\"insertyourlinkhere.com\"}}",
"RestrictedPackageName": "",
"SilentPush": false,
"SmallImageIconUrl": "",
"Sound": "string",
"Substitutions": {},
"TimeToLive": 123,
"Title": "",
"Url": ""
}
}
}
}
But when I executed it in cmd with aws pinpoint send-messages --color on --region us-east-1 --cli-input-json file://test.json, I got the response An error occurred (NotFoundException) when calling the SendMessages operation: Resource not found.
I believe I didn't write the JSON file correctly, since it's my first time doing this. So please, if any of you know what I am doing wrong, no mattering which step I misunderstood, I would appreciate the help!
"Endpoints" field in the Message request deals with the endpoint id (the identifier associated with an end user device while registering to pinpoint and not the region.)
In case if you haven't registered any endpoints with Pinpoint, you can use the "Addresses" field. After registering the GCM Channel in Amazon Pinpoint, you can get the GCM device token from your device and specify it here.
Here is a sample for sending direct messages using Amazon Pinpoint Note: The example deals with sending SMS message. You should have registered a SMS channel first and created an endpoint with the endpoint id as "test-endpoint1". Otherwise, you can use the "Addresses" field instead of "Endpoints" field.
aws pinpoint send-messages --application-id $APP_ID --message-request '{"MessageConfiguration": {"SMSMessage":{"Body":"hi hello"}},"Endpoints": {"test-endpoint1": {}}}
Also Note: ApplicationId is generated by Pinpoint. When you visit the Pinpoint console and choose your application, the URL will be of the format
https://console.aws.amazon.com/pinpoint/home/?region=us-east-1#/apps/someverybigstringhere/
Here "someverybigstringhere" is the ApplicationId and not the name you give for your project.

Amazon API Gateway swagger importer tool does not import minItems feild from swagger

I am trying the api gateway validation example from here https://github.com/rpgreen/apigateway-validation-demo . I observed that from the given swagger.json file, minItems is not imported into the models which got created during the swagger import.
"CreateOrders": {
"title": "Create Orders Schema",
"type": "array",
"minItems" : 1,
"items": {
"type": "object",
"$ref" : "#/definitions/Order"
}
}
Because of this when you give an empty array [ ] as input, instead of throwing an error about minimum items in an array, the api responds with a message 'created orders successfully'.
When I manually add the same from the API gateway console UI, it seems to work as expected. Am i missing something or this is a bug in the importer?
This is a known issue with the Swagger import feature of API Gateway.
From http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-known-issues.html
The maxItems and minItems tags are not included in simple request validation. To work around this, update the model after import before doing validation.

Getting number of page likes, post shares and website clicks

How can I get number of page likes, post shares and website clicks aggregated by ad group. Post comments and post likes are fetched through field 'inline_actions' - https://developers.facebook.com/docs/reference/ads-api/adreportstats#columns (/reportstats/date_preset=yesterday&data_columns=['adgroup_id','inline_actions']&actions_group_by=['action_type']). I need a way to fetch these report items using api, just like they are returned using https://www.facebook.com/ads/manage/reporting.php.
To get data per adgroup level you must include adgroup_id or adgroup_name in data_columns - as you're doing.
Comments, Likes and many other actions are returned in column actions
You can specify which actions types you're interested in by specifying filters:
filters=[
{
"field": "action_type",
"type": "in",
"value": [
"like",
"page_engagement",
"post_engagement",
"post_like",
"comment",
"games.plays",
"post",
"photo_view",
"link_click",
"receive_offer",
"checkin",
"mention",
"tab_view",
"vote",
"follow",
"gift_sale",
"video_play",
"video_view",
"offsite_conversion",
"offsite_conversion.checkout",
"offsite_conversion.registration",
"offsite_conversion.lead",
"offsite_conversion.key_page_view",
"offsite_conversion.add_to_cart",
"offsite_conversion.other",
"app_install",
"app_engagement",
"app_story",
"app_use",
"credit_spent",
"mobile_app_install",
"app_custom_event",
"app_custom_event.fb_mobile_activate_app",
"app_custom_event.fb_mobile_complete_registration",
"app_custom_event.fb_mobile_content_view",
"app_custom_event.fb_mobile_search",
"app_custom_event.fb_mobile_rate",
"app_custom_event.fb_mobile_tutorial_completion",
"app_custom_event.fb_mobile_add_to_cart",
"app_custom_event.fb_mobile_add_to_wishlist",
"app_custom_event.fb_mobile_initiated_checkout",
"app_custom_event.fb_mobile_add_payment_info",
"app_custom_event.fb_mobile_purchase",
"app_custom_event.fb_mobile_level_achieved",
"app_custom_event.fb_mobile_achievement_unlocked",
"app_custom_event.fb_mobile_spent_credits",
"app_custom_event.other",
"rsvp"
]
}
]
However that's optional IIRC, API should send you all non-zero actions by default.