Google Admin Reports API: Users Usage Report stats accuracy - google-admin-sdk

I am trying to use Google Admin Reports API: Users Usage Report to pull emails received/sent per user per day in our org's google app.
When I use Google APIs Explorer to pull my own stats for a particular day and compared it with real situation, it seems the number is far off.
For example, on Sunday, 7th Dec 2014, I only sent out one email. But the stats shows there were 4 emails sent out by me on that day.
Any assistance would be appreciated
Cheers,

You should get the same results than searching in Gmail:
in:sent from:me after:2014/12/07 before:2014/12/08
The missing bit is the time zone the server is using which in my research it is always Pacific Standard Time.

Did you:
Send out any calendar invitations that day? (1 email per attendee)
Share any Google Drive files/folders that day (1 email per file shared)
Send mail from a Google Group
there are likely other actions you may have performed in other Google Apps which caused emails to go out in your name and count against your quota but not necessarily show in your Sent folder.
If you'd like for these messages to appear in your Sent folder, turn on Comprehensive Storage.

Related

Fetching Facebook Ad spends grouped by campaign names

I wanted to fetch the daily ad spends in Facebook based on campaigns separately, ie, for each campaign with its spend.
I have tried with API call, but that gives state wise classification. How do I group based on campaign names?
"https://graph.facebook.com/v14.0/act_xxxxxx/insights?" + encodeURI("level=ad&fields=spend&access_token=FB_ACCESS_TOKEN&breakdowns=region,country&time_range={'since':'2022-08-09','until':'2022-08-09'}&filtering=[{\"field\":\"country\",\"operator\":\"CONTAIN\",\"value\":\"US\"}]&limit=100")
Campaigns and adsets has an insights field where you can read performance and spending metrics.
Get insights from campaigns endpoint
https://graph.facebook.com/v14.0/act_xxx/campaigns?fields=name,insights{spend}
you can use date filters with you calls to get lifetime spending.
you can also get spending from campaign adset insights field.
Check the official documentation
https://developers.facebook.com/docs/marketing-api/reference/ad-campaign/insights

Google Finance API Time Period

I am using a C# class to get the prices of a given stock from Google Finance.
The communication between the class and the Web Service is working well.
I am using the following url:
http://www.google.com/finance/historical?output=csv&q=BBAS3
The problem is that the returned prices are only from about 1 year ago.
I need to get all the available prices records.
If I search for stock ticker "BBAS3" in the Google webpage, it shows me a graph that goes until 2003.
As far I could find, there is no official documentation about the API.
I found some more info in this site:
Google's Undocumented Finance API
As it shows, I can use the following parameter:
p - Period. (A number followed by a "d" or "Y", eg. Days or years. Ex: 40Y = 40 years.)
Then I tried (&p=5Y):
http://www.google.com/finance/historical?output=csv&p=5Y&q=BBAS3
But the result was the same.
Does anyone knows another way to achieve what I need?
Google is changed Now.
I've been playing around with the undocumented Google Finance API. It provides intra-day data for the past 10 days and day-granularity data going back for years. Handy!
There are lots of web pages that attempt to describe how this thing works. I've pulled together a bunch of data from them, as well as a few of my own observations. You can watch this thing in action by popping open FireBug while messing about on with the google finance chart. It will make AJAX requests to this API.
Here's an example URL to pull all historical data for GOOG at daily granularity:
http://www.google.com/finance/getprices?q=GOOG&x=NASD&i=86400&p=40Y&f=d,c,v,k,o,h,l&df=cpct&auto=0&ei=Ef6XUYDfCqSTiAKEMg
Visit this Documentation
As You Search is Available at
https://www.google.com/finance/getprices?q=BBAS3&x=BVMF

How to retrieve the service fee details from Amazon market API

Is it possible to retrieve the service fee charges independent of the SKU like Subscription Fee, FBA Inventory Storage Fee etc. using amazon market API.
I tried the Financial Event API which returns the service fees in the format
<ServiceFeeEvent>
<FeeList>
<FeeComponent>
<FeeType>FBADisposalFee</FeeType>
<FeeAmount>
<CurrencyAmount>-0.15</CurrencyAmount>
<CurrencyCode>USD</CurrencyCode>
</FeeAmount>
</FeeComponent>
</FeeList>
</ServiceFeeEvent>
Which does not contains the data like PostedDate. Is there any oter APIs availabile to get the detailed data of service fee amounts?
In case it's useful for someone else, I figured out an approach that kind of works for me, though it's not ideal.
I'm using the Reports API to download the _GET_V2_SETTLEMENT_REPORT_DATA_FLAT_FILE_ report, which has the fees and a posted-date column. Some of Amazon's documentation about it can be found here: http://docs.developer.amazonservices.com/en_US/reports/Reports_ReportType.html
The disadvantage is that it's only generated once every two weeks. The advantage, compared to the Finances API, is that you get the posting date and the specific transaction that the fee came from.

"Average Daily Active Users" for Facebook API Limits

I've suddenly started running into API limits. I've been restricting my API calls to: number of users * 200, but I'm getting error #4 about once per day.
This calculation was based on the docs from end of 2015 that said number of users your app had yesterday, plus new logins today.
But it looks like that has changed to:
The number of users your app has is the average daily active users of your app, plus today's new logins
Can someone explain to me what "average daily active users" is? And is there a way I can get access to this number?
Some information on what I'm doing:
My app fetches pages and posts from pages. To do this, I hit the Facebook API to get user's liked pages. Then each hour, I fetch posts from pages the system knows about.
I do the following:
Batch requests (50 per batch)
I'm only fetching posts since the last fetch (using since, until and limit params. 90% of the requests return 0 posts)
I'm only fetching posts from pages my users like
I'm using my app token for these requests
I limit the number of calls per hour to users * 200
Batch Requests don´t reduce API limits, they are only faster, that´s all. That being said: You wrote that you are using an "App Token" for the requests - you should use a "User Token" instead. It´s still a LOT of calls though, the only thing you can do in addition is to reduce the amount of API calls.
I found this endpoint in the documentation: https://developers.facebook.com/docs/graph-api/reference/application/
I tested this via
https://graph.facebook.com/v2.10/<my_app_id>?access_token=<my_access_token>&fields=daily_active_users
And it returned
{
"daily_active_users": "152",
"id": "<my_app_id>"
}
It is not average daily active users though

Analytics django app with noSQL db and GA

I've started a django project that will include an analytics app. I want that app to use either couchDB or mongoDB for storing data.
The initial idea was (since the client already is using Google Analytics) to once a day/week/month grab data from GA, and store store it locally as values in database. Which would ultimately build a database of entries - one entry per user per month - with summed values like
{"date":"11.2011""clicks": 21, "pageviews": 40, "n": n},
for premium users there could be one entry per user per week or even day.
The question would be:
grab analytics from GA, do a sum entries for clicks, visits etc.
or
store clicks and whatever values locally and once a month do sums for display ?
Lukasz, unless Google Analytics has really relaxed their privacy levels, you're not going to be able to access user-level records (but check out the answer here: Django saving the whole request for statistics, whats available?)
Right, old question but I've just finished the project so I'll just write what I did.
Since I didn't need concurrency and wanted more speed approach, I found that mongodb is better for that.
The final document schema that I've used is
{'date': '11.2009', 'pageviews': 40, 'clicks': 13, 'otherdata': 'that i can use as filters'}
The scope of my local analytics is monthly, so I create one entry in mongdb per user per month, and update it each day. As said just now, I update data daily, and store only summaries and averages of those.
What else. Re: Jamie's answer... The system is using GA events, so I've got access to all data that i need.
Hope someone may find it interesting.
cheers and thanks for ideas !