When I visit AWS management console => AWS Billing Dashboard => Bills page, select the preferable date (e.g. May 2022) and Download CSV, I cannot download CSV but following error appears:
(!) The Monthly Usage Report CSV is only available for the months
after opting in via the Preferences page
So I tried to find the Preferences page to opt in but I could not find it.
Where can I find it?
Thank you.
Its in AWS Cost and Usage Reports`:
Maybe you can reference the links as below.
If you get the following error, please pick the most recent month and try again. If you continue to receive this error please reference
the Configure Monthly Reports Lab Guide for steps to enable your
Monthly Report.
https://www.wellarchitectedlabs.com/cost/100_labs/100_4_cost_and_usage_analysis/3_cost_usage_download/
https://wellarchitectedlabs.com/cost/100_labs/100_1_aws_account_setup/9_monthly_report/
Related
Has the Local Currency Preference option been deprecated from AWS console?
My Billing Dashboard shows the cost in USD, I wanted to change this to my local currency INR.
When I searched I found this post - Set Preferred Payment Currency for your AWS Account, but this seems to be a couple of years old so the options mentioned in the post are not found in the console. Is there any alternate way to change the currency in Billing Dashboard?
Any help will be appreciated. Thank You!
INR does not seem to be a supported currency for AWS Billing:
https://aws.amazon.com/premiumsupport/knowledge-center/supported-aws-currencies/
https://aws.amazon.com/blogs/aws/new-set-preferred-payment-currency-for-your-aws-account/
Seems like INR is not yet supported by AWS for billing Dashboard.
However, you still get your billed amount invoice in INR. Setting for this is in -
Billing > Payment Preferences > edit Payment Preferences
I'm trying to get data from one of the reports available in the google play console. Specifically the user_acquisition report. I set up the data transfer service within the google cloud platform in order to use the BigQuery API.
When querying that specific report the results are partial. Some columns match the results I get when downloading the report manually but other columns just have the value null although the downloaded report shows that there should be numerical values there.
Another peculiar thing is that when specifying a date range for the query (month of may for example) the result will show about 1/3 of the dates in that month but there should be a row for each day of the month.
When looking at the transfer runs history, some of the runs have completed successfully, and some have failed giving the error message: Error code 5 : No files found for any reports. Please make sure you selected the correct Google Cloud Storage bucket and Google Play reports exist. But if no files are found, then how am i getting any results at all?
The users of both the GCP and Google Play Console are the owners of the project, so there shouldn't be any issue with the permissions to access the bucket where the reports are stored.
I tried creating another data transfer service to see if it can even find the reports. It did find some of the files but not the one I'm interested in. The transfer run history shows the same error as mentioned above.
Has anyone had some similar problem before and perhaps can offer some sort of solution? Or maybe just has some insights into why this problem is occurring?
I think the issue could be related with the availability of the desired report, since I've found that only some reports are supported by this service:
Detailed reports (Reviews, Financial reports)
Aggregated reports (Statistics, User acquisition)
Could it happen that the specific report your want to export is not supported?
If that's not the case I think you should file a support case sharing the "Resource name" into the Transfer details of the failed exports (and correct ones for reference). Alternatively of the support ticket you can also report a defect on the transfer service on a Public Issue tracker. The support team can help you to review further the error message.
For context, we would like to visualize our data in google data studio - this dataset receives more entries each week. I have tried hosting our data sets in google drive, but it seems that they're too large and this slows down google data studio (the file is only 50 mb, am I doing something wrong?).
I have loaded our data into google cloud storage --> google bigquery, and connected my google data studio to my bigquery table. This has allowed me to use the google data studio dashboard much quicker!
I'm not sure what is the best way to update our data weekly in google cloud/bigquery. I have found a slow way to do this by uploading the new weekly data to google cloud, then appending the data to my table manually in bigquery, but I'm wondering if there's a better way to do this (or at least a more automated way)?
I'm open to any suggestions, and if you think that bigquery/google cloud storage is not the answer for me, please let me know!
If I understand your question correctly, you want to automate the query that populate your table, which is connected to Data Studio.
If this is the case, then you can use Scheduled Query from BigQuery. Scheduled query allow you to define a query which results can be inserted in a new table. Particularly you can specify different rules for repetition (minimum each 15 minutes) and execution, as well as destination writing options (destination table, writing mode: append, truncate).
In order to use Scheduled Queries your account must have the right permissions. You can have a look at the following documentation to better understand how to use Scheduled Query [1].
Also, please note that at the front end the updated data in the BigQuery table will be seen updated in Datastudio at each refresh (click on refresh button in Datastudio). To automatically refresh the front-end visualization you can use the following plugin [2] or automate the click on the refresh button through Browser console commands.
[1] https://cloud.google.com/bigquery/docs/scheduling-queries
[2] https://chrome.google.com/webstore/detail/data-studio-auto-refresh/inkgahcdacjcejipadnndepfllmbgoag?hl=en
I am looking to pull a monthly performance report of a campaign running in Double click bid manager, I'm pulling this report via the `DoubleClick Bid Manager API(https://developers.google.com/bid-manager/v1/queries/createquery).
I just plug the JASON syntax with the account and various fields, when I click on execute I'm able to download the reports from the reports tab of bid manager, then I'm uploading it to S3 manually so that I can later dump it into Redshift.
I was wondering if there was a way I can do this programmatically, instead of uploading the report manually every time.
If there is a question already on this, please point me in the right direction.
Thanks in advance.
I am trying to use Google Admin Reports API: Users Usage Report to pull emails received/sent per user per day in our org's google app.
When I use Google APIs Explorer to pull my own stats for a particular day and compared it with real situation, it seems the number is far off.
For example, on Sunday, 7th Dec 2014, I only sent out one email. But the stats shows there were 4 emails sent out by me on that day.
Any assistance would be appreciated
Cheers,
You should get the same results than searching in Gmail:
in:sent from:me after:2014/12/07 before:2014/12/08
The missing bit is the time zone the server is using which in my research it is always Pacific Standard Time.
Did you:
Send out any calendar invitations that day? (1 email per attendee)
Share any Google Drive files/folders that day (1 email per file shared)
Send mail from a Google Group
there are likely other actions you may have performed in other Google Apps which caused emails to go out in your name and count against your quota but not necessarily show in your Sent folder.
If you'd like for these messages to appear in your Sent folder, turn on Comprehensive Storage.