Expiration date in Amazon s3: How to set a dynamic trigger? - amazon-web-services

I´ve managed to create a bucket in my Amazon s3 account, and I´ve added to it a couple of files.
I´ve downloaded CloudBerry Explorer (the freeware edition), and I´ve uploaded a file and set it as private with an expiration date.
This is what my link looks like:
http://dja61b1p3y3bp.cloudfront.net/taller_parte1.flv?AWSAccessKeyId=AKIAIEZ23XILJFNS3ZVA&Expires=1379991642&Signature=GrBLzn13nkm8NiU6JXBj0jC0i%2F8%3D
The thing is that this is an "static" expiration date, it has a one week expiration date counting it from now.
I mean, If I have an online course, and I receive every week different students, and I want to set that course and use that file, I should go and update that expiration date every single week.
Is there any way to configure the file to expire a week later counting that period of time from the click that has made each individual user?
How may I do that?
Thanks for your insight!

Related

Older Power-BI pbix using a SharePoint-Online list that has had new fields added

I have a Power-BI dashboard that was created some time ago. It’s data source is a SharePoint-Online list. Since the Power-BI dashboard was created, several new fields have been added to the SharePoint list. Now I am being asked to add a new page to the dashboard that reports on those new fields. However, I have not found a way to get the existing Power-BI list/dataset to show the new fields.
Refreshing the data does refresh the values, but refresh does not add the new fields.
I’ve spend the last 4 hours looking on the internet for a solution. The only thing I have been able to do so far is to attach the list again with a different name- the new fields DO show up when I do this. (I can’t just replace the older Power-BI list/dataset because there have been several calculated columns and measures added.)
I can work with this and create the report, but is this the only way? It doesn’t seem like it should be.
Any help would be appreciated! Thank you!
(I'm using Power BI April 2021 and Sharepoint Online)
So, it looks like there's no good answer to this issue. I found that adding another instance of the reference Sharepoint list, that included the new columns, did work (however inelegant). That seems to be the best direct answer for times when the older pbix file must continue to be used.
What I ended up doing, though, was to create a new separate pbix file which included the latest version of the Sharepoint List. This was the best solution for my organization since it will allow us to be more focused on the specific manufacturing processes involved.
Thanks to #Jon and #Alejandro for their efforts to help!
If you have access to PowerAutomate you could refresh the dataset creating a flow so that given a certain time (say, once or twice a day) the dataset gets refreshed with the new created items.
Otherwise if you are working with the service version of Power BI you can program a refresh of the dataset directly from the workspace going to the settings of the dataset. You would have to have a gateway set for that which could be in personal mode or not.
Also if you want to update the data in the service version you could do it manually too in the workspace.

Dataset shows only 5 event tables after re-linking Firebase with another Google Analytics account

Recently unlinked and re-linked a Firebase project with a different Google Analytics account.
The BigQuery integration configured to export GA data created the new dataset and data started populating into that.
The old dataset corresponding to the unlinked, "default" GA account, which contained ~2 years of data is still accessible in the BigQuery UI, however only the 5 most recent event_ tables are visible in the dataset. (5 days worth of event data)
Is it possible to extract historical data from the old, unlinked dataset?
What I could suggest, it's to do some queries for further validate the data that you have within your BigQuery dataset.
In this case, I would start by getting the dates for each table to see the amount (days) of data contained on the dataset.
SELECT event_date
FROM `firebase-public-project.analytics_153293282.events_*`
GROUP BY event_date ORDER BY event_date
EDIT
A better way to do this, and get all the tables within the dataset, is using the bq command line tool, see reference here.
bq ls firebase-public-project:analytics_153293282
You'll get something like this:
You could also do a COUNT(event_date), so you can see how many records you have per day, and compare this to the content that you have or you can see on your Firebase project.
SELECT event_date, COUNT(event_date) ...
On the case that there's data missing, you could use table decorators, to try to recover that data, see example here.
About the table's expiration date you can see this, in short, expiration time can be set by default at dataset level and it would be applied for new tables (existing tables require a manual update of their expiration time one by one), and expiration time can be set during the creation of the table. To see if there was any change on the expiration time you could look into your logs for protoPayload.methodName="tableservice.update", and see if there was set an expireTime as follows:
tableUpdateRequest: {
resource: {
expireTime: "2020-12-31T00:00:00Z"
...
}
}
Besides this, if you have a GCP support plan, you could reach them looking for further assistance on what could have happened with your tables on that dataset. Otherwise, you could open an issue tracker. Keep in mind that Firebase doesn't delete your data when unlinking a Firebase project from BigQuery, so in theory the data should be there.

Error in Google Play transfer frequency - Google BigQuery

I want to set weekly Google Play transfer, but it can not be saved.
At first, I set daily a play-transfer job. It worked. I tried to change transfer frequency to weekly - every Monday 7:30 - got an error:
"This transfer config could not be saved. Please try again.
Invalid schedule [every mon 7:30]. Schedule has to be consistent with CustomScheduleGranularity [daily: true ].
I think this document shows it can change transfer frequency:
https://cloud.google.com/bigquery-transfer/docs/play-transfer
Can Google Play transfer be set to weekly?
By default transfer is created as daily. From the same docs:
Daily, at the time the transfer is first created (default)
Try to create brand new weekly transfer. If it works, I would think it is a web UI bug. Here are two other options to change your existing transfer:
BigQuery command-line tool: bq update --transfer_config
Very limited number of options are available, and schedule is not available for update.
BigQuery Data Transfer API: transferConfigs.patch Most transfer options are updatable. Easy way to try it is with API Explorer. Details on transferconfig object. schedule field need to be defined:
Data transfer schedule. If the data source does not support a custom
schedule, this should be empty. If it is empty, the default value for
the data source will be used. The specified times are in UTC. Examples
of valid format: 1st,3rd monday of month 15:30, every wed,fri of
jan,jun 13:15, and first sunday of quarter 00:00. See more explanation
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: the granularity should be at least 8 hours, or less frequent.

How to get expired table data in bigquery, If the expired time is more than two days?

I have a process in which I get the table data in bigquery on the daily basis. I need some old table data but unfortunately they're expired now and their expiration time is more than two days. I know we can get back the table data if it's deleted and deleted time is less than two days, but is it possible in the case of expired table and the time is more than 2 days?
I tried using timestamp of 2 days back and tried to get it using bq tool, but I need data which was deleted 2 days before.
GCP Support here!
Actually if you read through the SO question linked by #niczky12 and as stated in the documentation:
It's possible to restore a table within 2 days of deletion. By leveraging snapshot decorator functionality, you may be able to reference a table prior to the deletion event and then copy it. Note the following:
You cannot reference a deleted table if you have already created a new table with the same name in the same dataset.
You cannot reference a deleted table if you deleted the dataset that housed the table, and you have already created a new dataset with the same name.
At this point, unfortunately it is impossible to restore the deleted data.
Bigquery tables don't necessarily expire in 2 days. You can set them to whatever you like:
https://cloud.google.com/bigquery/docs/managing-tables#updating_a_tables_expiration_time
Once they expired, there's no way to retrieve the table, unless you have snapshots in place. In that case, you can restore a snapshot and use that to get the data you want. See this SO question on how to do that:
How can I undelete a BigQuery table?
To add for future searchers here. I was able to follow the explanation below on medium and restore data that was still there 7 days ago.
Actually the Cloud Shell in the UI gave back the max time to go back when i tried a date that was too far int he future. The max time they gave back was 7 days in EPOCH Miliseconds. Just type that in the convertor below and add 1 or 2 hours and you should be good. Don't take the exact copy of what the console provides, as that is outdated by the time it's printed.
https://medium.com/#dhafnar/how-to-instantly-recover-a-table-in-google-bigquery-544a9b7e7a8d
https://www.epochconverter.com/
And make sure to set future tables to never delete! (or a date you know). This can be found in the table details, and also on dataset level in console.cloud environment for bigquery.
(as of 2022 at least) In general, you can recover data BQ tables for 7 days via time-travel. See GCP doc:
https://cloud.google.com/bigquery/docs/time-travel
and the related:
How can I undelete a BigQuery table?

How to Change Access log From Hourly to Daily or Weekly?

I try to follow the instructions in https://cloud.google.com/storage/docs/access-logs to create access log for my bucket in Google Cloud Platform. The problem is that the access log is created hourly, which means in one day, there will be 24 files and for one week, there will be 24 * 7 files. That is not convenient to manage.
Is it possible to create access log daily or weekly? Or auto merge all hourly logs?
Thanks
It is not possible to change the frequency of these logs creation. However, you could create a Cloud Functions that is triggered every day or week and merges all the logs into one CSV. Here I found a blog that explains how to do it and may be useful for you.
Also, I have created a feature request and you can follow its progress in this link