Auto refresh Tableau online dashboard using "Auto refresh" Chrome extension - refresh

I've created a view in tableau online using a live connection. I want to the page to refresh automaticly every 10 minutes. It can only be done by pressing manually the refresh button in the dashboard/view in tableau online, refreshing the browser page wont refresh the dashboard. I saw a tableau discussion with this tip: "Tip: To continually refresh a view, in the <head> section of the web page, add <meta http-equiv="refresh" content="#">, where # is the number of seconds between refreshes." How can I do this? Can this be done in tableau online?
As a second option I can add parameters to the dashboard URL to fix this issue. I saw this in this discussion: https://community.tableau.com/thread/289924 At least the part ":refresh=yes" had to be added to the URL. Since I'm totally unknown in this area I was not able to fix this. Where and how I need to add this to the url so this will permantly works?
I'm also open for other suggestions.

There are a couple of ways you could approach this, which one you choose will depend on your situation, scale, and available resources.
Option 1: Embed with meta tag
This is the first option you were describing. In order to do this, you will need to embed your dashboard into your own custom separate webpage. You can get the embed code from the share button on any dashboard and can customize it using parameters and the JavaScript Embedding API. The meta tag you mentioned would then go in the header of your custom webpage where you are embedding the dashboard. So it would look something like this:
<html>
<head>
<meta http-equiv="refresh" content="600">
</head>
<body>
<script>
// Your embed code from the dashboard here
</script>
</body>
</html>
You would also want to make sure to include the :refresh tag you mentioned so you always get the latest data.
Pros: Anyone can open the page and have an auto-refreshing dashboard without installing anything.
Cons: You will need to have some form of a webserver to host your custom page. Requires some coding. Hard to scale up the number of dashboards.
Option 2: Chrome Extension
This is the second option you were describing. In this case, a chrome extension in the browser is refreshing the page for you. That means you don't need your own separate webpage. However, it will only work on the browser you install and setup the extension on. It looks like there are a couple of auto-refresh extensions in the chrome web store you can choose from. You would need to configure them to refresh the page, again make sure to include the :refresh tag on the url.
Pros: Don't need a separate webserver. No coding. Easy to scale for multiple dashboards.
Cons: Only works for the browser that the chrome extension is installed on.
Option 3: Dashboard extension
One option you didn't mention but I think is the best would be to use a Dashboard Extension. Dashboard extensions are web apps that you can bring directly into the dashboard. We currently have an Auto-Refresh extension in the gallery built for just this purpose. Once you've downloaded it simply open your dashboard, drag in a new extension object, select the downloaded file and configure for 10 minutes.
Pros: Don't need a separate webserver. No coding. Easy to scale for multiple dashboards. Anyone can open the dashboard and have it auto-refreshing without installing anything.
Cons: Auto-Refresh only works with 2019.4+.
Hope this helps!

Related

Cookieless analytics headache. How do I get GDPR compliant analytics to work?

Keep in mind, this is only the second site I've ever built. I'm not a programmer, I'm just good at googling things and following instructions.
The gist of it.
I don't want to have to ask for cookie consent just to know how many visitors my site has. I'm not working for a huge corporate entity which needs the most precise data in the world. All I need is estimative data regarding traffic and maybe to know how many people pushed the "contact us" buttons.
There has to be a way to do this without cookies, but my research has led to two dead ends:
Following Setup Google Analytics without cookies I tried this code
<!-- Google Analytics -->
<script>
window.ga=window.ga||function(){(ga.q=ga.q||[]).push(arguments)};ga.l=+new Date;
ga('create', 'G-C88PM0YJP2', {
'storage': 'none'
});
ga('send', 'pageview');
</script>
<script async src='https://www.google-analytics.com/analytics.js'></script>
<!-- End Google Analytics -->
That got me nowhere. My dashboard registered no data. Chrome analytics debugger extension showed no data. (just for reference, google analytics with the standard script which uses cookies did work when checked with the debugger.)
Fed up with that, I tried Tinyanalytics in their cookieless mode.
<!-- Pixel Code for http://app.tinyanalytics.io/ -->
<script defer src="http://app.tinyanalytics.io/pixel/oaFj0HuEqsS9uMW9"></script>
<!-- END Pixel Code -->
But when I go to look at the data, I'm again getting: "No data available. Did you install the Tracking Code?"
I did install it and have no idea why it isn't working.
Their "Help" section says "step 6. Click on the Verify installation tab and then click the button. If you see an alert box saying that the pixel is installed, then you finished the installation process."
Except I can't find a "Verify installation Tab" anywhere in their interface.
Can you tell why my two attempts have failed and help me get something like tinyanalytics to work? Or can you point me towards better alternatives for free cookieless analytics?
Please note, I'm aware that my host Netlify.com does offer an analytics package for 10$/month. I consider that a last resort. But in my research it appears as though 3rd party cookies are getting deprecated anyway. Is google analytics going cookieless soon? Should I just pay Netlify.com and wait for that?
I wish I could find more alternatives on my own, but if you type the word "analytics" into any search engine, all you get is google analytics because they are so huge.
EDIT: I seem to have unearthed some error messages from my website
Mixed Content: The page at 'https://teatrulcomoara.ro/' was loaded
over HTTPS, but requested an insecure script
'http://app.tinyanalytics.io/pixel/oaFj0HuEqsS9uMW9'. This request has
been blocked; the content must be served over HTTPS. analytics.js:1
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
But this also doesn't check out for me. Tiny analytics asks weather my url is https or http, and I selected https, so why is Tinyanalytics giving me an http tracking code?
I've since added the s in the script. It makes the error go away, but I still don't have analytics
<!-- Pixel Code for http://app.tinyanalytics.io/ -->
<script defer src="https://app.tinyanalytics.io/pixel/oaFj0HuEqsS9uMW9"></script>
<!-- END Pixel Code -->
Thank you in advance for your help.
Here's a link to my site so you can view source in case that helps.

Google cloud storage favicon for PDF File

I have some pdf files hosted on Google Cloud Storage. These files are public and I open them using their public link.
Is there a way I can customize the favicon shown by the browser when I view those PDF?
Can I put a custom Favicon for my bucket?
See screenshot to see which icon I mean, is the Icon shown in the browser Tab, this is chrome:
Currently, this is not possible. When this kind of features are not available, the best is to open a Feature Request (FR) in the Public Issue Tracker of GCP. Before doing so, please make sure that there are no other existing feature requests similar to yours.
Here's an existing issue tracker.
I would recommend you to "star" it to ensure that you receive updates about it. You can also adjust notification settings by clicking the gear icon in the top right corner and selecting settings.
The more "stars" in the issue, the most probability to be implemented the feature request. Also, adding this kind of needs makes the GCP Engineering team to have a better visibility of the real/currently needs of the users.

How to download the multiple bokeh images as a single report

My application is hosted on Django and one of the html pages shows multiple bar graphs which are drawn using Bokeh. I know we can download each graph separately by using SaveTool icon which comes with Bokeh.
Now my requirement is I want to have a export button in the page, when I click on export button, all the images should be downloaded in a single pdf file or any other format what ever is the easier option to implement.
Please guide me how can I achieve this?
Thanks In Advance.
If this is a Bokeh server application, you could use the export_png function. However, it sounds like it is a Bokeh server application, in which case there is nothing built-in for this. It looks like there is a JavaScript API for screen capture. So you could try using that API in a CustomJS callback for a Button. Note that for security reasons that API will make make users provide active consent every time before allowing a screenshot to be taken.

To refresh a section of page

In joomla 2.5 , how to refresh the content of a page with the hyper link of the same page, with out refreshing the entire page .
You need to use Ajax to do so, there are plenty of useful howtos to get this done.
Also try this
http://www.w3schools.com/ajax/default.asp
you might want to see jQuery if you are not familiar with it already.

Is QtWebkit needed to fetch data from websites that need login?

As the title implies,
I need to fetch data from certain website which need logins to use.
The login procedure might need cookies, or sessions.
Do I need QtWebkit, or can I get away with just QNetworkAccessManager?
I have no experience at both, and will start learning as I go.
So please save me a bit of time of comparing both ^^
Thank you in advance,
Evan
Edit: Having read some related answers,
I'll add some clarifications:
The website in concern does not have an API. So I will need to scrape web elements for the data myself.
Can I do that with just QNetworkAccessManager?
No, in most cases you don't need a full simulated web browser. In most cases, just performing the same web requests like a web browser would do is enough.
Try to record the web requests in your browser, using a plugin like "HTTP Live Headers" or "Firebug" in Firefox. I think Chrome provides a similar tool out of the box. These tools record the GET and POST requests done by the website when you send a form in the webpage.
Another option is to inspect the HTML code of the login page. Find the <form> tag and its fields. Put them together in a GET / POST request in your application to simulate the same form.
Remember that some pages use randomized "tokens" in their forms, some set the tokens as cookies. In such cases, you need to request the login page itself in your application first (before sending the filled in form). Both QWebView and QNetworkAccessManager have cookie support.
To sum things up, I think QWebView provides a far more elegant way to simulate user interaction with a web page. The manual way is, however, more "lightweight", as you don't need Webkit and your application might be faster (because only the HTML page is loaded, without any linked resources like images, CSS, javascript files).
QWebView as class name states is a view, so it views something (in this case web pages). If you don't need to display loaded page, then you don't need a view. QNetworkAccessManager may do the work, but you need some knowledge about HTTP protocol, and also anything about target site: how does it hande logins, what type of request you have to send to login etc.