Automatically creating & deploying a separate dashboard for each customer - shiny

I am working on a use-case where the product is subscription based and each new customer who signs in will have a unique dashboard displayed to them based on the algorithm output from the product. I have process automated till getting different results for each customer but my question is which tool can get me a unique dashboard for each customer on which I may plot the outputs I have generated? This will be the last part of my automated pipeline.
I would like all these dashboards on one server because if this product scales to 1000 customers, having 1000 servers running will be infeasible.
I am used to working on R Shiny but I couldn't see any links where you can do this easily i.e. deploy multiple apps on the same shiny server and just get a separate URL for each. Which other tools can I explore? Or can I implement the above process in Shiny itself?
Thanks in advance!

Related

Using Power BI desktop to connect to Business Central

I work for a small team of developers using Power BI Desktop to create reports for different customers. An increasing number want to pull data from Business Central. I'm also finding they don't want to create an account for me on their tenant but are happy to provide 'guest' access. My question is, when starting a new project for a customer, is my only option to remove all credentials for the Business Central connector and start again with the appropriate ones? Also, if I want to publish to the customer's tenant I cant seem to sign in as myself, so how do I do this? Uploading files directly is possible, but we like to use pbi datasets and then point other reports at these datasets. Trying to upload these files just results in an error.

Need help building an uptime dashboard for a distributed system

I have a product for which I would like to create a dashboard to show
its availability/uptime over time and display any outages.
Specifically I am looking for
ability to report historical information on service uptime
provide details on any service outages
The product is running on a fleet of linux servers and connects to a DB running
on a separate instance, also we have some dedicated instances that run nightly
batch jobs. My system also relies on some external services to provide
additional functionality for select customers. There is redis cache also for
caching data for multiple customers.
We replicate all the above setup (application servers, DB, jobs servers, redis
cache etc) into dedicated clusters for large customers. Small customers are put
on one of the shared clusters to keep costs low.
Currently we are running health checks on application servers only and providing
that information in a simple HTML page. This is a go to page for end-users/customers
and support teams.
Since the product is constructed using multiple systems/services our current HTML
page often times says that the system is up and running fine while can be experiencing
issues with some of its components or external services.
Current health check is using a simple HTTP request and looks for a 200
status code, this check runs every minute and we plot this data into a simple
chart to show last 30 days. We also show a list of outages with timestamp and
additional static information that is manually added.
We would like to build a more robust solution that monitors much more than the HTTP port
and where we have more details like what part
of the system is having issues and how those issues are impacting the system and
which customers are impacted.
Appreciate any guidance or help. We prefer to build the solution using
open source tools since we dont have much budget. Goal is to improve things for
my team members who are already overloaded.
I'm not sure if this will be overkill or not for your setup, given that I don't know your product, but have a look at the ELK Stack and see if you can use some components or at least some ideas from there:
What is the ELK Stack?
The Complete Guide to the ELK Stack

PowerBI Web Embed Has Mixed Refresh Data

My organization recently publish a PowerBI dashboard via the Publish to Web and an embed code. We have configured a daily refresh via a gateway running on a virtual machine that's always on. The data refreshes automatically daily. This is all successful and works well.
The issue we are running into is that the data seems to update incrementally on the embedded version. For example, data in one tab will update to the most current data, while changing a slicer selection will continue to display the previous day's data.
This is incredibly confusing, especially as it's a publish facing dashboard.
Is there a way to resolve this?
Thanks!

Predictive Personalisation between different multi-tenancy sitecore websites

We have different instances of websites some of them with up to 4 multisites each. They have their own XDB backend.
We have a requirement below:
Able to track the user between each sites. When user visits the dental site and comes to company main site then show the carousel
banner with dental Ads.
When user fills up the form or download certain PDF documents, increase the accumulated goals. e.g. if user visits dental site and
fills up form (worth 10 points) and goes to different site and
downloads pdf (worth 5 points). The total goal value accumulated
should be 15.
Should be able to view the exact same user profile details (on each instances).
I understand we could use Federated experience manager, but above all are sitecore instances.
Would you able to help us how above can be achieved?
e.g.
Do we need to share same xDB and analytics Databases for ALL INSTANCES?
Does all sites need to be on SINGLE instance to achieve above?
Is it possible to share goals, personas, segments setup between different instances?
Any other recommendations?
Finally how does sitecore works out the predictive personalization i.e. it reads the xDB or analytics or something else?
Thanks.
Do we need to share same xDB and analytics Databases for ALL
INSTANCES? Yes.
Does all sites need to be on SINGLE instance to achieve
above? All sites can exist on separate instances, not an issue here.
Is it possible to share goals, personas, segments setup
between different instances? Yes share the xDB among the instance, setup Database replication in CD and make sure machine key are same Please note that there lot of other things to be considered for coming to conclusion on this point.
More reference links:
http://digital-learnings.blogspot.in/2015/12/sitecore-multi-site-or-multi-instance.html
https://doc.sitecore.net/sitecore_experience_platform/developing/xdb_overview/scalability_options
http://www.nonlinearcreations.com/Digital/how-we-think/whitepapers/Whitepaper-Planning-your-Sitecore-xDB-infrastructure.aspx

WSO bam for aggregating events?

I have a real time web analytics problem to address, and I'm wondering if some of the WSO2 products might be an appropriate solution.
An ecommerce web site shows pages of products to a browser user, and the web site vendor wants to collect details of what products were viewed in a list, what products were selected from the list for more info, what products were put into the basket, and what products were actually purchased - all in real time. I can use web page tagging to generate logging events for the four states (I.e. In list, view detail, in basket, purchased). The web site vendor wants too see results summarized by product and by rolling time band (e.g. Last hour, last 6 hours, last 24 hours, last 72 hours) by the four product states.
As a complete WSO2 newbie I'm hoping somebody can help with some pointers on how to address this. I've been reading about the BAM module to capture events. Is that a good place to start? Also can anybody suggest a good in memory data store to hold the event data aggregated by event type and rolling time period?
TIA
Yes, BAM is more kind batch processing, monitoring and complex engine and using it you can capture data, process and then present. In architectural point of view, the product states that are changed by the browser user will be captured by the web server and publish to BAM server.
A good point to start is learning about data publishing. Once you define the data [in BAM it is known as stream definition] to be published, you can write a hive script to process it and present. You can pump all data to BAM and then you can use hive script to process and store it in the manner you wanted. Later you can retrieve and present.