I want to implement a "fast login".
I'm developing an enterprise software where a lot of users work in the same organization with the same data in the same computer and I want to be able to know who did what and when. Right now they have to log out and log in and load the data has to be loaded into the store all over again.
What I want is for them to be able to, without logging out, click on a user, from the organization, insert his password and the user is switched while preserving the store.
Any idea how I can accomplish this?
I'm using ember-simple-auth v1.1.0 and ember v2.10.2
The simpliest solution would be disabling page reload when user logs out. As far as I know, it's a reload causes data loss from store, not a logging out by itself. To do this, you need to overwrite sessionInvalidated method in your application route. For example,
sessionInvalidated() {
this.transitionTo('/login');
},
But remember - you lower security with this method: if someone will log out and leave webpage with app open, other person will have a possibility to extract data (if they have enough technical background to at least install ember inspector).
Other solution will require heavy research. I think it should be possible to implement custom authenticator which would allow to authenticate new user without logging out previous, by simply replacing tokens in store. But I don't know how hard it will be to implement and what obstacles you can meet. You will need to read ember-simple-auth's sources a lot, that's for sure.
I was actually able to solve it by simply using authenticate() with another user but never calling invalidateSession() which is the function that calls sessionInvalidated() that looks like this:
sessionInvalidated() {
if (!testing) {
if (this.get('_isFastBoot')) {
this.transitionTo(Configuration.baseURL);
} else {
window.location.replace(Configuration.baseURL);
}
}
}
So by not calling sessionInvalidated() the user isn't redirected or the page refreshed and the new user is able to keep using the store without switching pages.
Related
I have low-priority users and high-priority users. First of all, I need to process queries from high-priority users. Low-priority users should be limited initially because they slow down the high-priority users.
At the moment I have not found a solution. Is this generally possible or do I need to fork apache-superset and implement such logic myself in the source code? Is this functionality planned in the roadmap?
Superset generally is a thin layer over your existing data stores and doesn't have much of a compute layer.
With this in mind, the right technical decision is probably to configure this at the database / data store layer. Many people integrate LDAP both into Superset and into their data store so 2-way configuration of roles & datas tore priorities / permissions can be configured.
With that being said, Superset is open source! You're definitely welcome to fork the code and implement it yourself. Better yet, you can feel free to raise a discussion by creating a Github issue.
Like Srini said, it depends on your data layer.
One way you could this is by defining a custom SQL_QUERY_MUTATOR in your superset_config.py:
# superset_config.py
def SQL_QUERY_MUTATOR(sql, username, security_manager):
pool = "vip" if username in VIP_LIST else "normal"
return f"-- pool: {pool}\n{sql}"
This will prepend a comment to the query specifying a pool that's either "vip" or "normal". You could then send this to a proxy that parses the comment and dispatches the query to the proper header.
Another way of doing this is specifying a DB_CONNECTION_MUTATOR that sets connection parameters depending on the user:
# superset_config.py
def DB_CONNECTION_MUTATOR(uri, params, username, security_manager, source):
pool = "vip" if username in VIP_LIST else "normal"
params["configuration"] = {"job.queue.name": pool}}
return uri, params
I am trying to update the commerce catalog from external source. After the incremental update I need to have fresh data in Sitecore tree(data provider should return correct data instead of old(cached) ones). However, if I go to Sitecore right after the data import I can see only the old data till I click on "Refresh Catalog Cache" button in Sitecore Commerce menu.
I found the same info in the documentation for Sitecore Commerce Connect, however I can't find any example how to clean cache via code.
I found several types in "Sitecore.Commerce.Connect.CommerceServer.Caching" namespace. For example, CacheRefresh static class. It has RefreshCatalogCaches method which needs ICommerceServerContextManager contextManager as input parameter. If I create contextManager just using constructor new CommerceServerContextManager() and passing it to the method - it doesn't work(at least I still need to clean cache manually).
I would appreciate any advise/suggestion.
Thank you in advance.
You should do in your code same that happens on "Refresh Catalog Cache" button click:
CacheRefreshEvent eventX = new CacheRefreshEvent("catalogcache", "master", = ID.Null);
EventManager.QueueEvent<CacheRefreshEvent>(eventX, true, true);
For more details, look on Sitecore.Commerce.Connect.CommerceServer.Caching.RefreshCache, Sitecore.Commerce.Connect.CommerceServer implementation via reflector.
I use camunda 7.2.0 and i'm not very experienced with it. I'm trying to write data about users, who had done something with process instance to database (i'm using rest services) to get some kind of reports later. The problem is that i don't know how to trigger my rest(that sends information to datebase about current user and assignee) when user assignes task to somebody else or claims task to himself. I see that camunda engine sends request like
link: engine/engine/default/task/5f965ab7-e74b-11e4-a710-0050568b5c8a/assignee
post: {"userId":"Tom"}
As partial solution I can think about creating a global variable "currentUser" and on form load check if user is different from current, and if he is - run the rest and change variable. But this solution don't looks correct to me. So is there any better way to do it? Thanks in advance
You could use a task listener which updates your data when the assignee of a task is changed. If you want this behavior for every task you could define a global task listener.
I am working on doing some simple analytics on a Django webstite (v1.4.1). Seeing as this data will be gathered on pretty much every server request, I figured the right way to do this would be with a piece of custom middleware.
One important metric for the site is how often given images are accessed. Since each image is its own object, I thought about using django-hitcount, but figured that was unnecessary for what I was trying to do. If it proves easier, I may use it though.
The current conundrum I face is that I don't want to query the database and look for a given object for every HttpRequest that occurs. Instead, I would like to wait until a successful response (indicated by an HttpResponse.status of 200 or whatever), and then query the server and update a hit field for the corresponding image. The reason the only way to access the path of the image is in process_request, while the only way to access the status code is in process_response.
So, what do I do? Is it as simple as creating a class variable that can hold the path and then lookup the file once the response code of 200 is returned, or should I just use django-hitcount?
Thanks for your help
Set up a cron task to parse your Apache/Nginx/whatever access logs on a regular basis, perhaps with something like pylogsparser.
You could use memcache to store the counters and then periodically persist them to the database. There are risks that memcache will evict the value before it's been persisted but this could be acceptable to you.
This article provides more information and highlights a risk arising when using hosted memcache with keys distributed over multiple servers. http://bjk5.com/post/36567537399/dangers-of-using-memcache-counters-for-a-b-tests
My app download all the user's friends pictures.
All the requests are of this kind:
https://graph.facebook.com/<friend id>/picture?type=small
But, after a certain limit is reached, instead of the picture I get:
{"error""message":"(#4) Application request limit reached","type":"OAuthException"}}
Actually, the only way I found to prevent this is to change the server ip (manually).
There isn't a better way?
For the record:
The limit is related to the Graph Api only, and the graph.facebook.com/<user>/picture url is a graph api call that returns a redirect.
So, to avoid the daily limit, simply fetch all the images url from FQL, like:
SELECT uid, pic_small, pic_big, pic, pic_square FROM user WHERE uid = me() or IN (SELECT uid2 FROM friend WHERE uid1=me())
now these are the direct urls to the images, for eg:
http://profile.ak.fbcdn.net/hprofile-ak-snc4/275716_1546085527_622218197_q.jpg
so don't store them since they continuously change.
If it's needed for an online app better way not to download those images, but use an online version, there is couple of reasons for doing so:
Users change pictures (some frequently), do you need an updated version?
Facebook's servers probably faster than yours and friends pictures probably cached within browser of your user.
Update:
Since limit you reach is Graph API call limit, not the image retrieval, another solution that comes to my head just now is using friends connection of user in Graph API and specifying picture in fields argument, eq: https://graph.facebook.com/me/friends?fields=picture, this will return direct URL-s for friends pictures so you can do only one call to get all needed info to download the images for every user...