I have an app that sends out notifications to users (timeline cards) and some of the users are reporting that they are receiving the same timeline card multiple times (up to 5 times in one instance). Has anyone encountered this? My app is utilizing the Mirror API.
I've reviewed my log files and only see the timeline card produced once. I'm at a loss. I'll provide any code or logs that are needed. My app is written in Python.
Thanks!
This shouldn't be happening. If you're seeing it persist, file a bug in the official issue tracker.
If you do file a bug, there's one thing that might help Google find the root cause. Do a timeline.list on a user who reports the multiple notifications. Does the API show multiple cards? If so, include the JSON representation of them (including the ID)
The specific code to do this list depends on the language you're developing in. Here's an example of how to do it in Java:
public static List<TimelineItem> retrieveAllTimelineItems(Mirror service) {
List<TimelineItem> result = new ArrayList<TimelineItem>();
try {
Timeline.List request = service.timeline().list();
do {
TimelineListResponse timelineItems = request.execute();
if (timelineItems.getItems() != null && timelineItems.getItems().length() > 0) {
result.addAll(timelineItems.getItems());
request.setPageToken(timelineItems.getNextPageToken());
} else {
break;
}
} while (request.getPageToken() != null && request.getPageToken().length() > 0);
} catch (IOException e) {
System.err.println("An error occurred: " + e);
return null;
}
return result;
}
Related
I'm in a bit of trouble here. Here is the context:
One of our customers asked us to develop an alternative solution to storing documents of a document library in the content database as their content database is growing too fast. They provided us with a network storage so that the documents could be stored in the filesystem instead. After googling a bit, I've found a feature called Remote Blob Storage RBS RBS, but as the references say, this is a per content database feature which is not acceptable for the context. The other option I've come up with is the use of SPItemEventReceiver so that in the ItemAdded event I could save the SPFile associated with the ListItem of the SPItemEventProperties property to the filesystem and possibly delete or truncate the SPFile object
public static void DeleteAssociatedFile(SPWeb web, SPListItem item)
{
try
{
if (item == null) { throw new ArgumentNullException("item"); }
if (item.FileSystemObjectType == SPFileSystemObjectType.File)
{
web.AllowUnsafeUpdates = true;
using (var fileStream = item.File.OpenBinaryStream())
{
if (fileStream.CanWrite)
{
fileStream.SetLength(0);
}
}
item.File.Update();
}
}
catch (Exception ex)
{
// log error message
Logger.Unexpected("ListItemHelper.DeleteAssociatedFile", ex.Message);
throw;
}
finally
{
web.AllowUnsafeUpdates = false;
}
}
forcing it to not store its content into the content database. But it didn't work out. Everytime that I somehow manage to delete or truncate the SPFile associated with the ListItem, the ListItem itself either gets deleted from the document library or the file doesn't get affected by the change. So my is question is: is there a solution for this problem? Any other thoughts that could help me in this quest?
Thanks in advance!
As you have asked other thoughts
One thing coming into my mind is one drive for business instead of network storage
Another is develop custom file upload, upload the file directly to network storage and once uploaded, add an entry in SharePoint list.
I have a web service which will be called from about...let us say 100000 users in the same time (within 3 hours). The services reads and updates the SQL database using Entity Framework 4.1. Here is the code
[WebMethod]
public bool addVotes(string username,string password,int votes)
{
bool success= false;
if (Membership.ValidateUser(username, password) == true)
{
DbContext context = new DbContext();
AppUsers user = context.AppUsers.Where(x => x.Username.Equals(username)).FirstOrDefault();
if (user != null)
{
user.Votat += votes;
context.SaveChanges();
success = true;
}
}
return success;
}
The web service will be called from android mobiles(as I said maybe 100000 maybe more maybe less but that`s not important right now). Is there a deadlock possibility or a possibility for things to go wrong?
What will happen when reading from database and what when updating. As one of the answers said: I am updating just the field Vote per each user. If there is any problem with this how do you advice me to correct it.
Thank You in advance :)
This should be fine.
The reason i say that is that as far as i can tell, the only thing that happens when this method is called on behalf of a user is that the vote count (Votat) in their row in the database is increased. As long as they are only touching their own row, and not any row that might also be touched by one of the 99999 other users, then there is no contention between users, and this should scale well.
we set up a new SharePoint 2013 Server to test how it would work as Document-Storage.
The Problem is, that it is very slow and I dont know why..
I adapted from msdn:
ClientContext _ctx;
private void btnConnect_Click(object sender, RoutedEventArgs e)
{
try
{
_ctx = new ClientContext("http://testSP1");
Web web = _ctx.Web;
Stopwatch w = new Stopwatch();
w.Start();
List list = _ctx.Web.Lists.GetByTitle("Test");
Debug.WriteLine(w.ElapsedMilliseconds); //24 first time, 0 second time
w.Restart();
CamlQuery q = CamlQuery.CreateAllItemsQuery(10);
ListItemCollection items = list.GetItems(q);
_ctx.Load(items);
_ctx.ExecuteQuery();
Debug.WriteLine(w.ElapsedMilliseconds); //1800 first time, 900 second Time
}
catch (Exception)
{
throw;
}
}
There arent very much Documents in the Test list.
Just 3 Folders and 1 Word-File.
Any suggestions/ideas why it is this slow?
Storing unstructured content (Word docs, PDFs, anything except metadata) in SharePoint's SQL content database is going to result in slower upload and retrieval than if the files are stored on the file system. That's why Microsoft created the Remote BLOB (Binary Large Object) Storage interface to enable files to be managed in SharePoint but live on the file system or in the cloud. The bigger the files, the greater the performance hit.
There are several third-party solutions that leverage this interface, including my company's offering, Metalogix StoragePoint. You can reach out to me at trossi#metalogix.com if you would like to learn more or visit http://www.metalogix.com/Products/StoragePoint/StoragePoint-BLOB-Offloading.aspx
I want to register a goal/conversion on my Sitecore 6.5 site using the API rather than a 'thank-you' page.
I've seen this question about how to do it Sitecore OMS - achieving a goal on a form submission but the answer relates to the API prior to Sitecore 6.5 where it was overhauled quite significantly.
Has anyone done this? Or has this functionality been intentionally removed?
Have you tried something like
protected void btnSubmit_Click(object sender, EventArgs e)
{
if (Sitecore.Analytics.Tracker.IsActive && Sitecore.Analytics.Tracker.CurrentPage != null)
{
PageEventData eventData = new PageEventData("My Goal Name");
eventData.Data = "this is some event data.";
VisitorDataSet.PageEventsRow pageEventsRow = Sitecore.Analytics.Tracker.CurrentPage.Register(eventData);
Sitecore.Analytics.Tracker.Submit();
}
}
That should register the goal on the currentpage, but not before you decide to in your code
You can also use a modified version of the code which references the Goal Item by its GUID:
if (Sitecore.Analytics.Tracker.IsActive && Sitecore.Analytics.Tracker.CurrentPage != null)
{
PageEventItem goal = new PageEventItem(Sitecore.Context.Database.GetItem("GOALGUID"));
VisitorDataSet.PageEventsRow pageEventsRow = Sitecore.Analytics.Tracker.CurrentPage.Register(goal);
Sitecore.Analytics.Tracker.Submit();
}
Make sure you have deployed and published your goal and or goal category too as the code will fail otherwise.
i'm not sure that what im going to do is the right so i first of all tell you my issue.
I have TFS as Bugtracking System and another system for tracking the worktime. I want that if a workitem status changes the other system changes the status too.
What i did until now is the following.
I wrote a plugin for the TFS web service were i catch the WorkItemChangedEvent.
public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs,
out int statusCode, out string
statusMessage, out ExceptionPropertyCollection properties)
{
statusCode = 0;
properties = null;
statusMessage = String.Empty;
try
{
if (notificationType == NotificationType.Notification && notificationEventArgs is WorkItemChangedEvent)
{
WorkItemChangedEvent ev = notificationEventArgs as WorkItemChangedEvent;
EventLog.WriteEntry("WorkItemChangedEventHandler", "WorkItem " + ev.WorkItemTitle + " was modified");
}
}
catch (Exception)
{
}
return EventNotificationStatus.ActionPermitted;
}
I droped the DLL in C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\bin\Plugins
but i looks like the extension is never called. So nothing apears in the event log.
But if i try to debug the service like in this post http://geekswithblogs.net/jakob/archive/2010/10/27/devleoping-and-debugging-server-side-event-handlers-in-tfs-2010.aspx
i cant hook on the process. So debugging does not work.
Why i cant debug the service? And is there a better way to do this?
Not sure if you fixed this but to me it looks like you are missing the Subscription method in the class.
public Type[] SubscribedTypes()
{
return new Type[1] {typeof(WorkItemChangedEvent)};
}
Without this your plugin will never get hit thus you will be unable to debug.
To debug the w3wp.exe process, you need to be running Visual Studio as an administrator.
From the menu, select Debug > Attach to process (or Ctrl-Alt-P)
Select Show processes from all users and Show processes in all sessions.
Find the w3wp.exe process that corresponds to your TFS Application Pool, and attach to it.
I notice that you're using EventLog.WriteEntry() - have you registered the event source previously in your code? To avoid the registration (which requires admin permissions), you might try using the TFS logger:
TeamFoundationApplication.Log("WorkItem " + ev.WorkItemTitle + " was modified", 0, System.Diagnostics.EventLogEntryType.Information);