Currently, I have an ASP application which retrieves a set of locations from a datasource and then uses Bing map REST services to geocode the addresses and then display them on a table and a map in pages of 10 results at a time.
Currently, the application processes the locations sequentially ...
var geocodeRequest = "http://ecn.dev.virtualearth.net/REST/v1/Locations/" + fullAddress.replace('&', ' ').replace(',', ' ') + "?output=json&jsonp=GeocodeCallback&key=" + getCredentials;
CallRestService(geocodeRequest);
......
function GeocodeCallback(result) {
if (result &&
result.resourceSets &&
result.resourceSets.length > 0 &&
result.resourceSets[0].resources &&
result.resourceSets[0].resources.length > 0) {
// Set the map view using the returned bounding box
var bbox = result.resourceSets[0].resources[0].bbox;
var viewBoundaries = MM.LocationRect.fromLocations(new MM.Location(bbox[0], bbox[1]), new MM.Location(bbox[2], bbox[3]));
map.setView({ bounds: viewBoundaries });
// Add a pushpin at the found location
MM.Location.prototype.locID = null;
var location = new MM.Location(result.resourceSets[0].resources[0].point.coordinates[0], result.resourceSets[0].resources[0].point.coordinates[1]);
location.locID = tableRowIndex;
locs.push(location);
.....
Is there any way to speed this up by passing 10 locations in one call and then processing result.resourceSets[0], result.resourceSets[1] etc?
How would multiple addresses be passed into the rest services call? (comma deliminated?)
Thanks
Bing has two REST-accessible geocoding APIs. One of them is the one you're using, which only supports one address at a time, and the other is the Dataflow API which is designed for high-volume batch processing. Neither really seem like they're right for you, as your system is currently designed.
Depending on where you're getting your street addresses from (all you mention is 'a datasource'), you might be able to do a big-batch geocode for all the locations in your datasource - move the geocoding from request time to a batch process, and just use the request-time geocoding for the ones the batch process hasn't gotten to yet.
There is no way of doing this as it looks right now. It has been proposed to support native in javascript (i think), but I do not think that it has been implemented yet. It you want some concurrency, you could look at webworkers:
http://en.wikipedia.org/wiki/Web_Workers
https://developer.mozilla.org/En/Using_web_workers
But this is not supported in IE yet. Maybe you could try to check out html5 async. I do not know if it could be used in the creation of the script element that is created when you call the REST services.
Related
I have a Rest API which needs to return an item's presentation details.
I tried this line of code, but Sitecore.Context.Device is null, since this is a rest API call.
LayoutItem layoutItem = item.Visualization.GetLayout(Sitecore.Context.Device);
Update: I tried moving this code to when I index my data (hoping to read the value and write it to Solr), but I am facing the same issue.
How would I go about doing this?
The renderings are added per "device" in Sitecore, but in most cases nowadays, there's only one device. Depending on how you route your API endpoint etc., the device may not be resolved automatically. If you know you only have one device, you could pick the first device, or provide the device as a parameter. Then you can use a device switcher in case you need to call other Sitecore methods that requires a device to be set. For example like this:
var deviceName = "Default"; // The default device. Modify according to needs
var deviceItem = item.Database.Resources.Devices.GetAll().First(d => string.Equals(d.Name, deviceName, StringComparison.OrdinalIgnoreCase))
using (new DeviceSwitcher(deviceItem))
{
...
}
I’m getting some strange behaviour. When I update a state with a list of partner ids - other nodes - and and read the state afterwards it seems that via rpcOps.vaultQueryBy I can see the updated - or unconsumed - state with the updated list of partners, but if I do same query via serviceHub.vaultService.queryBy it looks like the state’s parner list hasn’t changed at all.
If I get all states in the flow - also the consumed - it looks like there has not been a change, but via API all updates into partners list are visible. Is this some sort of a bug I have encountered or am I just not understanding something?
We're using Corda 4.0.
Via API
var servicestates = rpcOps.vaultQueryBy<ServiceState>().states.map { it.state.data }
var services = getServices().filter {
it.linearId == UniqueIdentifier.fromString(serviceId)
}.single()
Inside flow
val serviceStateAndRef = serviceHub.vaultService.queryBy<ServiceState>(
QueryCriteria.LinearStateQueryCriteria(linearId = listOf(serviceLinearId))
).states.single()
#Ashutosh Meher You got it near enough. The problem was in a previous flow, where, when creating a new partner state the command call for contract, there was only the caller listed.
So
Command(ServiceContract.Commands.AddPartner(),listOf(ourIdentity.owningKey))
had to be edited to include necessary other parties.
Command(ServiceContract.Commands.AddPartner(),updatedServiceState.participants.map { it.owningKey })
That resulted the other node not to see the change. It was right under my eyes all the time... ;)
I am working on implementing sitecore DMS in 7.2 and I'm having one main issue for which I seem to be having a hard time finding an answer. I have some goals and events set up and I am attempting to set one off through the Analytics API. The event is being logged as being set off in the PageEventId database, but what I am trying to do is add Engagement Value to the current visit/visitor.
I'm looking to update the Value field in the Visits database for the current visit. Here is what I am currently using:
public static void triggerGoal(ID goal)
{
if (Tracker.IsActive && Tracker.CurrentPage != null)
{
Sitecore.Data.Items.Item goalToTrigger = Sitecore.Context.Database.GetItem(goal);
if (goalToTrigger != null)
{
Sitecore.Analytics.Data.Items.PageEventItem reg = new Sitecore.Analytics.Data.Items.PageEventItem(goalToTrigger);
Sitecore.Analytics.Data.DataAccess.DataSets.VisitorDataSet.PageEventsRow eventData =
Tracker.CurrentPage.Register(reg);
eventData.Data = goalToTrigger["Description"];
Tracker.Submit();
}
}
}
This updates the PageEventId database properly, noting that the event has been triggered, but this adds no Engagement Value to the Visits database, regardless of how many engagement points are assigned to the Goad that is being triggered.
I've tried various ways of getting the API to update this field, but nothing has worked for me so far. Here are a bunch of the different things I've tried:
Tracker.CurrentVisit.BeginEdit();
Tracker.CurrentVisit.Value += 3; //look up value here instead of hardcoding. Create new PageEventItem class to get field ID.
Tracker.CurrentVisit.AcceptChanges();
Tracker.CurrentVisit.EndEdit();
Tracker.CurrentVisit.Load();
Tracker.CurrentPage.BeginEdit();
Tracker.CurrentPage.Visit.Value += 3;
Tracker.CurrentPage.AcceptChanges();
Tracker.CurrentPage.EndEdit();
Tracker.Visitor.CurrentVisit.BeginEdit();
Tracker.Visitor.CurrentVisit.Value += 3;
Tracker.Visitor.CurrentVisit.AcceptChanges();
Tracker.Visitor.CurrentVisit.EndEdit();
Tracker.Visitor.CurrentVisit.Load();
Tracker.CurrentVisit.CurrentPage.Visit.BeginEdit();
Tracker.CurrentVisit.CurrentPage.Visit.Value += 3;
Tracker.CurrentVisit.CurrentPage.Visit.AcceptChanges();
Tracker.CurrentVisit.CurrentPage.Visit.EndEdit();
Tracker.CurrentVisit.CurrentPage.Visit.Load();
Tracker.CurrentVisit.CurrentPage.VisitorsRow.BeginEdit();
Tracker.CurrentVisit.CurrentPage.VisitorsRow.Value += 3;
Tracker.CurrentVisit.CurrentPage.VisitorsRow.AcceptChanges();
Tracker.CurrentVisit.CurrentPage.VisitorsRow.EndEdit();
I've used different combinations of using the AcceptChanges() and BeginEdit() EndEdit() and Load() functions, as I'm not completely sure what they each do, but either way, none of them update the Value field.
I am trying to avoid doing a custom SQL query to update this field, I'm trying to figure out how to do it through the built-in Sitecore Analytics API. Can anyone help me figure out how to update this field?
The following works fine for me, are you waiting long enough to see the value written for the visit?
if (Tracker.IsActive)
{
Tracker.CurrentVisit.Value += 3;
}
No need to BeginEdit, AcceptChanges, EndEdit, etc.
What you've done should work so long as you have set up your goal correctly in Sitecore. I like to mirror sitecore's default goals, and create goals of the "Page Event" template. Make sure that you've assigned the goal to your content item Analyze Tab -> Goals -> [select your goal from checkbox list]. If you're going to set the CurrentVisit value, I suggest using the line below to prevent hardcoding the engagement point values.
//This line will add points specified in Content Editor
Tracker.CurrentVisit.Value += int.Parse(reg.Points);
This document explains how to set up your goals the correct way. And if you follow it, your code will work the way you have it, reporting the value specified in the Content Editor without setting CurrentVisit.Value.
Sitecore SDN - Marketing Operations Cookbook
For my Flash Builder 4.6 Project I have a http service defined which looks at a url from our website.
What I'd like to be able to do though is to change the web service url on the fly within the app. i.e. using the existing url as default but having an admin/settings screen to change where the web service points (either stored in our sqlite database or in local memory).
This would be so that we could allow our customers to host their own version of the website/database but still be able to use/download the app through the app stores.
Has anyone had any experience with doing this?
EDIT: Adding some more details after the comments below.
When I created the HTTP Service through the FlashBuilder wizard it creates two web service classes a super class and a sub class which inherits from the super class. All of the code that the wizard populates goes into the super class.
I can assume that the code I need to put in would be in the sub class. But I do not know which function I'd put it in or how.
Below is a sample of the Super's constructor:
// initialize service control
_serviceControl = new mx.rpc.http.HTTPMultiService("websitehere");
var operations:Array = new Array();
var operation:mx.rpc.http.Operation;
var argsArray:Array;
operation = new mx.rpc.http.Operation(null, "loginRequest");
operation.url = "login.php";
operation.method = "GET";
argsArray = new Array("un","pw");
operation.argumentNames = argsArray;
operation.serializationFilter = serializer0;
operation.properties = new Object();
operation.properties["xPath"] = "/";
operation.contentType = "application/x-www-form-urlencoded";
operation.resultType = valueObjects.Data;
operations.push(operation);
_serviceControl.operationList = operations;
I'm not sure what property of the _serviceControl variable I would need to alter.
Also when I search for my website in my code it brings back a .fml file inside a .model directory which seems to get auto refreshed if I change the service url through the wizard. Would this not cause an issue?
I then have the challenge of accessing the user defined url. Within the app we use an sqlite database to store data but I think it would probably be better to use a 'SharedObject' which we also use to know what account they are logged into. How reliable is this? I assume I would be able to access this via the Service?
Though the awkward thing is that we were planning to have this configurable on a settings screen that would have been accessed after logging in. But to log in it would already need to know which server to point to.
if im reading your question correctly then your main ambition is to dynamically change the url for the services based on a user defined variable.
This is very easy to accomplish and even easier to accomplish if you are using parsley / spicelib.
a few points
dont change the code in the super file, this will get overwritten whenever the service gets refreshed. change everything in its generated sub-Class.
Shared Objects are very good for small quantities of data but should never be used for massive datasets i.e storing a big arraycollection.
Anyway here is how i achieve this.
In the SubClass you can change the constructor function.
Here is how i change my urls based on a config variable but you can just as easily use a SharedObject instead.
public function SubClassConstructor(){
if(CONFIG::DOMAIN_IDENT == "development" || CONFIG::DOMAIN_IDENT == "dev" || CONFIG::DOMAIN_IDENT == "d"){
_serviceControl.endpoint = "http://yoururl1";
}
else if(CONFIG::DOMAIN_IDENT == "production" || CONFIG::DOMAIN_IDENT == "prod" || CONFIG::DOMAIN_IDENT == "p"){
_serviceControl.endpoint = "http://yoururl2";
}
}
Of course this isn't exactly what your looking for but its a working solution, of course you can use Bindings to a Global ApplicationModel or direct reference to the SharedObject i guess you already know how to use the SharedObject.
Ask if you need any further help or guidance.
As cghrmauritius' solution didn't quite work for me, I am posting up the final solution that did work in my situation.
public function subConstructor()
{
super();
_serviceControl.baseURL = "http://url1";
}
Obviously for my final solution I need to implement the shareobject as well but overriding the url was my main priority.
Webservice1 can receive a set of Lon/Lat variables. Based on these variables it returns a resultset of items nearby.
In order to create the resultset Webservice1 has to pass the variables to multiple webservices of our own and multiple external webservices. All these webservice return a resultset. The combination of these resultsets of these secondary Webservices is the resultset to be returned by Webservice1.
What is the best design approach within Windows Azure with costs and performance in mind?
Should we sequential fire requests from Webservice1 to the other webservices wait for a response and continue? Or can we eg use a queue where we post the variables to be picked up by the secondary webservices?
I think you've answered you're own question in the title.
I wouldn't worry about using a queue. Queues are great for sending information off to get dealt with by something else when it doesn't matter how long it takes to process. As you've got a web service that's waiting to return results, this is not ideal.
Sending the requests to each of the other web services one at a time will work and is the easiest option technically, but it won't give you the best performance.
In this situation I would send requests to each of the other web services in parallel using the Task Parallel Library. Presuming the order of the items that you return isn't important your code might look a bit like this.
public List<LocationResult> PlacesOfInterest(LocationParameters parameters)
{
WebService[] webServices = GetArrayOfAllWebServices();
LocationResult[][] results = new LocationResult[webServices.Count()][];
// Call all of the webservices parallel
Parallel.For((long)0,
webServices.Count(),
i =>
{
results[i] = webServices[i].PlacesOfInterest(parameters);
});
var finalResults = new List<LocationResult>();
// Put all the results together
for (int i = 0; i < webServices.Count(); i++)
{
finalResults.AddRange(results[i]);
}
return finalResults;
}