I want to implement ISupportLastWriteTime in sync framework 2.1 - microsoft-sync-framework

Implementing ISupportLastWriteTime in sync framework 2.1 but there is not even a single piece of code on the internet.
http://msdn.microsoft.com/en-us/library/windows/desktop/dd744781(v=vs.85).aspx
I want to implement the Last writer wins in the sync framework 2.1
and i cannot add a new column in the existing tables.
I had created RelationalSyncProvider
private void LocalProvider_ApplyChangeFailed(object sender, DbApplyChangeFailedEventArgs e)
{
// ISupportLastWriteTime t = sender;
// t.GetChangeUnitChangeTime(
if (dbInfo.SynchronizeDirection == eSyncDirection.Download)
{
}
Here how can i implement ISupportLastWriteTime
Please help......

unfortunately, the RelationalSyncProvider lacks the the functionality to do Last Writer Wins type of resolution.
the easiest way to do this is to add a Last Update DateTime on your tables and when it fires a conflict, compare them and specify which row wins.

Related

FIX API of FXCM: Get the price value every T period

I am doing some tests on the FIX engine sample of FXCM. The complete code is available here.
There is a function void FixApplication::SubscribeMarketData() that allow to continuously receive update of a particular symbol of the Market. Here is what it look like :
// Subscribes to the EUR/USD trading security
void FixApplication::SubscribeMarketData()
{
// Subscribe to market data for EUR/USD
string request_ID = "EUR_USD_Request_";
FIX44::MarketDataRequest request;
request.setField(MDReqID(request_ID));
request.setField(SubscriptionRequestType(
SubscriptionRequestType_SNAPSHOT_PLUS_UPDATES));
request.setField(MarketDepth(0));
request.setField(NoRelatedSym(1));
// Add the NoRelatedSym group to the request with Symbol
// field set to EUR/USD
FIX44::MarketDataRequest::NoRelatedSym symbols_group;
symbols_group.setField(Symbol("EUR/USD"));
request.addGroup(symbols_group);
// Add the NoMDEntryTypes group to the request for each MDEntryType
// that we are subscribing to. This includes Bid, Offer, High, and Low
FIX44::MarketDataRequest::NoMDEntryTypes entry_types;
entry_types.setField(MDEntryType(MDEntryType_BID));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_OFFER));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_TRADING_SESSION_HIGH_PRICE));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_TRADING_SESSION_LOW_PRICE));
request.addGroup(entry_types);
Session::sendToTarget(request, sessionID(true));
}
Is there a way to tell the FIX server that I only want to receive updates every 5min ?
Or should I implement a function that catch the continuous flow of data and output a data every 5 min?
I already tried to search for a parameter in the FIX engine that I could modify to return a T periodic flow of data but I didn't find anything. If it exist I prefer to use it rather than create a function to handle the ticks flow.
The feature you are suggesting would be have to be a counterparty-specific feature implemented with probably custom fields. I don't believe the standard FIX dictionary provides fields that would support this.
So, yes, your hypothetical client-side solution would be the way to go.

How to use Qt QSqlDriver::subscribeToNotification with SQLite3?

I'm writing a Qt application where different models can insert/delete/update the same table. When one model changes the database, I would like the other models to be notified of the change, so they can update their views accordingly.
It seems that the best way to monitor inserts, deletes and updates in SQLite is to use QSqlDriver::subscribeToNotification and then react to the notification signal. I know the syntax is along the lines of:
db.driver()->subscribeToNotification("anEventId");
However, I'm not sure what anEventId means. Is anEventId a constant provided by SQLite or do I code these specific events into SQLite using triggers or something else and then subscribe to them?
The subscribeToNotification implementation in the Qt sqlite driver relies upon the sqlite3_update_hook function of the sqlite C API. The Qt driver, however, is not forwarding the operation performed, just the table name, and that should be the misterious anEventId argument to pass to subscribeToNotification. In short, you can listen to events occurring in any table (given it is a rowid table, but it generally is) passing the table name to the subscribeToNotificationmethod. When your slot catches the notification signal, though, you only know that an operation occurred in that table, but (sadly) Qt won't tell you which one (INSERT, UPDATE, or DELETE).
So, given a QSqlDriver * driver:
driver->subscribeToNotification("mytable1");
driver->subscribeToNotification("mytable2");
driver->subscribeToNotification("mytable3");
then in your slot:
void MyClass::notificationSlot(const QString &name)
{
if(name == "mytable1")
{
// do something
}
else if(name == "mytable2")
{
//etc...

Joining a stream against a "table" in Dataflow

Let me use a slightly contrived example to explain what I'm trying to do. Imagine I have a stream of trades coming in, with the stock symbol, share count, and price: { symbol = "GOOG", count = 30, price = 200 }. I want to enrich these events with the name of the stock, in this case "Google".
For this purpose I want to, inside Dataflow, maintain a "table" of symbol->name mappings that is updated by a PCollection<KV<String, String>>, and join my stream of trades with this table, yielding e.g. a PCollection<KV<Trade, String>>.
This seems like a thoroughly fundamental use case for stream processing applications, yet I'm having a hard time figuring out how to accomplish this in Dataflow. I know it's possible in Kafka Streams.
Note that I do not want to use an external database for the lookups – I need to solve this problem inside Dataflow or switch to Kafka Streams.
I'm going to describe two options. One using side-inputs which should work with the current version of Dataflow (1.X) and one using state within a DoFn which should be part of the upcoming Dataflow (2.X).
Solution for Dataflow 1.X, using side inputs
The general idea here is to use a map-valued side-input to make the symbol->name mapping available to all the workers.
This table will need to be in the global window (so nothing ever ages out), will need to be triggered every element (or as often as you want new updates to be produced), and accumulate elements across all firings. It will also need some logic to take the latest name for each symbol.
The downside to this solution is that the entire lookup table will be regenerated every time a new entry comes in and it will not be immediately pushed to all workers. Rather, each will get the new mapping "at some point" in the future.
At a high level, this pipeline might look something like (I haven't tested this code, so there may be some types):
PCollection<KV<Symbol, Name>> symbolToNameInput = ...;
final PCollectionView<Map<Symbol, Iterable<Name>>> symbolToNames = symbolToNameInput
.apply(Window.into(GlobalWindows.of())
.triggering(Repeatedly.forever(AfterProcessingTime
.pastFirstElementInPane()
.plusDelayOf(Duration.standardMinutes(5)))
.accumulatingFiredPanes())
.apply(View.asMultiMap())
Note that we had to use viewAsMultiMap here. This means that we actually build up all the names for every symbol. When we look things up we'll need make sure to take the latest name in the iterable.
PCollection<Detail> symbolDetails = ...;
symbolDetails
.apply(ParDo.withSideInputs(symbolToNames).of(new DoFn<Detail, AugmentedDetails>() {
#Override
public void processElement(ProcessContext c) {
Iterable<Name> names = c.sideInput(symbolToNames).get(c.element().symbol());
Name name = chooseName(names);
c.output(augmentDetails(c.element(), name));
}
}));
Solution for Dataflow 2.X, using the State API
This solution uses a new feature that will be part of the upcoming Dataflow 2.0 release. It is not yet part of the preview releases (currently Dataflow 2.0-beta1) but you can watch the release notes to see when it is available.
The general idea is that keyed state allows us to store some values associated with the specific key. In this case, we're going to remember the latest "name" value we've seen.
Before running the stateful DoFn we're going to wrap each element into a common element type (a NameOrDetails) object. This would look something like the following:
// Convert SymbolToName entries to KV<Symbol, NameOrDetails>
PCollection<KV<Symbol, NameOrDetails>> left = symbolToName
.apply(ParDo.of(new DoFn<SymbolToName, KV<Symbol, NameOrDetails>>() {
#ProcessElement
public void processElement(ProcessContext c) {
SymbolToName e = c.element();
c.output(KV.of(e.getSymbol(), NameOrDetails.name(e.getName())));
}
});
// Convert detailed entries to KV<Symbol, NameOrDetails>
PCollection<KV<Symbol, NameOrDetails>> right = details
.apply(ParDo.of(new DoFn<Details, KV<Symbol, NameOrDetails>>() {
#ProcessElement
public void processElement(ProcessContext c) {
Details e = c.element();
c.output(KV.of(e.getSymobl(), NameOrDetails.details(e)));
}
});
// Flatten the two streams together
PCollectionList.of(left).and(right)
.apply(Flatten.create())
.apply(ParDo.of(new DoFn<KV<Symbol, NameOrDetails>, AugmentedDetails>() {
#StateId("name")
private final StateSpec<ValueState<String>> nameSpec =
StateSpecs.value(StringUtf8Coder.of());
#ProcessElement
public void processElement(ProcessContext c
#StateId("name") ValueState<String> nameState) {
NameOrValue e = c.element().getValue();
if (e.isName()) {
nameState.write(e.getName());
} else {
String name = nameState.read();
if (name == null) {
// Use symbol if we haven't received a mapping yet.
name = c.element().getKey();
}
c.output(e.getDetails().withName(name));
}
});

Sitecore 6.5 DMS - Registering a goal completion via the API

I want to register a goal/conversion on my Sitecore 6.5 site using the API rather than a 'thank-you' page.
I've seen this question about how to do it Sitecore OMS - achieving a goal on a form submission but the answer relates to the API prior to Sitecore 6.5 where it was overhauled quite significantly.
Has anyone done this? Or has this functionality been intentionally removed?
Have you tried something like
protected void btnSubmit_Click(object sender, EventArgs e)
{
if (Sitecore.Analytics.Tracker.IsActive && Sitecore.Analytics.Tracker.CurrentPage != null)
{
PageEventData eventData = new PageEventData("My Goal Name");
eventData.Data = "this is some event data.";
VisitorDataSet.PageEventsRow pageEventsRow = Sitecore.Analytics.Tracker.CurrentPage.Register(eventData);
Sitecore.Analytics.Tracker.Submit();
}
}
That should register the goal on the currentpage, but not before you decide to in your code
You can also use a modified version of the code which references the Goal Item by its GUID:
if (Sitecore.Analytics.Tracker.IsActive && Sitecore.Analytics.Tracker.CurrentPage != null)
{
PageEventItem goal = new PageEventItem(Sitecore.Context.Database.GetItem("GOALGUID"));
VisitorDataSet.PageEventsRow pageEventsRow = Sitecore.Analytics.Tracker.CurrentPage.Register(goal);
Sitecore.Analytics.Tracker.Submit();
}
Make sure you have deployed and published your goal and or goal category too as the code will fail otherwise.

How to manually set a primary key in Doctrine2

I am importing data into a new Symfony2 project using Doctrine2 ORM.
All new records should have an auto-generated primary key. However, for my import, I would like to preserve the existing primary keys.
I am using this as my Entity configuration:
type: entity
id:
id:
type: integer
generator: { strategy: AUTO }
I have also created a setter for the id field in my entity class.
However, when I persist and flush this entity to the database, the key I manually set is not preserved.
What is the best workaround or solution for this?
The following answer is not mine but OP's, which was posted in the question. I've moved it into this community wiki answer.
I stored a reference to the Connection object and used that to manually insert rows and update relations. This avoids the persister and identity generators altogether. It is also possible to use the Connection to wrap all of this work in a transaction.
Once you have executed the insert statements, you may then update the relations.
This is a good solution because it avoids any potential problems you may experience when swapping out your configuration on a live server.
In your init function:
// Get the Connection
$this->connection = $this->getContainer()->get('doctrine')->getEntityManager()->getConnection();
In your main body:
// Loop over my array of old data adding records
$this->connection->beginTransaction();
foreach(array_slice($records, 1) as $record)
{
$this->addRecord($records[0], $record);
}
try
{
$this->connection->commit();
}
catch(Exception $e)
{
$output->writeln($e->getMessage());
$this->connection->rollBack();
exit(1);
}
Create this function:
// Add a record to the database using Connection
protected function addRecord($columns, $oldRecord)
{
// Insert data into Record table
$record = array();
foreach($columns as $key => $column)
{
$record[$column] = $oldRecord[$key];
}
$record['id'] = $record['rkey'];
// Insert the data
$this->connection->insert('Record', $record);
}
You've likely already considered this, but my approach would be to set the generator strategy to 'none' for the import so you can manually import the existing id's in your client code. Then once the import is complete, change the generator strategy back to 'auto' to let the RDBMS take over from there. A conditional can determine whether the id setter is invoked. Good luck - let us know what you end up deciding to use.