Sitecore: Session Started Pipeline - sitecore

I need to detect when a new session is started and I'm unable to locate a pipeline or processor for it. Ironically, I've found the sessionEnd pipeline!
The only other way I can seem to make it work is using Global.aspx to hook the session start event.

You can add your own processor to the httpRequestBegin pipeline with the following code:
if (Session["exist"] == null)
{
// your code that should be executed on session start.
Session["exist"] = true;
}

You might find what you're looking for here
Depending on what you're looking for maybe the startTracking pipeline is what you want.

Related

Enabling Change Tracking for Entity

Is there a way to enable change tracking for couple of entities programmatically? I could not find any api in Dataverse which can help to do this.
You cannot do that with webapi but you can definitely achieve this programatically.
you can either create console application or you can run code now plugin for xrmtoolbox and get this done.
Below code snippet for reference.
UpdateEntityRequest updateBankAccountRequest = new UpdateEntityRequest
{
Entity = BankAccountEntity,
ChangeTrackingEnabled = true //or false here
};
_serviceProxy.Execute(updateBankAccountRequest);
Microsoft docs for ChangeTrackingEnabled

Is there any way to monitor Azure Synapse Pipelines execution?

In my project, I've a need where I need to show how Pipeline is progressing on custom Web Portal built in PHP. Is there any way in any language such as C# or Java through which I can list pipelines and monitor the progress or even log into Application Insights?
Are you labelling your queries with the OPTION (LABEL='MY LABEL') syntax?
This will make it easy to monitor the progress of your pipeline by querying sys.dm_pdw_exec_requests to pick individual queries (see last paragraph under link heading), and if you use a naming convention like 'pipeline_query' you can probably achieve what you want.
try
{
PipelineRunClient pipelineRunClient = new(new Uri(_Settings.SynapseExtractEndpoint), new DefaultAzureCredential());
run = await pipelineRunClient.GetPipelineRunAsync(runId);
while(run.Status == "InProgress" || run.Status == "Queued")
{
_Logger.LogInformation($"!!Pipeline {run.PipelineName} {runId} Status: {run.Status}");
Task.Delay(30000).Wait();
run = await pipelineRunClient.GetPipelineRunAsync(runId);
}
_Logger.LogInformation($"!!Pipeline {run.PipelineName} {runId} Status: {run.Status} Runtime: {run.DurationInMs} Message: {run.Message}");
}

Cancelling an upload task

I've done some reading regarding the Azure SDK and in order to cancel a task you seemingly need to pass in a cancellation_token.
My upload code is very simple:
azure::storage::cloud_block_blob blockBlob = container.get_block_blob_reference(fileLeaf.wstring());
auto task = blockBlob.upload_from_file_async(fullFilePath);
However, some files I upload are potentially very large, and I would like to be able to cancel this operation. I'll probably also likely use continuations and would need all those cancelling too, if that's possible.
The problem I'm having is I can't see any way of attaching a cancellation_token to the task.
Any pointers?
There is a sample code using PPL library, I refered to it and changed the code for canceling task using PPLX library within C++ REST SDK which be used for Azure Storage SDK for C++, please try the code below.
/* Declare a cancellation_token_source and get the cancellation_token,
* please see http://microsoft.github.io/cpprestsdk/classpplx_1_1cancellation__token__source.html
*/
#include <pplxtasks.h>
cancellation_token_source cts;
auto token = cts.get_token();
//Pass the cancellation_toke to task via then method, please see https://msdn.microsoft.com/en-us/library/jj987787.aspx
task.then([]{}, token).wait();
// Cancel the task
cts.cancel();
Hope it helps.

C++ Poco ODBC Transactions - AutoCommit mode

I am currently attempting to use transactions in my C++ app, but I have a problem with the ODBC's auto commit mode.
I am using the POCO libaries to create a connection to a PostgreSQL database on the same machine. Currently, I can send data to this database as single statements, but I cannot get my head around how to use Poco's transaction libraries to be able to send this data more quickly.
As I have several thousand records to insert, and so continuing to use single insert statements is extrememly slow and inpractical - So I am trying to use Poco's transaction to speed this up a bit (a fair bit).
The error I am encountering is a theoretically a simple one - Poco is throwing the following error:
'Invalid access: Session is in auto commit mode.'
I understand, as a result of this, I should somehow set "auto commit" to false - as it only allows me to commit data to the database line by line, rather than as a single transaction.
The problem is how I set this.
Currently, I have a session created from Session.h, that looks alot like this:
session = new Poco::Data::Session(
"ODBC",
connection_data.str()
);
Where connection data is a simple stringstream with the login information, password, database, server and "Driver={PostgreSQL ANSI};" to tell ODBC to utilize PostgreSQL's driver.
I have tried just setting a property "autocommit" to false through the session's setFeature or setProperty settings, this, of course, was to no avail. (it was more of a ditch attempt at this point).
session->setFeature("AUTOCOMMIT", false);
Looking around, I saw a possible alternative method by creating a ODBC sessionImpl directly from ODBC/session/SessionImpl.h instead of using this generic method above, and then creating a new session object from this.
The benefits of this are that ODBC's sessionImpl has references to autocommit mode in the header, which would suggest it would be able to handle this:
void autoCommit(const std::string&, bool val);
/// Sets autocommit property for the session.
However, having not used sessionImpl before, I cannot garuntee if this will work or if can can get this to work with the limited documentation available.
I am using C++ 03 (Not 11), with Visual Studio 2015
Poco 1.7.5
Boost (Where needed)
Would any one know the correct way of setting this feature (above) or a alternative method to achieving this?
edit: Looking at the source of poco, at:
https://github.com/pocoproject/poco/blob/develop/Data/ODBC/src/SessionImpl.cpp#L153
The property seems be named autoCommit, and looking at
https://github.com/pocoproject/poco/blob/develop/Data/include/Poco/Data/AbstractSessionImpl.h#L120
the case of the property names seem to matter. So, does it help if you use session->setFeature("autoCommit", false);?
Cant you just call session->begin(); and session->end(); on the corresponding Session object?
What is returned by session->canTransact()?
According to the doc begin() will start a new transaction, the doc does not mention any property that needs to be set before or after.
See: https://pocoproject.org/docs/Poco.Data.Session.html
Also faced a similar issue.
First of all before begin() need:
m_ses.setFeature("autoCommit", false);
m_ses.begin();
And the second issue is that this feature stays "autoCommit" in false for all other sessions. So don't forget for the next session call
session.setFeature("autoCommit", true);

ember-data 2.0 and Offline

I am creating a new ember app. I want to use the newest version of ember-data. (ember-data 2.0). I want it to be a mobile webapp. Therefore it must handle variable network access and even offline.
I want it to store all data locally and use that data when it goes offline so the user gets the same experience regardless of the network connectivity.
Is ember-data 2.0 capable of handling the offline case? Do I just make an adapter that detects offline/online and then do....?
Or do I have to make my own in-between layer to hide the offline handling from ember-data?
Are there any libraries out there that has this problem solved? I have found some, but are there any that is up to date with the latest version of ember-data?
If device will go offline and user will try to transition to route, for which model is not loaded yet, you will have an error. You need to handle these situations yourself. For example, you may create a nice page with error message and a refresh button. To do this, you need:
First, In application route, create error action (it will catch errors during model hook), and when error occurs, save transition in memory. Do not try to use local storage for this task, it will save only properties, while we need an actual transition object. Use either window.failedTransition or inject in controllers and routes a simple object, which will contain a failed transition.
actions: {
error: function (error, transition) {
transition.abort();
/**
* You need to correct this line, as you don't have memoryStorage
* injected. Use window.failedTransition, or create a simple
* storage, Iy's up to you.
*/
this.get('memoryStorage').set('failedTransition', transition);
return true; //This line is important, or the whole idea will not work
}
}
Second, Create an error controller and template. In error controller define an action, retry:
actions: {
retry: function () {
/**
* Correct this line too
*/
var transition = this.get('memoryStorage').getAndRemove('failedTransition');
if (transition !== undefined) {
transition.retry();
}
}
}
Finally, In error template display a status and an error text (if any available) and a button with that action to retry a transition.
This is a simple solution for simple case (device gone offline just for few seconds), maybe you will need something way more complex. If you want your application to fully work without a network access, than you may want to use local storage (there is an addon https://github.com/funkensturm/ember-local-storage) for all data and sync it with server from time to time (i.e sync data every 10 sec in background). Unfortunately I didn't try such things, but I think it is possible.