I'm trying to sync two SQL Server DBs by following Microsoft example from here Synchronizing SQL Server and SQL Express and the basic sync is working for me.
Now, I tried to create a conflict by changing the same row on both DB to different values but when I run my sync process the ApplyChangeFailed is not fired.
I read this question Microsoft Sync Framework Conflict Event does not fire but I don't understand why when I sync client<->server configuration the framework ignore conflicts.
Here is my code, just for reference, I have a remote SQL 2008 R2 Server as the server and a local SQL 2012 Express as the client:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.SqlClient;
using Microsoft.Synchronization;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization.Data.SqlServer;
using Microsoft.Synchronization.Data.SqlServerCe;
namespace ExecuteExpressSync
{
class Program
{
static void Main(string[] args)
{
SqlConnection clientConn = new SqlConnection(#"Data Source=.\SQLEXPRESS; Initial Catalog=SyncExpressDB; Trusted_Connection=Yes");
SqlConnection serverConn = new SqlConnection("Data Source=X.Y.Z.W; Initial Catalog=SyncDB; uid=sa;password=******;Integrated Security=False");
// create the sync orhcestrator
SyncOrchestrator syncOrchestrator = new SyncOrchestrator();
// set local provider of orchestrator to a sync provider associated with the
// ProductsScope in the SyncExpressDB express client database
syncOrchestrator.LocalProvider = new SqlSyncProvider("ProductsScope", clientConn);
// set the remote provider of orchestrator to a server sync provider associated with
// the ProductsScope in the SyncDB server database
syncOrchestrator.RemoteProvider = new SqlSyncProvider("ProductsScope", serverConn);
// set the direction of sync session to Upload and Download
syncOrchestrator.Direction = SyncDirectionOrder.UploadAndDownload;
// subscribe for errors that occur when applying changes to the client
((SqlSyncProvider)syncOrchestrator.LocalProvider).ApplyChangeFailed += new EventHandler<DbApplyChangeFailedEventArgs>(Program_ApplyChangeFailed);
// execute the synchronization process
SyncOperationStatistics syncStats = syncOrchestrator.Synchronize();
// print statistics
Console.WriteLine("Start Time: " + syncStats.SyncStartTime);
Console.WriteLine("Total Changes Uploaded: " + syncStats.UploadChangesTotal);
Console.WriteLine("Total Changes Downloaded: " + syncStats.DownloadChangesTotal);
Console.WriteLine("Complete Time: " + syncStats.SyncEndTime);
Console.WriteLine(String.Empty);
}
static void Program_ApplyChangeFailed(object sender, DbApplyChangeFailedEventArgs e)
{
// display conflict type
Console.WriteLine(e.Conflict.Type);
// display error message
Console.WriteLine(e.Error);
}
}
}
As #JuneT notes this line is missing:
// subscribe for errors that occur when applying changes to the client
((SqlSyncProvider)syncOrchestrator.RemoteProvider).ApplyChangeFailed += new EventHandler<DbApplyChangeFailedEventArgs>(Program_ApplyChangeFailed);
Related
I am using grpc 1.35.0 on Windows 10 and following sample code here to create a grpc channel for client to use. But I have a provide a root cert to create the channel, otherwise it complains below error.
Then I write my client in a python version and I can create the channel without giving root cert.
So, is this a grpc bug or I misunderstand the sample code?
GRPC sample code
// Create a default SSL ChannelCredentials object.
auto channel_creds = grpc::SslCredentials(grpc::SslCredentialsOptions());
// Create a channel using the credentials created in the previous step.
auto channel = grpc::CreateChannel(server_name, channel_creds);
// Create a stub on the channel.
std::unique_ptr<Greeter::Stub> stub(Greeter::NewStub(channel));
// Make actual RPC calls on the stub.
grpc::Status s = stub->sayHello(&context, *request, response);
my code
const std::string SECURE_GRPC_CHANNEL_ADDRESS = <MY_SERVER>;
class GrpcChannel
{
GrpcChannel()
{
auto ca_cert = get_file_contents(cacert_path);
SslCredentialsOptions options = { ca_cert, "", "" };
auto channel_creds = SslCredentials(options);
channel_ = grpc::CreateChannel(SECURE_GRPC_CHANNEL_ADDRESS, channel_creds);
}
Turns out it's grpc document issue, grpc-core C++ for windows does not support default root cert and need specify one by user. Please refer to here.
I have an Azure WebJob which I am publishing from visual studio 2017 to a Standard S1 App Service, the WebJob should be Triggered by CRON but always publishes as Continuous and I cannot figure out what I have done wrong (two other WebJobs publish fine)
I have the App Service set to 'Always On' in application settings
I have a settings.job file in the root with my schedule
{
"schedule": "0 3 5 * * 1-5"
}
My Program class
namespace EventPushUpdater
{
using Microsoft.Azure.WebJobs;
using MBL.AzureKeyVaultHelpers;
internal class Program
{
private static void Main()
{
Properties.Settings s = Properties.Settings.Default;
IKeyVault kv = new KeyVaultHelper(s.ClientId, s.ClientKey, s.KeyVaultRoot);
var config = new JobHostConfiguration();
config.DashboardConnectionString = kv.GetSecretValue(s.DashboardConnectionString);
config.StorageConnectionString = kv.GetSecretValue(s.DashboardConnectionString);
var host = new JobHost(config);
host.Call(typeof(Functions).GetMethod("PushEvents"), new { keyVault = kv });
}
}
}
And the function being called
public class Functions
{
[NoAutomaticTrigger]
public static void PushEvents(IKeyVault keyVault)
{
// do stuff
}
}
The first time you chose 'Publish as a WebJob', it asks you if you want Continuous or On Demand (which includes scheduled):
If you picked the wrong choice, simply delete webjob-publish-settings.json under Properties, and try again.
As an aside, your code is overly complex as you're needlessly using WebJobs SDK. Instead, your code can simply be:
static void Main()
{
// Do Stuff
}
You can switch between 'Continuous' and 'Triggered' modes by editing the webjob-publish-settings.json file that is found within the Properties folder of your WebJob project.
In this json file you can set "runMode:" to either Continuous or OnDemand (triggered) :
Continuous
OnDemand
Have you set { "is_singleton": true } in your settings.job?
If so you cannot run more than one instance of your WebJob. If you publish and run your WebJob to the Azure cloud you cannot never run it locally unless you use a different storage account.
Azure Webjob timer trigger does not fire
I have a KML Server that outputs KML data and that I can configure as network place in Google Earth. The KML Server uses embedded Jetty.
I would like to also run the KML Server under Cecium, but then I need to configure Jetty to allow COR. Cesium runs from a webbowser.
There are many example w.r.t. Jettty/COR, but many of them do not run, are outdated, and are just unclear.
The KML Server main program is:
/*
** Create HHTP server
*/
final Server server = new Server(config.getKmlPortNumber());
// Set a handler for each context
ContextHandlerCollection contexts = new ContextHandlerCollection();
Handler[] contextHandler = new Handler[ForceIdentifier.TOTAL_IDENTIFIERS + 1];
final ContextHandler context = new ContextHandler("/");
context.setContextPath("/");
context.setHandler(new DefaultHandler(env));
contextHandler[0] = context;
// Set a handler for each Force Identifier.
for (byte i = 0; i < ForceIdentifier.TOTAL_IDENTIFIERS; i++) {
ContextHandler contexti = new ContextHandler("/" + i);
contexti.setHandler(new DefaultHandler(env, new ForceIdentifier(i)));
contextHandler[i + 1] = contexti;
}
contexts.setHandlers(contextHandler);
server.setHandler(contexts);
// Start the server and set some options
server.start();
//server.dumpStdErr();
server.setStopTimeout(1000);
server.setStopAtShutdown(true);
/*
** Start the federate
*/
try {
federate.start();
} catch (RTIexception ex) {
Main.logger.log(Level.SEVERE, null, ex);
}
/*
** Stop the federate
*/
federate.stop();
The KML Server uses serveral context handlers.
What needs to be done to enable COR here?
(Jetty version is: jetty-all-9.2.10.v20150310)
org.eclipse.jetty.servlets.CrossOriginFilter the technique that Jetty has for enabling COR related features, is only available under a ServletContext, meaning your example code, which doesn't use Servlets, or a ServletContext cannot utilize this filter.
You can, however, make your own Handler to do the COR related work for your servlet-free environment. (Consider looking at the cougar project, and its CrossOriginHandler implementation for inspiration)
Or you can switch to using a ServletContextHandler instead of a ContextHandler and then gain the benefit of using Jetty CrossOriginFilter in your project.
I am trying to use Java API to connect with informatica. I am tyring to run the samples at location
C:\Program Files\Informatica\PowerCenter8.6.1\MappingSDK\samples\src\com\informatica\powercenter\sdk\mapfwk\samples which uses com.informatica.powercenter.sdk.mapfwk.core.* libraries.
When I try to run CreateConnectionSample.java(simple connection to repository) I am getting exception.
code:
CachedRepositoryConnectionManager rpMgr = new CachedRepositoryConnectionManager(
new PmrepRepositoryConnectionManager());
Repository rep = new Repository();
RepoProperties repoProp = new RepoProperties();
repoProp.setProperty(RepoPropsConstant.PC_CLIENT_INSTALL_PATH,
"C:\\Program Files\\Informatica\\PowerCenter8.6.1\\client\\bin");
repoProp.setProperty(RepoPropsConstant.TARGET_REPO_NAME, "EDW_DEV_REPO");
repoProp.setProperty(RepoPropsConstant.REPO_SERVER_DOMAIN_NAME, "DOM_GWM_DEV01");
repoProp.setProperty(RepoPropsConstant.SECURITY_DOMAIN, "MSSB_INFA_DVLPR_DEV");
repoProp.setProperty(RepoPropsConstant.ADMIN_USERNAME, "Username");
repoProp.setProperty(RepoPropsConstant.ADMIN_PASSWORD, "Password");
repoProp.setProperty(RepoPropsConstant.TARGET_FOLDER_NAME,"CORE");
rep.setProperties(repoProp);
rep.setRepositoryConnectionManager(rpMgr);
ConnectionObject connObj = new ConnectionObject("Con", ConnectionAttributes.CONN_TYPE_RELATION);
rep.createConnection(connObj);
I am getting exception
com.informatica.powercenter.sdk.mapfwk.exceptions.ConnectionFailedException: Failed to list connections in PowerCenter Repository
Have anyone done this earlier? Can anyone help me to setup the Java API.
Well, this is really old, and hopefully you ended up getting connected using the SDK. Here's some recent code I put together to get a connection and query some stuff about workflows.
public static void main(String[] args) throws Exception {
if(System.getenv("INFA_DOMAINS_FILE") == null) // make sure .infa file exists
throw new Exception("INFA_DOMAINS_FILE path not set in environment variables.");
Repository rep = new Repository();
RepoConnectionInfo rci = new RepoConnectionInfo();
rci.setRepoServerHost("your host DNS name"); // set host URI
rci.setRepoServerPort("your host port number"); // host port
rci.setRepoServerDomainName("your-domain-name"); // repository domain name
rci.setTargetRepoName("your-repository"); // repository
rci.setSecurityDomain("e-directory"); // security type
rci.setAdminUsername("your-credentials"); // uid
rci.setAdminPassword(getPassword()); // pwd (stored in environment variable -- encoded so it's not cleartext)
rci.setPmrepCacheFolder("c:\\users\\your-credentials\\Informatica\\"); // some cache folder that must be set
rci.setPcClientInstallPath("C:\\Informatica\\9.0.1\\clients\\PowerCenterClient\\client\\bin\\");
rep.setRepoConnectionInfo(rci); // provide connection info to rep object
RepositoryConnectionManager repmgr = new PmrepRepositoryConnectionManager(); // set up repository connection manager
rep.setRepositoryConnectionManager(repmgr); // tell repository to use connection manager
System.out.println("Folders:");
System.out.println("===========================================================================");
List<Folder> folders = rep.getFolders();
for(Folder f: folders) { System.out.println(f);}
}
Goal:
Retrieve data from Dynamics CRM 2011 to my database from SQL server R2 by using webservice through integration services (SSIS). Webservice needed to be located inside of SSIS. Gonna use the data for data warehouse.
Problem:
How do I do it?
We only write to Dynamics so I can't address the specific method name but the general idea below should get you started.
Assumptions
Two variables have been defined in your package and they are passed to the script component as ReadOnlyVariables: CrmOrganizationName, CrmWebServiceUrl.
A script component has been added to the dataflow as a Source component. On the Inputs and Outputs tab, an appropriate number of columns have been added to Output 0 (or whatever you define your output collection as) with appropriate data types.
Inside the script, add a web reference to your CRM instance. This code assumes it's called CrmSdk.
using System;
using System.Data;
using System.Data.SqlClient;
using System.Windows.Forms;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
// web reference
using CrmSdk;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void CreateNewOutputRows()
{
// Get a reference to the CRM SDK
CrmSdk.CrmService CrmService = new CrmSdk.CrmService();
// An Authentication Token is required because CRM requires an OrganizationName
// to identify the Organization to be used
CrmSdk.CrmAuthenticationToken token = new CrmSdk.CrmAuthenticationToken();
token.AuthenticationType = 0;
token.OrganizationName = this.Variables.CrmOrganizationName;
CrmService.CrmAuthenticationTokenValue = token;
// Use default credentials
CrmService.Credentials = System.Net.CredentialCache.DefaultCredentials;
// Get the web service url from the config file
CrmService.Url = this.Variables.CrmWebServiceUrl;
//////////////////////////////////////////////////
// This code is approximate
// Use the appropriate service call to get retrieve
// data and then enumerate through it. For each
// row encountered, call the AddRow() method for
// your buffer and then populate fields. Be wary
// of NULLs
//////////////////////////////////////////////////
foreach (CrmSdk.entity person in CrmService.Get())
{
Output0Buffer.AddRow();
Output0Buffer.FirstName = person.FirstName;
Output0Buffer.LastName = person.LastName;
}
}
}
Caveats
There is no error handling, checks for nulls or anything elegant. The service should probably have been defined with the using statement, etc, etc, etc. It should provide an appropriate starting point for understanding how to consume a web service and load data into the pipeline.
The easiest solution for your requirement is to use a third-party library for SSIS. The commercial COZYROC SSIS+ library includes Dynamics CRM adapters, which support all deployment models: Premise, Live, Hosted, Federation, Office 365.