Java API for Informatica - informatica

I am trying to use Java API to connect with informatica. I am tyring to run the samples at location
C:\Program Files\Informatica\PowerCenter8.6.1\MappingSDK\samples\src\com\informatica\powercenter\sdk\mapfwk\samples which uses com.informatica.powercenter.sdk.mapfwk.core.* libraries.
When I try to run CreateConnectionSample.java(simple connection to repository) I am getting exception.
code:
CachedRepositoryConnectionManager rpMgr = new CachedRepositoryConnectionManager(
new PmrepRepositoryConnectionManager());
Repository rep = new Repository();
RepoProperties repoProp = new RepoProperties();
repoProp.setProperty(RepoPropsConstant.PC_CLIENT_INSTALL_PATH,
"C:\\Program Files\\Informatica\\PowerCenter8.6.1\\client\\bin");
repoProp.setProperty(RepoPropsConstant.TARGET_REPO_NAME, "EDW_DEV_REPO");
repoProp.setProperty(RepoPropsConstant.REPO_SERVER_DOMAIN_NAME, "DOM_GWM_DEV01");
repoProp.setProperty(RepoPropsConstant.SECURITY_DOMAIN, "MSSB_INFA_DVLPR_DEV");
repoProp.setProperty(RepoPropsConstant.ADMIN_USERNAME, "Username");
repoProp.setProperty(RepoPropsConstant.ADMIN_PASSWORD, "Password");
repoProp.setProperty(RepoPropsConstant.TARGET_FOLDER_NAME,"CORE");
rep.setProperties(repoProp);
rep.setRepositoryConnectionManager(rpMgr);
ConnectionObject connObj = new ConnectionObject("Con", ConnectionAttributes.CONN_TYPE_RELATION);
rep.createConnection(connObj);
I am getting exception
com.informatica.powercenter.sdk.mapfwk.exceptions.ConnectionFailedException: Failed to list connections in PowerCenter Repository
Have anyone done this earlier? Can anyone help me to setup the Java API.

Well, this is really old, and hopefully you ended up getting connected using the SDK. Here's some recent code I put together to get a connection and query some stuff about workflows.
public static void main(String[] args) throws Exception {
if(System.getenv("INFA_DOMAINS_FILE") == null) // make sure .infa file exists
throw new Exception("INFA_DOMAINS_FILE path not set in environment variables.");
Repository rep = new Repository();
RepoConnectionInfo rci = new RepoConnectionInfo();
rci.setRepoServerHost("your host DNS name"); // set host URI
rci.setRepoServerPort("your host port number"); // host port
rci.setRepoServerDomainName("your-domain-name"); // repository domain name
rci.setTargetRepoName("your-repository"); // repository
rci.setSecurityDomain("e-directory"); // security type
rci.setAdminUsername("your-credentials"); // uid
rci.setAdminPassword(getPassword()); // pwd (stored in environment variable -- encoded so it's not cleartext)
rci.setPmrepCacheFolder("c:\\users\\your-credentials\\Informatica\\"); // some cache folder that must be set
rci.setPcClientInstallPath("C:\\Informatica\\9.0.1\\clients\\PowerCenterClient\\client\\bin\\");
rep.setRepoConnectionInfo(rci); // provide connection info to rep object
RepositoryConnectionManager repmgr = new PmrepRepositoryConnectionManager(); // set up repository connection manager
rep.setRepositoryConnectionManager(repmgr); // tell repository to use connection manager
System.out.println("Folders:");
System.out.println("===========================================================================");
List<Folder> folders = rep.getFolders();
for(Folder f: folders) { System.out.println(f);}
}

Related

GRPC CreateChannel() error Could not get default pem root certs

I am using grpc 1.35.0 on Windows 10 and following sample code here to create a grpc channel for client to use. But I have a provide a root cert to create the channel, otherwise it complains below error.
Then I write my client in a python version and I can create the channel without giving root cert.
So, is this a grpc bug or I misunderstand the sample code?
GRPC sample code
// Create a default SSL ChannelCredentials object.
auto channel_creds = grpc::SslCredentials(grpc::SslCredentialsOptions());
// Create a channel using the credentials created in the previous step.
auto channel = grpc::CreateChannel(server_name, channel_creds);
// Create a stub on the channel.
std::unique_ptr<Greeter::Stub> stub(Greeter::NewStub(channel));
// Make actual RPC calls on the stub.
grpc::Status s = stub->sayHello(&context, *request, response);
my code
const std::string SECURE_GRPC_CHANNEL_ADDRESS = <MY_SERVER>;
class GrpcChannel
{
GrpcChannel()
{
auto ca_cert = get_file_contents(cacert_path);
SslCredentialsOptions options = { ca_cert, "", "" };
auto channel_creds = SslCredentials(options);
channel_ = grpc::CreateChannel(SECURE_GRPC_CHANNEL_ADDRESS, channel_creds);
}
Turns out it's grpc document issue, grpc-core C++ for windows does not support default root cert and need specify one by user. Please refer to here.

Cross origin request (COR) with embedded jetty

I have a KML Server that outputs KML data and that I can configure as network place in Google Earth. The KML Server uses embedded Jetty.
I would like to also run the KML Server under Cecium, but then I need to configure Jetty to allow COR. Cesium runs from a webbowser.
There are many example w.r.t. Jettty/COR, but many of them do not run, are outdated, and are just unclear.
The KML Server main program is:
/*
** Create HHTP server
*/
final Server server = new Server(config.getKmlPortNumber());
// Set a handler for each context
ContextHandlerCollection contexts = new ContextHandlerCollection();
Handler[] contextHandler = new Handler[ForceIdentifier.TOTAL_IDENTIFIERS + 1];
final ContextHandler context = new ContextHandler("/");
context.setContextPath("/");
context.setHandler(new DefaultHandler(env));
contextHandler[0] = context;
// Set a handler for each Force Identifier.
for (byte i = 0; i < ForceIdentifier.TOTAL_IDENTIFIERS; i++) {
ContextHandler contexti = new ContextHandler("/" + i);
contexti.setHandler(new DefaultHandler(env, new ForceIdentifier(i)));
contextHandler[i + 1] = contexti;
}
contexts.setHandlers(contextHandler);
server.setHandler(contexts);
// Start the server and set some options
server.start();
//server.dumpStdErr();
server.setStopTimeout(1000);
server.setStopAtShutdown(true);
/*
** Start the federate
*/
try {
federate.start();
} catch (RTIexception ex) {
Main.logger.log(Level.SEVERE, null, ex);
}
/*
** Stop the federate
*/
federate.stop();
The KML Server uses serveral context handlers.
What needs to be done to enable COR here?
(Jetty version is: jetty-all-9.2.10.v20150310)
org.eclipse.jetty.servlets.CrossOriginFilter the technique that Jetty has for enabling COR related features, is only available under a ServletContext, meaning your example code, which doesn't use Servlets, or a ServletContext cannot utilize this filter.
You can, however, make your own Handler to do the COR related work for your servlet-free environment. (Consider looking at the cougar project, and its CrossOriginHandler implementation for inspiration)
Or you can switch to using a ServletContextHandler instead of a ContextHandler and then gain the benefit of using Jetty CrossOriginFilter in your project.

Use wsse security header in soap message (Visual Studio 2015, .Net Framework 4.5)

I would like to consume a Soap Service provided by DHL. You can find the wsdl here: https://wsbexpress.dhl.com/sndpt/expressRateBook?WSDL
Therefore I created a new ClassLibrary in Visual Studio 2015 targeting .net framework 4.5.
Then I added a Web Reference to the created project by providing the wsdl address. I generated a proxy file with all types and ports in it but my first problem is, that the generated Service extends from System.Web.Services.Protocols.SoapHttpClientProtocol. As I read in recent posts it is not possible to get the wsse header to that proxy. Some posts advise to add wse but it seems wse is not supported by newer Visual Studio versions.
I tried to generate my proxy by svcutil. After that I added the generated .cs file to the project and copied the content of the generated config file to app.config. (of cause I removed the web reference)
Now the Service class extends System.ServiceModel.ClientBase. (I thought the generator in VS uses svctool internally. If microsoft want people to use wcf why does the generator generate non-wcf proxy files.
I also created a nunit testproject which should test my service, but If I use the version with the svcutil generated version I get an error. I try to translate it to english as the error is displayed in german:
Could not find a default endpoint element which points to the service contract. As I figured out this is because the proxy is in its own class library and therefor doesn't really have an app.config. But my test project is a class library too.
What would be the actual way to consume a web service which needs ws security Username/Password auth these days?
You can add the Web Reference in compatibility mode (I am guessing you are doing so). If you are not adding the reference in compatibility mode, do the following:
Right click on references Add Service Reference-> Advanced -> Add Web Reference (Below the compatibility section), type the URL of the WS and add the reference.
The WSE2.0 extensions are available as a Nuget Package at:
https://www.nuget.org/packages/Microsoft.Web.Services2/
Install the nuget package on the package manager console running the following nugget command:
Install-Package Microsoft.Web.Services2
After you installed the nuget package, you need to make sure your project is referencing the following DLL's:
System.Web
System.Web.Services
Microsoft.Web.Services2 (This will be added after you install the nuget package)
In order to use the WSE2.0 extensions, you need to actually modify the Proxy class that was created when you added the WebReference to inherit from "Microsoft.Web.Services2.WebServicesClientProtocol" instead of "System.Web.Services.Protocols.SoapHttpClientProtocol". Be aware that if you update the WebReference, the Proxy class will inherit againfrom SoapHttpClientProtocol.
Add the following using clauses to the code consuming the Proxy class:
using Microsoft.Web.Services2;
using Microsoft.Web.Services2.Security;
using Microsoft.Web.Services2.Security.Tokens;
After you make this changes, you code should look something like this:
var token = new UsernameToken("theUser", "thePassword", PasswordOption.SendHashed);
var serviceProxy = new ExpressRateBook.gblExpressRateBook();
SoapContext requestContext = serviceProxy.RequestSoapContext;
requestContext.Security.Timestamp.TtlInSeconds = 60;
requestContext.Security.Tokens.Add(token);
//The rest of the logic goes here...
I added the screenshot down below for your reference:
NOTE: I was unable to test the code since I am unfamiliar with the actual methods that you need to consume, the code displayed is just an example of what I saw in the proxy class, update it according to your needs. It should work fine if you follow the steps described before. Check the following link for more detailed instructions:
https://msdn.microsoft.com/en-us/library/ms819938.aspx
You can configure you Service Reference to add the Security Header as AW Rowse describes at http://cxdeveloper.com/article/implementing-ws-security-digest-password-nonce-net-40-wcf:
private void Configure()
{
System.Net.ServicePointManager.ServerCertificateValidationCallback = (senderX, certificate, chain, sslPolicyErrors) => { return true; };
defaultBinding = new BasicHttpBinding
{
Security =
{
Mode = BasicHttpSecurityMode.Transport,
Transport =
{
ClientCredentialType = HttpClientCredentialType.Digest
}
}
};
defaultToken = new UsernameToken(UserName, Password, PasswordOption.SendHashed);
defaultSecurityHeader = MessageHeader.CreateHeader(
"Security",
"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd",
defaultToken.GetXml(new XmlDocument())
);
}
And create you client/proxy like this:
public consulta_informacao_respttClient CriaConsultaClinicaClient()
{
var client = new consulta_informacao_respttClient(defaultBinding, new EndpointAddress("https://resqa.homologacao.unimed.coop.br/chs-integration-external-services-ptu-clinical/proxy-services/execute-query/execute-query-proxy-service"));
client.ClientCredentials.UserName.UserName = UserName;
client.ClientCredentials.UserName.Password = Password;
var scope = new OperationContextScope(client.InnerChannel);
OperationContext.Current.OutgoingMessageHeaders.Add(defaultSecurityHeader);
return client;
}
The properties you will need to create in your class are:
private BasicHttpBinding defaultBinding;
private UsernameToken defaultToken;
private MessageHeader defaultSecurityHeader;
You won't need to configure anything in app/web.config.

ApplyChangeFailed does not fired on conflict

I'm trying to sync two SQL Server DBs by following Microsoft example from here Synchronizing SQL Server and SQL Express and the basic sync is working for me.
Now, I tried to create a conflict by changing the same row on both DB to different values but when I run my sync process the ApplyChangeFailed is not fired.
I read this question Microsoft Sync Framework Conflict Event does not fire but I don't understand why when I sync client<->server configuration the framework ignore conflicts.
Here is my code, just for reference, I have a remote SQL 2008 R2 Server as the server and a local SQL 2012 Express as the client:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.SqlClient;
using Microsoft.Synchronization;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization.Data.SqlServer;
using Microsoft.Synchronization.Data.SqlServerCe;
namespace ExecuteExpressSync
{
class Program
{
static void Main(string[] args)
{
SqlConnection clientConn = new SqlConnection(#"Data Source=.\SQLEXPRESS; Initial Catalog=SyncExpressDB; Trusted_Connection=Yes");
SqlConnection serverConn = new SqlConnection("Data Source=X.Y.Z.W; Initial Catalog=SyncDB; uid=sa;password=******;Integrated Security=False");
// create the sync orhcestrator
SyncOrchestrator syncOrchestrator = new SyncOrchestrator();
// set local provider of orchestrator to a sync provider associated with the
// ProductsScope in the SyncExpressDB express client database
syncOrchestrator.LocalProvider = new SqlSyncProvider("ProductsScope", clientConn);
// set the remote provider of orchestrator to a server sync provider associated with
// the ProductsScope in the SyncDB server database
syncOrchestrator.RemoteProvider = new SqlSyncProvider("ProductsScope", serverConn);
// set the direction of sync session to Upload and Download
syncOrchestrator.Direction = SyncDirectionOrder.UploadAndDownload;
// subscribe for errors that occur when applying changes to the client
((SqlSyncProvider)syncOrchestrator.LocalProvider).ApplyChangeFailed += new EventHandler<DbApplyChangeFailedEventArgs>(Program_ApplyChangeFailed);
// execute the synchronization process
SyncOperationStatistics syncStats = syncOrchestrator.Synchronize();
// print statistics
Console.WriteLine("Start Time: " + syncStats.SyncStartTime);
Console.WriteLine("Total Changes Uploaded: " + syncStats.UploadChangesTotal);
Console.WriteLine("Total Changes Downloaded: " + syncStats.DownloadChangesTotal);
Console.WriteLine("Complete Time: " + syncStats.SyncEndTime);
Console.WriteLine(String.Empty);
}
static void Program_ApplyChangeFailed(object sender, DbApplyChangeFailedEventArgs e)
{
// display conflict type
Console.WriteLine(e.Conflict.Type);
// display error message
Console.WriteLine(e.Error);
}
}
}
As #JuneT notes this line is missing:
// subscribe for errors that occur when applying changes to the client
((SqlSyncProvider)syncOrchestrator.RemoteProvider).ApplyChangeFailed += new EventHandler<DbApplyChangeFailedEventArgs>(Program_ApplyChangeFailed);

How do I duplicate certificate authentication (Mumble (c/c++)) in Python?

Alright so, before I really get into this post, I am going to have to warn you that this might not be an easy fix. Whoever reads and is able to reply to this post must know a lot of c/c++, and at least some python to be able to answer the question I have above.
Basically, I have a connection method from Mumble (a VOIP client), that connects to a server and sends it an SSL certificate for authentication purposes. I also have a Python script that connects to the same Mumble VOIP server, but I don't send a certificate.
I need to modify my existing code to send a certificate, as the current Mumble client does.
--
Here is the C++ code that seems to send a certificate:
ServerHandler::ServerHandler() {
MumbleSSL::addSystemCA();
{
QList<QSslCipher> pref;
foreach(QSslCipher c, QSslSocket::defaultCiphers()) {
if (c.usedBits() < 128)
continue;
pref << c;
}
if (pref.isEmpty())
qFatal("No ciphers of at least 128 bit found");
QSslSocket::setDefaultCiphers(pref);
}
void ServerHandler::run() {
qbaDigest = QByteArray();
QSslSocket *qtsSock = new QSslSocket(this);
qtsSock->setPrivateKey(g.s.kpCertificate.second);
qtsSock->setLocalCertificate(g.s.kpCertificate.first.at(0));
QList<QSslCertificate> certs = qtsSock->caCertificates();
certs << g.s.kpCertificate.first;
qtsSock->setCaCertificates(certs);
cConnection = ConnectionPtr(new Connection(this, qtsSock));
qtsSock->setProtocol(QSsl::TlsV1);
qtsSock->connectToHostEncrypted(qsHostName, usPort);
void ServerHandler::serverConnectionConnected() {
tConnectionTimeoutTimer->stop();
qscCert = cConnection->peerCertificateChain();
qscCipher = cConnection->sessionCipher();
if (! qscCert.isEmpty()) {
const QSslCertificate &qsc = qscCert.last();
qbaDigest = sha1(qsc.publicKey().toDer());
bUdp = Database::getUdp(qbaDigest);
} else {
bUdp = true;
}
QStringList tokens = Database::getTokens(qbaDigest);
foreach(const QString &qs, tokens)
mpa.add_tokens(u8(qs));
QMap<int, CELTCodec *>::const_iterator i;
for (i=g.qmCodecs.constBegin(); i != g.qmCodecs.constEnd(); ++i)
mpa.add_celt_versions(i.key());
sendMessage(mpa);
--
And alas, this is what I do to connect to it right now (in python):
try:
self.socket.connect(self.host)
except:
print self.threadName,"Couldn't connect to server"
return
self.socket.setblocking(0)
print self.threadName,"connected to server"
--
Soo... what do I need to do more to my Python source to connect to servers that require a certificate? Because my source currently connects just fine to any mumble server with requirecert set to false. I need it to work on all servers, as this will be used on my own server (which ironically enough, has requirecerts on.)
I can pregenerate the certificate as a .p12 or w/e type file, so I don't need the program to generate the cert. I just need it to send the cert as the server wants it (as is done in the c++ I posted).
Please help me really soon! If you need more info, message me again.
Stripped out all irrelevant code, now it's just the code that deals with ssl.
From the C++ code it looks like you simply need to have ssl support and negotiate with the correct certification file and encrypt the payload with the correct private key. Those certifications and privates keys are most likely stored in your original program somewhere. If there are non-standard Authorities that the C++ might be loading up you'll need to find out where to put those root authorities in your python installation, or make sure python simply ignores those issues, which is less secure.
In python you can create a socket, like above, except with urllib. This library has support for HTTPs and providing the certification and private keys. URLOpener
Example Usage:
opener = urllib.URLopener(key_file = 'mykey.key', cert_file = 'mycert.cer')
self.socket = opener.open(url)
You'll probably need to make it more robust with the appropriate error checking and such, but hopefully this info will help you out.