Difficulty in configuring the database for domain connection repository during server installation - informatica

the error i m gettingenter image description hereMy intention is to have ORACLE 12 + Informatica on my local system.
I have installed Oracle and it is working fine. While I am trying to run SERVER INSTALLATION for INFORMATICA, I am having difficulty with configuring the database for domain connection repository. The connection is failing even though I am giving correct oracle credentials which are as below:
DB USERID: infadmin
JDBC URL:localhost:1521
Service: orcl
My questions is:
Why is it failing? Do I have to do anything special before trying this connection like installing any special jdbc drivers or something?
Your replies are much awaited.Thanks in advance.
Vaibhav Gautam

Check the contents of the tnsnames.ora file located at <Oracle Installation Direcotry>/network/admin. Make sure that the hostname, port no., service name matches exactly.
Also, make sure the oracle service is running.

Related

Oracle 19c Client Installation Stuck/freezes error: INFO: Checking whether the IP address of the localhost could be determined. suspecting GP Policy

I am trying to install Oracle Client 19c when i starts the installation its freezes at stage 1, checked logs it hangs on "INFO: Checking whether the IP address of the localhost could be determined...", i refer some article to skip stage mentioned step but it also stuck on stage 7.
My System is in Domain
I tried the installation on recently formatted System and it worked with antivirus.
also tried to disable the Antivirus from my system but faced the same freezing issues. if unjoin the Domain and try it works.
I suspect the issue with any Domain Policy but dont understand the which policy is creating the problem, please suggest the solution.
If you are seeing "Reading from the pipe" message in ...\Inventory\logs OUT files then, login Windows with an admin account and run the services called "Server" and "Workstation".

Cloud Composer worker fails to connect to external database

I am attempting to take my existing cloud composer environment and connect to a remote SQL database (Azure SQL). I've been banging at my head at this for a few days and I'm hoping someone can point out where my problem lies.
Following the documentation found here I've spun up a GKE Service and SQL Proxy workload. I then created a new airflow connection as show here using the full name of the service azure-sqlproxy-service:
I test run one of my DAG tasks and get the following:
Unable to connect: Adaptive Server is unavailable or does not exist
Not sure on the issue I decide to remote directly into one of the workers, whitelist that IP on the remote DB firewall, and try to connect to the server. With no command line MSSQL client installed I launch python on the worker and attempt to connect to the database with the following:
connection = pymssql.connect(host='database.url.net',user='sa',password='password',database='database')
From which I get the same error above with both the Service and the remote IP entered in as host. Even ignoring the service/proxy shouldn't this airflow worker be able to reach the remote database? I can ping websites but checking the remote logs the DB doesn't show any failed logins. With the generic error and not many ideas on what to do next I'm stuck. A few google results have suggested switching libraries but I'm not quite sure how, or if I even need to, within airflow.
What troubleshooting steps could I take next to get at least a single worker communicating to the DB before moving on the the service/proxy?
After much pain I've found that Cloud composer uses ubuntu 1804 which currently breaks pymssql as per here:
https://github.com/pymssql/pymssql/issues/687
I tried downgrading to 2.1.4 to no success. Needing to get this done I've followed the instructions outlined in this post to use pyodbc.
Google Composer- How do I install Microsoft SQL Server ODBC drivers on environments

Pentaho DI can't connect to AWS Redshift - Amazon Error 100021

Referring to Pentaho's Doc, we should be using RedshiftJDBC4.jar instead of version 4.1. I have downloaded the driver and placed it in the lib/ directory. Relaunched spoon.sh and I noticed it is no longer complaining about not able to find the com.amazon.redshift.jdbc4 class driver as I was using the 4.1 driver earlier. However, it still could not establish the connection.
Error connecting to database [aws_redshift] :
org.pentaho.di.core.exception.KettleDatabaseException: Error occurred
while trying to connect to the database
Error connecting to database: (using class
com.amazon.redshift.jdbc4.Driver) Amazon Error setting
default driver property values.
Can anyone help on this?
On the flip side, I can connect to my endpoint using SQLWorkbench/J, a SQL client tool.
Somehow I managed to get it working. It seems that downloading AWS Redshift drivers version 4, 4.1, or 4.2 and placing them in the lib/ directory did not work for me for each version by choosing Redshift as connection type (in Database Connection setup).
Instead, I chose PostgreSQL using JDBC. In host name field, I included the endpoint WITHOUT port number 5439 at the end. So, the endpoint should just end with ...amazonaws.com. Fill in database name, port number of 5439, and username and password. If this did not work, try downloading the latest PostgreSQL JDBC driver and placing it in lib/ directory and try again.

Trying to run WebLogic samples in VMWare Lab Manager

I'm trying to do the following:
Installing a Oracle WebLogic 11g Server with the examples in a VMWare Lab Manager (virutal machine Windows XP SP3).
The problem is the following:
According to the installation instruction everything is quite easy and should work out of the box. The installation does not show any errors.
Normally (i tried on a real machine first) it is only needed to got to
Start - Oracle WebLogic - WebLogic Server 11GR1 - Examples - Start Medical Records Server (Spring Edition)
and everything should work fine (a webpage should open).
The problem is, that no Medical webpage shows up.
Digging a little bit deeper:
Start Medical Records Server is only a batch script. On the real machine (same OS) a derby server is started, on VM not.
The batch script on both machines are equal, but both call a setDomainEnv.cmd which containt on the real machine
set DERBY_FLAG=true
and on VM
set DERBY_FLAG=false
Changing the parameter from false to true does start up derby, but the webpage does not open.
The only info message looking liek an error shown on startup is
Ignoring the trusted CA certificate "CN=T-TeleSec GlobalRoot Cl
ass 3,OU=T-Systems Trust Center,O=T-Systems Enterprise Services GmbH,C=DE". The loading of the trusted certificate list raise a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
but the machine finally goes to
Server started in RUNNING mode
Is there something else need to fire up the sample pages?
the admin console starts up at localhost:7011\console , but not the samples

ColdFusion DSN with DB2 via ODBC

I'm attempting to connect a ColdFusion application to a DB2 ODBC DSN.
Here's my error message:
Connection verification failed for data source: <DSN NAME>
java.sql.SQLException: [Macromedia][SequeLink JDBC Driver][ODBC Socket][IBM][CLI Driver] SQL30082N Attempt to establish connection failed with security reason "24" ("USERNAME AND/OR PASSWORD INVALID"). SQLSTATE=08001
The root cause was that: java.sql.SQLException: [Macromedia][SequeLink JDBC Driver][ODBC Socket][IBM][CLI Driver] SQL30082N Attempt to establish connection failed with security reason "24" ("USERNAME AND/OR PASSWORD INVALID"). SQLSTATE=08001
I've installed DB2 client tools on the server ColdFusion runs on. I've verified the credentials are correct.
I'm not a DB2 guy, but have you seen the Coldfusion DB2 Universal Driver doc located here?
http://livedocs.adobe.com/coldfusion/8/htmldocs/help.html?content=datasources_ADV_MJS_07.html
Based on your comments, you're using ODBC Socket instead. So while this isn't 1 to 1 what you're asking, it might have something useful: Configure Solaris to DB2 ODBC
http://kb2.adobe.com/cps/171/tn_17188.html
The other thing to keep in mind is that DB2 is only supported in CF Enterprise and Developer Editions. http://www.adobe.com/products/coldfusion/systemreqs/
http://kb2.adobe.com/cps/801/80121c8.html
This CFMX doc ended up being what we went with to get it working.
Thanks to SO for pointing us in the right direction.