How load varchar into date type in SQL Server using Informatica? - informatica

Error while moving data from one table (nvarchar) to another table (date) within SQL Server 2014 using IICS (Informatica Cloud). I tried using expression
To_date_out = To_date(Incep_Date, 'yyyy-mm-dd')
but the same error still shows up:
TT_11019 There is an error in the port [Incep_Date]: The default value for the port is set to: ERROR(Transformation error: IUpdateStrategyNode12)

I found the answer. It seems Informatica implicitly converts the varchar in YYYY-MM-DD to YYYY/MM/DD, despite the session and the source data in a different(former) format.So in the expression, we need to convert to the latter format & it works! TO_DATE( Incep_Date, ‘YYYY-MM-DD’). Thanks #bitbangs for your time & help.

Related

PowerBI subscription error : there is no data for the field at position x

We have run a PowerBI subscription to generate visualisations report in PDF format we have get many errors like this
There is no data for the field at position x
The problem is we searched many times about it we found that it may occurred due to missing data in dataset.
But we have about 30 datasets with a query to oracle database we cannot figure out which is the missing data and the log does not mention which report causes the error.
Is there a way to figure out which field is missing?
Or is there a way to enrich the reports error log to give us which report failed?
A sample of exact error is repeated with different positions :
processing!ReportServer_0-8!1e18!02/07/2022-09:56:36:: e
ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: , Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 29.;
Dears
I found a solution help me. I will share it.
The error is due to missing data, not missing values, which means the column name defined in the data set field has been changed in the database.
note:
When make the value equals null it will not give the same error; even if it is used in the report it will give a different error.
about how to detect ?
Simply install report builder in machine has connection to this database and open this report with report builder and make verify fields, it will give detailed error with the name of dataset fields not found so we tracked it in database we found it has been changed so fix it in either dataset or column name in database it fix the issue.
New challenge we are going to handle it either column name exist or changed to e, never get error and give empty report better as there is some database the report will connect may not have the same column names so it should give empty part of report instead of error.
thanks BR,

Redshift uses wrong timezone when loading from S3

Redshift allows loading time-related types using its epoch representation as stated here by setting timeformat 'epochmillisecs' parameter of COPY command. It works great for TIMESTAMPs but something is broken for TIME columns.
Epoch value 1636984022000 (ms) that corresponds to 13:47:02.572000 is being imported as 22:41:11 by Redshift. I can see that it matches 13:47:02 UTC at PST (-08:00).
I tried alter user awsuser set timezone to 'UTC' and set timezone to default but it doesn't seem to help. What am I missing ?
Sometimes timezone issues can be hard to diagnose because SQL clients perform a conversion on the field when displaying data.
A trick I use is to convert the timestamp to TEXT so that the SQL client does not alter its contents. So, try selecting the data as SELECT field::TEXT to verify how it is actually being stored.
This prevents the SQL client from performing any nicely-intentioned timezone conversion and lets you see the 'real' underlying data.

Failed to insert BLOB object using ODBC driver

I'm trying to store file in SQL server using ODBC driver to a column defined as varbinary(max), when I use SQL server driver I get:
the text, ntext, or image pointer value conflicts with the column name specified, the insert/updtae of a text or image column did not succeed
When I use native client driver I get
string data right truncation
Both are symptoms of the same problem well documented in MSDN. Inserting BLOBs bigger than 400kb will trigger this error, any suggested fix!?
Migrating to OleDB is not an option.
The sqlsrv32.dll installed at my machine is, file version: 6.1.7601.17514
Finally i managed to find the right way,
All you have to do is in your 'DoFieldExchange' function is to:
m_rgODBCFieldInfos[6].m_nSQLType = -4;
BLOB's are always should be found at the end of your query, so m_rgODBCFieldInfos['x'] refers to your m_nFields-1 location in this array, if you have more than one BLOB you should interrogate which one is which.
In my case this solution solved both exceptions:
from Native Client 11:
"String data, right truncation."
from SQL Server:
"the text, ntext, or image pointer value conflicts with the column name specified, the insert/updtae of a text or image column did not succeed"
Cheers :)

why custom dateformats not working in oracle bi web service parameter?

I am using Soap-UI to call Oracle-BI web service runReport to generate a report. Its working fine for reports with no report parameters. But when I am passing Date parameters with dateFormatString defined, the report doesn't show any result. Basically, as per my understanding, the web service can't parse the date correctly. Here's what I tried -
<v2:parameterNameValues>
<v2:listOfParamNameValues>
<!--Other parameters-->
<v2:item>
<v2:dataType>Date</v2:dataType>
<v2:dateFormatString>dd-MM-yyyy</v2:dateFormatString>
<v2:name>val_from_dt_in</v2:name>
<v2:values>
<v2:item>01-01-2013</v2:item>
</v2:values>
</v2:item>
<v2:listOfParamNameValues>
<v2:parameterNameValues>
According to oracle docs, the dateFormatString parameter will take any Java date time format string. I have tried dd-MMM-yyyy and yyyy-MM-dd also. But it doesn't work. Can anyone tell me where I am doing wrong?
try this format please:
dd/mm/yyyy

C++ store a modified MySQL timestamp as string

I am currently trying to get a DATETIME key value from a MySQL database and save it as a string.
However I dont want the time to be in the usual MySQL format YYYY-MM-DD HH:MM:SS but rather in a 12 hour format like this: HH:MM:SS AM/PM
I already figured out how to "convert" the time format in the MySQL database to my desired format by using this:
sql::ResultSet* time = database->Query("SELECT DATE_FORMAT(`lastLogin`, '%r') FROM `users` WHERE `user_id`='%i', id);
the MySQL command is valid and the Query returns the right time format (I tested it directly in MySQL).
Now I want to store the result in a char* array but for some reason he always crashes with a SQL Invalid Command Exception when I try to copy the result in to the array.
time->first();
char* lastLogin = new char[50];
//here are the variants of the commands I tried, every one crashed:
strcpy(lastLogin, time->getString("lastLogin").c_str());
strcpy(lastLogin, time->getString("DATE_FORMAT(`lastLogin`, '%r')").c_str());
strcpy(lastLogin, time->getString("DATE_FORMAT(lastLogin, '%r')").c_str());
strcpy(lastLogin, time->getString(0).c_str());
does anybody know what I am doing wrong? Or is this even possible with MySQL Connector C++?
Try naming the expression in your SQL, giving it an alias:
"SELECT DATE_FORMAT(`lastLogin`, '%r') as last_login_str FROM `users` WHERE `user_id`='%i'"
Then use "last_login_str" to retrieve the value.