In kettle Job execution I am maintaining log ,It works fine with everything except "CheckPoints Log Table ".It gives error as "Column_Name Timestamp : there was a value XML encoding error" as Column having datatype DateTime.
Related
I've imported a CSV file in AWS Databrew. By default, it has converted every date-time column in string. I need to check whether a field is in date-time format or not. When I'm trying to convert "Source" column into "timeStamp" format, it's giving null, for fields having time(hours) in single digit. Can anyone pleae tell me how to resolve this issue ?
We have run a PowerBI subscription to generate visualisations report in PDF format we have get many errors like this
There is no data for the field at position x
The problem is we searched many times about it we found that it may occurred due to missing data in dataset.
But we have about 30 datasets with a query to oracle database we cannot figure out which is the missing data and the log does not mention which report causes the error.
Is there a way to figure out which field is missing?
Or is there a way to enrich the reports error log to give us which report failed?
A sample of exact error is repeated with different positions :
processing!ReportServer_0-8!1e18!02/07/2022-09:56:36:: e
ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: , Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 29.;
Dears
I found a solution help me. I will share it.
The error is due to missing data, not missing values, which means the column name defined in the data set field has been changed in the database.
note:
When make the value equals null it will not give the same error; even if it is used in the report it will give a different error.
about how to detect ?
Simply install report builder in machine has connection to this database and open this report with report builder and make verify fields, it will give detailed error with the name of dataset fields not found so we tracked it in database we found it has been changed so fix it in either dataset or column name in database it fix the issue.
New challenge we are going to handle it either column name exist or changed to e, never get error and give empty report better as there is some database the report will connect may not have the same column names so it should give empty part of report instead of error.
thanks BR,
Given the following query
ALTER TABLE public.alldatatypes ADD x_time time ;,
how do I insert a value into x_time?
Time appears to be a valid column type according to the documentation.
https://docs.aws.amazon.com/redshift/latest/dg/r_Datetime_types.html#r_Datetime_types-time
https://docs.aws.amazon.com/redshift/latest/dg/c_Supported_data_types.html
However, when I try to do an insert, I always get an error.
Insert query: insert into public.alldatatypes(x_time) values('08:00:00');
Error:
SQL Error [500310] [0A000]: Amazon Invalid operation:
Specified types or functions (one per INFO message) not supported on
Redshift tables.;
I do not want to use another column type.
I am testing all the column types defined in the documentation.
That cryptic error message is the one Redshift gives when you try to use a leader-only function as source data for a compute node. So expect you aren't showing the exact code you ran to generated this error. I know that it can seem like you didn't change anything important to the issue but you likely have.
You see select now(); works just fine but insert into <table> select now(); will give the error you are showing. This is because now() is a leader only function. However insert into <table> select getdate(); works great - this is because getdate() is a function that runs on compute nodes.
Now the following SQL runs just fine for me:
create table fred (ttt time);
insert into public.fred(ttt) values('01:23:00'); -- this is more correctly written values('01:23:00':time)
insert into public.fred(ttt) select getdate()::time;
select * from fred;
While this throws the error you are getting:
insert into public.fred(ttt) select now()::time;
So if this doesn't help clear things up please post a complete test case that demonstrates the error.
Error while moving data from one table (nvarchar) to another table (date) within SQL Server 2014 using IICS (Informatica Cloud). I tried using expression
To_date_out = To_date(Incep_Date, 'yyyy-mm-dd')
but the same error still shows up:
TT_11019 There is an error in the port [Incep_Date]: The default value for the port is set to: ERROR(Transformation error: IUpdateStrategyNode12)
I found the answer. It seems Informatica implicitly converts the varchar in YYYY-MM-DD to YYYY/MM/DD, despite the session and the source data in a different(former) format.So in the expression, we need to convert to the latter format & it works! TO_DATE( Incep_Date, ‘YYYY-MM-DD’). Thanks #bitbangs for your time & help.
MD5 hashing function in athena is not working for string. However, athena's document shows that it does : https://docs.aws.amazon.com/redshift/latest/dg/r_MD5.html
Not sure what I am missing here. If I transform varchar to varbinary then the hash that gets generated are not correct.
Getting this error :
SYNTAX_ERROR: line 1:8: Unexpected parameters (varchar(15)) for function md5. Expected: md5(varbinary)
This query ran against the "temp" database, unless qualified by the query. Please post the error message on our forum or contact customer support with Query Id: dd959e8a-7fa4-4170-8895-ce7cf58be6ea.```
The md5 function in Athena/Presto takes binary input. You can convert a string to a varbinary using the to_utf8 function:
SELECT md5(to_utf8('hello world'))