I'm creating an SSIS 2008 package that reads data from an ASCII flat file source and writes it to a SQL Server 2008 database. BIDS was complaining about the implicit cast between unicode and non-unicode data types, so I used a derived column tool to make the cast. This is what it looks like:
Derived Column Name | Derived Column | Expression | Data Type
AccountName2 | Replace 'AName' | (DT_WSTR,100) AName | string [DT_STR]
I'm still getting the same error:
Validation error. InsertAccountRecords: InsertAccountRecords: Columns "AccountName2" and "AName" cannot convert between unicode and non-unicode string data types.
The error is showing up in the SQL Server Destination, which leads me to believe the problem is the Data Type in the derived column. I'm casting it from DT_STR to DT_WSTR, but the Data Type is still nominally DT_STR according to this, no?
I've Googled around, and I can't seem to find any good answers to this question. Can anyone provide any guidance?
Edit: Yup. Looking at the Data Viewer downstream of the Derived Columns, AccountName2 is still coming up as DT_STR. Why isn't the cast casting?
The documentation notes that the new data type is set correctly only when you choose to add a new column. You need to select "add as new column" in the Derived Column drop-down.
Or just use the Data Conversion transformation, which is probably easier if changing the data type is the only transformation you're making.
Related
I have an Athena table that has a column in it that I would like to query. The type of the column is double, but it contains data of mixed types. The data is either:
A double (0-1 inclusive)
An array with 0 or 1 elements (again, a double 0-1 inclusive).
I have no idea how the column go into this state. I'm just trying to fix it.
If I do a naive query:
SELECT col FROM tbl.db;
I get the error: "HIVE_BAD_DATA: Error parsing field value '[]' for field 0: org.openx.data.jsonserde.json.JSONArray cannot be cast to java.lang.Double"
Some things that I've tried, but don't work:
Use try_cast
The docs on try_cast make it sound like the perfect solution; reality is not so kind.
When I tried to run
SELECT COALESCE(
try_cast(col AS double),
try_cast(col AS array<double>)) FROM tbl.db;
I get the error: "SYNTAX_ERROR: line 3:5: Cannot cast double to array(double)". Indeed, when I try more simple examples, I continue to get an error: both
SELECT try_cast(3.4 AS array<double>);
SELECT try_cast(ARRAY [3.4] AS double);
trigger errors. It appears that, although the docs claim that a cast error would cause the function to return null, perhaps that only works when casting between primitive data types.
Cast to JSON
While casting both doubles and arrays to JSON works fine as in these examples:
SELECT try_cast(3.4 AS JSON);
SELECT try_cast(ARRAY [3.4] AS JSON);
when I perform the cast on the actual column like so:
SELECT try_cast(col AS JSON) FROM tbl.db;
I get the error: "HIVE_BAD_DATA: Error parsing field value '["0.01"]' for field 0: org.openx.data.jsonserde.json.JSONArray cannot be cast to java.lang.Double"
I'd really like to be able to query this data. Alternatively, if it's possible to migrate it into a state where it's all one type, that would be an acceptable solution as well.
I am writing a large MERGE statement in BigQuery.
When I attempt to run this query the validator gives me an error involving a lot of ...'s that hides the useful information as shown below:
Value has type ARRAY<STRUCT<eventName STRING, eventUUID STRING, eventDate DATE, ...>> which cannot be inserted into column Events, which has type ARRAY<STRUCT<eventName STRING, eventUUID STRING, eventDate DATE, ...>> at [535:1]
I am extremely confident these two array objects match exactly, however since I am struggling to get around this I would love to see the full error message.
Is there any way to see the full error?
I have looked into the Google Logging tool and cannot see any additional information.
I have also tried the following Cloud Shell command:
bq --format=prettyjson show -j [Job Id Goes Here]
Again, this seems to provide no additional information.
This approach feels pretty silly but it could be the last resort for really long nest type.
Use INFORMATION_SCHEMA.COLUMNS to get a full string of the target type, in your case, type of column Events.
Use CREATE TABLE <yourDataset>.<yourTempTable> AS SELECT ... to dump one row of the Value into a table. Use 1) again to see its full type string.
I have four DateTime columns, all in long format eg 2016-08-01T21:13:02Z. They are called EnqDateTime, QuoteCreatedDateTime, BookingCreatedDateTime and RejAt.
I want to add columns for the duration (in days) between EnquiryDateTime and the other three columns, i.e.
DATEDIF(EnqDateTime, QuoteCreatedDateTime, day)
This works for RejAt, but throws an error for all the other columns:
Parameter "rhs" accepts only ["Datetime"]
As per the image below, all four columns are DateTime.
Can anyone see any other reason this may not be working for 2 of the three columns?
As you can see in the image below, I reproduced an scenario such as the one you presented here, and I had no issue with it. I create the three columns X2Y using the same formula that you shared:
DATEDIF(EnqDateTime, QuoteCreatedDateTime, day)
DATEDIF(EnqDateTime, BookingCreatedDateTime, day)
DATEDIF(EnqDateTime, RejAt, day)
My guessing is that, for some reason, the columns do not have an appropriate Datetime format. Maybe you can try applying some transformations to the data in order to make sure that the data contained in the columns has the appropriate format. I recommend that you try doing the following:
Clean all missing values, clicking on the column and then Clean > Missing > Fill with NULL. Missing values can prevent Dataprep from recognizing a data type properly.
Change the data type again to Datetime, just to doublecheck that there is not any field that does not have the Datetime type. You can do so by clicking on the column and then Change type > Date/Time.
If these methods do not solve your issue, maybe you can try working with a minimal example, having only a few rows, so that you can narrow down the variables with which to work. Then you can update your question with more information.
It would also be nice to know where are you getting the error Parameter "rhs" accepts only ["Datetime"]. It is not clear for me what the rhs (Right Hand Side) parameter is in this case, so maybe you can also provide more details about that.
I am coding a cpp project with the database "postgreSQL".
I created a table in my database its type is character varying(40).
Now I need to SELECT these data FROM the table in my cpp project. I knew that I should use the library libpq, this is the interface of "postgreSQL" for c/cpp.
I have succeeded in selecting data from the table. Now I am considering if it's possible to get the data type of this table. For example, here I want to get character varying(40).
You need to use PQftype.
As described here: http://www.idiap.ch/~formaz/doc/postgreSQL/libpq-chapter17861.htm
And just take a look here about decoding return values: http://www.postgresql.org/message-id/da7021e0608040738l3b0880a1q5a76b838937f8c78#mail.gmail.com
You must also use PQfsize to get field size.
What built-in routine can I make use of to cast data of type LVARCHAR to data of type TEXT?
The larger context: I have a table with a column that has been defined as LVARCHAR(4096). Now a developer wishes to change the data type of this column to TEXT. Ideally this would be done with:
ALTER TABLE foo MODIFY bar TEXT;
...but in such a case the following error is puked to the screen:
ALTER TABLE can not modify column (bar) type. Need a cast from the current type to the new type.
I have read up on the CREATE CAST construction, but I cannot begin to think what on earth the proper conversion function would look like. Without a function, Informix will not allow the CREATE CAST to work. That is, if I do, simply:
CREATE CAST (LVARCHAR AS TEXT)
...Informix tells me that a cast function is required (which makes sense).
Beware, Informix developers: if you inadvertently run into a problem like this, there is no way to get out of it using SQL or DDL alone. Let me repeat that.
If you have a VARCHAR or an LVARCHAR column that you need to migrate to be a TEXT column, and if you cannot afford to lose data in that column, there is no way to do this in SQL or DDL.
Instead, you must write a program that does the conversion for you inside the database driver, in memory. In my case, I used JDBC mutable result sets and copied the column to a new column, letting the JDBC driver perform the conversion, then dropped the old column, and renamed the new column back to the old column. This general pattern is the only way to migrate existing character data into a TEXT column.
#Storm: Which version of IDS/ODBC are you using? AFAIK, IDS 9 or 10 can't do that without using specific embedded C in server (See boulder site), but in no way you can do that directly through SQL. Blob related functions or so.
Othe way is by using UNLOAD / LOAD.
In my scenario, we have lots of problems: no admin rights to enterprise server, as we are service providers, we only can use database, but cannot modify structure. We cannot modify TEXT fields only by launching queries.