"Invalid Datetime" for JSON when uploading to GCP BigQuery - google-cloud-platform

I'm trying to get some records into BigQuery, but I've gotten stuck at a date error. I've tried formatting the date in the way that BQ want, but these haven't seemed to help. Here is a (fake, obviously) record I am trying to work with.
{"ID":"1","lastName":"Doe","firstName":"John","middleName":"C","DOB":"1901-01-01 00:00:00","gender":"Male","MRN":"1","diagnosis":["something"],"phone":"888-555-5555","fax":"888-555-5555","email":"j#doe.org"}
And here is the error that I get when I try and upload the file
Provided Schema does not match Table x:y.z. Field dob has changed type from DATETIME to TIMESTAMP
I'm just not sure what the difference in my format could be that BQ is unhappy about. I have my date formatted properly, the time is formatted properly (even tried 00:00:00.0), but I just can't seem to get this data into the table. I've also not specified any time zone, which makes it even odder that it thinks I'm supplying a timestamp.
Any help would be appreciated

Related

Synapse CopyData Tool Changing datetime to int96

So I'm pulling items from a database via a query to put that data into a datalake. All that works great, the preview comes out wonderfully on the source tab. On the Mapping tab however, Whenever I hit "Import Data" or even input the "StartDate" column manually to a blank mapping (which is a datetime in the source DB) -- it changes the startdate to int96. Then, once I pull this data into PBI obviously I have to do a BUNCH of weird massaging to get the int96 back to a datetime. It's rediculous.
Here is a pic of what's happening.
Does anybody know why this is happening or what I can do to map the sink column as a datetime? I can't seem to change the type anywhere.
Parquet internally stores dates as integers, but the clients, including Power BI should automatically convert them back to dates. EG this works fine for me, with a parquet file created as you indicate.
let
Source = AzureStorage.DataLake("https://xxxx.dfs.core.windows.net/datalake/stage/xxx.parquet"),
f = Source{[#"Folder Path"="https://xxxx.dfs.core.windows.net/datalake/stage/",Name="xxx.parquet"]}[Content],
#"Imported Parquet" = Parquet.Document(f)
in
#"Imported Parquet"

How would one go around creating a due by attribute in redshift

I am currently trying to calculate due by dates in a table by adding the sla time to the time the request was created. From what I am able to understand, the way to go around this is to create a table with the work days and hours and query that table to find the due date. However, redshift does not allow one to declare variables. I was wondering how I would go around creating a work hour table in redshift and if that is not possible, how I would calculate the due date by other means. Thanks!
It appears that you would like to provide a timestamp and then calculate the timestamp that is 'n work hours later', most probably taking into account certain rules such as:
Weekdays: 9am-5pm
Weekends: No Hours
Holidays: Occasional weekdays with No Hours
This could be done by Creating a scalar Python UDF - Amazon Redshift that would be passed a 'start' timestamp and a number of hours, and would return the 'end' timestamp.
Please note that Scalar UDFs cannot access tables or 'call outside' of Redshift, so it would need to be self-contained.
There is code on the web that shows How to find the number of hours between two dates excluding weekends and certain holidays in Python? BusinessHours package - Stack Overflow. You would need to modify such code to specify the duration rather than finding the duration.
The alternate method of "creating a work hour table" would work well when trying to find the number of work hours between two timestamps but would be a bit harder when trying to add workhours to a timestamp.

Power BI - How to connect data with exact time in one dataset to a time range in other dataset

Please, could somebody help me with my problem?
I have many XML files with call logs and one excel file with attendance of operators.
XML file contains ID of the phone, date of the call and some other stuff.
Excel file contains ID of the phone, date of the attendance of the operator, time from, time to (when the operator started and finished work that day) and the operator's name.
What I need is to add a new column in the XML files in Power BI and type in the name of the operator who managed the call. So it should search somehow the ID of the phone with the date and then check if the time is older or equal to the time "from" and simultaneously that the time is younger or equal to the time "to". If everything meets the conditions, then it should type in the column "name" the name from the row which met the conditions.
Is that possible in Power BI somehow? If it is, may somebody navigate me how to do that?
I hope it is because I need to automate this process fully (with no manual step like preparing CSV files) so I would be able to create reports from the data.
Thank You!!!
Tereza

How to change a column to datetime in Apache Superset table configuration?

I'm getting this error message:
Datetime column not provided as part table configuration and is required by this type of chart
I'd like to create a chart showing change in total volume over time. I have a date field created as a column that I'd like to use as a datetime but cannot figure out how.
Data is an uploaded .csv file.
How do you change a column to datetime in table configuration?
How do you change a column to datetime in table configuration?
You should be fine casting that column to a timestamp instead.
Superset complains about a missing datetime column because it's written in Python; but assuming you're querying a postgres DB, the equivalent type would be timestamp.
If your datasource is based off a query; you should be able to cast the date type to timestamp (use ::timestamp or to_timestamp()) and use that as your "datetime" column.
In process of uploading the CSV file, we should set up names of columns with dates:
check if you have date or time series in your data once you make sure of that,if the format of the date just year, then the DATETIME format is this '%Y'.
if the format is yyyy/mm/dd then the DATATIME format is "%Y/%m/%d" and for the rest of the formats check here
all these formats had to be edited in the edit dataset window in supersetclick here for image
go with the documentation it helps a lot.
Define in CSV to Database configuration before uploading:

Dynamics Nav 2009 - API order save Date error

I am using the API to save new sales orders, and everytime I send a date as one of the parameteres, it returns me the following error message:
'2015-09-09+01:00' is not a valid date.
Any idea?
Thank you,
Bilel
The Date data type in NAV can not contain time data. Try using the DateTime data type instead. Be sure to correct the input format.
If you are not able to change the data type inside NAV, your best bet, as stated above, is to remove the time stamp from the date entered.