I am using grails 2.5.5 with java 8. To get this working with the new dates, I defined a mapping between the java 8 dates (ZonedDateTime) and the database/hibernate dates. This works fine, no problem at all
here's where the problem starts: Unit Testing:
I have a method which uses
Foo.withCriteria{
ge("startDate",foo.startDate)
}
the problem now is that startDate is a ZonedDateTime and I get the error that startDate is no existent property of Foo. Using FindAllBy gives the same problem.
I cannot mock this method, for it is a private method. How do i get these java 8 dates working in unit tests?
(if I gave too little information, just ask, I can provide, but I thought this would be enough and I wanted to keep it as general as possible for stackOverflow)
Each date has a different format so you must check that the Date are of the same type and format. Your Java 8 probably using java.util.Date and SimpleDateFormatter.
Please check your unit testing framework date format, it might be DateTime format so it causes the differences and may cause you error.
On the following code:
Foo.withCriteria{
ge("startDate",foo.startDate)
}
Make sure that the foo.startDate was initialized correctly using your unit testing framework with the correct format.
You can also use other date formats such as:
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd");
Date convertedCurrentDate = sdf.parse(currentDateText);
or ISO 8601 DateTime Format
These are very popular formats using databases, which can explain why
Using FindAllBy gives the same problem.
Related
I have hit a dead end with this problem. My code works perfectly in development but when I deploy my project and configure DigitalOcean Spaces & S3 bucket I get the following error when uploading media:
TypeError at /api/chunked_upload/
Unicode-objects must be encoded before hashing
I'm using django-chucked-uploads and it doesn't play well with Botocore
I'm using Python 3.7
My code is taken from this demo: https://github.com/juliomalegria/django-chunked-upload-demo
Any help will be massively helpful
This library was implemented for Python 2, so there might be a couple of things that don't work out of the box with Python 3.
This issue that you're facing is one of them since files in Python 3 are read directly as Unicode (since now py3's str is py2's unicode). The md5 hashing is the part of the code triggering this exception (this line) because it doesn't expect Unicode strings.
If you have created your own model inheriting from AbstractChunkedUpload, you can override the md5 property to encode the chunks before updating the hash. See this other SO question on how to solve this specific.
Hopefully this helped!
Disclaimer: I'm the creator of this library. However, I haven't maintained it in a long time to the point that it might be no longer usable.
i want test api of my project.
i have Article model that have DateField that fill automatic and certainly each time save date.today()
so if i run this line of test code today it run correctly but future days will run incorrectly
response=self.client.get("/api/v1.0.0/blog/archive/en-2021-01/")
how i can change the date part of above line of code dynamically.I mean part "en-2021-01" of the above code .I also tested it with a variable but it did not work.like this
edate=str(date.today())
response=self.client.get("/api/v1.0.0/blog/archive/en-edate/")
i do not know how changed it to work
Thanks for trying to help me
hope you are using unittest...
for this purpose...
i think easiest solution during testing is use something like:
https://pypi.org/project/freezegun/
it mock your datetime and then your test run with specific date.. and always be success.
Just lead the example in the library.
I’m trying to convert a ISO timestamp into a UNIX timestamp to test this converted value with the current value.
The date is 2018-02-15T00:33:02.000Z
Can this be converted into a UNIX timestamp within Postman?
Thanks in advance.
You can convert this time in Postman using moment.js which comes with the native application.
var moment = require("moment")
console.log(moment("2018-02-15T00:33:02.000Z").valueOf())
This would convert the value and print it on the Postman Console if you add this to the pre-request script or Tests tab.
This is always a good site for cross checking: http://currentmillis.com/
The same could be done in native JavaScript but moment makes it much easier.
The question is very vague as to what you are trying to do but this is a basic answer.
I am using Ben Nadel's POIUtility.cfm to read and write to Excel files. There are some files which I can read very easily using the given code/file. But for some other files, I keep getting an instantiation error. I cannot figure out what's going wrong.
Code:
<cfset arrSheets = objPOI.ReadExcel(
FilePath = ExpandPath( "./File giving error.xls" ),
HasHeaderRow = true
) />
Error:
Object instantiation exception.
An exception occurred while instantiating a Java object. The class
must not be an interface or an abstract class.
I'm using CF10, site hosted locally on IIS. Link to sample file.
Short answer:
The format of the file you are trying to read is too old (Excel 95). POI only supports Excel 97 and later. Unless you really have files in that old format, I would not worry about it.
Given that spreadsheet functions are built into CF 10, you probably do not even need the POIUtility.
Longer answer:
If you look at the end of the stack trace, the "caused by" message explains that the format Excel 95 (BIFF5), is not supported. (It is over twenty years old!). For that file to be compatible with POI, you would need to it with another tool, like Excel, and save it in Excel 97 format (or later).
Caused by: org.apache.poi.hssf.OldExcelFormatException: The supplied spreadsheet seems to be Excel 5.0/7.0 (BIFF5) format. POI only supports BIFF8 format (from Excel versions 97/2000/XP/2003)
As an aside, the POIUtility was originally designed back in the days of ColdFusion 7, because it had no official support for manipulating spreadsheets. However, CF7 did include the POI library. So that component was written to fill the gap. Then along came CF9, which had spreadsheet functions already baked in, so the component became less necessary.
I am recently trying to change our company's old program. One of the huge rocks in my way is that the old program was made with Borland C++, and it had its own way of connecting to the SQL Server 2000 database.
After 8 years, I'm trying to retire this program. But when I looked at the database, I got freaked out!
The whole database was in a vague language that was supposed to be Persian.
I'll give you a portion of the database converted to SQL Server 2005, so you can see it for yourself. I've spent many days trying to figure out how to decode this data. But so far no results has come out of it.
Link to the sample Database File
So please if you can tell me how to use them in Microsoft C#.net it will be much appreciated.
These are the datatypes used for them:
And this is how it looks:
Thanks a lot.
1) Analyse existing program and original database
Try to figure out how the C++ program stored Persian text in the database. What are the collations defined on the original server, database, and on column level.
Does the C++ program convert the data to be stored and retrieved from the database? If so, find out how.
It may well be that the program displays data in Persian, but does not store it in a compatible way. Or it uses a custom font that supports custom encoding. All this needs to be analyzed.
2) The screen shots looks as if everything Persian is encoded as ASCII characters higher than CHAR(128).
If this a standardized encoding or custom created?
3) To migrate the database, you most likely will need to convert the data mapping original characters to Unicode characters.
First recreate the tables using Unicode-enabled columns (NVARCHAR, NVARCHAR(MAX)) rather than CHAR and VARCHAR, which only support Latin or Extended Latin.
4) Even if you successfully migrated your data, SSMS may not correctly display the stored data due to font settings or OS support.
I summarized the difficulties of displaying Unicode in SSMS on my blog.
But first, you need to investigate the original database and application.