Unit Test Stored Proc with H2 Database - unit-testing

I am unit testing a stored proc which is returning 2 result set.
How I can do this with H2?
I have created a a Alias but when I am returning a Array of result set it is not working.
public static ResultSet[] createDummyStoredProc(
Connection connection,
String one,
String two,
String three,
String four) {
Statement statement = null;
ResultSet[] resultSets = new ResultSet[2];
ResultSet resultSet=null;
ResultSet resultSet1=null;
try {
statement = connection.createStatement();
resultSet =
statement.executeQuery(
"select nm_feature, in_feature, id_feature as id, in_ui from tempui");
resultSet1 =
connection.createStatement().executeQuery(
"select t.nm_sp from tempbusiness t");
} catch (SQLException e) {
e.printStackTrace();
}
resultSets[0]= resultSet;
resultSets[1]= resultSet1;
return resultSets;
}
I am using Spring boot and JDBC Template( as I didn't get any solution in JPA to handle multiple resultset)
Note: It is working as expected with one resultSet returned in place of array but I want to test it for 2 result set

Related

how to test method with resultSet and preparedStatement

i would like to unit test my method, which return from database id of last inserted object:
public int getLastId() throws SQLException {
String query = "select LAST_INSERT_ID();";
PreparedStatement stmt = connection.prepareStatement(query);
int id = -1;
ResultSet resultSet = stmt.executeQuery();
while (resultSet.next()) {
id = resultSet.getInt("LAST_INSERT_ID()");
}
return id;
}
and then i want to do same unit test like this:
dataBaseService = mock(DataBaseService.class);
resultSet = mock(ResultSet.class);
#Test
public void getLastIdIsCorrect() throws SQLException {
//given
int expectedId = 1;
int expectedIncorrectId = 101;
//when
when(resultSet.getInt("LAST_INSERT_ID()")).thenReturn(1);
when(dataBaseService.getLastId()).thenCallRealMethod();
int result = dataBaseService.getLastId();
//then
Assert.assertEquals(expectedId, result);
Assert.assertNotEquals(expectedIncorrectId, result);
}
But i'm getting Null pointer exception at line with PreparedStatement stmt = connection.prepareStatement(query);
Can you explain me please, how to do unit tests like this correctly? I have few methods left with the same problem and i'm still learning unit tests....
The problem is, that you seem to be unclear what you are testing. If you are testing the code of getLastId then you should not mock the class containing it - DataBaseService - normally. Instead, your test would be a test OF DataBaseService with something like this...
Connection connection = mock(Connection.class);
PreparedStatement stmt = mock(PreparedStatement.class);
...and...
when(connection.prepareStatement(Mockito.anyString())).thenReturn(stmt);
when(stmt.executeQuery()).thenReturn(resultSet);
When you are outside of DataBaseService and want to test something that's calling it, then you can mock DataBaseService and simply return a number then. But for testing DataBaseService it's not needed to do something like...
dataBaseService = mock(DataBaseService.class);
...
when(dataBaseService.getLastId()).thenCallRealMethod();
Simply create a new instance of DataBaseService and mock the things "inside" it, for example the Connection.
To show some more code, it would be required to know the setup of your class and also which JUnit version you are using.
Ok, thx for your answer :)
i've changed and created DataBaseService object, and then tried to mock everything inside like this:
#Test
public void getLastIdIsCorrect() throws SQLException {
//given
int expectedId = 1;
int expectedIncorrectId = 101;
//when
when(DriverManager.getConnection(Mockito.anyString(), Mockito.anyString(), Mockito.anyString())).thenReturn(connection);
when(connection.prepareStatement(Mockito.anyString())).thenReturn(stmt);
when(stmt.executeQuery()).thenReturn(resultSet);
int result = dataBaseService.getLastId();
//then
Assert.assertEquals(expectedId, result);
Assert.assertNotEquals(expectedIncorrectId, result);
}
and now i'm getting that exception and headache too :D :
com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure

ORA-00947: not enough values when creating object in Oracle

I created a new TYPE in Oracle in order to have parity between my table and a local c++ object (I am using OCCI interface for C++).
In the code I use
void insertRowInTable ()
{
string sqlStmt = "INSERT INTO MY_TABLE_T VALUES (:x)";
try{
stmt = con->createStatement (sqlStmt);
ObjectDefinition *o = new ObjectDefinition ();
o->setA(0);
o->setB(1);
o->setC(2);
stmt->setObject (1, o);
stmt->executeUpdate ();
cout << "Insert - Success" << endl;
delete (o);
}catch(SQLException ex)
{
//exception code
}
The code compiles, connects to db but throws the following exception
Exception thrown for insertRow Error number: 947 ORA-00947: not enough
values
Do I have a problematic "sqlStmt"? Is something wrong with the syntax or the binding?
Of course I have already setup an environment and connection
env = Environment::createEnvironment (Environment::OBJECT);
occiobjm (env);
con = env->createConnection (user, passwd, db);
How many columns are in the table? The error message indicates that you didn't provide enough values in the insert statement. If you only provide a VALUES clause, all columns in the table must be provided. Otherwise you need to list each of the columns you're providing values for:
string sqlStmt = "INSERT INTO MY_TABLE_T (x_col) VALUES (:x)";
Edit:
The VALUES clause is listing placeholder arguments. I think you need to list one for each value passed, e.g.:
string sqlStmt = "INSERT INTO MY_TABLE_T (GAME_ID, VERSION) VALUES (:x1,:x2)"
Have a look at occidml.cpp in the Oracle OCCI docs for an example.

Adding stored procedures to In-Memory DB using SqLite

I am using In-Memory database (using ServiceStack.OrmLite.Sqlite.Windows) for unit testing in servicestack based web api. I want to test the service endpoints which depends on stored Procedures through In-Memory database for which i have gone through the link Servicestack Ormlite SqlServerProviderTests, the unit test class that i am using for the test is as follows,
using System;
using System.Collections.Generic;
using System.Data;
using System.Linq;
using NUnit.Framework;
using ServiceStack.Text;
using ServiceStack.Configuration;
using ServiceStack.Data;
namespace ServiceStack.OrmLite.Tests
{
public class DummyTable
{
public int Id { get; set; }
public string Name { get; set; }
}
[TestFixture]
public class SqlServerProviderTests
{
private IDbConnection db;
protected readonly ServiceStackHost appHost;
public SqlServerProviderTests()
{
appHost = TestHelper.SetUp(appHost).Init();
db = appHost.Container.Resolve<IDbConnectionFactory>().OpenDbConnection("inventoryDb");
if (bool.Parse(System.Configuration.ConfigurationManager.AppSettings["IsMock"]))
TestHelper.CreateInMemoryDB(appHost);
}
[TestFixtureTearDown]
public void TearDown()
{
db.Dispose();
}
[Test]
public void Can_SqlColumn_StoredProc_returning_Column()
{
var sql = #"CREATE PROCEDURE dbo.DummyColumn
#Times integer
AS
BEGIN
SET NOCOUNT ON;
CREATE TABLE #Temp
(
Id integer NOT NULL,
);
declare #i int
set #i=1
WHILE #i < #Times
BEGIN
INSERT INTO #Temp (Id) VALUES (#i)
SET #i = #i + 1
END
SELECT * FROM #Temp;
DROP TABLE #Temp;
END;";
db.ExecuteSql("IF OBJECT_ID('DummyColumn') IS NOT NULL DROP PROC DummyColumn");
db.ExecuteSql(sql);
var expected = 0;
10.Times(i => expected += i);
var results = db.SqlColumn<int>("EXEC DummyColumn #Times", new { Times = 10 });
results.PrintDump();
Assert.That(results.Sum(), Is.EqualTo(expected));
results = db.SqlColumn<int>("EXEC DummyColumn 10");
Assert.That(results.Sum(), Is.EqualTo(expected));
results = db.SqlColumn<int>("EXEC DummyColumn #Times", new Dictionary<string, object> { { "Times", 10 } });
Assert.That(results.Sum(), Is.EqualTo(expected));
}
}
}
when i tried to execute this through Live-DB, it was working fine. but when i tried for In-Memory DB was getting Exceptions as follows,
System.Data.SQLite.SQLiteException : SQL logic error or missing database near "IF": syntax error
near the code line,
db.ExecuteSql("IF OBJECT_ID('DummyColumn') IS NOT NULL DROP PROC DummyColumn");
i commented the above line and executed the test case but still i am getting exception as follows,
System.Data.SQLite.SQLiteException : SQL logic error or missing database near "IF": syntax error
for the code line,
db.ExecuteSql(sql);
the In-Memory DB Created is as follows, and its working fine for remaining cases.
public static void CreateInMemoryDB(ServiceStackHost appHost)
{
using (var db = appHost.Container.Resolve<IDbConnectionFactory>().OpenDbConnection("ConnectionString"))
{
db.DropAndCreateTable<DummyData>();
TestDataReader<TableList>("Reservation.json", "InMemoryInput").Reservation.ForEach(x => db.Insert(x));
db.DropAndCreateTable<DummyTable>();
}
}
why we are facing this exception is there any other way to add and run stored Procedure in In-Memory DB with Sqlite??
The error is because you're trying to run SQL Server-specific queries with TSQL against an in memory version of Sqlite - i.e. a completely different, embeddable database. As the name suggests SqlServerProviderTests only works on SQL Server, I'm confused why you would try to run this against Sqlite?
SQLite doesn't support Stored Procedures, TSQL, etc so trying to execute SQL Server TSQL statements will always result in an error. The only thing you can do is fake it with a custom Exec Filter, where you can catch the exception and return whatever custom result you like, e.g:
public class MockStoredProcExecFilter : OrmLiteExecFilter
{
public override T Exec<T>(IDbConnection dbConn, Func<IDbCommand, T> filter)
{
try
{
return base.Exec(dbConn, filter);
}
catch (Exception ex)
{
if (dbConn.GetLastSql() == "exec sp_name #firstName, #age")
return (T)(object)new Person { FirstName = "Mocked" };
throw;
}
}
}
OrmLiteConfig.ExecFilter = new MockStoredProcExecFilter();

org.apache.calcite.sql.validate.SqlValidatorException

I'm using Apache Calcite to parse a simple SQL statement and return its relational tree. I obtain a database schema using a JDBC connection to a simple SQLite database. The schema is then added using FrameworkConfig. The parser configuration is then modified to handle identifier quoting and case (not sensitive). However the SQL validator is unable to find the quoted table identifier in the SQL statement. Somehow the parser ignore the configuration settings and converts the table to UPPER CASE. A SqlValidatorException is raised, stating the the table name is not found. I suspect, the configuration is not being updated correctly? I have already validated that the table name is correctly included in the schema's meta-data.
public class ParseSQL {
public static void main(String[] args) {
try {
// register the JDBC driver
String sDriverName = "org.sqlite.JDBC";
Class.forName(sDriverName);
JsonObjectBuilder builder = Json.createObjectBuilder();
builder.add("jdbcDriver", "org.sqlite.JDBC")
.add("jdbcUrl",
"jdbc:sqlite://calcite/students.db")
.add("jdbcUser", "root")
.add("jdbcPassword", "root");
Map<String, JsonValue> JsonObject = builder.build();
//argument for JdbcSchema.Factory().create(....)
Map<String, Object> operand = new HashMap<String, Object>();
//explicitly extract JsonString(s) and load into operand map
for(String key : JsonObject.keySet()) {
JsonString value = (JsonString) JsonObject.get(key);
operand.put(key, value.getString());
}
final SchemaPlus rootSchema = Frameworks.createRootSchema(true);
Schema schema = new JdbcSchema.Factory().create(rootSchema, "students", operand);
rootSchema.add("students", schema);
//build a FrameworkConfig using defaults where values aren't required
Frameworks.ConfigBuilder configBuilder = Frameworks.newConfigBuilder();
//set defaultSchema
configBuilder.defaultSchema(rootSchema);
//build configuration
FrameworkConfig frameworkdConfig = configBuilder.build();
//use SQL parser config builder to ignore case of quoted identifier
SqlParser.configBuilder(frameworkdConfig.getParserConfig()).setQuotedCasing(Casing.UNCHANGED).build();
//use SQL parser config builder to set SQL case sensitive = false
SqlParser.configBuilder(frameworkdConfig.getParserConfig()).setCaseSensitive(false).build();
//get planner
Planner planner = Frameworks.getPlanner(frameworkdConfig);
//parse SQL statement
SqlNode sql_node = planner.parse("SELECT * FROM \"Students\" WHERE age > 15.0");
System.out.println("\n" + sql_node.toString());
//validate SQL
SqlNode sql_validated = planner.validate(sql_node);
//get associated relational expression
RelRoot relationalExpression = planner.rel(sql_validated);
relationalExpression.toString();
} catch (SqlParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (RelConversionException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ValidationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} // end main
} // end class
***** ERROR MESSAGE ******
Jan 20, 2016 8:54:51 PM org.apache.calcite.sql.validate.SqlValidatorException
SEVERE: org.apache.calcite.sql.validate.SqlValidatorException: Table 'Students' not found
This is a case-sensitivity issue, similar to table not found with apache calcite. Because you enclosed the table name in quotes in your SQL statement, the validator is looking for a table called "Students", and the error message attests to this. If your table is called "Students", I am surprised that Calcite can't find it.
There is a problem with how you are using the SqlParser.ConfigBuilder. When you call build(), you are not using the SqlParser.Config object that it creates. If you passed that object to Frameworks.ConfigBuilder.parserConfig, I think you would get the behavior you want.

Same Instances header ( arff ) for all my database queries

I am using InstanceQuery , SQL queries, to construct my Instances. But my query results does not come in the same order always as it is normal in SQL.
Beacuse of this Instances constucted from different SQL has different headers. A simple example can be seen below. I suspect my results changes because of this behavior.
Header 1
#attribute duration numeric
#attribute protocol_type {tcp,udp}
#attribute service {http,domain_u}
#attribute flag {SF}
Header 2
#attribute duration numeric
#attribute protocol_type {tcp}
#attribute service {pm_dump,pop_2,pop_3}
#attribute flag {SF,S0,SH}
My question is : How can I give correct header information to Instance construction.
Is something like below workflow is possible?
get pre-prepared header information from arff file or another place.
give instance construction this header information
call sql function and get Instances (header + data)
I am using following sql function to get instances from database.
public static Instances getInstanceDataFromDatabase(String pSql
,String pInstanceRelationName){
try {
DatabaseUtils utils = new DatabaseUtils();
InstanceQuery query = new InstanceQuery();
query.setUsername(username);
query.setPassword(password);
query.setQuery(pSql);
Instances data = query.retrieveInstances();
data.setRelationName(pInstanceRelationName);
if (data.classIndex() == -1)
{
data.setClassIndex(data.numAttributes() - 1);
}
return data;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
I tried various approaches to my problem. But it seems that weka internal API does not allow solution to this problem right now. I modified weka.core.Instances append command line code for my purposes. This code is also given in this answer
According to this, here is my solution. I created a SampleWithKnownHeader.arff file , which contains correct header values. I read this file with following code.
public static Instances getSampleInstances() {
Instances data = null;
try {
BufferedReader reader = new BufferedReader(new FileReader(
"datas\\SampleWithKnownHeader.arff"));
data = new Instances(reader);
reader.close();
// setting class attribute
data.setClassIndex(data.numAttributes() - 1);
}
catch (Exception e) {
throw new RuntimeException(e);
}
return data;
}
After that , I use following code to create instances. I had to use StringBuilder and string values of instance, then I save corresponding string to file.
public static void main(String[] args) {
Instances SampleInstance = MyUtilsForWeka.getSampleInstances();
DataSource source1 = new DataSource(SampleInstance);
Instances data2 = InstancesFromDatabase
.getInstanceDataFromDatabase(DatabaseQueries.WEKALIST_QUESTION1);
MyUtilsForWeka.saveInstancesToFile(data2, "fromDatabase.arff");
DataSource source2 = new DataSource(data2);
Instances structure1;
Instances structure2;
StringBuilder sb = new StringBuilder();
try {
structure1 = source1.getStructure();
sb.append(structure1);
structure2 = source2.getStructure();
while (source2.hasMoreElements(structure2)) {
String elementAsString = source2.nextElement(structure2)
.toString();
sb.append(elementAsString);
sb.append("\n");
}
} catch (Exception ex) {
throw new RuntimeException(ex);
}
MyUtilsForWeka.saveInstancesToFile(sb.toString(), "combined.arff");
}
My save instances to file code is as below.
public static void saveInstancesToFile(String contents,String filename) {
FileWriter fstream;
try {
fstream = new FileWriter(filename);
BufferedWriter out = new BufferedWriter(fstream);
out.write(contents);
out.close();
} catch (Exception ex) {
throw new RuntimeException(ex);
}
This solves my problem but I wonder if more elegant solution exists.
I solved a similar problem with the Add filter that allows adding attributes to Instances. You need to add a correct Attibute with proper list of values to both datasets (in my case - to test dataset only):
Load train and test data:
/* "train" contains labels and data */
/* "test" contains data only */
CSVLoader csvLoader = new CSVLoader();
csvLoader.setFile(new File(trainFile));
Instances training = csvLoader.getDataSet();
csvLoader.reset();
csvLoader.setFile(new File(predictFile));
Instances test = csvLoader.getDataSet();
Set a new attribute with Add filter:
Add add = new Add();
/* the name of the attribute must be the same as in "train"*/
add.setAttributeName(training.attribute(0).name());
/* getValues returns a String with comma-separated values of the attribute */
add.setNominalLabels(getValues(training.attribute(0)));
/* put the new attribute to the 1st position, the same as in "train"*/
add.setAttributeIndex("1");
add.setInputFormat(test);
/* result - a compatible with "train" dataset */
test = Filter.useFilter(test, add);
As a result, the headers of both "train" and "test" are the same (compatible for Weka machine learning)