Add a unit test for Flink SQL - unit-testing

I am using Flink v1.7.1. When I finished a Flink streaming job with tableSource, SQL and tableSink, I have no idea how to add a unit test for it.

I found a good example about how to testing flink sql with the help of user mailing list, here is a example.
package org.apache.flink.table.runtime.stream.sql;
import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.api.java.tuple.Tuple5;
import org.apache.flink.api.java.typeutils.RowTypeInfo;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.java.StreamTableEnvironment;
import org.apache.flink.table.runtime.utils.JavaStreamTestData;
import org.apache.flink.table.runtime.utils.StreamITCase;
import org.apache.flink.test.util.AbstractTestBase;
import org.apache.flink.types.Row;
import org.junit.Test;
import java.util.ArrayList;
import java.util.List;
/**
* Integration tests for streaming SQL.
*/
public class JavaSqlITCase extends AbstractTestBase {
#Test
public void testRowRegisterRowWithNames() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnv = TableEnvironment.getTableEnvironment(env);
StreamITCase.clear();
List<Row> data = new ArrayList<>();
data.add(Row.of(1, 1L, "Hi"));
data.add(Row.of(2, 2L, "Hello"));
data.add(Row.of(3, 2L, "Hello world"));
TypeInformation<?>[] types = {
BasicTypeInfo.INT_TYPE_INFO,
BasicTypeInfo.LONG_TYPE_INFO,
BasicTypeInfo.STRING_TYPE_INFO};
String[] names = {"a", "b", "c"};
RowTypeInfo typeInfo = new RowTypeInfo(types, names);
DataStream<Row> ds = env.fromCollection(data).returns(typeInfo);
Table in = tableEnv.fromDataStream(ds, "a,b,c");
tableEnv.registerTable("MyTableRow", in);
String sqlQuery = "SELECT a,c FROM MyTableRow";
Table result = tableEnv.sqlQuery(sqlQuery);
DataStream<Row> resultSet = tableEnv.toAppendStream(result, Row.class);
resultSet.addSink(new StreamITCase.StringSink<Row>());
env.execute();
List<String> expected = new ArrayList<>();
expected.add("1,Hi");
expected.add("2,Hello");
expected.add("3,Hello world");
StreamITCase.compareWithList(expected);
}
}
the related code is here

Related

Kotlin test expected SingletonMap but was LinkedHashMap

I am new to Kotlin and Java so bear with me but I just wrote a Kotlin test as follows:
package com.squareup.cash.transactiongraph.service.actions
import com.squareup.cash.transactiongraph.TransactionGraphTestingModule
import com.squareup.cash.transactiongraph.client.franklin.FakeFranklinClient
import com.squareup.cash.transactiongraph.dataloader.DataLoaderRegistryFactory
import com.squareup.cash.transactiongraph.graphql.GraphQLContextFactory
import com.squareup.cash.transactiongraph.graphql.TransactionGraphContextFactory
import com.squareup.cash.transactiongraph.service.TransactionGraphGraphqlModule
import com.squareup.graphql.dataloaders.FlowDataLoaderDispatcher
import kotlinx.coroutines.future.await
import kotlinx.coroutines.runBlocking
import misk.testing.MiskTest
import misk.testing.MiskTestModule
import okhttp3.Headers
import org.junit.jupiter.api.Test
import org.assertj.core.api.Assertions.assertThat
#MiskTest(startService = true)
class CashCustomerTransactionsQueryTest {
#MiskTestModule
private val module = TransactionGraphTestingModule()
#Test
fun `returns an array of CashTransactions`() = runBlocking<Unit> {
val query = """
{
cashCustomerTransactions(customerToken: "customerToken") {
id
reasonCode
createdAt
}
}
""".trimIndent()
val result = execute(query)
assertThat(result["errors"]).isNull()
assertThat(result["data"]).isEqualTo(
mapOf(
"cashCustomerTransactions" to arrayOf(
mapOf(
"createdAt" to "2019-03-20T18:26:18Z",
"id" to "TOKEN",
"reasonCode" to "CARD_PRESENT_PURCHASE"
)
)
)
)
}
private suspend fun execute(query: String): Map<String, Any> {
val franklinClient = FakeFranklinClient()
val dataLoaderRegistryFactory = DataLoaderRegistryFactory()
val flowDataLoaderDispatcher = FlowDataLoaderDispatcher(dataLoaderRegistryFactory)
return flowDataLoaderDispatcher.run { registry ->
val contextFactory: GraphQLContextFactory =
TransactionGraphContextFactory(franklinClient)
TransactionGraphGraphqlModule().graphQL().executeAsync {
it
.query(query)
.context(contextFactory.build(Headers.Builder().build(), registry))
}.await().toSpecification()
}
}
}
Upon running the test it fails with the following error: expected: "{"cashCustomerTransactions"=[{"createdAt"="2019-03-20T18:26:18Z", "id"="TOKEN", "reasonCode"="CARD_PRESENT_PURCHASE"}]} (SingletonMap#58303289)" but was: "{"cashCustomerTransactions"=[{"createdAt"="2019-03-20T18:26:18Z", "id"="TOKEN", "reasonCode"="CARD_PRESENT_PURCHASE"}]} (LinkedHashMap#c32f16d)"
The following responses appear to be identical with the exception that one is a SingletonMap and one is a LinkedHashMap. I do not understand why the types are different. What am I doing incorrectly? Can someone please point me in the right direction. Thank you
Change arrayOf to listOf and the problem will be solved.

How to import 'org.hamcrest.collection' in junit to test collection methods

I am using Junit 5 and getting the following error while importing collection packages
The import org.hamcrest.collection cannot be resolved
I need to validate assertThat method & hasSize() from collection for which the below import should be done.
import org.hamcrest.collection.IsEmptyCollection;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.collection.IsCollectionWithSize.hasSize;
import static org.hamcrest.collection.IsIterableContainingInAnyOrder.containsInAnyOrder;
import static org.hamcrest.collection.IsIterableContainingInOrder.contains;
import static org.hamcrest.MatcherAssert.assertThat;
import java.awt.List;
import java.util.ArrayList;
public class PortJUnit {
PortBO portBO;
ArrayList<Port> list = new ArrayList<Port>();
#Before
public void createObjectForPort()
{
portBO = new PortBO();
}
#Test
public void testPortDetails()
{
list.add(new Port(101,"abc","cbe"));
list.add(new Port(102,"abd","chennai"));
list.add(new Port(103,"abe","bangalore"));
list.add(new Port(104,"abf","mumbai"));
list.add(new Port(105,"abg","delhi"));
String detail = "107,abh,Toronto";
portBO.addElementAtSpecfiedPosition(list, 6, detail);
assertThat(list, hasSize(6));
}
}
I suggest that you use hamcrest-core dependency.
Please try to use instead - hamcrest-all-1.3.jar and your problem will be solved.

Cloud Dataflow executed successfully but not inserted data into Bigquery

I have a CSV file which contain header and data. I want to insert file(data) into Bigquery. I have written code to read file header and used for table/column mapping. I made it as dynamic file import(In Bigquery I have created one static empty table).
My cloud dataflow got executed successfully but data was not inserted into my Bigquery table. I am not sure what would be the problem.
I ran below code in Eclipse:
package com.coe.cog;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.StringReader;
import java.util.*;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.TextIO;
import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.MapElements;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.transforms.SimpleFunction;
import org.apache.beam.sdk.values.PCollection;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.api.services.bigquery.model.TableReference;
import com.google.api.services.bigquery.model.TableRow;
public class DemoPipeline_SameCodeProcess {
private static final Logger LOG = LoggerFactory.getLogger(DemoPipeline_SameCodeProcess.class);
//Get Project,dataset & table
public static TableReference getGCDSTableReference() {
TableReference ref = new TableReference();
ref.setProjectId("myownprojectbqs");
ref.setDatasetId("DS_Employee");
ref.setTableId("tLoadEmp");
return ref;
}
//split input file with header and data sepearately
static class TransformToTable extends DoFn<String, TableRow> {
#ProcessElement
public void processElement(ProcessContext c) throws IOException {
BufferedReader br = null;
String line = "";
String csvSplitBy = ",";
Integer incFlg = 0;
StringReader strdr = new StringReader(c.element().toString());
br = new BufferedReader(strdr);
line = br.readLine(); //Header as FirstLine
String[] colmnsHeader = line.split(csvSplitBy); //Only Header array
while ((line = br.readLine()) != null) {
// Content of the file excluding header
String[] colmnsList = line.split(csvSplitBy);
TableRow row = new TableRow();
for (int i = 0; i < colmnsList.length; i++) {
row.set(colmnsHeader[i], colmnsList[i]);
}
c.output(row);
}
}
}
public static void main(String[] args) {
MyOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(MyOptions.class);
options.setTempLocation("gs://demo-bucket-data/temp");
Pipeline p = Pipeline.create(options);
PCollection<String> lines = p.apply("Read From Storage", TextIO.read().from("gs://demo-bucket-data/Demo/Test/MasterLoad_WithHeader.csv"));
PCollection<TableRow> rows = lines.apply("Transform To Table",ParDo.of(new TransformToTable()));
rows.apply("Write To Table",BigQueryIO.writeTableRows().to(getGCDSTableReference())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_NEVER));
p.run();
}
}
Input File Format(MasterLoad_WithHeader.csv):-
ID,NAME,AGE,SEX
1,John,25,M
2,Smith,28,M
3,Josephine,22,F

Managing google cloud instances using jcloud api

I want to add and list all the instances/VM under my project in google cloud using Jcloud api. In this code, I am assuming a node to be an instance.
I have set all the variables as required and extracted the private key from the json file. The context build takes place successfully.
images = compute.listImages() => lists all the images provided by google.
nodes = compute.listNodes() =>should list nodes but instead give null pointer exception.
Output=>
No of images 246
Exception in thread "main" java.lang.NullPointerException: group
at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:229)
at org.jclouds.compute.internal.FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.checkGroup(FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.java:124)
at org.jclouds.compute.internal.FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.sharedNameForGroup(FormatSharedNamesAndAppendUniqueStringToThoseWhichRepeat.java:120)
at org.jclouds.googlecomputeengine.compute.functions.FirewallTagNamingConvention$Factory.get(FirewallTagNamingConvention.java:39)
at org.jclouds.googlecomputeengine.compute.functions.InstanceToNodeMetadata.apply(InstanceToNodeMetadata.java:68)
at org.jclouds.googlecomputeengine.compute.functions.InstanceToNodeMetadata.apply(InstanceToNodeMetadata.java:43)
at com.google.common.base.Functions$FunctionComposition.apply(Functions.java:211)
at com.google.common.collect.Iterators$8.transform(Iterators.java:794)
at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48)
at com.google.common.collect.Iterators$7.computeNext(Iterators.java:646)
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
at com.google.common.collect.Iterators.addAll(Iterators.java:356)
at com.google.common.collect.Iterables.addAll(Iterables.java:350)
at com.google.common.collect.Sets.newLinkedHashSet(Sets.java:328)
at org.jclouds.compute.internal.BaseComputeService.listNodes(BaseComputeService.java:335)
at org.jclouds.examples.compute.basics.Example.main(Example.java:54)
package org.jclouds.examples.compute.basics;
import static com.google.common.base.Charsets.UTF_8;
import static org.jclouds.compute.config.ComputeServiceProperties.TIMEOUT_SCRIPT_COMPLETE;
import java.io.File;
import java.io.IOException;
import java.util.Properties;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.jclouds.ContextBuilder;
import org.jclouds.compute.ComputeService;
import org.jclouds.compute.ComputeServiceContext;
import org.jclouds.compute.domain.ComputeMetadata;
import org.jclouds.compute.domain.Image;
import org.jclouds.domain.Credentials;
import org.jclouds.enterprise.config.EnterpriseConfigurationModule;
import org.jclouds.googlecloud.GoogleCredentialsFromJson;
import org.jclouds.logging.slf4j.config.SLF4JLoggingModule;
import org.jclouds.sshj.config.SshjSshClientModule;
import com.google.common.base.Supplier;
import com.google.common.collect.ImmutableSet;
import com.google.common.io.Files;
import com.google.inject.Module;
public class Example {
public static void main(String[] args)
{
String provider = "google-compute-engine";
String identity = "***#developer.gserviceaccount.com";
String credential = "path to private key file ";
credential = getCredentialFromJsonKeyFile(credential);
Properties properties = new Properties();
long scriptTimeout = TimeUnit.MILLISECONDS.convert(20, TimeUnit.MINUTES);
properties.setProperty(TIMEOUT_SCRIPT_COMPLETE, scriptTimeout + "");
Iterable<Module> modules = ImmutableSet.<Module> of(
new SshjSshClientModule(),new SLF4JLoggingModule(),
new EnterpriseConfigurationModule());
ContextBuilder builder = ContextBuilder.newBuilder(provider)
.credentials(identity, credential)
.modules(modules)
.overrides(properties);
ComputeService compute=builder.buildView(ComputeServiceContext.class).getComputeService();
Set<? extends Image> images = compute.listImages();
System.out.printf(">> No of images %d%n", images.size());
Set<? extends ComputeMetadata> nodes = compute.listNodes();
System.out.printf(">> No of nodes/instances %d%n", nodes.size());
compute.getContext().close();
}
private static String getCredentialFromJsonKeyFile(String filename) {
try {
String fileContents = Files.toString(new File(filename), UTF_8);
Supplier<Credentials> credentialSupplier = new GoogleCredentialsFromJson(fileContents);
String credential = credentialSupplier.get().credential;
return credential;
} catch (IOException e) {
System.err.println("Exception reading private key from '%s': " + filename);
e.printStackTrace();
System.exit(1);
return null;
}
}
}

web service test ejb3 passing as Date parameter

My web service looks like
import entities.Expense;
import java.math.BigDecimal;
import java.util.Date;
import javax.ejb.EJB;
import javax.jws.WebService;
import javax.ejb.Stateless;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import logic.ExpenseSessionBeanLocal;
/**
*
* #author nikola
*/
#WebService(serviceName = "ExpenseWebService")
#Stateless()
public class ExpenseWebService {
#EJB
private ExpenseSessionBeanLocal ejbRef;// Add business logic below. (Right-click in editor and choose
// "Insert Code > Add Web Service Operation")
#WebMethod(operationName = "makeExpenseOnce")
public Expense makeExpenseOnce(#WebParam(name = "expenseName") String expenseName, #WebParam(name = "expenseRecipient") String expenseRecipient, #WebParam(name = "purpose") String purpose, #WebParam(name = "username") String username, #WebParam(name = "expenseDate") Date expenseDate, #WebParam(name = "amount") BigDecimal amount, #WebParam(name = "currency") String currency) {
return ejbRef.makeExpenseOnce(expenseName, expenseRecipient, purpose, username, expenseDate, amount, currency);
}
}
Image from test is below
How to input date in the test with javax.xml.datatype.XMLGregorianCalendar as parameter?
AFAIK it's not possible to pass Objects to the Webservice-Tester (other than Strings). At least I never managed to post BigIntegers, for instance.