WSO2 CEP - Custom Receiver Adapter: Event Formats - wso2

I am trying to build a custom receiver adaptor. Which will read from CSV file and push events to a stream.
As far a I understand, we have to follow any of the WSO2 standard format(TEXT, XML or JSON) to push data to a stream.
Problem is, CSV files doesn't match with any of the standard format stated above. We have to convert csv values to any of the supported format within the custom adapter.
As per my observation, WSO2 TEXT format doesn't support comma(,) within a string value. So, I have decided to convert CSV JSON.
My questions are below:
How to generate WSO2 TEXT events if values ave comma ?
(if point 1 is not possible) In my custom adapter MessageType, if I add either only TEXT or all 3 (TEXT, XML, JSON) it works fine. But if I add only JSON I get below error. My target is to add only JSON and convert all the CSV to JSON to avoid confusion.
[2016-09-19 15:38:02,406] ERROR {org.wso2.carbon.event.receiver.core.EventReceiverDeployer} - Error, Event Receiver not deployed and in inactive state, Text Mapping is not supported by event adapter type file

To read from CSV file and push events to a stream, you could use the file-tail adapter. Refer the sample 'Receiving Custom RegEx Text Events via File Tail'. This sample contains the regex patterns which you could use to map your CSV input.
In addition to this, as Charini has suggested in a comment, you could also check out the event simulator. However, the event simulator is not an event receiver - meaning, it will not receive events in realtime, rather it will "play" a previously defined set of events (in the CSV file, in this case) to simulate a flow of events. It will not continuously monitor the file for new events. If you want to monitor the file for new events, then consider using the file-tail adapter.

I have just made it. Not an elegant way. However it worked fine for me.
As I have mentioned, JSON format is the most flexible one to me. I am reading from file and converting each line/event to WSO2 JSON format.
Issue with this option was, I want to limit message format only to JSON from management console ("Message Format" menu while creating new receiver). If I add only JSON [supportInputMessageTypes.add(MessageType.JSON)] it shows error as I mentioned in question#2 above.
The solution is, instead of putting static variable from MessageType class, use corresponding string directly. So now, my method "getSupportedMessageFormats()" in EventAdapterFactory class is as below:
#Override
public List<String> getSupportedMessageFormats() {
List<String> supportInputMessageTypes = new ArrayList<String>();
// just converting the type to string value
// to avoid error "Text Mapping is not supported by event adapter type file"
String jsonType = MessageType.JSON;
supportInputMessageTypes.add(jsonType);
//supportInputMessageTypes.add(MessageType.JSON);
//supportInputMessageTypes.add(MessageType.XML);
//supportInputMessageTypes.add(MessageType.TEXT);
return supportInputMessageTypes;
}
My request to WSO2 team, please allow JSON format event adapter type file.
Thanks, Obaid

Related

QT C++ Project (Creating Login with XML)

I have question about my internship project. They want me to create a basic Login page(ID, Password). I create a XML file for Username and Password. The program should check the XML file for username and password*. If they are correct it will direct to a second window. I'm stuck on processing XML file for username and password. How can read those information in XML file.
As #JarMan said, I would recommend the QXmlStreamReader. You can fill it with a file (QIODevice), QString, QByteArray, etc...
Parsing a value could e.g. look like that
xml.attributes().value( attribute ).toString();
if attribute is a QString and xml is the QXmlStreamReader.
See the doc https://doc.qt.io/qt-5/qxmlstreamreader.html
There are several ways to do it. Marris mentioned one, but another one is to have this kind of code generated automatically. The way this works is that you first write an XML Schema that describes what your XML data looks like. An introduction to the XML Schema language can be found e. g. here.
Then you use an XML Schema compiler to translate the XML Schema to C++ classes. The schema compiler will also generate code to parse an XML file into objects, meaning you don't have to write any code to deal with XML by hand. It's a purely declarative approach: declare what the data looks like, and let the computer figure out the details.

Is it possible to write back to data file in postman?

While working with postman, data.someVariable returns data from within a csv file that can also be used as {{someVariable}} in uri/json.
This gives us the data for that variable from that row/iteration.
Is there a mechanism to write back to the data file by doing something like postman.setData('responseCode') = responseCode.
This would be really helpful to store response code in the data file and to record call wise details in same format as the input within csv.
The only solution I figured out is
to populate json objects in the environment with information about the data file name and structure/values of information to be added
to create a separate web service (maybe in node.js) that exposes an http call to write to a file and takes in as parameter a json input as the one created in the environment as mentioned above and writes that to a file / original data file (or a copy of it) in the desired format
to call the above mentioned web service call at the end of each run or desired rest call execution to generate step wise information/debug report
There is no way to write back to data file in postman as of now .
However, you can populate that in your environment file at run time using
pm.environment.set("varname")
keep varname in such a way that you understand this is the variable you wanted to write back into data file.

JMeter - How to extract values from a response which has been decoded from base64 and stored in a variable? All under the same sampler

I am trying to test a webservice's performance, and having a few issues with using and passing variables. There are multiple sequential requests, which depend on some data coming from a previous response. All requests need to be encoded to base64 and placed in a SOAP envelope namespace before sending it to the endpoint. It returns and encoded response which needs to be decoded to see the xml values which need to be used for the next request. What I have done so far is:
1) Beanshell preprocessor added to first sample to encode the payload which is called from a file.
2) Regex to pull the encoded response bit from whole response.
3) Beanshell post processor to decode the response and write to a file (just in case). I have stored the decoded response in a variable 'Output' and I know this works since it writes the response to file correctly.
4) After this, I have added 4 regex extractors and tried various things such as apply to different parts, check different fields, check JMeter variable etc. However, it doesn't seem to work.
This is what my tree is looking like.
JMeter Tree
I am storing the decoded response to 'Output' variable like this and it works since it's writing to file properly:
import org.apache.commons.io.FileUtils;
import org.apache.commons.codec.binary.Base64;
String Createresponse= vars.get("Createregex");
vars.put("response",new String(Base64.decodeBase64(Createresponse.getBytes("UTF-8"))));
Output = vars.get("response");
f = new FileOutputStream("filepath/Createresponse.txt");
p = new PrintStream(f);
this.interpreter.setOut(p);
print(Output);
f.close();
And this is how I using Regex after that, I have tried different options:
Regex settings
Unfortunately though, the regex is not picking up these values from 'Output' variable. I basically need them saved so i can use ${docID} in the payload file for next request.
Any help on this is appreciated! Also happy to provide more detail if needed.
EDIT:
I had a follow up question. I am trying to run this with multiple users. I have a field ${searchuser} in my payload xml file called in the pre-processor here.
The CSV Data set above it looks like this:
However, it is not picking up the values from CSV and substituting in the payload file. Any help is appreciated!
You have 2 problems with your Regular Expression Extractor configuration:
Apply to: needs to be response
Field to check: needs to be Body, Body as a Document is being used for binary file formants like PDF or Word.
By the way, you can do Base64 decoding and encoding using __base64Decode() and __base64Encode() functions available via JMeter Plugins. The plugins in their turn can be installed in one click using Plugin Manager

Grok Parse Failure on Custom Log Format and regex in logstash

I have a custom log format ,i am new to it so trying to figure out how it works . It is not getting parsed in logstash .Can someone help to identify the issue.
Logformat is as follows
{u'key_id': u'1sdfasdfvaa/sd456dfdffas/zasder==', u'type': u'AUDIO'}, {u'key_id': u'iu-dsfaz+ka/q1sdfQ==', u'type': u'HD'}], u'model': u'Level1', u'license_metadata': {u'license_type': u'STREAMING THE SET', u'request_type': u'NEW', u'content_id': u'AAAA='}, u'message_type': u'LICENSE', u'cert_serial_number': u'AAAASSSSEERRTTYUUIIOOOasa='}
I need to get it parsed in logstash and then store it in elasticsearch
The problem is the none of the existing grok pattern are taking care of it and i am unaware of regex custom config
Alain's comment may be useful to you, if that log is, in fact, coming in as JSON you may want to look at the JSON Filter to automajically parse a JSON message into an elastic friendly format or using the JSON Codec in your input.
If you want to stick with grok, a great resource for building custom grok patterns is Grok Constructor.
It seems like you're dumping a json hash from python 2.x to a logfile, and then trying to parse it from logstash.
First - Fix your json format and encoding:
Your file doesn't correclty generated json strings. My recommendation is to fix it on your application before trying to consume the data from Logstash, if not you'll have to make use of some tricks to do it from there:
# Disable accii default charset and encode to UTF-8
js_string = json.dumps(u"someCharactersHere", ensure_ascii=False).encode('utf8')
# validate that your new string is correct
print js_string
Second - Use the Logstash JSON filter
Grok is module intended to parse any kind of text using regular expressions. Every expression converts to a variable, and those variable can be converted to event fields. You could do it, but it will be much more complex and prune to errors.
Your input has a format already (json), so you can make use of Logstash JSON Filter. It will do all the heavy lifting for you by converting the json structure into fields:
filter {
json {
# this is your default input. you shouldn't need to touch it
source => "message"
# you can map the result into a variable. Simply uncomment the
# following:
# target => "doc"
# note: if you don't use the target option. the filter will try to
# map the json string into fields into the 'root' of your event
}
}
Hope it helps,

Access Alternate Data Stream data from Thumbnail Image Handler code?

I'm trying to write a Thumbnail Image Handler for a custom file type based on Microsoft's example code here: Building Thumbnail Handlers.
As a proof of concept, I'm storing the base64 of a jpeg file into an ADS on my custom file.
My problem is that I'm not sure how to access this data from the DLL code. Microsoft's example uses base64 stored in an XML tag within the custom file type. I don't have the option to modify the internal components of my file type, leading me to using ADS as an example.
Is there a way to access an ADS from DLL handler code?