Gstreamer: display and record H264 stream simultaneously - gstreamer

Using gst-launch-1.0 I am able record a H.264 stream to a file with:
gst-launch-1.0 -e v4l2src device=/dev/video3 do-timestamp=true \
! video/x-h264, width=$WIDTH, height=$HEIGHT, framerate=$FRAMERATE/1 \
! h264parse ! mp4mux ! queue ! filesink location=video.mp4
and I am able to display the stream with:
gst-launch-1.0 -e v4l2src device=/dev/video3 \
! video/x-h264, width=$WIDTH,height=$HEIGHT,framerate=$FRAMERATE/1 ! tee name=t \
t. ! queue ! h264parse ! decodebin ! xvimagesink sync=false
However, doing both things on the same time fails. Then nothing happens. Command:
gst-launch-1.0 -e v4l2src device=/dev/video3 \
! video/x-h264,width=$WIDTH,height=$HEIGHT,framerate=$FRAMERATE/1 ! tee name=t \
t. ! queue ! h264parse ! decodebin ! xvimagesink sync=false \
t. ! queue ! h264parse ! mp4mux ! filesink location=video.mp4
Output: (no error - but no display and filesize = 0kB)
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstTee:t.GstTeePad:src_0: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstTee:t.GstTeePad:src_1: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstH264Parse:h264parse1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstTee:t.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse2.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse2.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse2.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse2.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, width=(int)640, height=(int)480, framerate=(fraction)15/1, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:7:1, parsed=(boolean)true, profile=(string)main, level=(string)4.1

Related

GStreamer H264 video and KLV mpegtsmux error

I need your help! I have a .ts video file that carries H264 video and KLV metadata stream. I need to demux the video, overlay an image on video and create a new .ts file with it and the same KLV stream.
I’ve tried different approaches using GStreamer without success. The following pipeline is the simplest version I’ve tried:
gst-launch-1.0 -t -v --gst-debug=3 filesrc location=/home/ubuntu/videoTeste.ts ! tsdemux name=demux demux. ! h264parse ! “video/x-h264” ! avdec_h264 ! x264enc bitrate = 10000 ! video/x-h264, profile=baseline ! queue ! mpegtsmux name=encod encod. ! filesink location=/home/ubuntu/teste.ts async=false demux. ! “meta/x-klv” ! queue ! encod.
This is the output:A definir pipeline para PAUSA …
0:00:00.053580180 19705 0x5647f4e61c00 WARN basesrc gstbasesrc.c:3600:gst_base_src_start_complete: pad not activated yet
Pipeline é PREROLLED …
A definir pipeline para REPRODUZIR …
New clock: GstSystemClock
/GstCapsFilter:capsfilter3: caps = meta/x-klv
/GstPipeline:pipeline0/GstCapsFilter:capsfilter4: caps = meta/x-klv
/GstPipeline:pipeline0/GstCapsFilter:capsfilter4.GstPad:src: caps = meta/x-klv, parsed=(boolean)true
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = meta/x-klv, parsed=(boolean)true
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = meta/x-klv, parsed=(boolean)true
/GstPipeline:pipeline0/MpegTsMux:encod.GstPad:sink_66: caps = meta/x-klv, parsed=(boolean)true
(gst-launch-1.0:19705): GStreamer-CRITICAL **: 11:56:27.551: gst_segment_to_running_time: assertion ‘segment->format == format’ failed
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
0:00:00.054452886 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 15508 will be dropped
0:00:00.054506922 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 24322 will be dropped
0:00:00.054553198 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 21270 will be dropped
0:00:00.054591203 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 26579 will be dropped
0:00:00.054630071 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25436 will be dropped
0:00:00.054663925 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25557 will be dropped
0:00:00.054697735 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25598 will be dropped
0:00:00.054734388 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25866 will be dropped
0:00:00.054768227 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25782 will be dropped
0:00:00.054805325 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 26169 will be dropped
0:00:00.054838313 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25996 will be dropped
0:00:00.054875026 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25735 will be dropped
0:00:00.054908194 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25991 will be dropped
0:00:00.054940617 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 26233 will be dropped
0:00:00.054974736 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19915 will be dropped
0:00:00.055007396 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 26268 will be dropped
0:00:00.055037714 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19609 will be dropped
0:00:00.055069936 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 26015 will be dropped
0:00:00.055103250 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19613 will be dropped
0:00:00.055135263 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25920 will be dropped
0:00:00.055164929 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19735 will be dropped
0:00:00.055197066 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25851 will be dropped
0:00:00.055235518 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19638 will be dropped
0:00:00.055268897 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 19733 will be dropped
0:00:00.055301416 19705 0x5647f4dfcea0 WARN h264parse gsth264parse.c:1349:gst_h264_parse_handle_frame: broken/invalid nal Type: 1 Slice, Size: 25492 will be dropped
Seems to be a sync problem between the H264 frames and the KLV chunks, I think…
I’ve tried changing the place of the queues, using a “tee” after filesrc and using separated demux, multiqueue…
I’m able to create a .ts file with H264 video only and I’m also able to extract and save to a file the KLV data. I’m even able to do this in the same pipeline:
gst-launch-1.0 -t -v --gst-debug=0 filesrc location=/home/ubuntu/videoTeste.ts ! tee name=t t. ! queue ! tsdemux name=demux demux. ! h264parse ! “video/x-h264” ! avdec_h264 ! x264enc bitrate = 10000 ! video/x-h264, profile=baseline ! mpegtsmux name=encod encod. ! filesink location=/home/ubuntu/teste.ts async=false t. ! queue ! tsdemux ! “meta/x-klv” ! filesink location=/home/ubuntu/klv.dat async=false
Do you have any suggestions to solve this?
Thank you!

AWS glue error in converting dynamic dataframe to spark

I am using AWS Glue crawler to read some data from S3 into a table.
I would like to then use the AWS Glue jobs to do some transformations. I am able to modify and run the script of a small file, but when I try to run it on larger data, I get the following error that seems to be complaining about converting Dynamic Frame to spark dataframe. I am not even sure how to start debugging it.
I didn't see many posts on this here -- only about sparkDF->Dynamic frames.
No older events found for the selected filter. clear filter.

18:49:31
er$$anonfun$init$1.apply(GrokReader.scala:62) at scala.collection.Iterator$$anon$9.next(Iterator.scala:162) at scala.collection.Iterator$$anon$16.hasNext(Iterator.scala:599) at com.amazonaws.services.glue.readers.GrokReader.hasNext(GrokReader.scala:117) at com.amazonaws.services.glue.hadoop.TapeHadoopRecordReader.nextKeyValue(TapeHadoopRecordReader.scala:73) at org.apache.spark.rdd.NewHadoopR
er$$anonfun$init$1.apply(GrokReader.scala:62)
at scala.collection.Iterator$$anon$9.next(Iterator.scala:162)
at scala.collection.Iterator$$anon$16.hasNext(Iterator.scala:599)
at com.amazonaws.services.glue.readers.GrokReader.hasNext(GrokReader.scala:117)
at com.amazonaws.services.glue.hadoop.TapeHadoopRecordReader.nextKeyValue(TapeHadoopRecordReader.scala:73)
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:230)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.aggregate(TraversableOnce.scala:214)
at scala.collection.AbstractIterator.aggregate(Iterator.scala:1334)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
2020-05-12 18:49:22,434 INFO [Thread-9] scheduler.DAGScheduler (Logging.scala:logInfo(54)) - Job 0 failed: fromRDD at DynamicFrame.scala:241, took 8327.404883 s
2020-05-12 18:49:22,450 WARN [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9983.0 in stage 0.0 (TID 9986, ip-172-32-50-149.us-west-2.compute.internal, executor 2): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,451 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 10000.0 in stage 0.0 (TID 10003, ip-172-32-50-149.us-west-2.compute.internal, executor 2): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,451 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9986.0 in stage 0.0 (TID 9989, ip-172-32-50-149.us-west-2.compute.internal, executor 2): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,451 WARN [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9985.0 in stage 0.0 (TID 9988, ip-172-32-50-149.us-west-2.compute.internal, executor 2): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,451 WARN [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9864.0 in stage 0.0 (TID 9864, ip-172-32-62-222.us-west-2.compute.internal, executor 5): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,454 INFO [dispatcher-event-loop-3] storage.BlockManagerInfo (Logging.scala:logInfo(54)) - Added broadcast_25_piece0 in memory on ip-172-32-56-53.us-west-2.compute.internal:34837 (size: 32.1 KB, free: 2.8 GB)
2020-05-12 18:49:22,455 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9900.0 in stage 0.0 (TID 9900, ip-172-32-62-222.us-west-2.compute.internal, executor 5): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,456 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9991.0 in stage 0.0 (TID 9994, ip-172-32-56-53.us-west-2.compute.internal, executor 4): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,456 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9949.0 in stage 0.0 (TID 9949, ip-172-32-56-53.us-west-2.compute.internal, executor 4): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,456 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9975.0 in stage 0.0 (TID 9977, ip-172-32-62-222.us-west-2.compute.internal, executor 5): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,456 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9995.0 in stage 0.0 (TID 9998, ip-172-32-62-222.us-west-2.compute.internal, executor 7): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,456 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 10001.0 in stage 0.0 (TID 10004, ip-172-32-62-222.us-west-2.compute.internal, executor 5): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,457 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9993.0 in stage 0.0 (TID 9996, ip-172-32-62-222.us-west-2.compute.internal, executor 7): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,457 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9939.0 in stage 0.0 (TID 9939, ip-172-32-62-222.us-west-2.compute.internal, executor 7): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,457 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9930.0 in stage 0.0 (TID 9930, ip-172-32-62-222.us-west-2.compute.internal, executor 7): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,457 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9998.0 in stage 0.0 (TID 10001, ip-172-32-54-163.us-west-2.compute.internal, executor 6): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,462 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9965.0 in stage 0.0 (TID 9967, ip-172-32-56-53.us-west-2.compute.internal, executor 1): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,463 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9934.0 in stage 0.0 (TID 9934, ip-172-32-56-53.us-west-2.compute.internal, executor 1): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,463 WARN [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9990.0 in stage 0.0 (TID 9993, ip-172-32-56-53.us-west-2.compute.internal, executor 1): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9992.0 in stage 0.0 (TID 9995, ip-172-32-56-53.us-west-2.compute.internal, executor 4): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9967.0 in stage 0.0 (TID 9969, ip-172-32-56-53.us-west-2.compute.internal, executor 4): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9982.0 in stage 0.0 (TID 9985, ip-172-32-54-163.us-west-2.compute.internal, executor 3): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9999.0 in stage 0.0 (TID 10002, ip-172-32-54-163.us-west-2.compute.internal, executor 3): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9984.0 in stage 0.0 (TID 9987, ip-172-32-54-163.us-west-2.compute.internal, executor 3): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,464 WARN [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9966.0 in stage 0.0 (TID 9968, ip-172-32-54-163.us-west-2.compute.internal, executor 3): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,467 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9996.0 in stage 0.0 (TID 9999, ip-172-32-54-163.us-west-2.compute.internal, executor 6): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,474 WARN [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9960.0 in stage 0.0 (TID 9962, ip-172-32-54-163.us-west-2.compute.internal, executor 6): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,474 WARN [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logWarning(66)) - Lost task 9980.0 in stage 0.0 (TID 9982, ip-172-32-54-163.us-west-2.compute.internal, executor 6): TaskKilled (Stage cancelled)
2020-05-12 18:49:22,514 INFO [dispatcher-event-loop-0] yarn.YarnAllocator (Logging.scala:logInfo(54)) - Driver requested a total number of 1 executor(s).
Traceback (most recent call last):
File "script_2020-05-12-16-29-01.py", line 30, in <module>
dns = datasource0.toDF()
File "/mnt/yarn/usercache/root/appcache/application_1589300850182_0001/container_1589300850182_0001_01_000001/PyGlue.zip/awsglue/dynamicframe.py", line 147, in toDF
return DataFrame(self._jdf.toDF(self.glue_ctx._jvm.PythonUtils.toSeq(scala_options)), self.glue_ctx)
File "/mnt/yarn/usercache/root/appcache/application_1589300850182_0001/container_1589300850182_0001_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/mnt/yarn/usercache/root/appcache/application_1589300850182_0001/container_1589300850182_0001_01_000001/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/mnt/yarn/usercache/root/appcache/application_1589300850182_0001/container_1589300850182_0001_01_000001/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o66.toDF.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 9929 in stage 0.0 failed 4 times, most recent failure: Lost task 9929.3 in stage 0.0 (TID 9983, ip-172-32-56-53.us-west-2.compute.internal, executor 1): java.io.IOException: too many length or distance symbols
at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.inflateBytesDirect(Native Method)
at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.decompress(ZlibDecompressor.java:225)
at org.apache.hadoop.io.compress.DecompressorStream.decompress(DecompressorStream.java:111)
at org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:105)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at com.amazonaws.services.glue.readers.BufferedStream.read(DynamicRecordReader.scala:91)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1$$anonfun$apply$1.apply$mcV$sp(GrokReader.scala:68)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1.apply(GrokReader.scala:66)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1.apply(GrokReader.scala:62)
at scala.collection.Iterator$$anon$9.next(Iterator.scala:162)
at scala.collection.Iterator$$anon$16.hasNext(Iterator.scala:599)
at com.amazonaws.services.glue.readers.GrokReader.hasNext(GrokReader.scala:117)
at com.amazonaws.services.glue.hadoop.TapeHadoopRecordReader.nextKeyValue(TapeHadoopRecordReader.scala:73)
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:230)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.aggregate(TraversableOnce.scala:214)
at scala.collection.AbstractIterator.aggregate(Iterator.scala:1334)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2158)
at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1098)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.fold(RDD.scala:1092)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1161)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1137)
at org.apache.spark.sql.glue.util.SchemaUtils$.fromRDD(SchemaUtils.scala:72)
at com.amazonaws.services.glue.DynamicFrame.recomputeSchema(DynamicFrame.scala:241)
at com.amazonaws.services.glue.DynamicFrame.schema(DynamicFrame.scala:227)
at com.amazonaws.services.glue.DynamicFrame.toDF(DynamicFrame.scala:290)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: too many length or distance symbols
at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.inflateBytesDirect(Native Method)
at org.apache.hadoop.io.compress.zlib.ZlibDecompressor.decompress(ZlibDecompressor.java:225)
at org.apache.hadoop.io.compress.DecompressorStream.decompress(DecompressorStream.java:111)
at org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:105)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at com.amazonaws.services.glue.readers.BufferedStream.read(DynamicRecordReader.scala:91)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1$$anonfun$apply$1.apply$mcV$sp(GrokReader.scala:68)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1.apply(GrokReader.scala:66)
at com.amazonaws.services.glue.readers.GrokReader$$anonfun$init$1.apply(GrokReader.scala:62)
at scala.collection.Iterator$$anon$9.next(Iterator.scala:162)
at scala.collection.Iterator$$anon$16.hasNext(Iterator.scala:599)
at com.amazonaws.services.glue.readers.GrokReader.hasNext(GrokReader.scala:117)
at com.amazonaws.services.glue.hadoop.TapeHadoopRecordReader.nextKeyValue(TapeHadoopRecordReader.scala:73)
at org.apache.spark.rdd.NewHadoopRDD$$anon$1.hasNext(NewHadoopRDD.scala:230)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
at scala.collection.TraversableOnce$class.aggregate(TraversableOnce.scala:214)
at scala.collection.AbstractIterator.aggregate(Iterator.scala:1334)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$24.apply(RDD.scala:1145)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1$$anonfun$25.apply(RDD.scala:1146)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:801)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
2020-05-12 18:49:22,587 ERROR [Driver] yarn.ApplicationMaster (Logging.scala:logError(70)) - User application exited with status 1
2020-05-12 18:49:22,588 INFO [Driver] yarn.ApplicationMaster (Logging.scala:logInfo(54)) - Final app status: FAILED, exitCode: 1, (reason: User application exited with status 1)
2020-05-12 18:49:22,591 INFO [pool-4-thread-1] spark.SparkContext (Logging.scala:logInfo(54)) - Invoking stop() from shutdown hook
2020-05-12 18:49:22,594 INFO [pool-4-thread-1] server.AbstractConnector (AbstractConnector.java:doStop(318)) - Stopped Spark#3a4d5cae{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
2020-05-12 18:49:22,595 INFO [pool-4-thread-1] ui.SparkUI (Logging.scala:logInfo(54)) - Stopped Spark web UI at http://ip-172-32-50-149.us-west-2.compute.internal:40355
2020-05-12 18:49:22,597 INFO [dispatcher-event-loop-2] yarn.YarnAllocator (Logging.scala:logInfo(54)) - Driver requested a total number of 0 executor(s).
2020-05-12 18:49:22,598 INFO [pool-4-thread-1] cluster.YarnClusterSchedulerBackend (Logging.scala:logInfo(54)) - Shutting down all executors
2020-05-12 18:49:22,598 INFO [dispatcher-event-loop-3] cluster.YarnSchedulerBackend$YarnDriverEndpoint (Logging.scala:logInfo(54)) - Asking each executor to shut down
2020-05-12 18:49:22,600 INFO [pool-4-thread-1] cluster.SchedulerExtensionServices (Logging.scala:logInfo(54)) - Stopping SchedulerExtensionServices
(serviceOption=None,
services=List(),
started=false)
2020-05-12 18:49:22,604 INFO [dispatcher-event-loop-3] spark.MapOutputTrackerMasterEndpoint (Logging.scala:logInfo(54)) - MapOutputTrackerMasterEndpoint stopped!
2020-05-12 18:49:22,616 INFO [pool-4-thread-1] memory.MemoryStore (Logging.scala:logInfo(54)) - MemoryStore cleared
2020-05-12 18:49:22,616 INFO [pool-4-thread-1] storage.BlockManager (Logging.scala:logInfo(54)) - BlockManager stopped
2020-05-12 18:49:22,617 INFO [pool-4-thread-1] storage.BlockManagerMaster (Logging.scala:logInfo(54)) - BlockManagerMaster stopped
2020-05-12 18:49:22,618 INFO [dispatcher-event-loop-2] scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint (Logging.scala:logInfo(54)) - OutputCommitCoordinator stopped!
2020-05-12 18:49:22,621 INFO [pool-4-thread-1] spark.SparkContext (Logging.scala:logInfo(54)) - Successfully stopped SparkContext
2020-05-12 18:49:22,623 INFO [pool-4-thread-1] yarn.ApplicationMaster (Logging.scala:logInfo(54)) - Unregistering ApplicationMaster with FAILED (diag message: User application exited with status 1)
2020-05-12 18:49:22,631 INFO [pool-4-thread-1] impl.AMRMClientImpl (AMRMClientImpl.java:unregisterApplicationMaster(476)) - Waiting for application to be successfully unregistered.
2020-05-12 18:49:22,733 INFO [pool-4-thread-1] yarn.ApplicationMaster ```

WSO2 API Manager - Can't start the server

I would like to install and use WSO2 API Manager on my computer but I have some issues when I try to start the server.
I use the API Manager 3.0.0 version and I've installed it through .msi for Windows.
Here are the errors that come when I try to start the server with a clean setup :
[Broker] BRK-1001 : Startup : Version: 0.11 Build: 90784:90849
[Broker] MNG-1001 : Startup
[Broker] MNG-1004 : Ready : Using the platform JMX Agent
[Broker] BRK-1002 : Starting : Listening on TCP port 5672
[2020-04-01 11:18:18,692] INFO - listening [Broker] BRK-1002 : Starting : Listening on TCP port 5672
[Broker] BRK-1002 : Starting : Listening on TCP/SSL port 8672
[2020-04-01 11:18:18,706] INFO - listening [Broker] BRK-1002 : Starting : Listening on TCP/SSL port 8672
[Broker] BRK-1004 : Qpid Broker Ready
[2020-04-01 11:18:25,147] WARN - RevokedJWTTokensRetriever Failed retrieving revoked JWT token signatures from remote endpoint: Connection refused: connect. Retrying after 15 seconds...
[2020-04-01 11:18:25,147] WARN - KeyTemplateRetriever Failed retrieving throttling data from remote endpoint: Connection refused: connect. Retrying after 15 seconds...
[2020-04-01 11:18:25,161] WARN - BlockingConditionRetriever Failed retrieving Blocking Conditions from remote endpoint: Connection refused: connect. Retrying after 15 seconds...
[2020-04-01 11:18:35,958] ERROR - DataEndpointConnectionWorker Error while trying to connect to the endpoint. Cannot borrow client for ssl://192.168.1.42:9711
org.wso2.carbon.databridge.agent.exception.DataEndpointAuthenticationException: Cannot borrow client for ssl://192.168.1.42:9711
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.connect(DataEndpointConnectionWorker.java:147) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.run(DataEndpointConnectionWorker.java:59) [org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
Caused by: org.wso2.carbon.databridge.agent.exception.DataEndpointException: Error while opening socket to 192.168.1.42:9711. Connection timed out: connect
at org.wso2.carbon.databridge.agent.endpoint.binary.BinarySecureClientPoolFactory.createClient(BinarySecureClientPoolFactory.java:75) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at org.wso2.carbon.databridge.agent.client.AbstractClientPoolFactory.makeObject(AbstractClientPoolFactory.java:39) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1212) ~[commons-pool_1.5.6.wso2v1.jar:?]
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.connect(DataEndpointConnectionWorker.java:137) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
... 6 more
Caused by: java.net.ConnectException: Connection timed out: connect
at java.net.PlainSocketImpl.connect0(Native Method) ~[?:?]
at java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:101) ~[?:?]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) ~[?:?]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:242) ~[?:?]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) ~[?:?]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) ~[?:?]
at java.net.Socket.connect(Socket.java:609) ~[?:?]
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:285) ~[?:?]
at sun.security.ssl.SSLSocketImpl.<init>(SSLSocketImpl.java:144) ~[?:?]
at sun.security.ssl.SSLSocketFactoryImpl.createSocket(SSLSocketFactoryImpl.java:88) ~[?:?]
at org.wso2.carbon.databridge.agent.endpoint.binary.BinarySecureClientPoolFactory.createClient(BinarySecureClientPoolFactory.java:58) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at org.wso2.carbon.databridge.agent.client.AbstractClientPoolFactory.makeObject(AbstractClientPoolFactory.java:39) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1212) ~[commons-pool_1.5.6.wso2v1.jar:?]
at org.wso2.carbon.databridge.agent.endpoint.DataEndpointConnectionWorker.connect(DataEndpointConnectionWorker.java:137) ~[org.wso2.carbon.databridge.agent_5.2.12.jar:?]
... 6 more
[2020-04-01 11:18:39,720] ERROR - QpidServiceComponent Wait until Qpid server starts on port 5672
java.net.ConnectException: Connection timed out: connect
at java.net.PlainSocketImpl.connect0(Native Method) ~[?:?]
at java.net.PlainSocketImpl.socketConnect(PlainSocketImpl.java:101) ~[?:?]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:399) ~[?:?]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:240) ~[?:?]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:224) ~[?:?]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:403) ~[?:?]
at java.net.Socket.connect(Socket.java:609) ~[?:?]
at java.net.Socket.connect(Socket.java:558) ~[?:?]
at java.net.Socket.<init>(Socket.java:454) ~[?:?]
at java.net.Socket.<init>(Socket.java:264) ~[?:?]
at org.wso2.carbon.andes.internal.QpidServiceComponent.startAMQPServer(QpidServiceComponent.java:463) [org.wso2.carbon.andes_3.3.3.jar:?]
at org.wso2.carbon.andes.internal.QpidServiceComponent.startAndesBroker(QpidServiceComponent.java:423) [org.wso2.carbon.andes_3.3.3.jar:?]
at org.wso2.carbon.andes.internal.QpidServiceComponent.activate(QpidServiceComponent.java:132) [org.wso2.carbon.andes_3.3.3.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponent.activate(ServiceComponent.java:235) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.activate(ServiceComponentProp.java:146) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:345) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:620) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(InstanceProcess.java:197) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SCRManager.java:222) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.serviceChanged(FilteredServiceListener.java:113) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:985) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:151) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEventPrivileged(ServiceRegistry.java:866) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEvent(ServiceRegistry.java:804) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.register(ServiceRegistrationImpl.java:130) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.registerService(ServiceRegistry.java:228) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.registerService(BundleContextImpl.java:525) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.registerService(BundleContextImpl.java:544) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.wso2.carbon.server.admin.internal.ServerAdminServiceComponent.activate(ServerAdminServiceComponent.java:99) [org.wso2.carbon.server.admin_4.5.1.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponent.activate(ServiceComponent.java:260) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.activate(ServiceComponentProp.java:146) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:345) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:620) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(InstanceProcess.java:197) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SCRManager.java:222) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.serviceChanged(FilteredServiceListener.java:113) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:985) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:151) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEventPrivileged(ServiceRegistry.java:866) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEvent(ServiceRegistry.java:804) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.register(ServiceRegistrationImpl.java:130) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.registerService(ServiceRegistry.java:228) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.registerService(BundleContextImpl.java:525) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.registerService(BundleContextImpl.java:544) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.wso2.carbon.core.init.CarbonServerManager.initializeCarbon(CarbonServerManager.java:529) [org.wso2.carbon.core_4.5.1.jar:?]
at org.wso2.carbon.core.init.CarbonServerManager.start(CarbonServerManager.java:234) [org.wso2.carbon.core_4.5.1.jar:?]
at org.wso2.carbon.core.internal.CarbonCoreServiceComponent.activate(CarbonCoreServiceComponent.java:85) [org.wso2.carbon.core_4.5.1.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponent.activate(ServiceComponent.java:260) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.activate(ServiceComponentProp.java:146) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:345) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:620) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(InstanceProcess.java:197) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SCRManager.java:222) [org.eclipse.equinox.ds_1.4.400.v20160226-2036.jar:?]
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.serviceChanged(FilteredServiceListener.java:113) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:985) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:234) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:151) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEventPrivileged(ServiceRegistry.java:866) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEvent(ServiceRegistry.java:804) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.register(ServiceRegistrationImpl.java:130) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.registerService(ServiceRegistry.java:228) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.osgi.internal.framework.BundleContextImpl.registerService(BundleContextImpl.java:525) [org.eclipse.osgi_3.14.0.v20190517-1309.jar:?]
at org.eclipse.equinox.http.servlet.internal.Activator.registerHttpService(Activator.java:81) [org.eclipse.equinox.http.servlet_1.1.400.v20130418-1354.jar:?]
at org.eclipse.equinox.http.servlet.internal.Activator.addProxyServlet(Activator.java:60) [org.eclipse.equinox.http.servlet_1.1.400.v20130418-1354.jar:?]
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.init(ProxyServlet.java:40) [org.eclipse.equinox.http.servlet_1.1.400.v20130418-1354.jar:?]
at org.wso2.carbon.tomcat.ext.servlet.DelegationServlet.init(DelegationServlet.java:38) [org.wso2.carbon.tomcat.ext_4.5.1.jar:?]
at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1122) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1077) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:971) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4868) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5177) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374) [tomcat_9.0.22.wso2v1.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75) [tomcat_9.0.22.wso2v1.jar:?]
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:140) [?:?]
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:841) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1384) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1374) [tomcat_9.0.22.wso2v1.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75) [tomcat_9.0.22.wso2v1.jar:?]
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:140) [?:?]
at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:909) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:262) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [tomcat_9.0.22.wso2v1.jar:?]
at org.wso2.carbon.tomcat.ext.service.ExtendedStandardService.startInternal(ExtendedStandardService.java:52) [org.wso2.carbon.tomcat.ext_4.5.1.jar:?]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:932) [tomcat_9.0.22.wso2v1.jar:?]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183) [tomcat_9.0.22.wso2v1.jar:?]
at org.wso2.carbon.tomcat.internal.CarbonTomcat.start(CarbonTomcat.java:113) [org.wso2.carbon.tomcat_4.5.1.jar:?]
at org.wso2.carbon.tomcat.internal.ServerManager$1.run(ServerManager.java:167) [org.wso2.carbon.tomcat_4.5.1.jar:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
I have already tried to put an offset on ports, still not working.
I have checked port use through netstat and none of WSO2 ports are already used.
Thank you for your help.
Best Regards,
Edit : I finally found how to solve my problem. It seems that localhost is not the default address and so I had to stipulate it manually in the deployment.toml file for throttling and amqp broker.
Here is what I added :
[apim.throttling]
receiver_url = "tcp://localhost:9611"
receiver_auth_url = "ssl://localhost:9711"
[broker.transport.amqp]
bind_address = "localhost"
[broker.transport.amqp.default_connection]
enabled = true
port = 5672
The configuration file that deployment.toml located at
./repository/conf/deployment.toml
I found the file using below command on osx.
find ./ -name "deployment.toml"

Pubsub subscription receives unknown host exception

I have implemented pubsub poller which subscribes to specific topic and than schedules tasks which poll at specified interval for new messages.
I see it occasionally getting UnknownHostException which results in no messages received -
java.net.UnknownHostException: accounts.google.com
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184) ~[na:1.8.0_25]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[na:1.8.0_25]
at java.net.Socket.connect(Socket.java:589) ~[na:1.8.0_25]
at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:649) ~[na:1.8.0_25]
at sun.net.NetworkClient.doConnect(NetworkClient.java:175) ~[na:1.8.0_25]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) ~[na:1.8.0_25]
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) ~[na:1.8.0_25]
at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:275) ~[na:1.8.0_25]
at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:371) ~[na:1.8.0_25]
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:191) ~[na:1.8.0_25]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1103) ~[na:1.8.0_25]
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:997) ~[na:1.8.0_25]
at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:177) ~[na:1.8.0_25]
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream0(HttpURLConnection.java:1281) ~[na:1.8.0_25]
at sun.net.www.protocol.http.HttpURLConnection.getOutputStream(HttpURLConnection.java:1256) ~[na:1.8.0_25]
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getOutputStream(HttpsURLConnectionImpl.java:250) ~[na:1.8.0_25]
at com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:77) ~[google-http-client-1.22.0.jar:1.22.0]
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:981) ~[google-http-client-1.22.0.jar:1.22.0]
at com.google.api.client.auth.oauth2.TokenRequest.executeUnparsed(TokenRequest.java:283) ~[google-oauth-client-1.22.0.jar:1.22.0]
at com.google.api.client.auth.oauth2.TokenRequest.execute(TokenRequest.java:307) ~[google-oauth-client-1.22.0.jar:1.22.0]
at com.google.api.client.googleapis.auth.oauth2.GoogleCredential.executeRefreshToken(GoogleCredential.java:384) ~[google-api-client-1.22.0.jar:1.22.0]
at com.google.api.client.auth.oauth2.Credential.refreshToken(Credential.java:489) ~[google-oauth-client-1.22.0.jar:1.22.0]
at com.google.api.client.auth.oauth2.Credential.intercept(Credential.java:217) ~[google-oauth-client-1.22.0.jar:1.22.0]
at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:868) ~[google-http-client-1.22.0.jar:1.22.0]
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419) ~[google-api-client-1.22.0.jar:1.22.0]
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352) ~[google-api-client-1.22.0.jar:1.22.0]
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469) ~[google-api-client-1.22.0.jar:1.22.0]
at com.company.events.subscriber.PubsubPollerImpl.doPoll(PubsubPollerImpl.java:86) [classes/:na]
at com.company.events.subscriber.PubsubPollerImpl.poll(PubsubPollerImpl.java:58) [classes/:na]
at com.company.events.subscriber.PollingEventListenerContainer.pull(PollingEventListenerContainer.java:132) [classes/:na]
at com.company.events.subscriber.PollingEventListenerContainer.lambda$scheduleRecurring$0(PollingEventListenerContainer.java:120) [classes/:na]
at com.company.events.subscriber.PollingEventListenerContainer$$Lambda$1/777341499.run(Unknown Source) [classes/:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_25]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_25]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_25]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_25]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_25]

Issue with AS 5.0.0 and BAM 2.0.0 when I create a new tenant in AS

When in AS 5.0.0 I create a tenant if BAM statistic bundle is disable all work great, I can login into the tenant. But if I activate the BAM statistic bundle and try to login into the same tenant I see this error in console:
[2012-10-15 18:17:53,653] INFO {org.wso2.carbon.core.services.authentication.AuthenticationAdmin} - 'admin#carbon.super [-1234]' logged out at [2012-10-15 18:17:53,0653]
[2012-10-15 18:18:03,306] jorgeio#cdae.uci.cu [2] [Application Server] INFO {org.wso2.carbon.core.services.util.CarbonAuthenticationUtil} - 'jorgeio#cdae.uci.cu [2]' logged in at [2012-10-15 18:18:03,306-0400]
[2012-10-15 18:18:03,327] INFO {org.wso2.carbon.core.multitenancy.TenantAxisConfigurator} - Creating tenant AxisConfiguration for tenant: cdae.uci.cu[2]
[2012-10-15 18:18:03,450] WARN {org.wso2.carbon.stratos.landing.page.deployer.LandingPageWebappDeployer} - Product landing page not found.
[2012-10-15 18:18:03,565] #carbon.super [2] [Application Server]ERROR {org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils} - Error occurred while running deployment for tenant cdae.uci.cu
java.lang.NullPointerException
at org.wso2.carbon.bam.service.data.publisher.publish.StreamDefinitionCreatorUtil.getStreamDefinition(StreamDefinitionCreatorUtil.java:30)
at org.wso2.carbon.bam.service.data.publisher.internal.ServiceStatisticsAxis2ConfigurationContextObserver.setEventingConfigDataSpecificForTenant(ServiceStatisticsAxis2ConfigurationContextObserver.java:76)
at org.wso2.carbon.bam.service.data.publisher.internal.ServiceStatisticsAxis2ConfigurationContextObserver.createdConfigurationContext(ServiceStatisticsAxis2ConfigurationContextObserver.java:51)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.createTenantConfigurationContext(TenantAxisUtils.java:326)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.getTenantConfigurationContext(TenantAxisUtils.java:121)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.getTenantAxisConfiguration(TenantAxisUtils.java:101)
at org.wso2.carbon.activation.module.ActivationHandler.invoke(ActivationHandler.java:90)
at org.apache.axis2.engine.Phase.invokeHandler(Phase.java:340)
at org.apache.axis2.engine.Phase.invoke(Phase.java:313)
at org.apache.axis2.engine.AxisEngine.invoke(AxisEngine.java:262)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:168)
at org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:169)
at org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:82)
at org.wso2.carbon.core.transports.local.CarbonLocalTransportSender.finalizeSendWithToAddress(CarbonLocalTransportSender.java:45)
at org.apache.axis2.transport.local.LocalTransportSender.invoke(LocalTransportSender.java:77)
at org.apache.axis2.engine.AxisEngine.send(AxisEngine.java:443)
at org.apache.axis2.description.OutInAxisOperationClient.send(OutInAxisOperation.java:406)
at org.apache.axis2.description.OutInAxisOperationClient.executeImpl(OutInAxisOperation.java:229)
at org.apache.axis2.client.OperationClient.execute(OperationClient.java:165)
at org.wso2.carbon.core.commons.stub.loggeduserinfo.LoggedUserInfoAdminStub.getUserInfo(LoggedUserInfoAdminStub.java:187)
at org.wso2.carbon.ui.AbstractCarbonUIAuthenticator.setUserAuthorizationInfo(AbstractCarbonUIAuthenticator.java:278)
at org.wso2.carbon.ui.AbstractCarbonUIAuthenticator.processUserAuthorization(AbstractCarbonUIAuthenticator.java:196)
at org.wso2.carbon.ui.DefaultCarbonAuthenticator.authenticate(DefaultCarbonAuthenticator.java:198)
at org.wso2.carbon.ui.DefaultCarbonAuthenticator.authenticate(DefaultCarbonAuthenticator.java:121)
at org.wso2.carbon.ui.CarbonUILoginUtil.handleLogin(CarbonUILoginUtil.java:331)
at org.wso2.carbon.ui.CarbonSecuredHttpContext.handleSecurity(CarbonSecuredHttpContext.java:223)
at org.eclipse.equinox.http.servlet.internal.ServletRegistration.handleRequest(ServletRegistration.java:86)
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.processAlias(ProxyServlet.java:111)
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.service(ProxyServlet.java:67)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.wso2.carbon.tomcat.ext.servlet.DelegationServlet.service(DelegationServlet.java:58)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.wso2.carbon.tomcat.ext.filter.CharacterSetFilter.doFilter(CharacterSetFilter.java:61)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:225)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:179)
at org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:156)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
at org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:49)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1001)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1653)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
[2012-10-15 18:18:03,579] #carbon.super [2] [Application Server]ERROR {org.apache.axis2.engine.AxisEngine} - Failed to determine Activation status.
org.apache.axis2.AxisFault: Failed to determine Activation status.
at org.wso2.carbon.activation.module.ActivationHandler.invoke(ActivationHandler.java:96)
at org.apache.axis2.engine.Phase.invokeHandler(Phase.java:340)
at org.apache.axis2.engine.Phase.invoke(Phase.java:313)
at org.apache.axis2.engine.AxisEngine.invoke(AxisEngine.java:262)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:168)
at org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:169)
at org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:82)
at org.wso2.carbon.core.transports.local.CarbonLocalTransportSender.finalizeSendWithToAddress(CarbonLocalTransportSender.java:45)
at org.apache.axis2.transport.local.LocalTransportSender.invoke(LocalTransportSender.java:77)
at org.apache.axis2.engine.AxisEngine.send(AxisEngine.java:443)
at org.apache.axis2.description.OutInAxisOperationClient.send(OutInAxisOperation.java:406)
at org.apache.axis2.description.OutInAxisOperationClient.executeImpl(OutInAxisOperation.java:229)
at org.apache.axis2.client.OperationClient.execute(OperationClient.java:165)
at org.wso2.carbon.core.commons.stub.loggeduserinfo.LoggedUserInfoAdminStub.getUserInfo(LoggedUserInfoAdminStub.java:187)
at org.wso2.carbon.ui.AbstractCarbonUIAuthenticator.setUserAuthorizationInfo(AbstractCarbonUIAuthenticator.java:278)
at org.wso2.carbon.ui.AbstractCarbonUIAuthenticator.processUserAuthorization(AbstractCarbonUIAuthenticator.java:196)
at org.wso2.carbon.ui.DefaultCarbonAuthenticator.authenticate(DefaultCarbonAuthenticator.java:198)
at org.wso2.carbon.ui.DefaultCarbonAuthenticator.authenticate(DefaultCarbonAuthenticator.java:121)
at org.wso2.carbon.ui.CarbonUILoginUtil.handleLogin(CarbonUILoginUtil.java:331)
at org.wso2.carbon.ui.CarbonSecuredHttpContext.handleSecurity(CarbonSecuredHttpContext.java:223)
at org.eclipse.equinox.http.servlet.internal.ServletRegistration.handleRequest(ServletRegistration.java:86)
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.processAlias(ProxyServlet.java:111)
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.service(ProxyServlet.java:67)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.wso2.carbon.tomcat.ext.servlet.DelegationServlet.service(DelegationServlet.java:58)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.wso2.carbon.tomcat.ext.filter.CharacterSetFilter.doFilter(CharacterSetFilter.java:61)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:225)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:179)
at org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:156)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
at org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:49)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1001)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1653)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.RuntimeException: Cannot create tenant ConfigurationContext for tenant cdae.uci.cu
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.getTenantConfigurationContext(TenantAxisUtils.java:124)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.getTenantAxisConfiguration(TenantAxisUtils.java:101)
at org.wso2.carbon.activation.module.ActivationHandler.invoke(ActivationHandler.java:90)
... 46 more
Caused by: java.lang.Exception: Error occurred while running deployment for tenant
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.createTenantConfigurationContext(TenantAxisUtils.java:335)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.getTenantConfigurationContext(TenantAxisUtils.java:121)
... 48 more
Caused by: java.lang.NullPointerException
at org.wso2.carbon.bam.service.data.publisher.publish.StreamDefinitionCreatorUtil.getStreamDefinition(StreamDefinitionCreatorUtil.java:30)
at org.wso2.carbon.bam.service.data.publisher.internal.ServiceStatisticsAxis2ConfigurationContextObserver.setEventingConfigDataSpecificForTenant(Se
rviceStatisticsAxis2ConfigurationContextObserver.java:76)
at org.wso2.carbon.bam.service.data.publisher.internal.ServiceStatisticsAxis2ConfigurationContextObserver.createdConfigurationContext(ServiceStatis
ticsAxis2ConfigurationContextObserver.java:51)
at org.wso2.carbon.core.multitenancy.utils.TenantAxisUtils.createTenantConfigurationContext(TenantAxisUtils.java:326)
... 49 more
We have come across this issue and fixed it. This is available in the WSO2 AppServer 5.0.1 release.
Amila.