Unexpected character in wdqs-updater_1 upon restart of a Wikibase - wikibase

When working with a local Wikibase I noticed that the autocomplete function stopped working in the local query service. After a shutdown and startup I get the following message:
...
Stopping phenowiki_wdqs_1 ... done
...
./startup.sh
Starting phenowiki_wdqs_1 ... done
Starting phenowiki_elasticsearch_1 ... done
Starting phenowiki_mysql_1 ... done
Starting phenowiki_wdqs-proxy_1 ... done
Starting phenowiki_wikibase_1 ... done
Starting phenowiki_quickstatements_1 ... done
Starting phenowiki_wdqs-updater_1 ... done
Starting phenowiki_wdqs-frontend_1 ... done
Attaching to phenowiki_wdqs_1, phenowiki_elasticsearch_1, phenowiki_mysql_1, phenowiki_wikibase_1, phenowiki_wdqs-updater_1, phenowiki_quickstatements_1, phenowiki_wdqs-proxy_1, phenowiki_wdqs-frontend_1
wdqs_1 | /wdqs /wdqs
wdqs_1 | Running Blazegraph from /wdqs on :9999/bigdata
wdqs_1 | OpenJDK 64-Bit Server VM warning: Cannot open file /var/log/wdqs/wdqs-blazegraph_jvm_gc.pid7-2019-07-12_07-54-42.log due to No such file or directory
wdqs_1 |
wdqs_1 | 2019-07-12 07:54:42.564:INFO::main: Logging initialized #166ms
elasticsearch_1 | [2019-07-12T07:54:44,800][INFO ][o.e.n.Node ] [] initializing ...
wdqs_1 | 2019-07-12 07:54:42.571:INFO:oejr.Runner:main: Runner
wdqs_1 | 2019-07-12 07:54:42.659:INFO:oejs.Server:main: jetty-9.2.3.v20140905
wikibase_1 | wait-for-it.sh: waiting 120 seconds for mysql.svc:3306
wdqs_1 | 2019-07-12 07:54:47.605:WARN:oeja.AnnotationConfiguration:main: ServletContainerInitializers: detected. Class hierarchy: empty
wikibase_1 | wait-for-it.sh: mysql.svc:3306 is available after 0 seconds
wikibase_1 | wait-for-it.sh: waiting 120 seconds for mysql.svc:3306
wdqs_1 | #logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} IP:%X{req.remoteHost} UA:%X{req.userAgent} - %msg%n
elasticsearch_1 | [2019-07-12T07:54:44,903][INFO ][o.e.e.NodeEnvironment ] [5sOo-00] using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/nvme0n1p2)]], net usable_space [297.7gb], net total_space [878.6gb], spins? [possibly], types [ext4]
elasticsearch_1 | [2019-07-12T07:54:44,903][INFO ][o.e.e.NodeEnvironment ] [5sOo-00] heap size [494.9mb], compressed ordinary object pointers [true]
mysql_1 | 2019-07-12 7:54:45 0 [Note] mysqld (mysqld 10.3.16-MariaDB-1:10.3.16+maria~bionic) starting as process 1 ...
wikibase_1 | wait-for-it.sh: mysql.svc:3306 is available after 0 seconds
wikibase_1 | wait-for-it.sh: waiting 60 seconds for elasticsearch.svc:9200
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Using Linux native AIO
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Uses event mutexes
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Compressed tables use zlib 1.2.11
quickstatements_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.80.7. Set the 'ServerName' directive globally to suppress this message
wdqs_1 | INFO: com.bigdata.util.config.LogUtil: Configure: file:/tmp/jetty-0.0.0.0-9999-blazegraph-service-0.3.0-SNAPSHOT.war-_bigdata-any-1586101280383644326.dir/webapp/WEB-INF/classes/log4j.properties
wdqs-updater_1 | wait-for-it.sh: waiting 120 seconds for myserver.nl:80
wdqs_1 | 07:54:48.105 [main] WARN com.bigdata.Banner IP: UA: - Defaulting log level to WARN: com.bigdata
quickstatements_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.80.7. Set the 'ServerName' directive globally to suppress this message
wdqs_1 |
wdqs_1 | BlazeGraph(TM) Graph Engine
wdqs_1 |
wdqs_1 | Flexible
wdqs_1 | Reliable
wdqs_1 | Affordable
wdqs_1 | Web-Scale Computing for the Enterprise
wdqs_1 |
wdqs_1 | Copyright SYSTAP, LLC DBA Blazegraph 2006-2016. All rights reserved.
wdqs_1 |
wdqs_1 | 1a307070fc17
wdqs_1 | Fri Jul 12 07:54:48 GMT 2019
wdqs_1 | Linux/4.15.0-50-generic amd64
wdqs_1 | Intel(R) Xeon(R) Silver 4114 CPU # 2.20GHz Family 6 Model 85 Stepping 4, GenuineIntel #CPU=40
wdqs_1 | IcedTea 1.8.0_212
wdqs_1 | freeMemory=642250776
wdqs_1 | buildVersion=2.1.5-SNAPSHOT
wdqs_1 | gitCommit=8ff64aab6071b5e591e80676809e43f0ddb8ad49
wdqs_1 |
wdqs_1 | Dependency License
wdqs_1 | ICU http://source.icu-project.org/repos/icu/icu/trunk/license.html
wdqs_1 | bigdata-ganglia http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | blueprints-core https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt
wdqs_1 | colt http://acs.lbl.gov/software/colt/license.html
wdqs_1 | commons-codec http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | commons-fileupload http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | commons-io http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | commons-logging http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | dsiutils http://www.gnu.org/licenses/lgpl-2.1.html
wdqs_1 | fastutil http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | flot http://www.opensource.org/licenses/mit-license.php
wdqs_1 | high-scale-lib http://creativecommons.org/licenses/publicdomain
wdqs_1 | httpclient http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | httpclient-cache http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | httpcore http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | httpmime http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | jackson-core http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | jetty http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | jquery https://github.com/jquery/jquery/blob/master/MIT-LICENSE.txt
wdqs_1 | jsonld https://raw.githubusercontent.com/jsonld-java/jsonld-java/master/LICENCE
wdqs_1 | log4j http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | lucene http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | nanohttp http://elonen.iki.fi/code/nanohttpd/#license
wdqs_1 | rexster-core https://github.com/tinkerpop/rexster/blob/master/LICENSE.txt
wdqs_1 | river http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | semargl https://github.com/levkhomich/semargl/blob/master/LICENSE
wdqs_1 | servlet-api http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 | sesame http://www.openrdf.org/download.jsp
wdqs_1 | slf4j http://www.slf4j.org/license.html
wdqs_1 | zookeeper http://www.apache.org/licenses/LICENSE-2.0.html
wdqs_1 |
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Number of pools: 1
wdqs-updater_1 | wait-for-it.sh: myserver.nl:80 is available after 0 seconds
quickstatements_1 | [Fri Jul 12 07:54:48.712372 2019] [mpm_prefork:notice] [pid 11] AH00163: Apache/2.4.25 (Debian) PHP/5.6.40 configured -- resuming normal operations
quickstatements_1 | [Fri Jul 12 07:54:48.712425 2019] [core:notice] [pid 11] AH00094: Command line: 'apache2 -D FOREGROUND'
wdqs_1 | 07:54:49.105 [main] WARN com.bigdata.rdf.ServiceProviderHook IP: UA: - Running.
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Using SSE2 crc32 instructions
elasticsearch_1 | [2019-07-12T07:54:44,921][INFO ][o.e.n.Node ] node name [5sOo-00] derived from node ID [5sOo-00hTAC1FV3BKuzvxA]; set [node.name] to override
wdqs-updater_1 | wait-for-it.sh: waiting 120 seconds for wdqs.svc:9999
wdqs_1 | 07:54:49.159 [main] INFO o.w.q.r.b.mwapi.MWApiServiceFactory IP: UA: - Loading MWAPI service configuration from ./mwservices.json
wdqs_1 | 07:54:49.537 [main] INFO o.w.q.r.b.mwapi.MWApiServiceFactory IP: UA: - Registered 4 services.
elasticsearch_1 | [2019-07-12T07:54:44,922][INFO ][o.e.n.Node ] version[5.6.14], pid[1], build[f310fe9/2018-12-05T21:20:16.416Z], OS[Linux/4.15.0-50-generic/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/1.8.0_181/25.181-b13]
wdqs-updater_1 | wait-for-it.sh: wdqs.svc:9999 is available after 3 seconds
elasticsearch_1 | [2019-07-12T07:54:44,922][INFO ][o.e.n.Node ] JVM arguments [-Xms2g, -Xmx2g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Xms512m, -Xmx512m, -Des.path.home=/usr/share/elasticsearch]
wdqs-updater_1 | Updating via http://wdqs.svc:9999/bigdata/namespace/wdq/sparql
elasticsearch_1 | [2019-07-12T07:54:45,984][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [aggs-matrix-stats]
elasticsearch_1 | [2019-07-12T07:54:45,984][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [ingest-common]
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Initializing buffer pool, total size = 256M, instances = 1, chunk size = 128M
wdqs-updater_1 | OpenJDK 64-Bit Server VM warning: Cannot open file /var/log/wdqs/wdqs-updater_jvm_gc.pid7.log due to No such file or directory
wdqs-updater_1 |
elasticsearch_1 | [2019-07-12T07:54:45,984][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [lang-expression]
wdqs-updater_1 | I> No access restrictor found, access to any MBean is allowed
elasticsearch_1 | [2019-07-12T07:54:45,984][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [lang-groovy]
wdqs-updater_1 | Jolokia: Agent started with URL http://127.0.0.1:8778/jolokia/
wdqs_1 | 07:54:49.555 [main] INFO o.w.q.r.b.WikibaseContextListener IP: UA: - Wikibase services initialized.
wdqs_1 | 07:54:49.642 [main] INFO o.w.q.r.b.t.ThrottlingFilter IP: UA: - ThrottlingFilter MBean registered as org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter:filterName=throttling-filter.
elasticsearch_1 | [2019-07-12T07:54:45,984][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [lang-mustache]
wdqs_1 | 2019-07-12 07:54:49.681:INFO:oejsh.ContextHandler:main: Started o.e.j.w.WebAppContext#2d363fb3{/bigdata,file:/tmp/jetty-0.0.0.0-9999-blazegraph-service-0.3.0-SNAPSHOT.war-_bigdata-any-1586101280383644326.dir/webapp/,AVAILABLE}{file:/wdqs/blazegraph-service-0.3.0-SNAPSHOT.war}
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Completed initialization of buffer pool
wdqs-updater_1 | #logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [lang-painless]
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [parent-join]
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority().
wdqs_1 | 2019-07-12 07:54:49.682:WARN:oejsh.RequestLogHandler:main: !RequestLog
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: 128 out of 128 rollback segments are active.
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [percolator]
wdqs_1 | 2019-07-12 07:54:49.693:INFO:oejs.ServerConnector:main: Started ServerConnector#46e8a539{HTTP/1.1}{0.0.0.0:9999}
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [reindex]
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Creating shared tablespace for temporary tables
wdqs_1 | 2019-07-12 07:54:49.696:INFO:oejs.Server:main: Started #7339ms
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [transport-netty3]
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded module [transport-netty4]
elasticsearch_1 | [2019-07-12T07:54:45,985][INFO ][o.e.p.PluginsService ] [5sOo-00] loaded plugin [extra]
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ...
elasticsearch_1 | [2019-07-12T07:54:46,148][WARN ][o.e.d.s.ScriptModule ] Native scripts are deprecated. Use a custom ScriptEngine to write scripts in java.
elasticsearch_1 | [2019-07-12T07:54:47,312][INFO ][o.e.d.DiscoveryModule ] [5sOo-00] using discovery type [zen]
elasticsearch_1 | [2019-07-12T07:54:48,064][INFO ][o.e.n.Node ] initialized
elasticsearch_1 | [2019-07-12T07:54:48,065][INFO ][o.e.n.Node ] [5sOo-00] starting ...
elasticsearch_1 | [2019-07-12T07:54:48,207][INFO ][o.e.t.TransportService ] [5sOo-00] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB.
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: 10.3.16 started; log sequence number 6230267; transaction id 2553
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
mysql_1 | 2019-07-12 7:54:45 0 [Note] Plugin 'FEEDBACK' is disabled.
mysql_1 | 2019-07-12 7:54:45 0 [Note] Server socket created on IP: '::'.
mysql_1 | 2019-07-12 7:54:45 0 [Warning] 'proxies_priv' entry '#% root#c16b13b97f72' ignored in --skip-name-resolve mode.
mysql_1 | 2019-07-12 7:54:45 0 [Note] Reading of all Master_info entries succeeded
mysql_1 | 2019-07-12 7:54:45 0 [Note] Added new Master_info '' to hash table
mysql_1 | 2019-07-12 7:54:45 0 [Note] mysqld: ready for connections.
mysql_1 | Version: '10.3.16-MariaDB-1:10.3.16+maria~bionic' socket: '/var/run/mysqld/mysqld.sock' port: 3306 mariadb.org binary distribution
mysql_1 | 2019-07-12 7:54:45 0 [Note] InnoDB: Buffer pool(s) load completed at 190712 7:54:45
elasticsearch_1 | [2019-07-12T07:54:51,287][INFO ][o.e.c.s.ClusterService ] [5sOo-00] new_master {5sOo-00}{5sOo-00hTAC1FV3BKuzvxA}{tfRuBuVIToaQhhqABeBQEg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
elasticsearch_1 | [2019-07-12T07:54:51,343][INFO ][o.e.h.n.Netty4HttpServerTransport] [5sOo-00] publish_address {192.168.80.3:9200}, bound_addresses {0.0.0.0:9200}
elasticsearch_1 | [2019-07-12T07:54:51,344][INFO ][o.e.n.Node ] [5sOo-00] started
wdqs-updater_1 | 07:54:51.561 [main] INFO org.wikidata.query.rdf.tool.Update - Checking where we left off
wdqs-updater_1 | 07:54:51.563 [main] INFO o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the updater
wdqs-updater_1 | 07:54:52.111 [main] INFO o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the dump
wdqs-updater_1 | 07:54:52.149 [main] INFO org.wikidata.query.rdf.tool.Update - Defaulting start time to 90 days ago: 2019-04-13T07:54:52Z
elasticsearch_1 | [2019-07-12T07:54:52,178][INFO ][o.e.g.GatewayService ] [5sOo-00] recovered [3] indices into cluster_state
wikibase_1 | wait-for-it.sh: elasticsearch.svc:9200 is available after 5 seconds
wdqs-updater_1 | 07:54:52.324 [main] ERROR org.wikidata.query.rdf.tool.Update - Error during updater run.
wdqs-updater_1 | java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
wdqs-updater_1 | at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:270)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.fetchRecentChanges(RecentChangesPoller.java:301)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.batch(RecentChangesPoller.java:314)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.firstBatch(RecentChangesPoller.java:139)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.firstBatch(RecentChangesPoller.java:33)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.Updater.run(Updater.java:125)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.Update.main(Update.java:80)
wdqs-updater_1 | Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
wdqs-updater_1 | at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]
wdqs-updater_1 | at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1798)
wdqs-updater_1 | at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:663)
wdqs-updater_1 | at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:561)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2625)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:826)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:723)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4129)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3988)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3058)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.getJson(WikibaseRepository.java:422)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:264)
wdqs-updater_1 | ... 6 common frames omitted
wdqs-updater_1 | Exception in thread "main" java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
wdqs-updater_1 | at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:270)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.fetchRecentChanges(RecentChangesPoller.java:301)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.batch(RecentChangesPoller.java:314)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.firstBatch(RecentChangesPoller.java:139)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.change.RecentChangesPoller.firstBatch(RecentChangesPoller.java:33)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.Updater.run(Updater.java:125)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.Update.main(Update.java:80)
wdqs-updater_1 | Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
wdqs-updater_1 | at [Source: (org.apache.http.conn.EofSensorInputStream); line: 1, column: 2]
wdqs-updater_1 | at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1798)
wdqs-updater_1 | at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:663)
wdqs-updater_1 | at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:561)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2625)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:826)
wdqs-updater_1 | at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:723)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:4129)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3988)
wdqs-updater_1 | at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3058)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.getJson(WikibaseRepository.java:422)
wdqs-updater_1 | at org.wikidata.query.rdf.tool.wikibase.WikibaseRepository.fetchRecentChanges(WikibaseRepository.java:264)
wdqs-updater_1 | ... 6 more
wikibase_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.80.6. Set the 'ServerName' directive globally to suppress this message
wikibase_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 192.168.80.6. Set the 'ServerName' directive globally to suppress this message
wikibase_1 | [Fri Jul 12 07:54:52.423218 2019] [mpm_prefork:notice] [pid 59] AH00163: Apache/2.4.25 (Debian) PHP/7.2.19 configured -- resuming normal operations
wikibase_1 | [Fri Jul 12 07:54:52.423284 2019] [core:notice] [pid 59] AH00094: Command line: 'apache2 -D FOREGROUND'
phenowiki_wdqs-updater_1 exited with code 1
elasticsearch_1 | [2019-07-12T07:54:53,079][INFO ][o.e.c.r.a.AllocationService] [5sOo-00] Cluster health status changed from [RED] to [GREEN] (reason: [shards started [[mw_cirrus_metastore_first][0]] ...]).
Update finished
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:57:36 +0000] "GET /wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2&peek=1&from=20190712075407&isAnon=true&action=render HTTP/1.1" 204 472 "http://myserver.nl:8181/wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15"
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:57:36 +0000] "GET /wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2 HTTP/1.1" 200 8081 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15"
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:57:37 +0000] "POST /w/api.php HTTP/1.1" 200 923 "http://myserver.nl:8181/wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15"
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:57:37 +0000] "POST /w/api.php HTTP/1.1" 200 924 "http://myserver.nl:8181/wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15"
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:57:37 +0000] "GET /w/opensearch_desc.php HTTP/1.1" 200 793 "-" "com.apple.Safari.SearchHelper/14607.2.6.1.1 CFNetwork/978.0.7 Darwin/18.6.0 (x86_64)"
wikibase_1 | 10.2.0.24 - - [12/Jul/2019:07:58:54 +0000] "GET /wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2&peek=1&from=20190712075737&isAnon=true&action=render HTTP/1.1" 204 472 "http://myserver.nl:8181/wiki/Special:RecentChanges?hidebots=1&hideWikibase=1&limit=50&days=7&enhanced=1&urlversion=2" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15"
wdqs-proxy_1 | 192.168.80.9 - - [12/Jul/2019:07:59:29 +0000] "GET /bigdata/namespace/wdq/sparql?query=prefix%20schema:%20%3Chttp://schema.org/%3E%20SELECT%20*%20WHERE%20%7B%3Chttp://www.wikidata.org%3E%20schema:dateModified%20?y%7D&nocache=26048639 HTTP/1.0" 200 83 "http://myserver.nl:8282/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15" "-"
wdqs-frontend_1 | 10.2.0.24 - - [12/Jul/2019:07:59:29 +0000] "GET /proxy/wdqs/bigdata/namespace/wdq/sparql?query=prefix%20schema:%20%3Chttp://schema.org/%3E%20SELECT%20*%20WHERE%20%7B%3Chttp://www.wikidata.org%3E%20schema:dateModified%20?y%7D&nocache=26048639 HTTP/1.1" 200 94 "http://myserver.nl:8282/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_5) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1.1 Safari/605.1.15" "-"
Is anyone familiar on how this could have happend? I managed to reproduce this problem by starting from a clean docker based Wikibase after startup I did a shutdown and then start it up again.

Related

Unable to connect browser to any of my docker images

I downloaded cookiecutter django to start a new project the other day. I spun it up (along with postgres, redis, etc) inside docker containers. The configuration files should be fine because they were all generated by coockicutter.
However, once I build and turn on the containers I am unable to see the "hello world" splash page when I connect to my localhost:8000. But there is something going wrong between the applications and the containers because I am able to connect to them via telnet and through docker exec -it commands etc. The only thing I can think of is some sort of permissions issue? So I gave all the files/directors 777 permissions to test that but that hasnt changed anything.
logs
% docker compose -f local.yml up
[+] Running 8/0
⠿ Container dashboard_local_docs Created 0.0s
⠿ Container dashboard_local_redis Created 0.0s
⠿ Container dashboard_local_mailhog Created 0.0s
⠿ Container dashboard_local_postgres Created 0.0s
⠿ Container dashboard_local_django Created 0.0s
⠿ Container dashboard_local_celeryworker Created 0.0s
⠿ Container dashboard_local_celerybeat Created 0.0s
⠿ Container dashboard_local_flower Created 0.0s
Attaching to dashboard_local_celerybeat, dashboard_local_celeryworker, dashboard_local_django, dashboard_local_docs, dashboard_local_flower, dashboard_local_mailhog, dashboard_local_postgres, dashboard_local_redis
dashboard_local_postgres |
dashboard_local_postgres | PostgreSQL Database directory appears to contain a database; Skipping initialization
dashboard_local_postgres |
dashboard_local_postgres | 2022-07-07 14:36:15.969 UTC [1] LOG: starting PostgreSQL 14.4 (Debian 14.4-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
dashboard_local_postgres | 2022-07-07 14:36:15.992 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
dashboard_local_postgres | 2022-07-07 14:36:15.992 UTC [1] LOG: listening on IPv6 address "::", port 5432
dashboard_local_postgres | 2022-07-07 14:36:15.995 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
dashboard_local_postgres | 2022-07-07 14:36:15.999 UTC [26] LOG: database system was shut down at 2022-07-07 14:35:47 UTC
dashboard_local_postgres | 2022-07-07 14:36:16.004 UTC [1] LOG: database system is ready to accept connections
dashboard_local_mailhog | 2022/07/07 14:36:16 Using in-memory storage
dashboard_local_mailhog | 2022/07/07 14:36:16 [SMTP] Binding to address: 0.0.0.0:1025
dashboard_local_mailhog | 2022/07/07 14:36:16 Serving under http://0.0.0.0:8025/
dashboard_local_mailhog | [HTTP] Binding to address: 0.0.0.0:8025
dashboard_local_mailhog | Creating API v1 with WebPath:
dashboard_local_mailhog | Creating API v2 with WebPath:
dashboard_local_docs | sphinx-autobuild -b html --host 0.0.0.0 --port 9000 --watch /app -c . . ./_build/html
dashboard_local_docs | [sphinx-autobuild] > sphinx-build -b html -c . /docs /docs/_build/html
dashboard_local_redis | 1:C 07 Jul 2022 14:36:17.057 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
dashboard_local_redis | 1:C 07 Jul 2022 14:36:17.057 # Redis version=6.2.7, bits=64, commit=00000000, modified=0, pid=1, just started
dashboard_local_redis | 1:C 07 Jul 2022 14:36:17.057 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.057 * monotonic clock: POSIX clock_gettime
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.057 # A key '__redis__compare_helper' was added to Lua globals which is not on the globals allow list nor listed on the deny list.
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.057 * Running mode=standalone, port=6379.
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.057 # Server initialized
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.057 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 * Loading RDB produced by version 6.2.7
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 * RDB age 30 seconds
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 * RDB memory usage when created 0.78 Mb
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 # Done loading RDB, keys loaded: 3, keys expired: 0.
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 * DB loaded from disk: 0.000 seconds
dashboard_local_redis | 1:M 07 Jul 2022 14:36:17.058 * Ready to accept connections
dashboard_local_docs | Running Sphinx v5.0.1
dashboard_local_celeryworker | PostgreSQL is available
dashboard_local_celerybeat | PostgreSQL is available
dashboard_local_docs | loading pickled environment... done
dashboard_local_docs | building [mo]: targets for 0 po files that are out of date
dashboard_local_docs | building [html]: targets for 0 source files that are out of date
dashboard_local_docs | updating environment: 0 added, 0 changed, 0 removed
dashboard_local_docs | looking for now-outdated files... none found
dashboard_local_docs | no targets are out of date.
dashboard_local_docs | build succeeded.
dashboard_local_docs |
dashboard_local_docs | The HTML pages are in _build/html.
dashboard_local_docs | [I 220707 14:36:18 server:335] Serving on http://0.0.0.0:9000
dashboard_local_celeryworker | [14:36:18] watching "/app" and reloading "celery.__main__.main" on changes...
dashboard_local_docs | [I 220707 14:36:18 handlers:62] Start watching changes
dashboard_local_docs | [I 220707 14:36:18 handlers:64] Start detecting changes
dashboard_local_django | PostgreSQL is available
dashboard_local_celerybeat | celery beat v5.2.7 (dawn-chorus) is starting.
dashboard_local_flower | PostgreSQL is available
dashboard_local_celerybeat | __ - ... __ - _
dashboard_local_celerybeat | LocalTime -> 2022-07-07 09:36:19
dashboard_local_celerybeat | Configuration ->
dashboard_local_celerybeat | . broker -> redis://redis:6379/0
dashboard_local_celerybeat | . loader -> celery.loaders.app.AppLoader
dashboard_local_celerybeat | . scheduler -> django_celery_beat.schedulers.DatabaseScheduler
dashboard_local_celerybeat |
dashboard_local_celerybeat | . logfile -> [stderr]#%INFO
dashboard_local_celerybeat | . maxinterval -> 5.00 seconds (5s)
dashboard_local_celerybeat | [2022-07-07 09:36:19,658: INFO/MainProcess] beat: Starting...
dashboard_local_celeryworker | /usr/local/lib/python3.9/site-packages/celery/platforms.py:840: SecurityWarning: You're running the worker with superuser privileges: this is
dashboard_local_celeryworker | absolutely not recommended!
dashboard_local_celeryworker |
dashboard_local_celeryworker | Please specify a different user using the --uid option.
dashboard_local_celeryworker |
dashboard_local_celeryworker | User information: uid=0 euid=0 gid=0 egid=0
dashboard_local_celeryworker |
dashboard_local_celeryworker | warnings.warn(SecurityWarning(ROOT_DISCOURAGED.format(
dashboard_local_celeryworker |
dashboard_local_celeryworker | -------------- celery#e1ac9f770cbd v5.2.7 (dawn-chorus)
dashboard_local_celeryworker | --- ***** -----
dashboard_local_celeryworker | -- ******* ---- Linux-5.4.0-96-generic-x86_64-with-glibc2.31 2022-07-07 09:36:19
dashboard_local_celeryworker | - *** --- * ---
dashboard_local_celeryworker | - ** ---------- [config]
dashboard_local_celeryworker | - ** ---------- .> app: dashboard:0x7fd9dcaeb1c0
dashboard_local_celeryworker | - ** ---------- .> transport: redis://redis:6379/0
dashboard_local_celeryworker | - ** ---------- .> results: redis://redis:6379/0
dashboard_local_celeryworker | - *** --- * --- .> concurrency: 8 (prefork)
dashboard_local_celeryworker | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
dashboard_local_celeryworker | --- ***** -----
dashboard_local_celeryworker | -------------- [queues]
dashboard_local_celeryworker | .> celery exchange=celery(direct) key=celery
dashboard_local_celeryworker |
dashboard_local_celeryworker |
dashboard_local_celeryworker | [tasks]
dashboard_local_celeryworker | . dashboard.users.tasks.get_users_count
dashboard_local_celeryworker |
dashboard_local_django | Operations to perform:
dashboard_local_django | Apply all migrations: account, admin, auth, authtoken, contenttypes, django_celery_beat, sessions, sites, socialaccount, users
dashboard_local_django | Running migrations:
dashboard_local_django | No migrations to apply.
dashboard_local_flower | INFO 2022-07-07 09:36:20,646 command 7 140098896897856 Visit me at http://localhost:5555
dashboard_local_flower | INFO 2022-07-07 09:36:20,652 command 7 140098896897856 Broker: redis://redis:6379/0
dashboard_local_flower | INFO 2022-07-07 09:36:20,655 command 7 140098896897856 Registered tasks:
dashboard_local_flower | ['celery.accumulate',
dashboard_local_flower | 'celery.backend_cleanup',
dashboard_local_flower | 'celery.chain',
dashboard_local_flower | 'celery.chord',
dashboard_local_flower | 'celery.chord_unlock',
dashboard_local_flower | 'celery.chunks',
dashboard_local_flower | 'celery.group',
dashboard_local_flower | 'celery.map',
dashboard_local_flower | 'celery.starmap',
dashboard_local_flower | 'dashboard.users.tasks.get_users_count']
dashboard_local_flower | INFO 2022-07-07 09:36:20,663 mixins 7 140098817644288 Connected to redis://redis:6379/0
dashboard_local_celeryworker | [2022-07-07 09:36:20,792: INFO/SpawnProcess-1] Connected to redis://redis:6379/0
dashboard_local_celeryworker | [2022-07-07 09:36:20,794: INFO/SpawnProcess-1] mingle: searching for neighbors
dashboard_local_flower | WARNING 2022-07-07 09:36:21,700 inspector 7 140098800826112 Inspect method active_queues failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,710 inspector 7 140098766993152 Inspect method reserved failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,712 inspector 7 140098784040704 Inspect method scheduled failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,714 inspector 7 140098758600448 Inspect method revoked failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,715 inspector 7 140098792433408 Inspect method registered failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,715 inspector 7 140098276423424 Inspect method conf failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,715 inspector 7 140098809218816 Inspect method stats failed
dashboard_local_flower | WARNING 2022-07-07 09:36:21,716 inspector 7 140098775648000 Inspect method active failed
dashboard_local_celeryworker | [2022-07-07 09:36:21,802: INFO/SpawnProcess-1] mingle: all alone
dashboard_local_celeryworker | [2022-07-07 09:36:21,811: WARNING/SpawnProcess-1] /usr/local/lib/python3.9/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory
dashboard_local_celeryworker | leak, never use this setting in production environments!
dashboard_local_celeryworker | warnings.warn('''Using settings.DEBUG leads to a memory
dashboard_local_celeryworker |
dashboard_local_celeryworker | [2022-07-07 09:36:21,811: INFO/SpawnProcess-1] celery#e1ac9f770cbd ready.
dashboard_local_django | Watching for file changes with StatReloader
dashboard_local_django | INFO 2022-07-07 09:36:22,862 autoreload 9 140631340287808 Watching for file changes with StatReloader
dashboard_local_django | Performing system checks...
dashboard_local_django |
dashboard_local_django | System check identified no issues (0 silenced).
dashboard_local_django | July 07, 2022 - 09:36:23
dashboard_local_django | Django version 3.2.14, using settings 'config.settings.local'
dashboard_local_django | Starting development server at http://0.0.0.0:8000/
dashboard_local_django | Quit the server with CONTROL-C.
dashboard_local_celeryworker | [2022-07-07 09:36:25,661: INFO/SpawnProcess-1] Events of group {task} enabled by remote.
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
69591187e44d dashboard_local_flower "/entrypoint /start-…" 11 minutes ago Up 2 minutes 0.0.0.0:5555->5555/tcp, :::5555->5555/tcp dashboard_local_flower
15914b6b91e0 dashboard_local_celerybeat "/entrypoint /start-…" 11 minutes ago Up 2 minutes dashboard_local_celerybeat
e1ac9f770cbd dashboard_local_celeryworker "/entrypoint /start-…" 11 minutes ago Up 2 minutes dashboard_local_celeryworker
6bbfc900c346 dashboard_local_django "/entrypoint /start" 11 minutes ago Up 2 minutes 0.0.0.0:8000->8000/tcp, :::8000->8000/tcp dashboard_local_django
b8bec3422bae redis:6 "docker-entrypoint.s…" 11 minutes ago Up 2 minutes 6379/tcp dashboard_local_redis
2b7c3d9eabe3 dashboard_production_postgres "docker-entrypoint.s…" 11 minutes ago Up 2 minutes 5432/tcp dashboard_local_postgres
0249aaaa040c mailhog/mailhog:v1.0.0 "MailHog" 11 minutes ago Up 2 minutes 1025/tcp, 0.0.0.0:8025->8025/tcp, :::8025->8025/tcp dashboard_local_mailhog
d5dd94cbb070 dashboard_local_docs "/start-docs" 11 minutes ago Up 2 minutes 0.0.0.0:9000->9000/tcp, :::9000->9000/tcp dashboard_local_docs
the ports are listening
telnet 127.0.0.1 8000
Trying 127.0.0.1...
Connected to 127.0.0.1.
Escape character is '^]'.
^]
% sudo netstat -tulpn | grep LISTEN
tcp 0 0 0.0.0.0:139 0.0.0.0:* LISTEN 29532/smbd
tcp 0 0 127.0.0.1:43979 0.0.0.0:* LISTEN 31867/BlastServer
tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN 963/rpcbind
tcp 0 0 0.0.0.0:46641 0.0.0.0:* LISTEN -
tcp 0 0 0.0.0.0:51857 0.0.0.0:* LISTEN 4149/rpc.statd
tcp 0 0 0.0.0.0:5555 0.0.0.0:* LISTEN 14326/docker-proxy
tcp 0 0 0.0.0.0:6100 0.0.0.0:* LISTEN 31908/Xorg
tcp 0 0 127.0.0.53:53 0.0.0.0:* LISTEN 973/systemd-resolve
tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 29295/sshd
tcp 0 0 0.0.0.0:8025 0.0.0.0:* LISTEN 13769/docker-proxy
tcp 0 0 127.0.0.1:25 0.0.0.0:* LISTEN 30117/master
tcp 0 0 127.0.0.1:6010 0.0.0.0:* LISTEN 882/sshd: noakes#no
tcp 0 0 0.0.0.0:445 0.0.0.0:* LISTEN 29532/smbd
tcp 0 0 0.0.0.0:8000 0.0.0.0:* LISTEN 14272/docker-proxy
tcp 0 0 0.0.0.0:9000 0.0.0.0:* LISTEN 13850/docker-proxy
tcp6 0 0 :::139 :::* LISTEN 29532/smbd
tcp6 0 0 :::40717 :::* LISTEN -
tcp6 0 0 :::41423 :::* LISTEN 4149/rpc.statd
tcp6 0 0 :::111 :::* LISTEN 963/rpcbind
tcp6 0 0 127.0.0.1:41265 :::* LISTEN 30056/java
tcp6 0 0 :::5555 :::* LISTEN 14333/docker-proxy
tcp6 0 0 :::6100 :::* LISTEN 31908/Xorg
tcp6 0 0 :::22 :::* LISTEN 29295/sshd
tcp6 0 0 :::13782 :::* LISTEN 2201/xinetd
tcp6 0 0 :::13783 :::* LISTEN 2201/xinetd
tcp6 0 0 :::8025 :::* LISTEN 13779/docker-proxy
tcp6 0 0 ::1:25 :::* LISTEN 30117/master
tcp6 0 0 ::1:6010 :::* LISTEN 882/sshd: noakes#no
tcp6 0 0 :::13722 :::* LISTEN 2201/xinetd
tcp6 0 0 :::6556 :::* LISTEN 2201/xinetd
tcp6 0 0 :::445 :::* LISTEN 29532/smbd
tcp6 0 0 :::8000 :::* LISTEN 14278/docker-proxy
tcp6 0 0 :::1057 :::* LISTEN 2201/xinetd
tcp6 0 0 :::7778 :::* LISTEN 2201/xinetd
tcp6 0 0 :::7779 :::* LISTEN 2201/xinetd
tcp6 0 0 :::9000 :::* LISTEN 13860/docker-proxy
local.yml
version: '3'
volumes:
dashboard_local_postgres_data: {}
dashboard_local_postgres_data_backups: {}
services:
django: &django
build:
context: .
dockerfile: ./compose/local/django/Dockerfile
#user: "root:root"
image: dashboard_local_django
container_name: dashboard_local_django
platform: linux/x86_64
depends_on:
- postgres
- redis
- mailhog
volumes:
- .:/app:z
env_file:
- ./.envs/.local/.django
- ./.envs/.local/.postgres
ports:
- "8000:8000"
command: /start
postgres:
build:
context: .
dockerfile: ./compose/production/postgres/Dockerfile
image: dashboard_production_postgres
container_name: dashboard_local_postgres
volumes:
- dashboard_local_postgres_data:/var/lib/postgresql/data:Z
- dashboard_local_postgres_data_backups:/backups:z
env_file:
- ./.envs/.local/.postgres
docs:
image: dashboard_local_docs
container_name: dashboard_local_docs
platform: linux/x86_64
build:
context: .
dockerfile: ./compose/local/docs/Dockerfile
env_file:
- ./.envs/.local/.django
volumes:
- ./docs:/docs:z
- ./config:/app/config:z
- ./dashboard:/app/dashboard:z
ports:
- "9000:9000"
command: /start-docs
mailhog:
image: mailhog/mailhog:v1.0.0
container_name: dashboard_local_mailhog
ports:
- "8025:8025"
redis:
image: redis:6
container_name: dashboard_local_redis
celeryworker:
<<: *django
image: dashboard_local_celeryworker
container_name: dashboard_local_celeryworker
depends_on:
- redis
- postgres
- mailhog
ports: []
command: /start-celeryworker
celerybeat:
<<: *django
image: dashboard_local_celerybeat
container_name: dashboard_local_celerybeat
depends_on:
- redis
- postgres
- mailhog
ports: []
command: /start-celerybeat
flower:
<<: *django
image: dashboard_local_flower
container_name: dashboard_local_flower
ports:
- "5555:5555"
command: /start-flower

Spring Cloud application fails to connect to AWS Kinesis 'dynamoDBClient' must not be null

Unable to connect to Kinesis stream using Spring Cloud Stream Kinesis binder due to error java.lang.IllegalArgumentException: 'dynamoDBClient' must not be null
//Stream binding
EnableBinding(DataStream.class)
public class StreamsConfig {}
//Stream config
public interface DataStream {
String AUDIT_IN = "auditIn";
#Input(AUDIT_IN)
SubscribableChannel auditIn();
}
//Stream Listener
#StreamListener(DataStream.AUDIT_IN)
public void processAudit(finalAuditMessage auditMessage) {
logger.info("Received audit message : {}", auditMessage);
MDCUtil.setContext(auditMessage.getTenantId(), null, null);
persistAuditMessage(auditMessage);
MDCUtil.clearContext();
}
Application.yaml
cloud:
aws:
stack:
auto: false
region:
static: us-west-2
credentials:
useDefaultAwsCredentialsChain: true
management:
context-path: /actuator
security:
enabled: false
roles: ADMIN
endpoint:
metrics:
enabled: true
loggers:
enabled: true
health:
show-details: always
endpoints:
web:
exposure:
include: "health,metrics,info,loggers"
metrics:
export:
statsd:
enabled: true
flavor: telegraf
host: telegraf-s.cluster-services.svc.cluster.local
port: 8125
tags:
service: audit
namespace: {{ .Release.Namespace }}
distribution:
percentiles:
http:
server:
requests: 0.5,0.9,0.99
percentiles-histogram:
http:
server:
requests: true
logging:
level:
org.hibernate.SQL: DEBUG
org.hibernate.type.descriptor.sql.BasicBinder: TRACE
spring:
application:
name: audit
zipkin:
enabled: false
cloud:
stream:
bindings:
auditIn:
content-type: application/json
destination: ***
consumer:
describeStreamRetries: 10
group: ***
kinesis:
binder:
autoCreateStream: true
autoAddShards: false
kplKclEnabled: true
vault:
token: 00000000-0000-0000-0000-000000000000
ssl:
trust-store: ***
trust-store-password: ***
generic:
enabled: false
host: localhost
port: 8200
scheme: https
connection:
timeout: 5000
read-timeout: 15000
authentication: KUBERNETES
kubernetes:
role: owner
kubernetes-path: kubernetes
service-account-token-file: token
kv:
enabled: false
backend: kv-v2
application-name: configuration/mysql
jpa:
open-in-view: false
audit:
filter:
block:
methods: GET
Debug log
INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Detecting the operating system and CPU architecture
[INFO] ------------------------------------------------------------------------
[INFO] os.detected.name: osx
[INFO] os.detected.arch: x86_64
[INFO] os.detected.classifier: osx-x86_64
[INFO]
[INFO] ---------------------------< com.nile:audit >---------------------------
[INFO] Building audit 0.0.1-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] >>> spring-boot-maven-plugin:2.2.2.RELEASE:run (default-cli) > test-compile # audit >>>
[INFO]
[INFO] --- maven-resources-plugin:3.1.0:resources (default-resources) # audit ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 1 resource
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) # audit ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:3.1.0:testResources (default-testResources) # audit ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 5 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) # audit ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] <<< spring-boot-maven-plugin:2.2.2.RELEASE:run (default-cli) < test-compile # audit <<<
[INFO]
[INFO]
[INFO] --- spring-boot-maven-plugin:2.2.2.RELEASE:run (default-cli) # audit ---
[INFO] Attaching agents: []
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.2.2.RELEASE)
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.springframework.cglib.core.ReflectUtils (file:/Users/arvindkgs/.m2/repository/org/springframework/spring-core/5.2.2.RELEASE/spring-core-5.2.2.RELEASE.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of org.springframework.cglib.core.ReflectUtils
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2021-02-24 20:40:38,529 ERROR [main] com.nile.cloudutilities.audit.aspect.MethodMap [] [] - STARTING AUDIT BUILD AT 2021-02-24T15:10:38.524052Z
2021-02-24 20:40:39,984 ERROR [main] com.nile.cloudutilities.audit.aspect.MethodMap [] [] - FINISHED AUDIT BUILD AT 2021-02-24T15:10:39.984543Z: 3/14/0 (0 mappings)
Running custom WebSecurity
2021-02-24 20:40:52,203 DEBUG [main] org.hibernate.SQL [] [] - select loggerconf0_.id as id1_2_, loggerconf0_.create_time as create_t2_2_, loggerconf0_.last_update as last_upd3_2_, loggerconf0_.configuration as configur4_2_, loggerconf0_.service as service5_2_ from logger_configs loggerconf0_ where loggerconf0_.service=?
2021-02-24 20:40:52,219 TRACE [main] org.hibernate.type.descriptor.sql.BasicBinder [] [] - binding parameter [1] as [VARCHAR] - [audit]
2021-02-24 20:40:54,068 ERROR [main] org.springframework.cloud.stream.binding.BindingService [] [] - Failed to create consumer binding; retrying in 30 seconds
org.springframework.cloud.stream.binder.BinderException: Exception thrown while starting consumer:
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:471) ~[spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:90) ~[spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binder.AbstractBinder.bindConsumer(AbstractBinder.java:143) ~[spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.BindingService.doBindConsumer(BindingService.java:169) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.BindingService.bindConsumer(BindingService.java:126) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.AbstractBindableProxyFactory.createAndBindInputs(AbstractBindableProxyFactory.java:112) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.InputBindingLifecycle.doStartWithBindable(InputBindingLifecycle.java:58) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.AbstractBindingLifecycle$$Lambda$1628/000000000000000000.accept(Unknown Source) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at java.util.LinkedHashMap$LinkedValues.forEach(LinkedHashMap.java:608) [?:?]
at org.springframework.cloud.stream.binding.AbstractBindingLifecycle.start(AbstractBindingLifecycle.java:57) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.cloud.stream.binding.InputBindingLifecycle.start(InputBindingLifecycle.java:34) [spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:182) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:53) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:360) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:158) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:122) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:894) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:162) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:553) [spring-context-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:141) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:747) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) [spring-boot-2.2.2.RELEASE.jar:2.2.2.RELEASE]
at com.nile.audit.AuditService.main(AuditService.java:60) [classes/:?]
Caused by: java.lang.IllegalArgumentException: 'dynamoDBClient' must not be null.
at org.springframework.util.Assert.notNull(Assert.java:198) ~[spring-core-5.2.2.RELEASE.jar:5.2.2.RELEASE]
at org.springframework.integration.aws.inbound.kinesis.KclMessageDrivenChannelAdapter.<init>(KclMessageDrivenChannelAdapter.java:143) ~[spring-integration-aws-2.3.0.RELEASE.jar:?]
at org.springframework.cloud.stream.binder.kinesis.KinesisMessageChannelBinder.createKclConsumerEndpoint(KinesisMessageChannelBinder.java:314) ~[spring-cloud-stream-binder-kinesis-2.0.0.RELEASE.jar:2.0.0.RELEASE]
at org.springframework.cloud.stream.binder.kinesis.KinesisMessageChannelBinder.createConsumerEndpoint(KinesisMessageChannelBinder.java:291) ~[spring-cloud-stream-binder-kinesis-2.0.0.RELEASE.jar:2.0.0.RELEASE]
at org.springframework.cloud.stream.binder.kinesis.KinesisMessageChannelBinder.createConsumerEndpoint(KinesisMessageChannelBinder.java:89) ~[spring-cloud-stream-binder-kinesis-2.0.0.RELEASE.jar:2.0.0.RELEASE]
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:417) ~[spring-cloud-stream-3.0.1.RELEASE.jar:3.0.1.RELEASE]
... 23 more
2021-02-24 20:40:54,167 DEBUG [main] org.hibernate.SQL [] [] - select loggerconf0_.id as id1_2_, loggerconf0_.create_time as create_t2_2_, loggerconf0_.last_update as last_upd3_2_, loggerconf0_.configuration as configur4_2_, loggerconf0_.service as service5_2_ from logger_configs loggerconf0_ where loggerconf0_.service=?
2021-02-24 20:40:54,169 TRACE [main] org.hibernate.type.descriptor.sql.BasicBinder [] [] - binding parameter [1] as [VARCHAR] - [audit]
^C[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 49.955 s
[INFO] Finished at: 2021-02-24T20:40:58+05:30
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "nexus" could not be activated because it does not exist.
pom.xml - https://drive.google.com/file/d/1kBEj6HqTQS5BgA7StaFZBEKfbLrpgSMR/view?usp=sharing
Using following jars
org.springframework.cloud:spring-cloud-stream-binder-kinesis:jar:2.0.0.RELEASE:compile
[INFO] | +- com.amazonaws:amazon-kinesis-client:jar:1.13.0:compile
[INFO] | | +- com.amazonaws:aws-java-sdk-cloudwatch:jar:1.11.655:compile
[INFO] | | \- com.google.protobuf:protobuf-java:jar:2.6.1:compile
[INFO] | +- com.amazonaws:amazon-kinesis-producer:jar:0.14.0:compile
[INFO] | | \- commons-lang:commons-lang:jar:2.6:compile
[INFO] | +- org.springframework.integration:spring-integration-aws:jar:2.3.0.RELEASE:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-aws-core:jar:2.2.0.RELEASE:compile
[INFO] | | | +- com.amazonaws:aws-java-sdk-ec2:jar:1.11.415:compile
[INFO] | | | \- com.amazonaws:aws-java-sdk-cloudformation:jar:1.11.415:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-aws-messaging:jar:2.2.0.RELEASE:runtime
[INFO] | | | +- com.amazonaws:aws-java-sdk-sns:jar:1.11.415:runtime
[INFO] | | | \- com.amazonaws:aws-java-sdk-sqs:jar:1.11.415:runtime
[INFO] | | +- org.springframework.integration:spring-integration-file:jar:5.2.2.RELEASE:runtime
[INFO] | | \- org.springframework.integration:spring-integration-http:jar:5.2.2.RELEASE:runtime
[INFO] | +- org.springframework.cloud:spring-cloud-starter-aws:jar:2.2.0.RELEASE:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-aws-context:jar:2.2.0.RELEASE:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-aws-autoconfigure:jar:2.2.0.RELEASE:compile
[INFO] | | \- javax.activation:javax.activation-api:jar:1.2.0:compile
[INFO] | +- com.amazonaws:aws-java-sdk-dynamodb:jar:1.11.415:compile
[INFO] | | \- com.amazonaws:aws-java-sdk-s3:jar:1.11.415:compile
[INFO] | | \- com.amazonaws:aws-java-sdk-kms:jar:1.11.415:compile
[INFO] | +- com.amazonaws:dynamodb-lock-client:jar:1.1.0:compile
[INFO] | | \- org.apache.httpcomponents:httpcore:jar:4.4.12:compile
[INFO] | +- com.amazonaws:aws-java-sdk-kinesis:jar:1.11.415:compile
[INFO] | \- com.amazonaws:dynamodb-streams-kinesis-adapter:jar:1.5.0:compile
To work with Spring BOOT apps and AWS APIs, try using the official AWS SDK for Java V2 API (as opposed to non Amazon APIs). Amazon recommends moving from V1 to V2.
Here is an example that does use DynamoDB API V2 within a Spring BOOT application. This use case lets a user submit data from a simple web form and then the data is stored in an Amazon DynamoDB table. Then it shows you how to use SNS to fire off a text message. In addition, this shows you how to successfully use AWS Java V2 APIs within a Spring Boot app.
Once its written, this shows you how to deploy the Spring Boot App to the cloud:
Creating your first AWS Java web application
Once you get a basic Spring app working with AWS Java V2 APIs, you can use other services.
My bad. I was creating a DynamoDBClient bean that returned null. This bean is autowired in org.springframework.cloud.stream.binder.kinesis.KinesisMessageChannelBinder.

111: Connection refused on elastic beanstalk

I built a jar file with JAVA Springboot on my computer and upload it to AWS elastic-beanstalk.
I could run it on my computer but it showed errors when I open the link of beanstalk:
2020/08/26 06:40:20 [error] 3143#0: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 125.121.75.33, server: , request: "GET /favicon.ico HTTP/1.1", upstream: "http://127.0.0.1:5000/favicon.ico", host: "xxxxxx-env.eba-4gp64tmr.us-east-1.elasticbeanstalk.com", referrer: "http://xxxxx-env.eba-4gp64tmr.us-east-1.elasticbeanstalk.com/"
However, the web log looks fine:
-------------------------------------
/var/log/web-1.log
-------------------------------------
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.3.4.BUILD-SNAPSHOT)
2020-08-26 06:39:42.667 INFO 3167 --- [ main] c.b.restservice.RestServiceApplication : Starting RestServiceApplication v0.0.1-SNAPSHOT on ip-171-21-39-87 with PID 3167 (/var/app/current/application.jar started by webapp in /var/app/current)
2020-08-26 06:39:42.678 INFO 3167 --- [ main] c.b.restservice.RestServiceApplication : No active profile set, falling back to default profiles: default
2020-08-26 06:39:46.929 INFO 3167 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http)
2020-08-26 06:39:46.966 INFO 3167 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat]
2020-08-26 06:39:46.967 INFO 3167 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.37]
2020-08-26 06:39:47.199 INFO 3167 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext
2020-08-26 06:39:47.204 INFO 3167 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 4295 ms
2020-08-26 06:39:49.410 INFO 3167 --- [ main] pertySourcedRequestMappingHandlerMapping : Mapped URL path [/v2/api-docs] onto method [springfox.documentation.swagger2.web.Swagger2Controller#getDocumentation(String, HttpServletRequest)]
2020-08-26 06:39:49.636 INFO 3167 --- [ main] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor'
2020-08-26 06:39:50.201 INFO 3167 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ''
2020-08-26 06:39:50.203 INFO 3167 --- [ main] d.s.w.p.DocumentationPluginsBootstrapper : Context refreshed
2020-08-26 06:39:50.252 INFO 3167 --- [ main] d.s.w.p.DocumentationPluginsBootstrapper : Found 1 custom documentation plugin(s)
2020-08-26 06:39:50.349 INFO 3167 --- [ main] s.d.s.w.s.ApiListingReferenceScanner : Scanning for api listing references
2020-08-26 06:39:50.401 INFO 3167 --- [ main] c.b.restservice.RestServiceApplication : Started RestServiceApplication in 9.6 seconds (JVM running for 12.522)
I set the SERVER_PORT to 8080 in Environment properties.
How should I handle this error? If you need any additional information just let me know.
Thank you for #Marcin reminder!
I set SERVER_PORT to 5000 this question has been solved!
Because Elastic Beanstalk will listen to 5000 default.

EC2 Bootstrap script not executed on Ubuntu

I am trying to set bootstrap script for EC2 Ubuntu machine with below CLI commands:
#cloud-boothook
#!/bin/bash
sudo chmod 750 -R /home/
Above text is set via AWS console in user data.
Once machine has started running, change mode still not amended by above user data script.
Tried alternative command as:
Content-Type: text/cloud-boothook
Content-Type: text/x-shellscript
sudo chmod 750 -R /home/
Still directory permission is not changed.
My objective is to set user home directory with 750 change mode on machine startup.
I would appreciate any solution to this.
Here is the error message appearing for me:
[[0;32m OK [0m] Started Initial cloud-init job (pre-networking).
[[0;32m OK [0m] Reached target Network (Pre).
[[0;32m OK [0m] Started ifup for eth0.
Starting Raise network interfaces...
[[0;32m OK [0m] Started Raise network interfaces.
[[0;32m OK [0m] Reached target Network.
Starting Initial cloud-init job (metadata service crawler)...
[ 7.535726] cloud-init[1003]: Cloud-init v. 0.7.9 running 'init' at Sun, 29 Oct 2017 19:44:35 +0000. Up 7.23 seconds.
[ 7.548260] cloud-init[1003]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info++++++++++++++++++++++++++++++++++++++
[ 7.565098] cloud-init[1003]: ci-info: +--------+------+-----------------------------+---------------+-------+-------------------+
[ 7.580238] cloud-init[1003]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address |
[ 7.592760] cloud-init[1003]: ci-info: +--------+------+-----------------------------+---------------+-------+-------------------+
[ 7.602150] cloud-init[1003]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | . | . |
[ 7.613465] cloud-init[1003]: ci-info: | lo | True | ::1/128 | . | host | . |
[ 7.625404] cloud-init[1003]: ci-info: | eth0 | True | 172.31.19.221 | 255.255.240.0 | . | 06:ad:ba:ed:47:10 |
[ 7.635005] cloud-init[1003]: ci-info: | eth0 | True | fe80::4ad:baff:feed:4710/64 | . | link | 06:ad:ba:ed:47:10 |
[ 7.648580] cloud-init[1003]: ci-info: +--------+------+-----------------------------+---------------+-------+-------------------+
[ 7.663022] cloud-init[1003]: ci-info: +++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++
[ 7.674192] cloud-init[1003]: ci-info: +-------+-------------+-------------+---------------+-----------+-------+
[ 7.682430] cloud-init[1003]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags |
[ 7.690935] cloud-init[1003]: ci-info: +-------+-------------+-------------+---------------+-----------+-------+
[ 7.703027] cloud-init[1003]: ci-info: | 0 | 0.0.0.0 | 172.31.16.1 | 0.0.0.0 | eth0 | UG |
[ 7.712719] cloud-init[1003]: ci-info: | 1 | 172.31.16.0 | 0.0.0.0 | 255.255.240.0 | eth0 | U |
[ 7.719950] cloud-init[1003]: ci-info: +-------+-------------+-------------+---------------+-----------+-------+
**[ 7.728024] cloud-init[1003]: 2017-10-29 19:44:35,859 - __init__.py[WARNING]: Unhandled non-multipart (text/x-not-multipart) userdata: 'b'Content-Type: text/cloud'...'**
[[0;32m OK [0m] Started Initial cloud-init job (metadata service crawler).
[[0;32m OK [0m] Reached target Cloud-config availability.
[[0;32m OK [0m] Reached target System Initialization.
[[0;32m OK [0m] Listening on D-Bus System Message Bus Socket.
Starting Socket activation for snappy daemon.
You are assigning 750 permission to /home recursively.
750 means:
The Owner may read, write and execute.
The Group may read and execute. (but not write)
The world may not do anything with this file.
The /home directory itself is owned by root and is in the root group. Therefore, after recursively applying 750, only root can access the contents of /home (including other users' home directories).
Therefore, you really don't want to do that.

Unit Test using embedded cassandra server (cassandra-unit) runs in intellij but crashes when run on command line via sbt

I'm trying to run my unit test via sbt on the command line but every time I get this error.
java.lang.NoSuchMethodError: org.apache.cassandra.utils.FBUtilities.getNetworkInterface(Ljava/net/InetAddress;)Ljava/lang/String;
at org.apache.cassandra.net.MessagingService.getServerSockets(MessagingService.java:556)
at org.apache.cassandra.net.MessagingService.listen(MessagingService.java:488)
at org.apache.cassandra.net.MessagingService.listen(MessagingService.java:472)
at org.apache.cassandra.service.StorageService.prepareToJoin(StorageService.java:832)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:727)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:613)
at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:349)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:551)
at org.cassandraunit.utils.EmbeddedCassandraServerHelper$1.run(EmbeddedCassandraServerHelper.java:125)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
3126 [pool-6-thread-1] ERROR org.apache.cassandra.service.CassandraDaemon - Exception encountered during startup
java.lang.NoSuchMethodError: org.apache.cassandra.utils.FBUtilities.getNetworkInterface(Ljava/net/InetAddress;)Ljava/lang/String;
at org.apache.cassandra.net.MessagingService.getServerSockets(MessagingService.java:556)
at org.apache.cassandra.net.MessagingService.listen(MessagingService.java:488)
at org.apache.cassandra.net.MessagingService.listen(MessagingService.java:472)
at org.apache.cassandra.service.StorageService.prepareToJoin(StorageService.java:832)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:727)
at org.apache.cassandra.service.StorageService.initServer(StorageService.java:613)
at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:349)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:551)
at org.cassandraunit.utils.EmbeddedCassandraServerHelper$1.run(EmbeddedCassandraServerHelper.java:125)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Where as when I run in intellij it runs just fine.
Below is the dependency tree I got from the command line with relevant lib. Couldn't post the whole tree here. Way too much. Any ideas?
+-org.cassandraunit:cassandra-unit:3.0.0.1
[info] | +-com.google.guava:guava:18.0
[info] | +-io.netty:netty-handler:4.0.27.Final (evicted by: 4.0.33.Final)
[info] | +-io.netty:netty-handler:4.0.33.Final
[info] | | +-io.netty:netty-buffer:4.0.33.Final
[info] | | | +-io.netty:netty-common:4.0.33.Final
[info] | | |
[info] | | +-io.netty:netty-codec:4.0.33.Final
[info] | | | +-io.netty:netty-transport:4.0.33.Final
[info] | | | +-io.netty:netty-buffer:4.0.33.Final
[info] | | | +-io.netty:netty-common:4.0.33.Final
[info] | | |
[info] | | +-io.netty:netty-transport:4.0.33.Final
[info] | | +-io.netty:netty-buffer:4.0.33.Final
[info] | | +-io.netty:netty-common:4.0.33.Final
[info] | |
[info] | +-junit:junit:4.12 (evicted by: 4.6)
[info] | | +-org.hamcrest:hamcrest-core:1.3
[info] | |
[info] | +-junit:junit:4.6
[info] | +-org.apache.cassandra:cassandra-all:3.4
[info] | | +-com.addthis.metrics:reporter-config3:3.0.0
[info] | | | +-com.addthis.metrics:reporter-config-base:3.0.0
[info] | | | | +-org.apache.commons:commons-lang3:3.1 (evicted by: 3.4)
[info] | | | | +-org.apache.commons:commons-lang3:3.4
[info] | | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | | | +-org.yaml:snakeyaml:1.11
[info] | | | |
[info] | | | +-io.dropwizard.metrics:metrics-core:3.1.0 (evicted by: 3.1.2)
[info] | | | +-io.dropwizard.metrics:metrics-core:3.1.2
[info] | | | | +-org.slf4j:slf4j-api:1.6.4 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.10 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.2 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | | +-org.slf4j:slf4j-api:1.7.5 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | | |
[info] | | | +-org.apache.commons:commons-lang3:3.1 (evicted by: 3.4)
[info] | | | +-org.apache.commons:commons-lang3:3.4
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | | +-org.yaml:snakeyaml:1.11
[info] | | |
[info] | | +-com.boundary:high-scale-lib:1.0.6
[info] | | +-com.clearspring.analytics:stream:2.5.2 (evicted by: 2.7.0)
[info] | | +-com.clearspring.analytics:stream:2.7.0
[info] | | +-com.github.jbellis:jamm:0.3.0
[info] | | +-com.google.guava:guava:18.0
[info] | | +-com.googlecode.concurrentlinkedhashmap:concurrentlinkedhashmap..
[info] | | +-com.googlecode.json-simple:json-simple:1.1
[info] | | +-com.ning:compress-lzf:0.8.4 (evicted by: 1.0.3)
[info] | | +-com.ning:compress-lzf:1.0.3
[info] | | +-com.thinkaurelius.thrift:thrift-server:0.3.7
[info] | | | +-com.lmax:disruptor:3.0.1
[info] | | | +-junit:junit:4.6
[info] | | | +-org.apache.thrift:libthrift:0.9.2
[info] | | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | | |
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | |
[info] | | +-commons-cli:commons-cli:1.1 (evicted by: 1.2)
[info] | | +-commons-cli:commons-cli:1.2
[info] | | +-commons-codec:commons-codec:1.10
[info] | | +-commons-codec:commons-codec:1.2 (evicted by: 1.10)
[info] | | +-io.dropwizard.metrics:metrics-core:3.1.0 (evicted by: 3.1.2)
[info] | | +-io.dropwizard.metrics:metrics-core:3.1.2
[info] | | | +-org.slf4j:slf4j-api:1.6.4 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.10 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.2 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.5 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | |
[info] | | +-joda-time:joda-time:2.4 (evicted by: 2.9.4)
[info] | | +-joda-time:joda-time:2.9.4
[info] | | +-net.java.dev.jna:jna:4.0.0
[info] | | +-net.jpountz.lz4:lz4:1.3.0
[info] | | +-org.antlr:antlr-runtime:3.5.2
[info] | | +-org.antlr:antlr:3.5.2
[info] | | | +-org.antlr:ST4:4.0.8
[info] | | | | +-org.antlr:antlr-runtime:3.5.2
[info] | | | |
[info] | | | +-org.antlr:antlr-runtime:3.5.2
[info] | | |
[info] | | +-org.apache.cassandra:cassandra-thrift:3.4
[info] | | | +-com.carrotsearch:hppc:0.5.4
[info] | | | +-com.github.rholder:snowball-stemmer:1.3.0.581.1
[info] | | | +-com.googlecode.concurrent-trees:concurrent-trees:2.4.0
[info] | | | +-de.jflex:jflex:1.6.0
[info] | | | | +-org.apache.ant:ant:1.7.0
[info] | | | | +-org.apache.ant:ant-launcher:1.7.0
[info] | | | |
[info] | | | +-net.mintern:primitive:1.0
[info] | | | +-org.apache.commons:commons-lang3:3.1 (evicted by: 3.4)
[info] | | | +-org.apache.commons:commons-lang3:3.4
[info] | | | +-org.apache.thrift:libthrift:0.9.2
[info] | | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | | |
[info] | | | +-org.slf4j:jcl-over-slf4j:1.7.19
[info] | | | | +-org.slf4j:slf4j-api:1.7.19 (evicted by: 1.7.21)
[info] | | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | |
[info] | | | +-org.slf4j:jcl-over-slf4j:1.7.7 (evicted by: 1.7.19)
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | |
[info] | | +-org.apache.commons:commons-lang3:3.1 (evicted by: 3.4)
[info] | | +-org.apache.commons:commons-lang3:3.4
[info] | | +-org.apache.commons:commons-math3:3.2 (evicted by: 3.4.1)
[info] | | +-org.apache.commons:commons-math3:3.4.1
[info] | | +-org.apache.thrift:libthrift:0.9.2
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | |
[info] | | +-org.caffinitas.ohc:ohc-core:0.4.2
[info] | | | +-com.google.guava:guava:18.0
[info] | | | +-net.java.dev.jna:jna:4.0.0
[info] | | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | |
[info] | | +-org.codehaus.jackson:jackson-core-asl:1.9.13
[info] | | +-org.codehaus.jackson:jackson-core-asl:1.9.2 (evicted by: 1.9.13)
[info] | | +-org.codehaus.jackson:jackson-mapper-asl:1.9.13
[info] | | | +-org.codehaus.jackson:jackson-core-asl:1.9.13
[info] | | |
[info] | | +-org.codehaus.jackson:jackson-mapper-asl:1.9.2 (evicted by: 1.9..
[info] | | +-org.eclipse.jdt.core.compiler:ecj:4.4.2
[info] | | +-org.fusesource:sigar:1.6.4
[info] | | +-org.mindrot:jbcrypt:0.3m
[info] | | +-org.slf4j:jcl-over-slf4j:1.7.19
[info] | | | +-org.slf4j:slf4j-api:1.7.19 (evicted by: 1.7.21)
[info] | | | +-org.slf4j:slf4j-api:1.7.21
[info] | | |
[info] | | +-org.slf4j:jcl-over-slf4j:1.7.7 (evicted by: 1.7.19)
[info] | | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | | +-org.slf4j:slf4j-api:1.7.21
[info] | | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] | | +-org.xerial.snappy:snappy-java:1.1.1.7
[info] | | +-org.yaml:snakeyaml:1.11
[info] | |
[info] | +-org.apache.commons:commons-lang3:3.1 (evicted by: 3.4)
[info] | +-org.apache.commons:commons-lang3:3.4
[info] | +-org.hamcrest:hamcrest-core:1.3
[info] | +-org.hamcrest:hamcrest-library:1.3
[info] | | +-org.hamcrest:hamcrest-core:1.3
[info] | |
[info] | +-org.slf4j:slf4j-api:1.7.12 (evicted by: 1.7.21)
[info] | +-org.slf4j:slf4j-api:1.7.21
[info] | +-org.slf4j:slf4j-api:1.7.7 (evicted by: 1.7.21)
[info] |
[info] +-org.scalamock:scalamock-scalatest-support_2.11:3.2.2 [S]
[info] | +-org.scalamock:scalamock-core_2.11:3.2.2 [S]
[info] | | +-org.scala-lang:scala-reflect:2.11.5 (evicted by: 2.11.8)
[info] | | +-org.scala-lang:scala-reflect:2.11.8 [S]
[info] | |
[info] | +-org.scalatest:scalatest_2.11:2.2.4 (evicted by: 2.2.6)
[info] | +-org.scalatest:scalatest_2.11:2.2.6 [S]
[info] | +-org.scala-lang.modules:scala-xml_2.11:1.0.2 (evicted by: 1.0.4)
[info] | +-org.scala-lang.modules:scala-xml_2.11:1.0.4 [S]
[info] | +-org.scala-lang:scala-reflect:2.11.7 (evicted by: 2.11.8)
[info] | +-org.scala-lang:scala-reflect:2.11.8 [S]
For me it is the embedded thrift-server forcing junit 4.6, it should be fine to exclude that library since Cassandra since 2.1 not using Thrift anymore.
Adding this line to build.sbt fixed the issue
"org.cassandraunit" % "cassandra-unit" % "3.0.0.1" % "test" exclude("com.thinkaurelius.thrift", "thrift-server")