Odoo 9 is not running tests - unit-testing

I try to write test for odoo module.
Module has structure:
/pr_mobile
|--->conttrollers
|--->demo
|--->models
|--->security
|--->tests
....|----> __init__.py
....|----> test_mobileproduct.py
|--->views
__init__.py
...
init.py:
from . import tests
tests/init.py
import test_mobileproduct
tests/test_mobileproduct.py
# -*- coding: utf-8 -*-
import unittest
from openerp.tests import common
from openerp.exceptions import ValidationError
class TestMobileProduct(common.TransactionCase):
def test_get_maket(self):
self.assertEqual(1, 2)
def test_get_maket2(self):
self.assertEqual(1, 2)
def test_get_maket3(self):
self.assertEqual(1, 2)
def test_get_maket4(self):
self.assertEqual5(1, 2)
if __name__ == '__main__':
unittest.main()
I run test with this commands:
./odoo.py --addons-path=addons,openerp/addons,openerp/my-addons -d
pr_odoo_v9 --db-filter="^pr_odoo_v9" -u pr_mobile --test-enable
--log-level=test
And result is:
skif#ubuntu-desktop:/opt/odoo$ ./odoo.py --addons-path=addons,openerp/addons,openerp/my-addons -d pr_odoo_v9 --db-filter="^pr_odoo_v9" -u pr_mobile --test-enable --log-level=test
2019-04-10 15:11:48,708 8919 INFO ? openerp: OpenERP version 9.0c
2019-04-10 15:11:48,709 8919 INFO ? openerp: addons paths: ['/home/skif/.local/share/Odoo/addons/9.0', u'/opt/odoo/addons', u'/opt/odoo/openerp/addons', u'/opt/odoo/openerp/my-addons']
2019-04-10 15:11:48,709 8919 INFO ? openerp: database: default#default:default
2019-04-10 15:11:48,774 8919 INFO ? openerp.service.server: HTTP service (werkzeug) running on 0.0.0.0:8069
2019-04-10 15:11:48,780 8919 INFO pr_odoo_v9 openerp.modules.loading: loading 1 modules...
2019-04-10 15:11:48,788 8919 INFO pr_odoo_v9 openerp.modules.loading: 1 modules loaded in 0.01s, 0 queries
2019-04-10 15:11:49,541 8919 INFO pr_odoo_v9 openerp.modules.loading: loading 64 modules...
2019-04-10 15:11:49,560 8919 INFO ? openerp.addons.bus.models.bus: Bus.loop listen imbus on db postgres
2019-04-10 15:11:49,579 8919 INFO pr_odoo_v9 openerp.addons.report.models.report: You need Wkhtmltopdf to print a pdf version of the reports.
2019-04-10 15:11:50,252 8919 INFO pr_odoo_v9 openerp.modules.module: module pr_mobile: creating or updating database tables
/opt/odoo/openerp/models.py:451: UnicodeWarning: Unicode unequal comparison failed to convert both arguments to Unicode - interpreting them as being unequal
if cols[k][key] != vals[key]:
2019-04-10 15:11:50,601 8919 INFO pr_odoo_v9 openerp.modules.loading: loading pr_mobile/views/views.xml
2019-04-10 15:11:50,970 8919 INFO pr_odoo_v9 openerp.modules.loading: loading pr_mobile/views/saleorder.xml
2019-04-10 15:11:51,017 8919 INFO pr_odoo_v9 openerp.modules.loading: loading pr_mobile/views/templates.xml
2019-04-10 15:11:51,021 8919 INFO pr_odoo_v9 openerp.addons.base.ir.ir_translation: module pr_mobile: no translation for language ru
2019-04-10 15:11:51,171 8919 INFO pr_odoo_v9 openerp.modules.loading: 64 modules loaded in 1.63s, 5 queries
2019-04-10 15:11:51,626 8919 INFO pr_odoo_v9 openerp.modules.loading: Modules loaded.
2019-04-10 15:11:51,628 8919 INFO pr_odoo_v9 openerp.modules.loading: All post-tested in 0.00s, 0 queries
Why are tests not running? In db present data from production server.

You have to set at_install or post_install for the tests to run at installation or upgrading
class TestMobileProduct(common.TransactionCase):
at_install = True
post_install = True

Related

How to use sonarcloud with travis?

Description
We are already using sonarqube locally and we want to use it for our open source projects.
This is an example OpenSource project we are trying to setup:
https://www.npmjs.com/package/#yeutech-lab/accept-dot-path
https://github.com/yeutech-lab/accept-dot-path
Using dev branch we have followed the documentation and the build is failing:
https://travis-ci.org/yeutech-lab/accept-dot-path/jobs/396729046
Reproduction
Failing job on travis
This is my sonar-project.properties:
sonar.testExecutionReportPaths=reports/test-report.xml
sonar.projectKey=com.github.yeutech-lab.accept-dot-path
sonar.projectName=com.github.yeutech-lab.accept-dot-path
sonar.sources=src
sonar.exclusions=/src/**/tests/*.test.js
sonar.test.exclusions=/src/**/tests/*.test.js
sonar.dynamicAnalysis=reuseReports
sonar.javascript.jstest.reportsPath=coverage
sonar.javascript.lcov.reportPaths=coverage/lcov.info
This is my stage failing .travis.yml:
- stage: test
if: branch IN (dev, master)
node_js:
- lts/*
- 10
- 8
addons:
sonarcloud:
organization: "yeutech-lab"
script:
- npm run test
- sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}
I have the following error:
26.52s$ sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}
06:30:58.836 INFO: Scanner configuration file: /home/travis/.sonarscanner/sonar-scanner-3.0.3.778/conf/sonar-scanner.properties
06:30:58.845 INFO: Project root configuration file: /home/travis/build/yeutech-lab/accept-dot-path/sonar-project.properties
06:30:58.931 INFO: SonarQube Scanner 3.0.3.778
06:30:58.931 INFO: Java 1.8.0_151 Oracle Corporation (64-bit)
06:30:58.931 INFO: Linux 4.4.0-101-generic amd64
06:30:59.317 DEBUG: keyStore is :
06:30:59.317 DEBUG: keyStore type is : jks
06:30:59.318 DEBUG: keyStore provider is :
06:30:59.319 DEBUG: init keystore
06:30:59.321 DEBUG: init keymanager of type SunX509
06:30:59.534 DEBUG: Create : /home/travis/.sonar/cache
06:30:59.537 INFO: User cache: /home/travis/.sonar/cache
06:30:59.539 DEBUG: Create : /home/travis/.sonar/cache/_tmp
06:30:59.539 DEBUG: Extract sonar-scanner-api-batch in temp...
06:30:59.565 DEBUG: Get bootstrap index...
06:30:59.565 DEBUG: Download: https://sonarcloud.io/batch/index
06:31:00.321 DEBUG: Get bootstrap completed
06:31:00.323 DEBUG: Download https://sonarcloud.io/batch/file?name=sonar-scanner-engine-shaded-developer-7.3.0.13459-all.jar to /home/travis/.sonar/cache/_tmp/fileCache1590224166395973229.tmp
06:31:05.257 DEBUG: Create isolated classloader...
06:31:05.277 DEBUG: Start temp cleaning...
06:31:05.304 DEBUG: Temp cleaning done
06:31:05.304 DEBUG: Execution getVersion
06:31:05.310 DEBUG: Execution start
06:31:05.598 INFO: Publish mode
06:31:05.771 INFO: Load global settings
06:31:06.441 DEBUG: GET 200 https://sonarcloud.io/api/settings/values.protobuf | time=659ms
06:31:06.467 INFO: Load global settings (done) | time=697ms
06:31:06.485 INFO: Server id: AWHW8ct9-T_TB3XqouNu
06:31:06.502 DEBUG: Create : /home/travis/.sonar/_tmp
06:31:06.503 INFO: User cache: /home/travis/.sonar/cache
06:31:06.686 INFO: Load/download plugins
06:31:06.686 INFO: Load plugins index
06:31:06.806 DEBUG: GET 200 https://sonarcloud.io/api/plugins/installed | time=120ms
06:31:06.850 INFO: Load plugins index (done) | time=164ms
06:31:06.853 DEBUG: Download plugin 'authbitbucket' to '/home/travis/.sonar/_tmp/fileCache8382949818402309739.tmp'
06:31:06.972 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authbitbucket&acceptCompressions=pack200 | time=118ms
06:31:07.480 DEBUG: Unpacking plugin authbitbucket
06:31:07.564 DEBUG: Download plugin 'scmgit' to '/home/travis/.sonar/_tmp/fileCache3430018907165592069.tmp'
06:31:07.688 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=scmgit&acceptCompressions=pack200 | time=123ms
06:31:07.936 DEBUG: Unpacking plugin scmgit
06:31:08.353 DEBUG: Download plugin 'github' to '/home/travis/.sonar/_tmp/fileCache5247411604780626227.tmp'
06:31:08.471 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=github&acceptCompressions=pack200 | time=116ms
06:31:08.885 DEBUG: Unpacking plugin github
06:31:09.011 DEBUG: Download plugin 'authgithub' to '/home/travis/.sonar/_tmp/fileCache1172914636956968383.tmp'
06:31:09.128 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authgithub&acceptCompressions=pack200 | time=116ms
06:31:09.143 DEBUG: Unpacking plugin authgithub
06:31:09.164 DEBUG: Download plugin 'license' to '/home/travis/.sonar/_tmp/fileCache2891083593711587642.tmp'
06:31:09.280 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=license&acceptCompressions=pack200 | time=115ms
06:31:09.282 DEBUG: Unpacking plugin license
06:31:09.288 DEBUG: Download plugin 'scmmercurial' to '/home/travis/.sonar/_tmp/fileCache480957901258776338.tmp'
06:31:09.405 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=scmmercurial&acceptCompressions=pack200 | time=117ms
06:31:09.407 DEBUG: Unpacking plugin scmmercurial
06:31:09.411 DEBUG: Download plugin 'authmicrosoft' to '/home/travis/.sonar/_tmp/fileCache7929759057179488686.tmp'
06:31:09.528 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authmicrosoft&acceptCompressions=pack200 | time=115ms
06:31:10.238 DEBUG: Unpacking plugin authmicrosoft
06:31:10.479 DEBUG: Download plugin 'abap' to '/home/travis/.sonar/_tmp/fileCache6155881230164947210.tmp'
06:31:10.596 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=abap&acceptCompressions=pack200 | time=115ms
06:31:10.714 DEBUG: Unpacking plugin abap
06:31:10.918 DEBUG: Download plugin 'csharp' to '/home/travis/.sonar/_tmp/fileCache6706825159734964118.tmp'
06:31:11.034 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=csharp&acceptCompressions=pack200 | time=115ms
06:31:11.279 DEBUG: Unpacking plugin csharp
06:31:11.422 DEBUG: Download plugin 'cpp' to '/home/travis/.sonar/_tmp/fileCache5652771019902212699.tmp'
06:31:11.539 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=cpp&acceptCompressions=pack200 | time=116ms
06:31:12.227 DEBUG: Unpacking plugin cpp
06:31:12.863 DEBUG: Download plugin 'flex' to '/home/travis/.sonar/_tmp/fileCache8167974862316719743.tmp'
06:31:12.982 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=flex&acceptCompressions=pack200 | time=115ms
06:31:13.237 DEBUG: Unpacking plugin flex
06:31:13.426 DEBUG: Download plugin 'go' to '/home/travis/.sonar/_tmp/fileCache4775478942526974201.tmp'
06:31:13.542 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=go&acceptCompressions=pack200 | time=116ms
06:31:14.679 DEBUG: Unpacking plugin go
06:31:15.380 DEBUG: Download plugin 'javascript' to '/home/travis/.sonar/_tmp/fileCache6735152755692319121.tmp'
06:31:15.497 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=javascript&acceptCompressions=pack200 | time=116ms
06:31:15.839 DEBUG: Unpacking plugin javascript
06:31:16.231 DEBUG: Download plugin 'java' to '/home/travis/.sonar/_tmp/fileCache4775164839730523442.tmp'
06:31:16.348 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=java&acceptCompressions=pack200 | time=117ms
06:31:16.921 DEBUG: Unpacking plugin java
06:31:17.871 DEBUG: Download plugin 'php' to '/home/travis/.sonar/_tmp/fileCache4310559658352997108.tmp'
06:31:17.989 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=php&acceptCompressions=pack200 | time=117ms
06:31:18.335 DEBUG: Unpacking plugin php
06:31:18.630 DEBUG: Download plugin 'plsql' to '/home/travis/.sonar/_tmp/fileCache4483462510508490361.tmp'
06:31:18.746 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=plsql&acceptCompressions=pack200 | time=116ms
06:31:18.873 DEBUG: Unpacking plugin plsql
06:31:19.120 DEBUG: Download plugin 'python' to '/home/travis/.sonar/_tmp/fileCache7976201852420985200.tmp'
06:31:19.236 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=python&acceptCompressions=pack200 | time=116ms
06:31:19.361 DEBUG: Unpacking plugin python
06:31:19.548 DEBUG: Download plugin 'security' to '/home/travis/.sonar/_tmp/fileCache4952173467535429371.tmp'
06:31:19.664 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=security&acceptCompressions=pack200 | time=116ms
06:31:19.782 DEBUG: Unpacking plugin security
06:31:19.930 DEBUG: Download plugin 'swift' to '/home/travis/.sonar/_tmp/fileCache7219239880236505170.tmp'
06:31:20.046 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=swift&acceptCompressions=pack200 | time=115ms
06:31:20.175 DEBUG: Unpacking plugin swift
06:31:20.422 DEBUG: Download plugin 'typescript' to '/home/travis/.sonar/_tmp/fileCache8846622888447642464.tmp'
06:31:20.539 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=typescript&acceptCompressions=pack200 | time=116ms
06:31:21.109 DEBUG: Unpacking plugin typescript
06:31:21.247 DEBUG: Download plugin 'tsql' to '/home/travis/.sonar/_tmp/fileCache6294671329122059465.tmp'
06:31:21.363 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=tsql&acceptCompressions=pack200 | time=116ms
06:31:21.480 DEBUG: Unpacking plugin tsql
06:31:21.723 DEBUG: Download plugin 'vbnet' to '/home/travis/.sonar/_tmp/fileCache2366356249389465444.tmp'
06:31:21.841 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=vbnet&acceptCompressions=pack200 | time=118ms
06:31:22.076 DEBUG: Unpacking plugin vbnet
06:31:22.169 DEBUG: Download plugin 'web' to '/home/travis/.sonar/_tmp/fileCache1192878208453770217.tmp'
06:31:22.286 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=web&acceptCompressions=pack200 | time=117ms
06:31:22.401 DEBUG: Unpacking plugin web
06:31:22.611 DEBUG: Download plugin 'xml' to '/home/travis/.sonar/_tmp/fileCache6491864581862163918.tmp'
06:31:22.728 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=xml&acceptCompressions=pack200 | time=116ms
06:31:22.955 DEBUG: Unpacking plugin xml
06:31:23.186 INFO: Load/download plugins (done) | time=16500ms
06:31:23.250 DEBUG: API compatibility mode is enabled on plugin Mercurial [scmmercurial] (built with API lower than 5.2)
06:31:23.491 DEBUG: Plugins:
06:31:23.492 DEBUG: * Bitbucket Authentication for SonarQube 1.1.0.344 (authbitbucket)
06:31:23.493 DEBUG: * SonarPLSQL 3.2.0.1753 (plsql)
06:31:23.493 DEBUG: * SonarC# 7.2.0.5463 (csharp)
06:31:23.494 DEBUG: * SonarSecurity 7.2.0.944 (security)
06:31:23.495 DEBUG: * SonarJava 5.4.0.14284 (java)
06:31:23.495 DEBUG: * SonarWeb 2.6.0.1053 (web)
06:31:23.496 DEBUG: * SonarFlex 2.4.0.1222 (flex)
06:31:23.496 DEBUG: * SonarXML 1.5.1.1452 (xml)
06:31:23.497 DEBUG: * SonarTS 1.7.0.2828 (typescript)
06:31:23.498 DEBUG: * SonarVB 5.1.0.442 (vbnet)
06:31:23.498 DEBUG: * SonarSwift 3.3.0.2492 (swift)
06:31:23.499 DEBUG: * GitHub 1.4.2.1027 (github)
06:31:23.500 DEBUG: * SonarCFamily 5.1.0.10083 (cpp)
06:31:23.501 DEBUG: * SonarPython 1.10.0.2131 (python)
06:31:23.501 DEBUG: * GitHub Authentication for SonarQube 1.4.0.660 (authgithub)
06:31:23.501 DEBUG: * Mercurial 1.1.1 (scmmercurial)
06:31:23.501 DEBUG: * SonarGo 1.1.0.1612 (go)
06:31:23.501 DEBUG: * Microsoft Authentication for SonarCloud 1.0.0.157 (authmicrosoft)
06:31:23.501 DEBUG: * SonarTSQL 1.2.0.2539 (tsql)
06:31:23.501 DEBUG: * SonarJS 4.1.0.6085 (javascript)
06:31:23.501 DEBUG: * License for SonarLint 7.3.0.13459 (license)
06:31:23.504 DEBUG: * Git 1.5.0.1160 (scmgit)
06:31:23.504 DEBUG: * SonarPHP 2.13.0.3107 (php)
06:31:23.505 DEBUG: * SonarABAP 3.6.0.1269 (abap)
06:31:23.543 INFO: Loaded core extensions: branch-scanner
06:31:23.544 DEBUG: Execution getVersion
06:31:23.545 INFO: SonarQube server 7.3.0
06:31:23.546 INFO: Default locale: "en_US", source code encoding: "UTF-8" (analysis is platform dependent)
06:31:23.548 DEBUG: Work directory: /home/travis/build/yeutech-lab/accept-dot-path/.scannerwork
06:31:23.549 DEBUG: Execution getVersion
06:31:23.550 DEBUG: Execution execute
06:31:23.860 INFO: Installed core extension: branch-scanner
06:31:24.122 INFO: Installed core extension: branch-scanner
06:31:24.129 INFO: Process project properties
06:31:24.143 DEBUG: Process project properties (done) | time=14ms
06:31:24.158 INFO: Load project branches
06:31:24.278 DEBUG: GET 404 https://sonarcloud.io/api/project_branches/list?project=com.github.yeutech-lab.accept-dot-path%3Adev | time=116ms
06:31:24.282 DEBUG: Could not process project branches - continuing without it
06:31:24.285 INFO: Load project branches (done) | time=127ms
06:31:24.289 INFO: Load project pull requests
06:31:24.406 DEBUG: GET 404 https://sonarcloud.io/api/project_pull_requests/list?project=com.github.yeutech-lab.accept-dot-path%3Adev | time=115ms
06:31:24.407 DEBUG: Could not process project pull requests - continuing without it
06:31:24.410 INFO: Load project pull requests (done) | time=122ms
06:31:24.410 INFO: Load branch configuration
06:31:24.411 DEBUG: Not on a Bitbucket pipeline.
06:31:24.419 INFO: ------------------------------------------------------------------------
06:31:24.419 INFO: EXECUTION FAILURE
06:31:24.419 INFO: ------------------------------------------------------------------------
06:31:24.419 INFO: Total time: 25.644s
06:31:24.518 INFO: Final Memory: 54M/188M
06:31:24.518 INFO: ------------------------------------------------------------------------
06:31:24.518 ERROR: Error during SonarQube Scanner execution
java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.ProjectLock
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:678)
at org.sonar.core.platform.ComponentContainer.getComponentByType(ComponentContainer.java:281)
at org.sonar.scanner.scan.ProjectScanContainer.doBeforeStart(ProjectScanContainer.java:123)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:134)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:122)
at org.sonar.scanner.task.ScanTask.execute(ScanTask.java:48)
at org.sonar.scanner.task.TaskContainer.doAfterStart(TaskContainer.java:81)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:136)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:122)
at org.sonar.scanner.bootstrap.GlobalContainer.executeTask(GlobalContainer.java:132)
at org.sonar.batch.bootstrapper.Batch.doExecuteTask(Batch.java:116)
at org.sonar.batch.bootstrapper.Batch.executeTask(Batch.java:111)
at org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.execute(BatchIsolatedLauncher.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(IsolatedLauncherProxy.java:60)
at com.sun.proxy.$Proxy0.execute(Unknown Source)
at org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner.java:233)
at org.sonarsource.scanner.api.EmbeddedScanner.runAnalysis(EmbeddedScanner.java:151)
at org.sonarsource.scanner.cli.Main.runAnalysis(Main.java:123)
at org.sonarsource.scanner.cli.Main.execute(Main.java:77)
at org.sonarsource.scanner.cli.Main.main(Main.java:61)
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.DefaultInputModuleHierarchy
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 24 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.ProjectBuildersExecutor
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.SingleMemberInjector.getMemberArguments(SingleMemberInjector.java:61)
at org.picocontainer.injectors.MethodInjector.getMemberArguments(MethodInjector.java:100)
at org.picocontainer.injectors.MethodInjector$2.run(MethodInjector.java:112)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.MethodInjector.decorateComponentInstance(MethodInjector.java:120)
at org.picocontainer.injectors.CompositeInjector.decorateComponentInstance(CompositeInjector.java:58)
at org.picocontainer.injectors.Reinjector.reinject(Reinjector.java:142)
at org.picocontainer.injectors.ProviderAdapter.getComponentInstance(ProviderAdapter.java:96)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 38 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.plugins.github.PullRequestProjectBuilder
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:621)
at org.picocontainer.parameters.CollectionComponentParameter.getArrayInstance(CollectionComponentParameter.java:334)
at org.picocontainer.parameters.CollectionComponentParameter.access$100(CollectionComponentParameter.java:49)
at org.picocontainer.parameters.CollectionComponentParameter$1.resolveInstance(CollectionComponentParameter.java:139)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:141)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 53 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.plugins.github.GitHubPluginConfiguration
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 69 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.MutableProjectSettings
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 83 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.repository.ProjectRepositories
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 97 more
Caused by: java.lang.IllegalStateException: Unable to load component interface org.sonar.scanner.scan.branch.BranchConfiguration
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.SingleMemberInjector.getMemberArguments(SingleMemberInjector.java:61)
at org.picocontainer.injectors.MethodInjector.getMemberArguments(MethodInjector.java:100)
at org.picocontainer.injectors.MethodInjector$2.run(MethodInjector.java:112)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.MethodInjector.decorateComponentInstance(MethodInjector.java:120)
at org.picocontainer.injectors.CompositeInjector.decorateComponentInstance(CompositeInjector.java:58)
at org.picocontainer.injectors.Reinjector.reinject(Reinjector.java:142)
at org.picocontainer.injectors.ProviderAdapter.getComponentInstance(ProviderAdapter.java:96)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 111 more
Caused by: Project was never analyzed. A regular analysis is required before a branch analysis
06:31:24.545 DEBUG: Execution getVersion
06:31:24.545 DEBUG: Execution stop
The command "sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}" exited with 1.
store build cache
I have no clue what this error message is about, the same configuration work for our local sonarqube.
Can I have information on how to resolve this error?
You are trying to directly analyze a branch whereas your project has not been created yet. This is why you get the following message:
Project was never analyzed. A regular analysis is required before a
branch analysis
Fixing the situation is simple:
Go to your organization "Administration > Projects Management" page
Click on "Create Project" and set the project name and key (com.github.yeutech-lab.accept-dot-path in your case)
This should fix your issue.

2 RabbitMQ workers and 2 Scrapyd daemons running on 2 local Ubuntu instances, in which one of the rabbitmq worker is not working

I am currently working on building "Scrapy spiders control panel" in which I am testing this existing solution available on [Distributed Multi-user Scrapy Spiders Control Panel] https://github.com/aaldaber/Distributed-Multi-User-Scrapy-System-with-a-Web-UI.
I am trying to run this on my local Ubuntu Dev Machine but having issues with scrapd daemon.
One of the Workers, linkgenerator is working but scraper as worker1 is not working.
I can not figure out why scrapyd won't run on another local instance.
Background Information about the configuration.
The application comes bundled with Django, Scrapy, Pipeline for MongoDB (for saving the scraped items) and Scrapy scheduler for RabbitMQ (for distributing the links among workers). I have 2 local Ubuntu instances in which Django, MongoDB, Scrapyd daemon and RabbitMQ server running on Instance1.
On another Scrapyd daemon is running on Instance2.
RabbitMQ Workers:
linkgenerator
worker1
IP Configurations for Instances:
IP For local Ubuntu Instance1: 192.168.0.101
IP for local Ubuntu Instance2: 192.168.0.106
List of tools used:
MongoDB server
RabbitMQ server
Scrapy Scrapyd API
One RabbitMQ linkgenerator worker (WorkerName: linkgenerator) server with Scrapy installed and running scrapyd daemon on local Ubuntu Instance1: 192.168.0.101
Another one RabbitMQ scraper worker (WorkerName: worker1) server with Scrapy installed and running scrapyd daemon on local Ubuntu Instance2: 192.168.0.106
Instance1: 192.168.0.101
"Instance1" on which Django, RabbitMQ, scrapyd daemon servers running -- IP : 192.168.0.101
Instance2: 192.168.0.106
Scrapy installed on instance2 and running scrapyd daemon
Scrapy Control Panel UI Snapshot:
from snapshot, control panel outlook can be been seen, there are two workers, linkgenerator worked successfully but worker1 did not, the logs given in the end of the post
RabbitMQ status info
linkgenerator worker can successfully push the message to RabbitMQ queue, linkgenerator spider generates start_urls for "scraper spider* are consumed by scraper (worker1), which is not working, please see the logs for worker1 in end of the post
RabbitMQ settings
The below file contains the settings for MongoDB and RabbitMQ:
SCHEDULER = ".rabbitmq.scheduler.Scheduler"
SCHEDULER_PERSIST = True
RABBITMQ_HOST = 'ScrapyDevU79'
RABBITMQ_PORT = 5672
RABBITMQ_USERNAME = 'guest'
RABBITMQ_PASSWORD = 'guest'
MONGODB_PUBLIC_ADDRESS = 'OneScience:27017' # This will be shown on the web interface, but won't be used for connecting to DB
MONGODB_URI = 'localhost:27017' # Actual uri to connect to DB
MONGODB_USER = 'tariq'
MONGODB_PASSWORD = 'toor'
MONGODB_SHARDED = True
MONGODB_BUFFER_DATA = 100
# Set your link generator worker address here
LINK_GENERATOR = 'http://192.168.0.101:6800'
SCRAPERS = ['http://192.168.0.106:6800']
LINUX_USER_CREATION_ENABLED = False # Set this to True if you want a linux user account
linkgenerator scrapy.cfg settings:
[settings]
default = tester2_fda_trial20.settings
[deploy:linkgenerator]
url = http://192.168.0.101:6800
project = tester2_fda_trial20
scraper scrapy.cfg settings:
[settings]
default = tester2_fda_trial20.settings
[deploy:worker1]
url = http://192.168.0.101:6800
project = tester2_fda_trial20
scrapyd.conf file settings for Instance1 (192.168.0.101)
cat /etc/scrapyd/scrapyd.conf
[scrapyd]
eggs_dir = /var/lib/scrapyd/eggs
dbs_dir = /var/lib/scrapyd/dbs
items_dir = /var/lib/scrapyd/items
logs_dir = /var/log/scrapyd
max_proc = 0
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 5.0
bind_address = 0.0.0.0
#bind_address = 127.0.0.1
http_port = 6800
debug = on
runner = scrapyd.runner
application = scrapyd.app.application
launcher = scrapyd.launcher.Launcher
webroot = scrapyd.website.Root
[services]
schedule.json = scrapyd.webservice.Schedule
cancel.json = scrapyd.webservice.Cancel
addversion.json = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json = scrapyd.webservice.ListSpiders
delproject.json = scrapyd.webservice.DeleteProject
delversion.json = scrapyd.webservice.DeleteVersion
listjobs.json = scrapyd.webservice.ListJobs
daemonstatus.json = scrapyd.webservice.DaemonStatus
scrapyd.conf file settings for Instance2 (192.168.0.106)
cat /etc/scrapyd/scrapyd.conf
[scrapyd]
eggs_dir = /var/lib/scrapyd/eggs
dbs_dir = /var/lib/scrapyd/dbs
items_dir = /var/lib/scrapyd/items
logs_dir = /var/log/scrapyd
max_proc = 0
max_proc_per_cpu = 4
finished_to_keep = 100
poll_interval = 5.0
bind_address = 0.0.0.0
#bind_address = 127.0.0.1
http_port = 6800
debug = on
runner = scrapyd.runner
application = scrapyd.app.application
launcher = scrapyd.launcher.Launcher
webroot = scrapyd.website.Root
[services]
schedule.json = scrapyd.webservice.Schedule
cancel.json = scrapyd.webservice.Cancel
addversion.json = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json = scrapyd.webservice.ListSpiders
delproject.json = scrapyd.webservice.DeleteProject
delversion.json = scrapyd.webservice.DeleteVersion
listjobs.json = scrapyd.webservice.ListJobs
daemonstatus.json = scrapyd.webservice.DaemonStatus
RabbitMQ Status
sudo service rabbitmq-server status
[sudo] password for mtaziz:
Status of node rabbit#ScrapyDevU79
[{pid,53715},
{running_applications,
[{rabbitmq_shovel_management,
"Management extension for the Shovel plugin","3.6.11"},
{rabbitmq_shovel,"Data Shovel for RabbitMQ","3.6.11"},
{rabbitmq_management,"RabbitMQ Management Console","3.6.11"},
{rabbitmq_web_dispatch,"RabbitMQ Web Dispatcher","3.6.11"},
{rabbitmq_management_agent,"RabbitMQ Management Agent","3.6.11"},
{rabbit,"RabbitMQ","3.6.11"},
{os_mon,"CPO CXC 138 46","2.2.14"},
{cowboy,"Small, fast, modular HTTP server.","1.0.4"},
{ranch,"Socket acceptor pool for TCP protocols.","1.3.0"},
{ssl,"Erlang/OTP SSL application","5.3.2"},
{public_key,"Public key infrastructure","0.21"},
{cowlib,"Support library for manipulating Web protocols.","1.0.2"},
{crypto,"CRYPTO version 2","3.2"},
{amqp_client,"RabbitMQ AMQP Client","3.6.11"},
{rabbit_common,
"Modules shared by rabbitmq-server and rabbitmq-erlang-client",
"3.6.11"},
{inets,"INETS CXC 138 49","5.9.7"},
{mnesia,"MNESIA CXC 138 12","4.11"},
{compiler,"ERTS CXC 138 10","4.9.4"},
{xmerl,"XML parser","1.3.5"},
{syntax_tools,"Syntax tools","1.6.12"},
{asn1,"The Erlang ASN1 compiler version 2.0.4","2.0.4"},
{sasl,"SASL CXC 138 11","2.3.4"},
{stdlib,"ERTS CXC 138 10","1.19.4"},
{kernel,"ERTS CXC 138 10","2.16.4"}]},
{os,{unix,linux}},
{erlang_version,
"Erlang R16B03 (erts-5.10.4) [source] [64-bit] [smp:4:4] [async-threads:64] [kernel-poll:true]\n"},
{memory,
[{connection_readers,0},
{connection_writers,0},
{connection_channels,0},
{connection_other,6856},
{queue_procs,145160},
{queue_slave_procs,0},
{plugins,1959248},
{other_proc,22328920},
{metrics,160112},
{mgmt_db,655320},
{mnesia,83952},
{other_ets,2355800},
{binary,96920},
{msg_index,47352},
{code,27101161},
{atom,992409},
{other_system,31074022},
{total,87007232}]},
{alarms,[]},
{listeners,[{clustering,25672,"::"},{amqp,5672,"::"},{http,15672,"::"}]},
{vm_memory_calculation_strategy,rss},
{vm_memory_high_watermark,0.4},
{vm_memory_limit,3343646720},
{disk_free_limit,50000000},
{disk_free,56257699840},
{file_descriptors,
[{total_limit,924},{total_used,2},{sockets_limit,829},{sockets_used,0}]},
{processes,[{limit,1048576},{used,351}]},
{run_queue,0},
{uptime,34537},
{kernel,{net_ticktime,60}}]
scrapyd daemon on Instance1 ( 192.168.0.101 ) running status:
scrapyd
2017-09-11T06:16:07+0600 [-] Loading /home/mtaziz/.virtualenvs/onescience_dist_env/local/lib/python2.7/site-packages/scrapyd/txapp.py...
2017-09-11T06:16:07+0600 [-] Scrapyd web console available at http://0.0.0.0:6800/
2017-09-11T06:16:07+0600 [-] Loaded.
2017-09-11T06:16:07+0600 [twisted.scripts._twistd_unix.UnixAppLogger#info] twistd 17.5.0 (/home/mtaziz/.virtualenvs/onescience_dist_env/bin/python 2.7.6) starting up.
2017-09-11T06:16:07+0600 [twisted.scripts._twistd_unix.UnixAppLogger#info] reactor class: twisted.internet.epollreactor.EPollReactor.
2017-09-11T06:16:07+0600 [-] Site starting on 6800
2017-09-11T06:16:07+0600 [twisted.web.server.Site#info] Starting factory <twisted.web.server.Site instance at 0x7f5e265c77a0>
2017-09-11T06:16:07+0600 [Launcher] Scrapyd 1.2.0 started: max_proc=16, runner='scrapyd.runner'
2017-09-11T06:16:07+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:16:07 +0000] "GET /listprojects.json HTTP/1.1" 200 98 "-" "python-requests/2.18.4"
2017-09-11T06:16:07+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:16:07 +0000] "GET /listversions.json?project=tester2_fda_trial20 HTTP/1.1" 200 80 "-" "python-requests/2.18.4"
2017-09-11T06:16:07+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:16:07 +0000] "GET /listjobs.json?project=tester2_fda_trial20 HTTP/1.1" 200 92 "-" "python-requests/2.18.4"
scrapyd daemon on instance2 (192.168.0.106) running status:
scrapyd
2017-09-11T06:09:28+0600 [-] Loading /home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/scrapyd/txapp.py...
2017-09-11T06:09:28+0600 [-] Scrapyd web console available at http://0.0.0.0:6800/
2017-09-11T06:09:28+0600 [-] Loaded.
2017-09-11T06:09:28+0600 [twisted.scripts._twistd_unix.UnixAppLogger#info] twistd 17.5.0 (/home/mtaziz/.virtualenvs/scrapydevenv/bin/python 2.7.6) starting up.
2017-09-11T06:09:28+0600 [twisted.scripts._twistd_unix.UnixAppLogger#info] reactor class: twisted.internet.epollreactor.EPollReactor.
2017-09-11T06:09:28+0600 [-] Site starting on 6800
2017-09-11T06:09:28+0600 [twisted.web.server.Site#info] Starting factory <twisted.web.server.Site instance at 0x7fbe6eaeac20>
2017-09-11T06:09:28+0600 [Launcher] Scrapyd 1.2.0 started: max_proc=16, runner='scrapyd.runner'
2017-09-11T06:09:32+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:09:32 +0000] "GET /listprojects.json HTTP/1.1" 200 98 "-" "python-requests/2.18.4"
2017-09-11T06:09:32+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:09:32 +0000] "GET /listversions.json?project=tester2_fda_trial20 HTTP/1.1" 200 80 "-" "python-requests/2.18.4"
2017-09-11T06:09:32+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:09:32 +0000] "GET /listjobs.json?project=tester2_fda_trial20 HTTP/1.1" 200 92 "-" "python-requests/2.18.4"
2017-09-11T06:09:37+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:09:37 +0000] "GET /listprojects.json HTTP/1.1" 200 98 "-" "python-requests/2.18.4"
2017-09-11T06:09:37+0600 [twisted.python.log#info] "192.168.0.101" - - [11/Sep/2017:00:09:37 +0000] "GET /listversions.json?project=tester2_fda_trial20 HTTP/1.1" 200 80 "-" "python-requests/2.18.4"
worker1 logs
After updating the code for RabbitMQ server settings followed by the suggestions made by #Tarun Lalwani
The suggestion was to use RabbitMQ Server IP - 192.168.0.101:5672 instead of
127.0.0.1:5672. After I updated as suggested by Tarun Lalwani got the new problems as below............
2017-09-11 15:49:18 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: tester2_fda_trial20)
2017-09-11 15:49:18 [scrapy.utils.log] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tester2_fda_trial20.spiders', 'ROBOTSTXT_OBEY': True, 'LOG_LEVEL': 'INFO', 'SPIDER_MODULES': ['tester2_fda_trial20.spiders'], 'BOT_NAME': 'tester2_fda_trial20', 'FEED_URI': 'file:///var/lib/scrapyd/items/tester2_fda_trial20/tester2_fda_trial20/79b1123a96d611e79276000c29bad697.jl', 'SCHEDULER': 'tester2_fda_trial20.rabbitmq.scheduler.Scheduler', 'TELNETCONSOLE_ENABLED': False, 'LOG_FILE': '/var/log/scrapyd/tester2_fda_trial20/tester2_fda_trial20/79b1123a96d611e79276000c29bad697.log'}
2017-09-11 15:49:18 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.corestats.CoreStats']
2017-09-11 15:49:18 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2017-09-11 15:49:18 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2017-09-11 15:49:18 [scrapy.middleware] INFO: Enabled item pipelines:
['tester2_fda_trial20.pipelines.FdaTrial20Pipeline',
'tester2_fda_trial20.mongodb.scrapy_mongodb.MongoDBPipeline']
2017-09-11 15:49:18 [scrapy.core.engine] INFO: Spider opened
2017-09-11 15:49:18 [pika.adapters.base_connection] INFO: Connecting to 192.168.0.101:5672
2017-09-11 15:49:18 [pika.adapters.blocking_connection] INFO: Created channel=1
2017-09-11 15:49:18 [scrapy.core.engine] INFO: Closing spider (shutdown)
2017-09-11 15:49:18 [pika.adapters.blocking_connection] INFO: Channel.close(0, Normal Shutdown)
2017-09-11 15:49:18 [pika.channel] INFO: Channel.close(0, Normal Shutdown)
2017-09-11 15:49:18 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method ?.close_spider of <scrapy.extensions.feedexport.FeedExporter object at 0x7f94878b8c50>>
Traceback (most recent call last):
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/scrapy/extensions/feedexport.py", line 201, in close_spider
slot = self.slot
AttributeError: 'FeedExporter' object has no attribute 'slot'
2017-09-11 15:49:18 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method ?.spider_closed of <Tester2Fda_Trial20Spider 'tester2_fda_trial20' at 0x7f9484f897d0>>
Traceback (most recent call last):
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply
return receiver(*arguments, **named)
File "/tmp/user/1000/tester2_fda_trial20-10-d4Req9.egg/tester2_fda_trial20/spiders/tester2_fda_trial20.py", line 28, in spider_closed
AttributeError: 'Tester2Fda_Trial20Spider' object has no attribute 'statstask'
2017-09-11 15:49:18 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'finish_reason': 'shutdown',
'finish_time': datetime.datetime(2017, 9, 11, 9, 49, 18, 159896),
'log_count/ERROR': 2,
'log_count/INFO': 10}
2017-09-11 15:49:18 [scrapy.core.engine] INFO: Spider closed (shutdown)
2017-09-11 15:49:18 [twisted] CRITICAL: Unhandled error in Deferred:
2017-09-11 15:49:18 [twisted] CRITICAL:
Traceback (most recent call last):
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
result = g.send(result)
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/scrapy/crawler.py", line 95, in crawl
six.reraise(*exc_info)
File "/home/mtaziz/.virtualenvs/scrapydevenv/local/lib/python2.7/site-packages/scrapy/crawler.py", line 79, in crawl
yield self.engine.open_spider(self.spider, start_requests)
OperationFailure: command SON([('saslStart', 1), ('mechanism', 'SCRAM-SHA-1'), ('payload', Binary('n,,n=tariq,r=MjY5OTQ0OTYwMjA4', 0)), ('autoAuthorize', 1)]) on namespace admin.$cmd failed: Authentication failed.
MongoDBPipeline
# coding:utf-8
import datetime
from pymongo import errors
from pymongo.mongo_client import MongoClient
from pymongo.mongo_replica_set_client import MongoReplicaSetClient
from pymongo.read_preferences import ReadPreference
from scrapy.exporters import BaseItemExporter
try:
from urllib.parse import quote
except:
from urllib import quote
def not_set(string):
""" Check if a string is None or ''
:returns: bool - True if the string is empty
"""
if string is None:
return True
elif string == '':
return True
return False
class MongoDBPipeline(BaseItemExporter):
""" MongoDB pipeline class """
# Default options
config = {
'uri': 'mongodb://localhost:27017',
'fsync': False,
'write_concern': 0,
'database': 'scrapy-mongodb',
'collection': 'items',
'replica_set': None,
'buffer': None,
'append_timestamp': False,
'sharded': False
}
# Needed for sending acknowledgement signals to RabbitMQ for all persisted items
queue = None
acked_signals = []
# Item buffer
item_buffer = dict()
def load_spider(self, spider):
self.crawler = spider.crawler
self.settings = spider.settings
self.queue = self.crawler.engine.slot.scheduler.queue
def open_spider(self, spider):
self.load_spider(spider)
# Configure the connection
self.configure()
self.spidername = spider.name
self.config['uri'] = 'mongodb://' + self.config['username'] + ':' + quote(self.config['password']) + '#' + self.config['uri'] + '/admin'
self.shardedcolls = []
if self.config['replica_set'] is not None:
self.connection = MongoReplicaSetClient(
self.config['uri'],
replicaSet=self.config['replica_set'],
w=self.config['write_concern'],
fsync=self.config['fsync'],
read_preference=ReadPreference.PRIMARY_PREFERRED)
else:
# Connecting to a stand alone MongoDB
self.connection = MongoClient(
self.config['uri'],
fsync=self.config['fsync'],
read_preference=ReadPreference.PRIMARY)
# Set up the collection
self.database = self.connection[spider.name]
# Autoshard the DB
if self.config['sharded']:
db_statuses = self.connection['config']['databases'].find({})
partitioned = []
notpartitioned = []
for status in db_statuses:
if status['partitioned']:
partitioned.append(status['_id'])
else:
notpartitioned.append(status['_id'])
if spider.name in notpartitioned or spider.name not in partitioned:
try:
self.connection.admin.command('enableSharding', spider.name)
except errors.OperationFailure:
pass
else:
collections = self.connection['config']['collections'].find({})
for coll in collections:
if (spider.name + '.') in coll['_id']:
if coll['dropped'] is not True:
if coll['_id'].index(spider.name + '.') == 0:
self.shardedcolls.append(coll['_id'][coll['_id'].index('.') + 1:])
def configure(self):
""" Configure the MongoDB connection """
# Set all regular options
options = [
('uri', 'MONGODB_URI'),
('fsync', 'MONGODB_FSYNC'),
('write_concern', 'MONGODB_REPLICA_SET_W'),
('database', 'MONGODB_DATABASE'),
('collection', 'MONGODB_COLLECTION'),
('replica_set', 'MONGODB_REPLICA_SET'),
('buffer', 'MONGODB_BUFFER_DATA'),
('append_timestamp', 'MONGODB_ADD_TIMESTAMP'),
('sharded', 'MONGODB_SHARDED'),
('username', 'MONGODB_USER'),
('password', 'MONGODB_PASSWORD')
]
for key, setting in options:
if not not_set(self.settings[setting]):
self.config[key] = self.settings[setting]
def process_item(self, item, spider):
""" Process the item and add it to MongoDB
:type item: Item object
:param item: The item to put into MongoDB
:type spider: BaseSpider object
:param spider: The spider running the queries
:returns: Item object
"""
item_name = item.__class__.__name__
# If we are working with a sharded DB, the collection will also be sharded
if self.config['sharded']:
if item_name not in self.shardedcolls:
try:
self.connection.admin.command('shardCollection', '%s.%s' % (self.spidername, item_name), key={'_id': "hashed"})
self.shardedcolls.append(item_name)
except errors.OperationFailure:
self.shardedcolls.append(item_name)
itemtoinsert = dict(self._get_serialized_fields(item))
if self.config['buffer']:
if item_name not in self.item_buffer:
self.item_buffer[item_name] = []
self.item_buffer[item_name].append([])
self.item_buffer[item_name].append(0)
self.item_buffer[item_name][1] += 1
if self.config['append_timestamp']:
itemtoinsert['scrapy-mongodb'] = {'ts': datetime.datetime.utcnow()}
self.item_buffer[item_name][0].append(itemtoinsert)
if self.item_buffer[item_name][1] == self.config['buffer']:
self.item_buffer[item_name][1] = 0
self.insert_item(self.item_buffer[item_name][0], spider, item_name)
return item
self.insert_item(itemtoinsert, spider, item_name)
return item
def close_spider(self, spider):
""" Method called when the spider is closed
:type spider: BaseSpider object
:param spider: The spider running the queries
:returns: None
"""
for key in self.item_buffer:
if self.item_buffer[key][0]:
self.insert_item(self.item_buffer[key][0], spider, key)
def insert_item(self, item, spider, item_name):
""" Process the item and add it to MongoDB
:type item: (Item object) or [(Item object)]
:param item: The item(s) to put into MongoDB
:type spider: BaseSpider object
:param spider: The spider running the queries
:returns: Item object
"""
self.collection = self.database[item_name]
if not isinstance(item, list):
if self.config['append_timestamp']:
item['scrapy-mongodb'] = {'ts': datetime.datetime.utcnow()}
ack_signal = item['ack_signal']
item.pop('ack_signal', None)
self.collection.insert(item, continue_on_error=True)
if ack_signal not in self.acked_signals:
self.queue.acknowledge(ack_signal)
self.acked_signals.append(ack_signal)
else:
signals = []
for eachitem in item:
signals.append(eachitem['ack_signal'])
eachitem.pop('ack_signal', None)
self.collection.insert(item, continue_on_error=True)
del item[:]
for ack_signal in signals:
if ack_signal not in self.acked_signals:
self.queue.acknowledge(ack_signal)
self.acked_signals.append(ack_signal)
To sum up, I believe the problem lies in scrapyd daemons running on both instances but somehow scraper or worker1 can not access it, I could not figure it out, I did not find any use cases on stackoverflow.
Any help is highly appreciated in this regard. Thank you in advance!

Configure DDS with scrapy-splash. ERROR: no base objects

LS,
I have installed Django-Dynamic-Scraper. And i would like to render Javascript via Splash. Therefor i have installed scrapy-splash and installed the docker splash image. The image below shows that the docker container can be reached.
Splash docker container
Nevertheless when i test it via DDS it returns the following error:
2016-10-25 17:06:00 [scrapy] INFO: Spider opened
2016-10-25 17:06:00 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2016-10-25 17:06:00 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2016-10-25 17:06:05 [scrapy] DEBUG: Crawled (200) <POST http://192.168.0.150:8050/render.html> (referer: None)
2016-10-25 17:06:06 [root] ERROR: No base objects found!
2016-10-25 17:06:06 [scrapy] INFO: Closing spider (finished)
2016-10-25 17:06:06 [scrapy] INFO: Dumping Scrapy stats:
when executing:
scrapy crawl my_spider -a id=1
I have configured the DDS admin page and checked the checkbox to render the javascript:
Admin configuration
I have followed the configuration from scrapy-splash:
# ----------------------------------------------------------------------
# SPLASH SETTINGS
# https://github.com/scrapy-plugins/scrapy-splash#configuration
# --------------------------------------------------------------------
SPLASH_URL = 'http://192.168.0.150:8050/'
DSCRAPER_SPLASH_ARGS = {'wait': 3}
DOWNLOADER_MIDDLEWARES = {
'scrapy_splash.SplashCookiesMiddleware': 723,
'scrapy_splash.SplashMiddleware': 725,
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810,
}
# This middleware is needed to support cache_args feature;
# it allows to save disk space by not storing duplicate Splash arguments
# multiple times in a disk request queue.
SPIDER_MIDDLEWARES = {
'scrapy_splash.SplashDeduplicateArgsMiddleware': 100,
}
DUPEFILTER_CLASS = 'scrapy_splash.SplashAwareDupeFilter'
# If you use Scrapy HTTP cache then a custom cache storage backend is required.
# scrapy-splash provides a subclass
HTTPCACHE_STORAGE = 'scrapy_splash.SplashAwareFSCacheStorage'
I assume with a correct configuration of DDS/scrapy-splash, it will sends the required arguments to the splash docker container to render, Is this the case?
What am I missing? Do i need to adjust the spider with a splash script?

Druid not storing to AWS S3

I am trying to push the data to AWS S3. I had user the example in (http://druid.io/docs/0.7.0/Tutorial:-The-Druid-Cluster.html) but modified the common.runtime.properties as below
druid.storage.type=s3
druid.s3.accessKey=AKIAJWTETHZDEQLHQ7AQ
druid.s3.secretKey=tcTtvGXcqLmmMbo2hRunzlSA1P2X0O0bjVf537Nt
druid.storage.bucket=testfeed
druid.storage.baseKey=sample
Below is the logs for realtime node
2015-03-02T15:03:44,809 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.QueryConfig] from
props[druid.query.] as [io.druid.query.QueryConfig#2edcd9d]
2015-03-02T15:03:44,843 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.search.search.SearchQueryConfig]
from props[druid.query.search.] as
[io.druid.query.search.search.SearchQueryConfig#7939de8b]
2015-03-02T15:03:44,861 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from
props[druid.query.groupBy.] as
[io.druid.query.groupby.GroupByQueryConfig#bea8209]
2015-03-02T15:03:44,874 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning value
[100000000] for [druid.processing.buffer.sizeBytes] on
[io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-03-02T15:03:44,878 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning value [2] for
[druid.processing.numThreads] on
[io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-03-02T15:03:44,878 INFO [main]
org.skife.config.ConfigurationObjectFactory - Using method itself for
[${base_path}.columnCache.sizeBytes] on
[io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-03-02T15:03:44,880 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning default value
[processing-%s] for [${base_path}.formatString] on
[com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-03-02T15:03:44,956 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.topn.TopNQueryConfig] from
props[druid.query.topN.] as
[io.druid.query.topn.TopNQueryConfig#276503c4]
2015-03-02T15:03:44,960 INFO [main] io.druid.guice.JsonConfigurator
- Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from
props[druid.storage.] as
[io.druid.segment.loading.LocalDataSegmentPusherConfig#360548eb]
2015-03-02T15:03:44,967 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.client.DruidServerConfig] from
props[druid.server.] as [io.druid.client.DruidServerConfig#75ba7964]
2015-03-02T15:03:44,971 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class
io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from
props[druid.announcer.] as
[io.druid.server.initialization.BatchDataSegmentAnnouncerConfig#1ff2a544]
2015-03-02T15:03:44,984 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.server.initialization.ZkPathsConfig] from
props[druid.zk.paths.] as
[io.druid.server.initialization.ZkPathsConfig#58d3f4be]
2015-03-02T15:03:44,990 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.curator.CuratorConfig] from
props[druid.zk.service.] as [io.druid.curator.CuratorConfig#5fd11499]
I got the issue. I had missed the s3 extension in common.runtime.properties. Once that was added data started getting pushed to s3.

How can i use scrapy shell to with username and password on url ( on login require website )

I want to scrap a login require website and check my xpath right or wrong using scrapy shell in python scrapy-framework like
C:\Users\Ranvijay.Sachan>scrapy shell https://www.google.co.in/?gfe_rd=cr&ei=mIl8V
6LovC8gegtYHYDg&gws_rd=ssl
:0: UserWarning: You do not have a working installation of the service_identity m
ule: 'No module named service_identity'. Please install it from <https://pypi.py
on.org/pypi/service_identity> and make sure all of its dependencies are satisfied
Without the service_identity module and a recent enough pyOpenSSL to support it,
wisted can perform only rudimentary TLS client hostname verification. Many valid
ertificate/hostname mappings may be rejected.
2014-12-01 21:00:04-0700 [scrapy] INFO: Scrapy 0.24.2 started (bot: scrapybot)
2014-12-01 21:00:04-0700 [scrapy] INFO: Optional features available: ssl, http11
2014-12-01 21:00:04-0700 [scrapy] INFO: Overridden settings: {'LOGSTATS_INTERVAL'
0}
2014-12-01 21:00:05-0700 [scrapy] INFO: Enabled extensions: TelnetConsole, CloseS
der, WebService, CoreStats, SpiderState
2014-12-01 21:00:05-0700 [scrapy] INFO: Enabled downloader middlewares: HttpAuthM
dleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, Default
adersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddle
re, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2014-12-01 21:00:05-0700 [scrapy] INFO: Enabled spider middlewares: HttpErrorMidd
ware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2014-12-01 21:00:05-0700 [scrapy] INFO: Enabled item pipelines:
2014-12-01 21:00:05-0700 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:60
2014-12-01 21:00:05-0700 [scrapy] DEBUG: Web service listening on 127.0.0.1:6081
2014-12-01 21:00:05-0700 [default] INFO: Spider opened
2014-12-01 21:00:06-0700 [default] DEBUG: Crawled (200) <GET https://www.google.c
in/?gfe_rd=cr> (referer: None)
[s] Available Scrapy objects:
[s] crawler <scrapy.crawler.Crawler object at 0x01B71910>
[s] item {}
[s] request <GET https://www.google.co.in/?gfe_rd=cr>
[s] response <200 https://www.google.co.in/?gfe_rd=cr>
[s] settings <scrapy.settings.Settings object at 0x023CBC90>
[s] spider <Spider 'default' at 0x29402f0>
[s] Useful shortcuts:
[s] shelp() Shell help (print this help)
[s] fetch(req_or_url) Fetch request (or URL) and update local objects
[s] view(response) View response in a browser
>>> response.xpath("//div[#id='_eEe']/text()").extract()
[u'Google.co.in offered in: ', u' ', u' ', u' ', u' ', u' ', u' ', u' ', u
', u' ']
>>>