When I try to start Ember server (ember serve) I get this error
[Embroider:MacrosConfig] the given config from 'C:\WebDev\cpc\front-end_service\node_modules\ember-get-config' for packageName 'undefined' is not JSON serializable.
Here's the stack error details ;
=================================================================================
ENV Summary:
TIME: Sat Apr 09 2022 01:13:15 GMT+0200 (heure d’été d’Europe centrale)
TITLE: ember
ARGV:
- C:\Program Files\nodejs\node.exe
- C:\Users\zak\AppData\Roaming\npm\node_modules\ember-cli\bin\ember
- s
EXEC_PATH: C:\Program Files\nodejs\node.exe
TMPDIR: C:\Users\zak\AppData\Local\Temp
SHELL: null
PATH:
- C
- \Program Files (x86)\AMD APP\bin\x86_64;C
- \Program Files (x86)\AMD APP\bin\x86;C
- \Windows\system32;C
- \Windows;C
- \Windows\System32\Wbem;C
- \Windows\System32\WindowsPowerShell\v1.0\;C
- \Windows\System32\OpenSSH\;C
- \Program Files\Git\cmd;C
- \Program Files\dotnet\;C
- \Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C
- \Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x64;C
- \Program Files\nodejs\;C
- \Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C
- \Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x64;C
- \Program Files\PuTTY\;C
- \Users\zak\AppData\Local\Microsoft\WindowsApps;C
- \Users\zak\AppData\Local\Programs\Microsoft VS Code\bin;C
- \Users\zak\AppData\Roaming\npm
PLATFORM: win32 x64
FREEMEM: 4299919360
TOTALMEM: 12786380800
UPTIME: 60966
LOADAVG: 0,0,0
CPUS:
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
- Intel(R) Core(TM) i7-2670QM CPU # 2.20GHz - 2195
ENDIANNESS: LE
VERSIONS:
- ares: 1.17.1
- brotli: 1.0.9
- cldr: 39.0
- icu: 69.1
- llhttp: 2.1.3
- modules: 83
- napi: 8
- nghttp2: 1.42.0
- node: 14.17.3
- openssl: 1.1.1k
- tz: 2021a
- unicode: 13.0
- uv: 1.41.0
- v8: 8.4.371.23-node.67
- zlib: 1.2.11
ERROR Summary:
- broccoliBuilderErrorStack: [undefined]
- code: [undefined]
- codeFrame: [undefined]
- errorMessage: [Embroider:MacrosConfig] the given config from 'projectPath\node_modules\ember-get-config' for packageName 'undefined' is not JSON serializable.
- errorType: [undefined]
- location:
- column: [undefined]
- file: [undefined]
- line: [undefined]
- message: [Embroider:MacrosConfig] the given config from 'projectPath\node_modules\ember-get-config' for packageName 'undefined' is not JSON serializable.
- name: Error
- nodeAnnotation: [undefined]
- nodeName: [undefined]
- originalErrorMessage: [undefined]
- stack: Error: [Embroider:MacrosConfig] the given config from 'projectPath\node_modules\ember-get-config' for packageName 'undefined' is not JSON serializable.
at MacrosConfig.internalSetConfig (projectPath\node_modules\#embroider\macros\src\macros-config.js:163:19)
at MacrosConfig.setOwnConfig (projectPath\node_modules\#embroider\macros\src\macros-config.js:142:21)
at Class.included (projectPath\node_modules\#embroider\macros\src\ember-addon-main.js:25:26)
at Class.superWrapper [as included] (projectPath\node_modules\core-object\lib\assign-properties.js:34:20)
at projectPath\node_modules\ember-cli\lib\models\addon.js:497:26
at Array.reduce (<anonymous>)
at Class.eachAddonInvoke (projectPath\node_modules\ember-cli\lib\models\addon.js:494:24)
at Class.included (projectPath\node_modules\ember-cli\lib\models\addon.js:769:10)
at Class.superWrapper [as included] (projectPath\node_modules\core-object\lib\assign-properties.js:34:20)
at Class.included (projectPath\node_modules\ember-get-config\index.js:29:26)
=================================================================================
I searched a lot on Google but I didn't find any solution
If someone can help, thanks.
Related
this is my docker-compose-yaml file:
# Copyright IBM Corp. All Rights Reserved.
#
# SPDX-License-Identifier: Apache-2.0
#
version: '2'
volumes:
orderer1.workspace:
orderer2.workspace:
orderer3.workspace:
orderer4.workspace:
orderer5.workspace:
peer1.developers.workspace:
peer2.developers.workspace:
peer1.accounts.workspace:
peer2.accounts.workspace:
peer1.hr.workspace:
peer2.hr.workspace:
peer1.marketing.workspace:
peer2.marketing.workspace:
networks:
byfn:
services:
orderer1.workspace:
extends:
file: base.yaml
service: orderer-base
container_name: orderer1.workspace
networks:
- byfn
volumes:
- ./channel-artifacts/genesis.block:/var/hyperledger/orderer/orderer.genesis.block
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer1.workspace/msp:/var/hyperledger/orderer/msp
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer1.workspace/tls/:/var/hyperledger/orderer/tls
- orderer1.workspace:/var/hyperledger/production/orderer
ports:
- 7050:7050
orderer2.workspace:
extends:
file: base.yaml
service: orderer-base
container_name: orderer2.workspace
networks:
- byfn
volumes:
- ./channel-artifacts/genesis.block:/var/hyperledger/orderer/orderer.genesis.block
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer2.workspace/msp:/var/hyperledger/orderer/msp
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer2.workspace/tls/:/var/hyperledger/orderer/tls
- orderer2.workspace:/var/hyperledger/production/orderer
ports:
- 8050:7050
orderer3.workspace:
extends:
file: base.yaml
service: orderer-base
container_name: orderer3.workspace
networks:
- byfn
volumes:
- ./channel-artifacts/genesis.block:/var/hyperledger/orderer/orderer.genesis.block
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer3.workspace/msp:/var/hyperledger/orderer/msp
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer3.workspace/tls/:/var/hyperledger/orderer/tls
- orderer3.workspace:/var/hyperledger/production/orderer
ports:
- 9050:7050
orderer4.workspace:
extends:
file: base.yaml
service: orderer-base
container_name: orderer4.workspace
networks:
- byfn
volumes:
- ./channel-artifacts/genesis.block:/var/hyperledger/orderer/orderer.genesis.block
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer4.workspace/msp:/var/hyperledger/orderer/msp
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer4.workspace/tls/:/var/hyperledger/orderer/tls
- orderer4.workspace:/var/hyperledger/production/orderer
ports:
- 10050:7050
orderer5.workspace:
extends:
file: base.yaml
service: orderer-base
container_name: orderer5.workspace
networks:
- byfn
volumes:
- ./channel-artifacts/genesis.block:/var/hyperledger/orderer/orderer.genesis.block
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer5.workspace/msp:/var/hyperledger/orderer/msp
- ./crypto-config/ordererOrganizations/workspace/orderers/orderer5.workspace/tls/:/var/hyperledger/orderer/tls
- orderer5.workspace:/var/hyperledger/production/orderer
ports:
- 11050:7050
peer1.developers.workspace:
container_name: peer1.developers.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer1.developers.workspace
- CORE_PEER_ADDRESS=peer1.developers.workspace:7051
- CORE_PEER_LISTENADDRESS=0.0.0.0:7051
- CORE_PEER_CHAINCODEADDRESS=peer1.developers.workspace:7052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:7052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer2.developers.workspace:8051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer1.developers.workspace:7051
- CORE_PEER_LOCALMSPID=Org1MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/developers.workspace/peers/peer1.developers.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/developers.workspace/peers/peer1.developers.workspace/tls:/etc/hyperledger/fabric/tls
- peer1.developers.workspace:/var/hyperledger/production
ports:
- 7051:7051
networks:
- byfn
peer2.developers.workspace:
container_name: peer2.developers.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer2.developers.workspace
- CORE_PEER_ADDRESS=peer2.developers.workspace:8051
- CORE_PEER_LISTENADDRESS=0.0.0.0:8051
- CORE_PEER_CHAINCODEADDRESS=peer2.developers.workspace:8052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:7052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer1. .workspace:7051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer2.developers.workspace:8051
- CORE_PEER_LOCALMSPID=Org1MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/developers.workspace/peers/peer2.developers.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/developers.workspace/peers/peer2.developers.workspace/tls:/etc/hyperledger/fabric/tls
- peer2.developers.workspace:/var/hyperledger/production
ports:
- 8051:8051
networks:
- byfn
peer1.accounts.workspace:
container_name: peer1.accounts.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer1.accounts.workspace
- CORE_PEER_ADDRESS=peer1.accounts.workspace:9051
- CORE_PEER_LISTENADDRESS=0.0.0.0:9051
- CORE_PEER_CHAINCODEADDRESS=peer1.accounts.workspace:9052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:9052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer2.accounts.workspace:10051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer1.accounts.workspace:9051
- CORE_PEER_LOCALMSPID=Org2MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/accounts.workspace/peers/peer1.accounts.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/accounts.workspace/peers/peer1.accounts.workspace/tls:/etc/hyperledger/fabric/tls
- peer1.accounts.workspace:/var/hyperledger/production
ports:
- 9051:9051
networks:
- byfn
peer2.accounts.workspace:
container_name: peer2.accounts.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer2.accounts.workspace
- CORE_PEER_ADDRESS=peer2.accounts.workspace:10051
- CORE_PEER_LISTENADDRESS=0.0.0.0:10051
- CORE_PEER_CHAINCODEADDRESS=peer2.accounts.workspace:10052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:10052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer1.accounts.workspace:9051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer2.accounts.workspace:10051
- CORE_PEER_LOCALMSPID=Org2MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/accounts.workspace/peers/peer2.accounts.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/accounts.workspace/peers/peer2.accounts.workspace/tls:/etc/hyperledger/fabric/tls
- peer2.accounts.workspace:/var/hyperledger/production
ports:
- 10051:10051
networks:
- byfn
peer1.hr.workspace:
container_name: peer1.hr.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer1.hr.workspace
- CORE_PEER_ADDRESS=peer1.hr.workspace:11051
- CORE_PEER_LISTENADDRESS=0.0.0.0:11051
- CORE_PEER_CHAINCODEADDRESS=peer1.hr.workspace:11052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:11052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer2.hr.workspace:12051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer1.hr.workspace:11051
- CORE_PEER_LOCALMSPID=Org3MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/hr.workspace/peers/peer1.hr.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/hr.workspace/peers/peer1.hr.workspace/tls:/etc/hyperledger/fabric/tls
- peer1.hr.workspace:/var/hyperledger/production
ports:
- 11051:11051
networks:
- byfn
peer2.hr.workspace:
container_name: peer2.hr.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer2.hr.workspace
- CORE_PEER_ADDRESS=peer2.hr.workspace:12051
- CORE_PEER_LISTENADDRESS=0.0.0.0:12051
- CORE_PEER_CHAINCODEADDRESS=peer2.hr.workspace:12052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:12052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer1.hr.workspace:11051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer2.hr.workspace:12051
- CORE_PEER_LOCALMSPID=Org3MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/hr.workspace/peers/peer2.hr.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/hr.workspace/peers/peer2.hr.workspace/tls:/etc/hyperledger/fabric/tls
- peer2.hr.workspace:/var/hyperledger/production
ports:
- 12051:12051
networks:
- byfn
peer1.marketing.workspace:
container_name: peer1.marketing.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer1.marketing.workspace
- CORE_PEER_ADDRESS=peer1.marketing.workspace:13051
- CORE_PEER_LISTENADDRESS=0.0.0.0:13051
- CORE_PEER_CHAINCODEADDRESS=peer1.marketing.workspace:13052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:13052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer2.marketing.workspace:14051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer1.marketing.workspace:13051
- CORE_PEER_LOCALMSPID=Org4MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/marketing.workspace/peers/peer1.marketing.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/marketing.workspace/peers/peer1.marketing.workspace/tls:/etc/hyperledger/fabric/tls
- peer1.marketing.workspace:/var/hyperledger/production
ports:
- 13051:13051
networks:
- byfn
peer2.marketing.workspace:
container_name: peer2.marketing.workspace
extends:
file: base.yaml
service: peer-base
environment:
- CORE_PEER_ID=peer2.marketing.workspace
- CORE_PEER_ADDRESS=peer2.marketing.workspace:14051
- CORE_PEER_LISTENADDRESS=0.0.0.0:14051
- CORE_PEER_CHAINCODEADDRESS=peer2.marketing.workspace:14052
- CORE_PEER_CHAINCODELISTENADDRESS=0.0.0.0:14052
- CORE_PEER_GOSSIP_BOOTSTRAP=peer1.marketing.workspace:13051
- CORE_PEER_GOSSIP_EXTERNALENDPOINT=peer2.marketing.workspace:14051
- CORE_PEER_LOCALMSPID=Org4MSP
volumes:
- /var/run/:/host/var/run/
- ./crypto-config/peerOrganizations/marketing.workspace/peers/peer2.marketing.workspace/msp:/etc/hyperledger/fabric/msp
- ./crypto-config/peerOrganizations/marketing.workspace/peers/peer2.marketing.workspace/tls:/etc/hyperledger/fabric/tls
- peer2.marketing.workspace:/var/hyperledger/production
ports:
- 14051:14051
networks:
- byfn
cli:
container_name: cli
image: hyperledger/fabric-tools:$IMAGE_TAG
tty: true
stdin_open: true
environment:
- SYS_CHANNEL=$SYS_CHANNEL
- GOPATH=/opt/gopath
- CORE_VM_ENDPOINT=unix:///host/var/run/docker.sock
#- FABRIC_LOGGING_SPEC=DEBUG
- FABRIC_LOGGING_SPEC=INFO
- CORE_PEER_ID=cli
- CORE_PEER_ADDRESS=peer1.developers.workspace:7051
- CORE_PEER_LOCALMSPID=Org1MSP
- CORE_PEER_TLS_ENABLED=true
- CORE_PEER_TLS_CERT_FILE=/opt/gopath/src/github.com/hyperledger/fabric/peer/crypto/peerOrganizations/developers.workspace/peers/peer1.developers.workspace/tls/server.crt
- CORE_PEER_TLS_KEY_FILE=/opt/gopath/src/github.com/hyperledger/fabric/peer/crypto/peerOrganizations/developers.workspace/peers/peer1.developers.workspace/tls/server.key
- CORE_PEER_TLS_ROOTCERT_FILE=/opt/gopath/src/github.com/hyperledger/fabric/peer/crypto/peerOrganizations/developers.workspace/peers/peer1.developers.workspace/tls/ca.crt
- CORE_PEER_MSPCONFIGPATH=/opt/gopath/src/github.com/hyperledger/fabric/peer/crypto/peerOrganizations/developers.workspace/users/Admin#developers.workspace/msp
working_dir: /opt/gopath/src/github.com/hyperledger/fabric/peer
command: /bin/bash
volumes:
- /var/run/:/host/var/run/
- ./../chaincode/:/opt/gopath/src/github.com/chaincode
- ./crypto-config:/opt/gopath/src/github.com/hyperledger/fabric/peer/crypto/
- ./scripts:/opt/gopath/src/github.com/hyperledger/fabric/peer/scripts/
- ./myscripts:/opt/gopath/src/github.com/hyperledger/fabric/peer/myscripts/
- ./channel-artifacts:/opt/gopath/src/github.com/hyperledger/fabric/peer/channel-artifacts
depends_on:
- orderer1.workspace
- orderer2.workspace
- orderer3.workspace
- orderer4.workspace
- orderer5.workspace
- peer1.developers.workspace
- peer2.developers.workspace
- peer1.accounts.workspace
- peer2.accounts.workspace
- peer1.hr.workspace
- peer2.hr.workspace
- peer1.marketing.workspace
- peer2.marketing.workspace
networks:
- byfn
# ca1:
# extends:
# file: base.yaml
# service: ca-base
# environment:
# - FABRIC_CA_SERVER_CA_NAME=ca-developers
# - FABRIC_CA_SERVER_TLS_CERTFILE=/etc/hyperledger/fabric-ca-server-config/ca.developers.workspace-cert.pem
# - FABRIC_CA_SERVER_TLS_KEYFILE=/etc/hyperledger/fabric-ca-server-config/priv_sk
# - FABRIC_CA_SERVER_PORT=7054
# ports:
# - "7054:7054"
# command: sh -c 'fabric-ca-server start --ca.certfile /etc/hyperledger/fabric-ca-server-config/ca.developers.workspace-cert.pem --ca.keyfile /etc/hyperledger/fabric-ca-server-config/a75d012725ef434f86c9d31f02e748922f9d81f0dfcbb9e4890f1dfbd69a0424_sk -b admin:adminpw -d'
# volumes:
# - ./crypto-config/peerOrganizations/developers.workspace/ca/:/etc/hyperledger/fabric-ca-server-config
# container_name: ca_developers
# ca2:
# extends:
# file: base.yaml
# service: ca-base
# environment:
# - FABRIC_CA_SERVER_CA_NAME=ca-accounts
# - FABRIC_CA_SERVER_TLS_CERTFILE=/etc/hyperledger/fabric-ca-server-config/ca.accounts.workspace-cert.pem
# - FABRIC_CA_SERVER_TLS_KEYFILE=/etc/hyperledger/fabric-ca-server-config/priv_sk
# - FABRIC_CA_SERVER_PORT=9054
# ports:
# - "9054:9054"
# command: sh -c 'fabric-ca-server start --ca.certfile /etc/hyperledger/fabric-ca-server-config/ca.accounts.workspace-cert.pem --ca.keyfile /etc/hyperledger/fabric-ca-server-config/caf48a1b5c6e3d0afa0ef05ff9e42dd890f7a64299d7a7a1cd0da301ffc65263_sk -b admin:adminpw -d'
# volumes:
# - ./crypto-config/peerOrganizations/accounts.workspace/ca/:/etc/hyperledger/fabric-ca-server-config
# container_name: ca_finance
# ca3:
# extends:
# file: base.yaml
# service: ca-base
# environment:
# - FABRIC_CA_SERVER_CA_NAME=ca-hr
# - FABRIC_CA_SERVER_TLS_CERTFILE=/etc/hyperledger/fabric-ca-server-config/ca.hr.workspace-cert.pem
# - FABRIC_CA_SERVER_TLS_KEYFILE=/etc/hyperledger/fabric-ca-server-config/priv_sk
# - FABRIC_CA_SERVER_PORT=11054
# ports:
# - "11054:11054"
# command: sh -c 'fabric-ca-server start --ca.certfile /etc/hyperledger/fabric-ca-server-config/ca.hr.workspace-cert.pem --ca.keyfile /etc/hyperledger/fabric-ca-server-config/973b6fbc8397b467ec76dc32ad61104ac77034a1e2de1b98dedbb787c0540def_sk -b admin:adminpw -d'
# volumes:
# - ./crypto-config/peerOrganizations/hr.workspace/ca/:/etc/hyperledger/fabric-ca-server-config
# container_name: ca_hr
# ca4:
# extends:
# file: base.yaml
# service: ca-base
# environment:
# - FABRIC_CA_SERVER_CA_NAME=ca-marketing
# - FABRIC_CA_SERVER_TLS_CERTFILE=/etc/hyperledger/fabric-ca-server-config/ca.marketing.workspace-cert.pem
# - FABRIC_CA_SERVER_TLS_KEYFILE=/etc/hyperledger/fabric-ca-server-config/priv_sk
# - FABRIC_CA_SERVER_PORT=13054
# ports:
# - "13054:13054"
# command: sh -c 'fabric-ca-server start --ca.certfile /etc/hyperledger/fabric-ca-server-config/ca.marketing.workspace-cert.pem --ca.keyfile /etc/hyperledger/fabric-ca-server-config/82b1b4f6bf80f8b948f8aac606ae7a46f8605b221e309b8cc3edea307020e56d_sk -b admin:adminpw -d'
# volumes:
# - ./crypto-config/peerOrganizations/marketing.workspace/ca/:/etc/hyperledger/fabric-ca-server-config
# container_name: ca_marketing
# networks:
# - byfn
I tried to start the network using the command mentioned below:
docker-compose -f docker-compose.yaml up -d
this is the error msg I got:
WARNING: Found orphan containers (ca_cbaccounts, peer2.apple.workspace, ca_apple, peer1.apple.workspace, peer1.citizenbank.workspace, peer2.fiserv.workspace, ca_finance, peer1.cbaccounts.workspace, peer2.cbaccounts.workspace, peer2.citizenbank.workspace, ca_citizenbank, peer1.fiserv.workspace) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up.
Starting peer2.accounts.workspace ...
Starting peer1.accounts.workspace ...
Starting orderer1.workspace ...
orderer3.workspace is up-to-date
Starting peer2.hr.workspace ...
Starting peer1.marketing.workspace ...
Starting peer2.marketing.workspace ...
orderer2.workspace is up-to-date
Starting peer2.developers.workspace ...
Starting peer1.hr.workspace ...
Starting peer1.developers.workspace ...
orderer4.workspace is up-to-date
orderer5.workspace is up-to-date
ERROR: for peer2.accounts.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.accounts.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.marketing.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.hr.workspace a bytes-like object is required, not 'str'
Starting orderer1.workspace ... done
ERROR: for peer1.developers.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.marketing.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.developers.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.hr.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.accounts.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.accounts.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.marketing.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.hr.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.developers.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.marketing.workspace a bytes-like object is required, not 'str'
ERROR: for peer2.developers.workspace a bytes-like object is required, not 'str'
ERROR: for peer1.hr.workspace a bytes-like object is required, not 'str'
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/docker/api/client.py", line 261, in _raise_for_status
response.raise_for_status()
File "/usr/lib/python3/dist-packages/requests/models.py", line 940, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.22/containers/f3081941fb6114633593ec5d8b4f258ecebe5829007027295c30a284ed7a3c8c/start
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/compose/service.py", line 625, in start_container
container.start()
File "/usr/lib/python3/dist-packages/compose/container.py", line 241, in start
return self.client.start(self.id, **options)
File "/usr/lib/python3/dist-packages/docker/utils/decorators.py", line 19, in wrapped
return f(self, resource_id, *args, **kwargs)
File "/usr/lib/python3/dist-packages/docker/api/container.py", line 1095, in start
self._raise_for_status(res)
File "/usr/lib/python3/dist-packages/docker/api/client.py", line 263, in _raise_for_status
raise create_api_error_from_http_exception(e)
File "/usr/lib/python3/dist-packages/docker/errors.py", line 31, in create_api_error_from_http_exception
raise cls(e, response=response, explanation=explanation)
docker.errors.APIError: 500 Server Error: Internal Server Error ("b'driver failed programming external connectivity on endpoint peer1.hr.workspace (fcfb797b4e0e47c5d3611d652e56f06b9d5f4cdaad74b0663729c3773c39030a): Bind for 0.0.0.0:11051 failed: port is already allocated'")
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/bin/docker-compose", line 11, in <module>
load_entry_point('docker-compose==1.25.0', 'console_scripts', 'docker-compose')()
File "/usr/lib/python3/dist-packages/compose/cli/main.py", line 72, in main
command()
File "/usr/lib/python3/dist-packages/compose/cli/main.py", line 128, in perform_command
handler(command, command_options)
File "/usr/lib/python3/dist-packages/compose/cli/main.py", line 1107, in up
to_attach = up(False)
File "/usr/lib/python3/dist-packages/compose/cli/main.py", line 1088, in up
return self.project.up(
File "/usr/lib/python3/dist-packages/compose/project.py", line 565, in up
results, errors = parallel.parallel_execute(
File "/usr/lib/python3/dist-packages/compose/parallel.py", line 112, in parallel_execute
raise error_to_reraise
File "/usr/lib/python3/dist-packages/compose/parallel.py", line 210, in producer
result = func(obj)
File "/usr/lib/python3/dist-packages/compose/project.py", line 548, in do
return service.execute_convergence_plan(
File "/usr/lib/python3/dist-packages/compose/service.py", line 567, in execute_convergence_plan
return self._execute_convergence_start(
File "/usr/lib/python3/dist-packages/compose/service.py", line 506, in _execute_convergence_start
_, errors = parallel_execute(
File "/usr/lib/python3/dist-packages/compose/parallel.py", line 112, in parallel_execute
raise error_to_reraise
File "/usr/lib/python3/dist-packages/compose/parallel.py", line 210, in producer
result = func(obj)
File "/usr/lib/python3/dist-packages/compose/service.py", line 508, in <lambda>
lambda c: self.start_container_if_stopped(c, attach_logs=not detached, quiet=True),
File "/usr/lib/python3/dist-packages/compose/service.py", line 620, in start_container_if_stopped
return self.start_container(container)
File "/usr/lib/python3/dist-packages/compose/service.py", line 627, in start_container
if "driver failed programming external connectivity" in ex.explanation:
TypeError: a bytes-like object is required, not 'str'
using:
Python 3.8.10
docker:
Client: Docker Engine - Community
Version: 20.10.20
API version: 1.41
Go version: go1.18.7
Git commit: 9fdeb9c
Built: Tue Oct 18 18:20:23 2022
OS/Arch: linux/amd64
Context: default
Experimental: true
Server: Docker Engine - Community
Engine:
Version: 20.10.20
API version: 1.41 (minimum version 1.12)
Go version: go1.18.7
Git commit: 03df974
Built: Tue Oct 18 18:18:12 2022
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.6.8
GitCommit: 9cd3357b7fd7218e4aec3eae239db1f68a5a6ec6
runc:
Version: 1.1.4
GitCommit: v1.1.4-0-g5fd4c4d
docker-init:
Version: 0.19.0
GitCommit: de40ad0
This doesn't seems to be some issue with fabric.
I just removed all the docker containers and executed the docker-compose -f docker-compose.yaml up -d command again and it worked. May be I was facing the issue because of orphaned containers.
Let me know your views in comments. thanks
istio-1.12.2
istioctl install --set profile=demo -y
Not sure why the pod is still pending and didn't match Pod's node affinity.
SecretName: istio-ingressgateway-ca-certs
Optional: true
kube-api-access-sfpw2:
Type: Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds: 3607
ConfigMapName: kube-root-ca.crt
ConfigMapOptional: <nil>
DownwardAPI: true
QoS Class: Burstable
Node-Selectors: <none>
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Warning FailedScheduling 55s (x4 over 4m31s) default-scheduler 0/1 nodes are available: 1 node(s) didn't match Pod's node affinity/selector.
From the pod here is the affinity
spec:
affinity:
nodeAffinity:
preferredDuringSchedulingIgnoredDuringExecution:
- preference:
matchExpressions:
- key: kubernetes.io/arch
operator: In
values:
- amd64
weight: 2
- preference:
matchExpressions:
- key: kubernetes.io/arch
operator: In
values:
- ppc64le
weight: 2
- preference:
matchExpressions:
- key: kubernetes.io/arch
operator: In
values:
- s390x
weight: 2
requiredDuringSchedulingIgnoredDuringExecution:
nodeSelectorTerms:
- matchExpressions:
- key: kubernetes.io/arch
operator: In
values:
- amd64
- ppc64le
- s390x
When I check node ( Single node )
NAME STATUS ROLES AGE VERSION LABELS
minikube Ready control-plane,master 11m v1.23.3 beta.kubernetes.io/arch=arm64,beta.kubernetes.io/os=linux,kubernetes.io/arch=arm64,kubernetes.io/hostname=minikube,kubernetes.io/os=linux,minikube.k8s.io/commit=362d5fdc0a3dbee389b3d3f1034e8023e72bd3a7,minikube.k8s.io/name=minikube,minikube.k8s.io/primary=true,minikube.k8s.io/updated_at=2022_05_01T10_16_33_0700,minikube.k8s.io/version=v1.25.2,node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.kubernetes.io/exclude-from-external-load-balancers=
As of June 2022, there is no support for istio on MacOS m1 chip. Istio init containers fail initialization because of iptables.
Command error output: xtables parameter problem: iptables-restore: unable to initialize table 'nat'.
The temporary fix for this is to use istio operator with hub pointing to RESF's repo.
After you have installed/downloaded the istioctl command line.
Run following commands -
Set hub for istio operator.
istioctl operator init --hub=ghcr.io/resf/istio
Create namespace for istio insatallation.
kubectl create ns istio-system
Install istio
kubectl apply -f - <<EOF
apiVersion: install.istio.io/v1alpha1
kind: IstioOperator
metadata:
namespace: istio-system
name: example-istiocontrolplane
spec:
hub: ghcr.io/resf/istio
profile: demo
EOF
This temporary fix is by RESF until official fix and all credit goes to them.
I found this workaround Apple Silicon/M1 builds for Istioctl #29596
I'm following https://github.com/WowzaMediaSystems/wse-example-pushpublish-hls in order to inject an HLS stream from a Wowza into a AWS MediaPackage channel.
My PushPublishProfilesCustom.xml
<?xml version="1.0" encoding="UTF-8"?>
<Root>
<PushPublishProfiles>
<PushPublishProfile>
<Name>cupertino-file</Name>
<Protocol>HTTP</Protocol>
<BaseClass>com.mycompany.wms.example.pushpublish.protocol.cupertino.PushPublishHTTPCupertinoFileHandler</BaseClass>
<Implementation>
<Name>Cupertino File</Name>
</Implementation>
<HTTPConfiguration>
</HTTPConfiguration>
<Properties>
</Properties>
</PushPublishProfile>
<PushPublishProfile>
<Name>cupertino-http</Name>
<Protocol>HTTP</Protocol>
<BaseClass>com.mycompany.wms.example.pushpublish.protocol.cupertino.PushPublishHTTPCupertinoHTTPHandler</BaseClass>
<Implementation>
<Name>Cupertino HTTP</Name>
</Implementation>
<HTTPConfiguration>
</HTTPConfiguration>
<Properties>
</Properties>
</PushPublishProfile>
</PushPublishProfiles>
</Root>
My #APP_NAME#/PushPublishMap.txt (I'm adding EndOfLines to do reading easier)
MediaPackage={
"entryName":"MediaPackage",
"profile":"cupertino-http",
"streamName":"MediaPackageStream",
"destinationName":"MediaPackage0",
"host":"xxxx.mediapackage.eu-west-1.amazonaws.com/in/v2/xxxx/xxxx/channel",
"port":"443",
"sendSSL":"true",
"username":"xxxx,
"password":"xxxx",
"http.path":"hls"
}
When I'm sending data to my wowza ( rtsp://X.X.X.X:1935/#APP_NAME#/MediaPackage ) I start to see logs like this...
WARN server comment 2020-06-02 09:23:49 - - - - - 4325.922 - - - - - - - - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Found 79 segments to send
WARN server comment 2020-06-02 09:23:49 - - - - - 4325.922 - - - - - - - - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Found 76 segments to delete
ERROR server comment 2020-06-02 09:23:49 - - - - - 4325.934 - - - - - - - - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Send media segment. rendition: AUDIOVIDEO chunkId:77 uri:pdmekxw9/media_77.aac result:FAILURE
So, HLS Push Publishing is sending chunks... but without success
I have read https://www.wowza.com/docs/how-to-configure-apple-hls-packetization-cupertinostreaming but I don't know what values I may change.
What am I doing wrong? Any ideas?
EDIT: More logs
2020-06-02 14:32:39 UTC comment server INFO 200 - PushPublishHTTPCupertinoHTTPHandler.createOutputItem([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) chunkCount:10, chunkStartIndex:201, lastChunkIndex:209 - - -22856.082 - - - - - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server INFO 200 - PushPublishHTTPCupertinoHTTPHandler.createOutputItem([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) playlistChunkCount:3, playlistChunkStartIndex:208 - - - 22856.082 - - - - - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server INFO 200 - PushPublishHTTPCupertinoHTTPHandler.createOutputItem([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) New chunk: chunkRendition:AUDIOVIDEO, chunkId:210, chunkIndex:2 - -- 22856.082 - - - - - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server INFO 200 - PushPublishHTTPCupertinoHTTPHandler.createOutputItem([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Marking MediaSegmentModel: pcnod08j/media_207.aac for deletion - -- 22856.083 - - - - - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server WARN 200 - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Found 32 segments to send - - - 22856.083 - - -- - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server WARN 200 - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Found 29 segments to delete - - - 22856.083 - -- - - - - - - - - - - - - - - - - - - - - - -
2020-06-02 14:32:39 UTC comment server ERROR 500 - PushPublishHTTPCupertinoHTTPHandler.outputSend([MediaPackage] TV/_definst_/MediaPackage->MediaPackageStream) Send media segment. rendition: AUDIOVIDEO chunkId:208 uri:pcnod08j/media_208.aac result:FAILURE - - - 22856.097
While configuring Ember project on my local PC.
I have installed all the required packages. But when I try to build the project it gave me error.
After some research I found a solution, to uninstall ember-browserify and use ember-auto-import instead.
I tried to uninstall the ember-browserify. It helps to create a successful build but ember-pusher requires the ember-browserify and my app doesn't load properly.
Then I go back on installing the ember-browserify and have no luck till yet to fix the issue.
My package.json file:
{
"name": "compliance",
"version": "2.36.0",
"private": true,
"description": "Ember application for COMpliance API",
"repository": "",
"license": "",
"author": "",
"directories": {
"doc": "doc",
"test": "tests"
},
"scripts": {
"build": "ember build",
"lint:hbs": "ember-template-lint .",
"lint:js": "eslint .",
"start": "ember serve",
"test": "ember test"
},
"devDependencies": {
"#ember/jquery": "^0.5.2",
"#ember/optional-features": "^0.6.4",
"aws-sdk": "^2.676.0",
"ember-ajax": "^3.1.0",
"ember-bootstrap-datetimepicker": "^1.1.0",
"ember-cli-app-version": "^3.2.0",
"ember-cli-babel": "^6.18.0",
"ember-cli-chart": "^3.6.0",
"ember-cli-dependency-checker": "^3.2.0",
"ember-cli-deploy": "^1.0.2",
"ember-cli-deploy-build": "^1.1.2",
"ember-cli-deploy-cloudfront": "~1.1.0",
"ember-cli-deploy-display-revisions": "^1.0.1",
"ember-cli-deploy-gzip": "^1.0.1",
"ember-cli-deploy-manifest": "~1.1.0",
"ember-cli-deploy-revision-data": "~1.0.0",
"ember-cli-deploy-s3": "~1.1.0",
"ember-cli-deploy-s3-index": "^1.0.1",
"ember-cli-deploy-slack": "^1.0.1",
"ember-cli-dotenv": "1.2.0",
"ember-cli-dropzonejs": "^1.3.6",
"ember-cli-eslint": "^4.2.3",
"ember-cli-flash": "^1.8.1",
"ember-cli-htmlbars": "^3.1.0",
"ember-cli-htmlbars-inline-precompile": "^1.0.3",
"ember-cli-inject-live-reload": "^1.8.2",
"ember-cli-lightbox": "1.0.2",
"ember-cli-mentionable": "0.0.10",
"ember-cli-moment-shim": "^1.1.0",
"ember-cli-qunit": "^4.3.2",
"ember-cli-sass": "^7.2.0",
"ember-cli-sentry": "^2.4.4",
"ember-cli-shims": "^1.2.0",
"ember-cli-sri": "^2.1.1",
"ember-cli-string-helpers": "^1.10.0",
"ember-cli-template-lint": "^1.0.0",
"ember-cli-uglify": "^2.1.0",
"ember-cli-windows-addon": "^1.3.1",
"ember-data": "^3.18.0",
"ember-drag-drop": "^0.5.1",
"ember-export-application-global": "^2.0.1",
"ember-highcharts": "^1.2.0",
"ember-infinity": "^1.4.9",
"ember-initials": "^3.13.0",
"ember-jquery-legacy": "^1.0.0",
"ember-load-initializers": "^1.1.0",
"ember-maybe-import-regenerator": "^0.1.6",
"ember-moment": "6.1.0",
"ember-radio-button": "1.2.0",
"ember-resolver": "^5.3.0",
"ember-responsive-tabs": "^1.0.7",
"ember-side-menu": "^0.1.0",
"ember-simple-auth": "^1.9.2",
"ember-source": "^3.18.1",
"ember-tag-input": "^1.2.2",
"ember-toggle": "^5.3.3",
"ember-tooltips": "^3.4.2",
"ember-welcome-page": "^3.2.0",
"ember-wormhole": "^0.5.5",
"eonasdan-bootstrap-datetimepicker": "^4.17.47",
"eslint-plugin-ember": "^5.4.0",
"highcharts": "^6.2.0",
"loader.js": "^4.7.0",
"moment": "^2.25.3",
"moment-timezone": "0.5.2",
"pdfjs-dist": "^2.3.200",
"pusher-js": "^3.1.0",
"qunit-dom": "^0.8.5"
},
"engines": {
"node": "6.* || 8.* || >= 10.*"
},
"dependencies": {
"broccoli-asset-rev": "^3.0.0",
"ember-browserify": "^1.2.2",
"ember-cli": "^3.18.0",
"ember-progress-bar": "^1.0.0",
"ember-pusher": "^1.1.1",
"jam-icons": "^2.0.0",
"jquery-csv": "^1.0.11",
"liquid-fire": "^0.31.0",
"node-sass": "^4.14.1"
}
}
Error log:
=================================================================================
ENV Summary:
TIME: Fri May 15 2020 18:37:52 GMT+0500 (Pakistan Standard Time)
TITLE: ember
ARGV:
- C:\Program Files\nodejs\node.exe
- C:\Users\upervaiz\AppData\Roaming\npm\node_modules\ember-cli\bin\ember
- s
EXEC_PATH: C:\Program Files\nodejs\node.exe
TMPDIR: C:\Users\upervaiz\AppData\Local\Temp
SHELL: null
PATH:
- C
- \ProgramData\DockerDesktop\version-bin;C
- \Program Files\Docker\Docker\Resources\bin;C
- \Python37\Scripts\;C
- \Python37\;C
- \Program Files\Python37\Scripts\;C
- \Program Files\Python37\;C
- \Program Files (x86)\Common Files\Oracle\Java\javapath;C
- \Windows\system32;C
- \Windows;C
- \Windows\System32\Wbem;C
- \Windows\System32\WindowsPowerShell\v1.0\;C
- \Windows\System32\OpenSSH\;C
- \Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\;C
- \Program Files\Microsoft SQL Server\130\Tools\Binn\;C
- \Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\;C
- \Program Files\Microsoft SQL Server\130\DTS\Binn\;C
- \Program Files\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C
- \Program Files (x86)\Microsoft SQL Server\Client SDK\ODBC\130\Tools\Binn\;C
- \Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\ManagementStudio\;C
- \Program Files\Microsoft SQL Server\120\Tools\Binn\;C
- \Python27\;C
- \Python27\Scripts;C
- \Program Files\PuTTY\;C
- \Program Files\Java\jdk1.6.0_45\bin;C
- \Program Files\Git;"C
- \Windows;C
- \Windows\System32;C
- \Python27";C
- \Program Files\helm\windows-amd64;C
- \Program Files\IBM\Cloud\bin;C
- \xampp\php;C
- \ProgramData\ComposerSetup\bin;C
- \WINDOWS\system32;C
- \WINDOWS;C
- \WINDOWS\System32\Wbem;C
- \WINDOWS\System32\WindowsPowerShell\v1.0\;C
- \WINDOWS\System32\OpenSSH\;C
- \Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C
- \AIRSDK\bin;C
- \Program Files (x86)\gnupg\bin;C
- \Program Files\nodejs\;C
- \Program Files\Git\cmd;C
- \Program Files\Git\bin;C
- \Users\upervaiz\AppData\Roaming\npm;C
- \Users\upervaiz\AppData\Local\Microsoft\WindowsApps;C
- \Program Files\Java\jdk1.6.0_45\jre\bin\server;C
- \Users\upervaiz\AppData\Local\atom\bin;C
- \Program Files (x86)\Nmap;C
- \Users\upervaiz\AppData\Roaming\Composer\vendor\bin;C
- \Users\upervaiz\AppData\Local\GitHubDesktop\bin;C
- \Users\upervaiz\AppData\Local\Programs\Microsoft VS Code\bin;C
- \Users\upervaiz\AppData\Local\Microsoft\WindowsApps;C
- \Users\upervaiz\AppData\Roaming\npm
PLATFORM: win32 x64
FREEMEM: 3896344576
TOTALMEM: 17054658560
UPTIME: 12891
LOADAVG: 0,0,0
CPUS:
- Intel(R) Core(TM) i5-6300U CPU # 2.40GHz - 2496
- Intel(R) Core(TM) i5-6300U CPU # 2.40GHz - 2496
- Intel(R) Core(TM) i5-6300U CPU # 2.40GHz - 2496
- Intel(R) Core(TM) i5-6300U CPU # 2.40GHz - 2496
ENDIANNESS: LE
VERSIONS:
- ares: 1.16.0
- brotli: 1.0.7
- cldr: 36.0
- http_parser: 2.9.3
- icu: 65.1
- llhttp: 2.0.4
- modules: 72
- napi: 5
- nghttp2: 1.40.0
- node: 12.16.3
- openssl: 1.1.1g
- tz: 2019c
- unicode: 12.1
- uv: 1.34.2
- v8: 7.8.279.23-node.35
- zlib: 1.2.11
ERROR Summary:
- broccoliBuilderErrorStack: [undefined]
- code: [undefined]
- codeFrame: [undefined]
- errorMessage: WatcherAdapter's first argument must be an array of SourceNodeWrapper nodes
- errorType: [undefined]
- location:
- column: [undefined]
- file: [undefined]
- line: [undefined]
- message: WatcherAdapter's first argument must be an array of SourceNodeWrapper nodes
- name: TypeError
- nodeAnnotation: [undefined]
- nodeName: [undefined]
- originalErrorMessage: [undefined]
- stack: TypeError: WatcherAdapter's first argument must be an array of SourceNodeWrapper nodes
at new WatcherAdapter (D:\Projects\Neb\Front\compliance-app\node_modules\broccoli\dist\watcher_adapter.js:18:19)
at new Watcher (D:\Projects\Neb\Front\compliance-app\node_modules\broccoli\dist\watcher.js:23:44)
at Watcher.constructBroccoliWatcher (D:\Projects\Neb\Front\compliance-app\node_modules\ember-cli\lib\models\watcher.js:45:19)
at new Watcher (D:\Projects\Neb\Front\compliance-app\node_modules\ember-cli\lib\models\watcher.js:27:41)
at ServeTask.run (D:\Projects\Neb\Front\compliance-app\node_modules\ember-cli\lib\tasks\serve.js:58:7)
at D:\Projects\Neb\Front\compliance-app\node_modules\ember-cli\lib\models\command.js:238:24
at processTicksAndRejections (internal/process/task_queues.js:97:5)
Change the line
"ember-pusher": "^1.1.1",
to
"ember-pusher": "truecoach/ember-pusher#2.0.1"
and remove ember-browserify and pusher-js from your package.json.
This will update ember-pusher to a fork on github not published to npm that uses ember-auto-import and does not require ember-browserify.
FAILED
Error reading manifest file:
yaml: line 3: did not find expected key
---
applications:
- name: Ragu-psdonationwebservice
memory: 128MB
instrances: 1
command: node ./bin/www
services:
- mymongo
You have a typo. Try this:
---
applications:
- name: Ragu-psdonationwebservice
memory: 128MB
instances: 1
command: node ./bin/www
services:
- mymongo