I have been making lots of changes in appearance source code (variables.less => static/assets/variables.less) of superset and then did a superset init but nothing changed. Do we have to do something else to reconfigure those changes?
Superset init creates default roles and permissions, so it's a backend task! When you want to customize superset you should run backend and frontend separately as per official instructions.
Then in a terminal you should have a tab for the backend, and another for the frontend. When changing the frontend's code you'll have to run npm install inside the superset-frontend directory once, and to run it you will have to run the command npm run dev-server.
All these steps are documented in the link I provided though. I recommend re-reading the documentation.
I believe you need to rebuild the assets:
cd into the /assets directory; then
$ npm install
$ npm install webpack
$ npm run prod
If this doesn't work, try https://github.com/apache/incubator-superset/blob/master/CONTRIBUTING.md#frontend-assets
Related
I am trying to get this app to work on my laptop. I am using npm start to start the application but it gives me errors. How do I get it to start? This is the code I am using. It is a clone of an app on GitHub.
https://github.com/adrianhajdin/project_medical_pager_chat.git
From your comment, you're not running npm install in the right directory.
There is a client and a server folder. You need to npm install in both, and run both.
Download the ZIP file and extract
Go into the server folder and run npm install && npm start
Go into the client folder and run npm install && npm start
This will cause http://localhost:3000/ to open in your default browser which is the client application, which apparently connects to the server application.
Note: I had to change the server port from 5000 to 5001 for some reason. That change can be made in server/index.js. You may not have this probem.
I'm developing an Aurelia Single Page App that will talk to a REST api built with Django Rest Framework.
Without containers, I would have two buildpacks, one for Node that runs a script in the package.json file and another for Python that builds the Django app.
If I push a container image, then what mechanism replaces the node buildpack that calls the script in package.json to trigger webpack to create the asset bundles?
what mechanism replaces the node buildpack that calls the script in package.json
You're not really giving any info regarding your current setup and what you've tried already, so I'll assume you already know how to run docker on heroku, and that you got your current setup to work on heroku without docker.
If you've got a script called build in your package.json that kicks off the webpack build, and start that starts a node.js express app to serve your app from the webpack output folder, you'd do something like this in your Dockerfile:
FROM node:8.9.4
RUN npm install
RUN npm run build
CMD npm run start
Of course this doesn't account for any copying and permission setting you may need to do, but that depends on your project setup.
The important bit is that you're essentially running the thing as a node app, and you need the appropriate scripts in your package.json to which you can delegate the building and running, so you only need to call one or two of those scripts from your Dockerfile. You don't want to be doing too much npm stuff there directly.
The apt-buildpack is experimental and not yet intended for production use. I guess that's why also no documentation.
Creating container
Successfully created container
Downloading app package...
Downloaded app package (862.7K)
Warning: this buildpack can only be run as a supply buildpack, it can not be run alone
Failed to compile droplet: Failed to compile droplet: exit status 1
Destroying container
Exit status 223
Stopping instance abdfc8d0-699e-4834-9f2d-2b8aec218423
Successfully destroyed container
Can you give me example how to push cf-env sample app and install for example rtorrent and/or openvpn. Is it possible to install gnome for testing purposes?
As far as usage goes it's pretty simple, you just need to include an apt.yml in the root directory of your app. That should contain among other things, the list of packages to install.
Ex:
---
packages:
- ascii
- libxml
- https://example.com/exciting.deb
The buildpack supports installing package names, deb files, custom APT repositories, and even PPAs.
Please see the README for further instructions.
This message:
Warning: this buildpack can only be run as a supply buildpack, it can not be run alone
Is telling you that the Apt buildpack only functions to supply binaries. It doesn't actually know how to run your app or any application. For more on the supply script, check out the docs here.
The trick to making it work is that you need to use multi buildpack support. Instructions for doing that can be found here. This should work with most apps, but there's a simple example here.
Once your app stages & starts, you can confirm that your packages were installed by running cf ssh apt-test -t -c "/tmp/lifecycle/launcher /home/vcap/app bash ''". Anything that was installed should be on the path, but if you want to see where things are installed it'll be under the /home/vcap/deps/<buildpack-number>/.
That should be about it. Hope that helps!
I'm trying to run unit tests against our AngularCLI project using our hosted VSTS build agents however it keeps running into trouble when it tries to run 'ng test'.
To resolve this I have tried to make the agent use the ng tool directly by providing the path to the tool. This hasn't worked as it looks like it's trying to run 'ng test' where the tool is rather than in the specified current working directory:
I've also tried to add it as an environment variable in Windows (we're using Windows Server 2012 to host the VSTS agent) and setting the tool in the VSTS agent as just ng however it doesn't appear to be finding the ng tool:
How can I get the VSTS agent to make use of the ng tool to run tests? We have got #angular/cli installed on the server hosting the agent.
The thing is that you won't get angular cli installed on VSTS globally as its build server is not supporting that. But the good thing you not even need cli globally installed on your agent.
All you need is npm run ng build -- prod - this way it will always run the local version. Also this way you won't need to take care of updating your global package at all.
Use npm run ng test to run tests, npm run ng e2e to run protractor. If you need to pass any more params to any of these just use --
As mentioned by #Kuncevic, to use the Angular CLI without installing it globally, you will need to use the npm run command.
To run an Angular build using Azure Devops:
Add an npm task to install dependencies (choose install for the command)
Add another npm task, but choose custom for the command. Then add your command and arguments:
run ng -- build --output-path=dist --configuration=prod
Note how npm is not a part of the command and arguments since this will be provided by the task. Also note how -- separates the command to be run and the arguments to be passed to the command.
I am very close to set up dev environment for hyperledger fabric and following this link
https://github.com/IBM-Blockchain/learn-chaincode/blob/master/docs/setup.md
When I run this command git clone -b v0.6 http://gerrit.hyperledger.org/r/fabric
and run go build. I get following error:
can't load package: package github.com/hyperledger/fabric: no
buildable Go source files in
/Users/test/work/src/github.com/hyperledger/fabric
However when I run step 4 from the link, the build success.
cd $GOPATH/src/github.com//learn-chaincode/start
go build ./
Here build is not succeed only for http://gerrit.hyperledger.org/r/fabric.
Any thoughts?
Please suggest!
I think the manual is not precisely written here. You are not supposed to run go build . on the cloned fabric repository. The manual just states here, that if you are getting build errors later, the clone into your go sources did not work. I is not asking you to build the fabric repository. If your build command is executed in step 4, everything should be set up correctly.
Assuming you are setting up the dev environment you want to build things for that after cloning the repo. This is done with make, thus e.g. make all to build and test all.
To build chaincode later on you use go build in the folder where you have the chaincode source file.