Docker image creation using nodejs with redis - dockerfile

i am using below Docker file. how can i configure redis in my Dockerfile?
also i am using build command docker build - < Dockerfile but this didn't work out.
if i run this command the following error will show
COPY failed: no source files were specified
FROM node:lts
RUN mkdir -p /app
WORKDIR /app
COPY package*.json /app
RUN yarn
COPY . /app
CMD ["yarn","run","start"]

One cannot use docker build - < Dockerfile to build an image that uses COPY instructions, because those instructions require those files to be present in the build context.
One must use docker build ., where . is the relative path to the build context.
Using docker build - < Dockerfile effectively means that the only thing in the build context is the Dockerfile. The files that one wants to copy into the docker image are not known to docker, because they are not included in the context.

Related

exec: "--env": executable file not found in $PATH: unknown [duplicate]

I am having a problem running Docker container after upgrading from NodeJS 8.2 to 9.1. This is the message I am getting.
I used the Dockerfile I found in Docker Hub but got an error of not being able to find package.json. So I commented it out and use the one I found on NodeJS website.
Below is the Docker File:
Dockerfile
FROM node:9.1.0
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
ONBUILD ARG NODE_ENV
ONBUILD ENV NODE_ENV $NODE_ENV
ONBUILD COPY package*.json ./
ONBUILD RUN npm install && npm cache clean --force
ONBUILD COPY . /usr/src/app
CMD [ "npm", "start" ]
I would appreciate help from more experienced users.
Your docker run command syntax is wrong. Everything after the image name is used to override the command run in your container. So docker run myimage -d will try to run -d inside the container, while docker run -d myimage will run your container with the -d option to docker run (detached mode).
The Dockerfile you referenced is meant to be used as parent image for an easy dockerization of your application.
So to dockerize your nodejs application, you'd need to create a dockerfile using the docker image created by said dockerfile.
The ONBUILD instruction gets executed whenever a new image is built with this particular image as parent image (FROM instruction). More info
I've never used an image like this, but from the looks of it, it should be enough to reference the image with the FROM instruction and then provide the NODE_ENV via build args.
The dockerfile to add into your project:
FROM this_image:9.1
How to build your application image:
docker build -t IMAGE_NAME:TAG --build-arg NODE_ENV=production .

Google Cloud Run: COPY fails when changing source folder from ./ to build

$ gcloud builds submit --tag gcr.io/projectname/testserver
// ... works fine until the COPY step:
Step 6/7 : COPY build ./
COPY failed: stat /var/lib/docker/tmp/docker-builder653325957/build: no such file or directory
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: exit status 1
That build folder listed above, /var/lib/docker/tmp/docker-builder653325957/build, is not a local folder. Does Cloud Builder create a temp folder in that format?
How do I get it to copy my local build folder?
I also tried COPY ./build ./ but the CLI output was the same
Dockerfile below.
FROM node:12-slim
# Create app folder
WORKDIR /usr/src/app
# Install app deps. Copy the lock file
COPY package*.json ./
RUN npm install
ENV SCOPES=removed \
SHOPIFY_API_KEY=removed \
SHOPIFY_API_SECRET=removed \
CLIENT_APP_URL=removed
COPY build ./
CMD ["node", "server.js"]
The gcloud command uses the .gitignore and .gcloudignore files to determine which files and directories to include with the Docker build. If your build directory is listed in either of these files, it won't be available to copy into your container image.

How do I let my docker volume sync to my filesystem and allow writing from docker build

I'm building a django app using docker. The issue I am having is that my local filesystem is not synced to the docker environment so making local changes have no effect until I rebuild.
I added a volume
- ".:/app:rw"
which is syncing to my local filesystem but does my bundles that get built via webpack during the build don't get inserted (because they aren't in my filesystem)
My dockerfile has this
... setup stuff...
ENV NODE_PATH=$NVM_DIR/versions/node/v$NODE_VERSION/lib/node_modules \
PATH=$NVM_DIR/versions/node/v$NODE_VERSION/bin:$PATH
ENV PATH=/node_modules/.bin:$PATH
COPY package*.json /
RUN (cd / && npm install && rm -rf /tmp/*)
...pip install stuff...
COPY . /app
WORKDIR /app
RUN npm run build
RUN DJANGO_MODE=build python manage.py collectstatic --noinput
So I want to sync to my local filesystem so I can make changes and have them show up immediately AND have my bundles and static assets present. The way I've been developing so far is to just comment out the app:rw line in my docker-compose.yml which allows all the assets and bundles to be present.
The solution that ended up working for me was to assign a volume to the directory I wanted to not sync to my local environment.
volumes:
- ".:/app/:rw"
- "/app/project_folder/static_source/bundles/"
- "/app/project_folder/bundle_tracker/"
- "/app/project_folder/static_source/static/"
Arguably there's probably a better way to do this, but this solution does work. The dockerfile compiles webpack and collect static does it's job both within the container and the last 3 lines above keep my local machine from overwriting them. The downside is that I still have to figure out a better solution for live recompile of scss or javascript, but that's a job for another day.
You can mount a local folder into your Docker image. Just use the --mount option at the docker run command. In the following example the current directory will be available in your Docker image at /app.
docker run -d \
-it \
--name devtest \
--mount type=bind,source="$(pwd)"/target,target=/app \
nginx:latest
Reference: https://docs.docker.com/storage/bind-mounts/

build and deploy from one dockerfile or have different files for each purpose?

I am creating react app and want to build and deploy the build to new container?
I am trying stages in dockerfile.
FROM node:alpine as builder
WORKDIR /app/
COPY package.json .
RUN npm install
COPY . .
RUN npm run build
FROM nginx:alpine
EXPOSE 80
well the steps you followed is all correct,You just missed the part from where to copy the build folder and paste it in the nginx image.
by default nginx takes files to serve from
usr/share/nginx/html
try the bellow code.
FROM node:alpine as builder
WORKDIR /app/
COPY package.json .
RUN npm install
COPY . .
RUN npm run build
FROM nginx:alpine
EXPOSE 80
COPY --from=builder /app/build /usr/share/nginx/html
if it doesnt work do comment.

Dockerfile copying war to local linked volume

I have a note app that I am building with a Dockerfile in the maven app.
I want to copy the artifact note-1.0.war to local linked volume to folder like webapps. So far I have the following in a Dockerfile:
FROM maven:latest
MAINTAINER Sonam <emailme#gmail.com>
RUN apt-get update
WORKDIR /code
#Prepare by downloading dependencies
ADD pom.xml /code/pom.xml
RUN ["mvn", "dependency:resolve"]
RUN ["mvn", "verify"]
#Adding source, compile and package into a fat jar
ADD src /code/src
RUN ["mvn", "clean"]
#RUN ["mvn", "install"]
RUN ["mvn", "install", "-Dmaven.test.skip=true"]
RUN mkdir webapps
COPY note-1.0.war webapps
#COPY code/target/note-1.0.war webapps
Unfortunately, I keep seeing the "no such file or directory" at the COPY statement. The following is the error from build on Docker hub:
...
---> bd555aecadbd
Removing intermediate container 69c09945f954
Step 11 : RUN mkdir webapps
---> Running in 3d114c40caee
---> 184903fa1041
Removing intermediate container 3d114c40caee
Step 12 : COPY note-1.0.war webapps
lstat note-1.0.war: no such file or directory
How can I copy the war file to a "webapps" folder that I executed in
RUN mkdir webapps
thanks
The COPY instruction copies new files or directories from <src> and adds them to the filesystem of the container at the path <dest>.
In your example the docker build is looking for note-1.0.war in the same directory than Dockerfile.
If I understand your intention, you want to copy a file inside the image that is build from previous RUN in Dockerfile.
So you should use something like
RUN cp /code/target/note-1.0.war /code/webapps