Epoptes is a lab view (as in classroom lab) software commonly available in Debian based distributions. I'm trying to rebuild it for CentOS, basically as described here and here. The OpenSuse rpm can be found here, the deb here.
Either way, from deb to rpm or OpenSuse rpm the stopping point is the same:
/var/tmp/rpm-tmp.leBIra: line 1: fg: no job control
error: %pre(epoptes-0.5.10-3.1.noarch) scriptlet failed, exit status 1
error: epoptes-0.5.10-3.1.noarch: install failed
error: epoptes-0.5.10-3.noarch: erase skipped
Basically, trying to install the rpm after rebuild, e.g.,
rpmrebuild -pe epoptes-0.5.10-3.noarch.rpm
rpm -Uvh epoptes-0.5.10-3.noarch.rpm
fails complaining about job control. I am not sure how to amend this given the workflow for rebuilding the rpm.
From OpenSuse:
Using the command rpm -qp --scripts epoptes-0.5.10-3.1.noarch.rpm returns
preinstall scriptlet (using /bin/sh):
%service_add_pre epoptes-server.service
postinstall scriptlet (using /bin/sh):
if ! getent group epoptes >/dev/null; then
groupadd --system epoptes
fi
if ! [ -f /etc/epoptes/server.key ] || ! [ -f /etc/epoptes/server.crt ] || ! [ -s /etc/epoptes/server.crt ]; then
if ! [ -d /etc/epoptes ]; then
mkdir /etc/epoptes
fi
openssl req -batch -x509 -nodes -newkey rsa:1024 -days $(($(date --utc +%s) / 86400 + 3652)) -keyout /etc/epoptes/server.key -out /etc/epoptes/server.crt
chmod 600 /etc/epoptes/server.key
fi
%service_add_post epoptes-server.service
preuninstall scriptlet (using /bin/sh):
%service_del_preun epoptes-server.service
postuninstall scriptlet (using /bin/sh):
%service_del_postun epoptes-server.service
Rebuilding from the Ubuntu deb, installation is successful, but launch fails with:
# epoptes
Traceback (most recent call last):
File "/usr/bin/epoptes", line 30, in <module>
import epoptes
ImportError: No module named epoptes
Running rpm -qlp on the RPM generated from the .deb shows:
/etc
/etc/default
/etc/default/epoptes
/etc/epoptes
/etc/init.d/epoptes
/usr
/usr/bin/epoptes
/usr/lib/python2.7
/usr/lib/python2.7/dist-packages
/usr/lib/python2.7/dist-packages/epoptes
/usr/lib/python2.7/dist-packages/epoptes-0.5.10_2.egg-info
/usr/lib/python2.7/dist-packages/epoptes/__init__.py
/usr/lib/python2.7/dist-packages/epoptes/common
/usr/lib/python2.7/dist-packages/epoptes/common/__init__.py
/usr/lib/python2.7/dist-packages/epoptes/common/config.py
/usr/lib/python2.7/dist-packages/epoptes/common/constants.py
/usr/lib/python2.7/dist-packages/epoptes/common/ltsconf.py
/usr/lib/python2.7/dist-packages/epoptes/common/xdg_dirs.py
/usr/lib/python2.7/dist-packages/epoptes/core
/usr/lib/python2.7/dist-packages/epoptes/core/__init__.py
/usr/lib/python2.7/dist-packages/epoptes/core/lib_users.py
/usr/lib/python2.7/dist-packages/epoptes/core/structs.py
/usr/lib/python2.7/dist-packages/epoptes/core/wol.py
/usr/lib/python2.7/dist-packages/epoptes/daemon
/usr/lib/python2.7/dist-packages/epoptes/daemon/__init__.py
/usr/lib/python2.7/dist-packages/epoptes/daemon/bashplex.py
/usr/lib/python2.7/dist-packages/epoptes/daemon/commands.py
/usr/lib/python2.7/dist-packages/epoptes/daemon/exchange.py
/usr/lib/python2.7/dist-packages/epoptes/daemon/guiplex.py
/usr/lib/python2.7/dist-packages/epoptes/daemon/uiconnection.py
/usr/lib/python2.7/dist-packages/epoptes/ui
/usr/lib/python2.7/dist-packages/epoptes/ui/__init__.py
/usr/lib/python2.7/dist-packages/epoptes/ui/about_dialog.py
/usr/lib/python2.7/dist-packages/epoptes/ui/benchmark.py
/usr/lib/python2.7/dist-packages/epoptes/ui/client_information.py
/usr/lib/python2.7/dist-packages/epoptes/ui/execcommand.py
/usr/lib/python2.7/dist-packages/epoptes/ui/graph.py
/usr/lib/python2.7/dist-packages/epoptes/ui/gui.py
/usr/lib/python2.7/dist-packages/epoptes/ui/notifications.py
/usr/lib/python2.7/dist-packages/epoptes/ui/remote_assistance.py
/usr/lib/python2.7/dist-packages/epoptes/ui/sendmessage.py
/usr/lib/python2.7/dist-packages/twisted
/usr/lib/python2.7/dist-packages/twisted/plugins
/usr/lib/python2.7/dist-packages/twisted/plugins/epoptesd.py
/usr/share
/usr/share/applications
/usr/share/applications/epoptes.desktop
/usr/share/doc
/usr/share/doc/epoptes
/usr/share/doc/epoptes/README
/usr/share/doc/epoptes/changelog.Debian.gz
/usr/share/doc/epoptes/copyright
/usr/share/epoptes
/usr/share/epoptes/about_dialog.ui
/usr/share/epoptes/client-functions
/usr/share/epoptes/client_information.ui
/usr/share/epoptes/epoptes.ui
/usr/share/epoptes/executeCommand.ui
/usr/share/epoptes/images
/usr/share/epoptes/images/16
/usr/share/epoptes/images/16/assist.png
/usr/share/epoptes/images/16/broadcast-stop.png
/usr/share/epoptes/images/16/broadcast-windowed.png
/usr/share/epoptes/images/16/broadcast.png
/usr/share/epoptes/images/16/execute.png
/usr/share/epoptes/images/16/graph.png
/usr/share/epoptes/images/16/info.png
/usr/share/epoptes/images/16/lock-screen.png
/usr/share/epoptes/images/16/logout.png
/usr/share/epoptes/images/16/message.png
/usr/share/epoptes/images/16/mute.png
/usr/share/epoptes/images/16/observe.png
/usr/share/epoptes/images/16/poweron.png
/usr/share/epoptes/images/16/restart.png
/usr/share/epoptes/images/16/root-terminal.png
/usr/share/epoptes/images/16/shutdown.png
/usr/share/epoptes/images/16/terminal.png
/usr/share/epoptes/images/16/unlock-screen.png
/usr/share/epoptes/images/16/unmute.png
/usr/share/epoptes/images/assist.svg
/usr/share/epoptes/images/broadcast-stop.svg
/usr/share/epoptes/images/broadcast-windowed.svg
/usr/share/epoptes/images/broadcast.svg
/usr/share/epoptes/images/execute.svg
/usr/share/epoptes/images/fat.svg
/usr/share/epoptes/images/graph.png
/usr/share/epoptes/images/info.svg
/usr/share/epoptes/images/lock-screen.svg
/usr/share/epoptes/images/login.png
/usr/share/epoptes/images/logout.svg
/usr/share/epoptes/images/message.svg
/usr/share/epoptes/images/mute.svg
/usr/share/epoptes/images/observe.svg
/usr/share/epoptes/images/off.png
/usr/share/epoptes/images/offline.svg
/usr/share/epoptes/images/on.png
/usr/share/epoptes/images/poweron.svg
/usr/share/epoptes/images/restart.svg
/usr/share/epoptes/images/root-terminal.svg
/usr/share/epoptes/images/shutdown.svg
/usr/share/epoptes/images/standalone.svg
/usr/share/epoptes/images/systemgrp.png
/usr/share/epoptes/images/terminal.svg
/usr/share/epoptes/images/thin.svg
/usr/share/epoptes/images/unlock-screen.svg
/usr/share/epoptes/images/unmute.svg
/usr/share/epoptes/images/usersgrp.png
/usr/share/epoptes/netbenchmark.ui
/usr/share/epoptes/remote_assistance.ui
/usr/share/epoptes/sendMessage.ui
/usr/share/icons
/usr/share/icons/hicolor
/usr/share/icons/hicolor/scalable
/usr/share/icons/hicolor/scalable/apps
/usr/share/icons/hicolor/scalable/apps/epoptes.svg
/usr/share/locale
/usr/share/locale/af
/usr/share/locale/af/LC_MESSAGES
/usr/share/locale/af/LC_MESSAGES/epoptes.mo
/usr/share/locale/ar
/usr/share/locale/ar/LC_MESSAGES
/usr/share/locale/ar/LC_MESSAGES/epoptes.mo
/usr/share/locale/bg
/usr/share/locale/bg/LC_MESSAGES
/usr/share/locale/bg/LC_MESSAGES/epoptes.mo
/usr/share/locale/ca
/usr/share/locale/ca/LC_MESSAGES
/usr/share/locale/ca/LC_MESSAGES/epoptes.mo
/usr/share/locale/ca#valencia
/usr/share/locale/ca#valencia/LC_MESSAGES
/usr/share/locale/ca#valencia/LC_MESSAGES/epoptes.mo
/usr/share/locale/cs
/usr/share/locale/cs/LC_MESSAGES
/usr/share/locale/cs/LC_MESSAGES/epoptes.mo
/usr/share/locale/da
/usr/share/locale/da/LC_MESSAGES
/usr/share/locale/da/LC_MESSAGES/epoptes.mo
/usr/share/locale/de
/usr/share/locale/de/LC_MESSAGES
/usr/share/locale/de/LC_MESSAGES/epoptes.mo
/usr/share/locale/el
/usr/share/locale/el/LC_MESSAGES
/usr/share/locale/el/LC_MESSAGES/epoptes.mo
/usr/share/locale/en_AU
/usr/share/locale/en_AU/LC_MESSAGES
/usr/share/locale/en_AU/LC_MESSAGES/epoptes.mo
/usr/share/locale/en_GB
/usr/share/locale/en_GB/LC_MESSAGES
/usr/share/locale/en_GB/LC_MESSAGES/epoptes.mo
/usr/share/locale/es
/usr/share/locale/es/LC_MESSAGES
/usr/share/locale/es/LC_MESSAGES/epoptes.mo
/usr/share/locale/eu
/usr/share/locale/eu/LC_MESSAGES
/usr/share/locale/eu/LC_MESSAGES/epoptes.mo
/usr/share/locale/fi
/usr/share/locale/fi/LC_MESSAGES
/usr/share/locale/fi/LC_MESSAGES/epoptes.mo
/usr/share/locale/fr
/usr/share/locale/fr/LC_MESSAGES
/usr/share/locale/fr/LC_MESSAGES/epoptes.mo
/usr/share/locale/gl
/usr/share/locale/gl/LC_MESSAGES
/usr/share/locale/gl/LC_MESSAGES/epoptes.mo
/usr/share/locale/he
/usr/share/locale/he/LC_MESSAGES
/usr/share/locale/he/LC_MESSAGES/epoptes.mo
/usr/share/locale/hu
/usr/share/locale/hu/LC_MESSAGES
/usr/share/locale/hu/LC_MESSAGES/epoptes.mo
/usr/share/locale/id
/usr/share/locale/id/LC_MESSAGES
/usr/share/locale/id/LC_MESSAGES/epoptes.mo
/usr/share/locale/it
/usr/share/locale/it/LC_MESSAGES
/usr/share/locale/it/LC_MESSAGES/epoptes.mo
/usr/share/locale/lt
/usr/share/locale/lt/LC_MESSAGES
/usr/share/locale/lt/LC_MESSAGES/epoptes.mo
/usr/share/locale/ml
/usr/share/locale/ml/LC_MESSAGES
/usr/share/locale/ml/LC_MESSAGES/epoptes.mo
/usr/share/locale/ms
/usr/share/locale/ms/LC_MESSAGES
/usr/share/locale/ms/LC_MESSAGES/epoptes.mo
/usr/share/locale/nb
/usr/share/locale/nb/LC_MESSAGES
/usr/share/locale/nb/LC_MESSAGES/epoptes.mo
/usr/share/locale/nl
/usr/share/locale/nl/LC_MESSAGES
/usr/share/locale/nl/LC_MESSAGES/epoptes.mo
/usr/share/locale/oc
/usr/share/locale/oc/LC_MESSAGES
/usr/share/locale/oc/LC_MESSAGES/epoptes.mo
/usr/share/locale/pl
/usr/share/locale/pl/LC_MESSAGES
/usr/share/locale/pl/LC_MESSAGES/epoptes.mo
/usr/share/locale/pt
/usr/share/locale/pt/LC_MESSAGES
/usr/share/locale/pt/LC_MESSAGES/epoptes.mo
/usr/share/locale/pt_BR
/usr/share/locale/pt_BR/LC_MESSAGES
/usr/share/locale/pt_BR/LC_MESSAGES/epoptes.mo
/usr/share/locale/ru
/usr/share/locale/ru/LC_MESSAGES
/usr/share/locale/ru/LC_MESSAGES/epoptes.mo
/usr/share/locale/se
/usr/share/locale/se/LC_MESSAGES
/usr/share/locale/se/LC_MESSAGES/epoptes.mo
/usr/share/locale/sk
/usr/share/locale/sk/LC_MESSAGES
/usr/share/locale/sk/LC_MESSAGES/epoptes.mo
/usr/share/locale/sl
/usr/share/locale/sl/LC_MESSAGES
/usr/share/locale/sl/LC_MESSAGES/epoptes.mo
/usr/share/locale/so
/usr/share/locale/so/LC_MESSAGES
/usr/share/locale/so/LC_MESSAGES/epoptes.mo
/usr/share/locale/sr
/usr/share/locale/sr/LC_MESSAGES
/usr/share/locale/sr/LC_MESSAGES/epoptes.mo
/usr/share/locale/sr#latin
/usr/share/locale/sr#latin/LC_MESSAGES
/usr/share/locale/sr#latin/LC_MESSAGES/epoptes.mo
/usr/share/locale/sv
/usr/share/locale/sv/LC_MESSAGES
/usr/share/locale/sv/LC_MESSAGES/epoptes.mo
/usr/share/locale/tr
/usr/share/locale/tr/LC_MESSAGES
/usr/share/locale/tr/LC_MESSAGES/epoptes.mo
/usr/share/locale/uk
/usr/share/locale/uk/LC_MESSAGES
/usr/share/locale/uk/LC_MESSAGES/epoptes.mo
/usr/share/locale/vi
/usr/share/locale/vi/LC_MESSAGES
/usr/share/locale/vi/LC_MESSAGES/epoptes.mo
/usr/share/locale/zh_CN
/usr/share/locale/zh_CN/LC_MESSAGES
/usr/share/locale/zh_CN/LC_MESSAGES/epoptes.mo
/usr/share/locale/zh_TW
/usr/share/locale/zh_TW/LC_MESSAGES
/usr/share/locale/zh_TW/LC_MESSAGES/epoptes.mo
/usr/share/ltsp
/usr/share/ltsp/plugins
/usr/share/ltsp/plugins/ltsp-build-client
/usr/share/ltsp/plugins/ltsp-build-client/common
/usr/share/ltsp/plugins/ltsp-build-client/common/040-epoptes-certificate
/usr/share/man
/usr/share/man/man1
/usr/share/man/man1/epoptes.1.gz
Related
Here is my building process
I open mingw32 from the x64 Native Tools Command Prompt for VS 2022
then in the mingw32 shell:
# cd /
# ./c/Program\ Files/Microsoft\ Visual\ Studio/2022/Community/VC/Auxiliary/Build/vcvars32.bat
# cd ~
# pacman -Sy diffutils git make gcc yasm pkg-config --noconfirm
# git clone --depth 1 https://git.ffmpeg.org/ffmpeg.git ffmpeg
# git clone https://git.videolan.org/git/ffmpeg/nv-codec-headers.git nv-codec-headers
# cd nv-codec-headers/
# make PREFIX=/usr/local
# make install PREFIX=/usr/local
# cd ..
# mkdir nv_sdk
# cp -r /c/Program\ Files/NVIDIA\ GPU\ Computing\ Toolkit/CUDA/v11.7/lib/Win32/* nv_sdk
# cp -r /c/Program\ Files/NVIDIA\ GPU\ Computing\ Toolkit/CUDA/v11.7/include/* nv_sdk
# export PATH="/c/Program Files/Microsoft Visual Studio/2022/Community/VC/Tools/MSVC/14.32.31326/bin/Hostx86/x86/":$PATH
# export PATH="/c/Program Files/NVIDIA GPU Computing Toolkit/CUDA/v11.7/bin/":$PATH
# ./configure --disable-everything --enable-decoder=h264 --enable-decoder=hevc --enable-cross-compile --disable-avdevice --disable-swresample --disable-postproc --disable-avfilter --target-os=mingw32 --enable-cuda-nvcc --enable-nonfree --toolchain=msvc --extra-cflags=-I../nv_sdk --extra-ldflags=" -m32 -L../nv_sdk" --enable-shared --shlibdir=SHARED_LIBS --arch=x86_32 --enable-runtime-cpudetect --enable-w32threads
# make -j8
# make install
first I have a bunch of warnings during the making looking like that:
libavutil/opt.c(1075): warning C4133: 'fonction' : types incompatibles - de 'AVPixelFormat *' à 'int *'
And finally the make install returns :
EXTERN_PREFIX="_" AR="lib.exe" NM="dumpbin.exe -symbols" ./compat/windows/makedef libavutil/libavutil.ver libavutil/adler32.o libavutil/aes.o libavutil/aes_ctr.o libavutil/audio_fifo.o libavutil/avsscanf.o libavutil/avstring.o libavutil/base64.o libavutil/blowfish.o libavutil/bprint.o libavutil/buffer.o libavutil/camellia.o libavutil/cast5.o libavutil/channel_layout.o libavutil/color_utils.o libavutil/cpu.o libavutil/crc.o libavutil/des.o libavutil/detection_bbox.o libavutil/dict.o libavutil/display.o libavutil/dovi_meta.o libavutil/downmix_info.o libavutil/encryption_info.o libavutil/error.o libavutil/eval.o libavutil/fifo.o libavutil/file.o libavutil/file_open.o libavutil/film_grain_params.o libavutil/fixed_dsp.o libavutil/float_dsp.o libavutil/frame.o libavutil/hash.o libavutil/hdr_dynamic_metadata.o libavutil/hdr_dynamic_vivid_metadata.o libavutil/hmac.o libavutil/hwcontext.o libavutil/hwcontext_d3d11va.o libavutil/hwcontext_dxva2.o libavutil/imgutils.o libavutil/integer.o libavutil/intmath.o libavutil/lfg.o libavutil/lls.o libavutil/log.o libavutil/log2_tab.o libavutil/lzo.o libavutil/mastering_display_metadata.o libavutil/mathematics.o libavutil/md5.o libavutil/mem.o libavutil/murmur3.o libavutil/opt.o libavutil/parseutils.o libavutil/pixdesc.o libavutil/pixelutils.o libavutil/random_seed.o libavutil/rational.o libavutil/rc4.o libavutil/reverse.o libavutil/ripemd.o libavutil/samplefmt.o libavutil/sha.o libavutil/sha512.o libavutil/slicethread.o libavutil/spherical.o libavutil/stereo3d.o libavutil/tea.o libavutil/threadmessage.o libavutil/time.o libavutil/timecode.o libavutil/tree.o libavutil/twofish.o libavutil/tx.o libavutil/tx_double.o libavutil/tx_float.o libavutil/tx_int32.o libavutil/utils.o libavutil/version.o libavutil/video_enc_params.o libavutil/x86/cpu.o libavutil/x86/cpuid.o libavutil/x86/fixed_dsp.o libavutil/x86/fixed_dsp_init.o libavutil/x86/float_dsp.o libavutil/x86/float_dsp_init.o libavutil/x86/imgutils.o libavutil/x86/imgutils_init.o libavutil/x86/lls.o libavutil/x86/lls_init.o libavutil/x86/tx_float.o libavutil/x86/tx_float_init.o libavutil/xga_font_data.o libavutil/xtea.o > libavutil/avutil-57.def
Could not create temporary library.
make: *** [ffbuild/library.mak:118: libavutil/avutil-57.dll] Error 1
What am I doing wrong ?
shall I install others packets from pacman?
I open mingw32 from the x64 Native Tools Command Prompt for VS 2022
maybe you should open it in X86 Native Tools Command Prompt for VS 2022
I am running dataflow job which reads file and pushes data to cloudsql. Its working in local mode(DirectRunner) but failing in DataflowRunner mode. I am getting following error
I 2021-08-24T10:08:13.866094Z ERROR: Command errored out with exit status 1:
I 2021-08-24T10:08:13.866142Z command: /usr/local/bin/python3 /tmp/pip-standalone-pip-_qnajhyd/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-idcssgqm/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links /var/opt/google/dataflow -- 'setuptools>=54.0' 'setuptools_scm[toml]>=5.0' 'wheel>=0.36.2' 'Cython>=0.29.22'
I 2021-08-24T10:08:13.866202Z cwd: None
I 2021-08-24T10:08:13.866213Z Complete output (15 lines):
I 2021-08-24T10:08:13.866222Z Looking in links: /var/opt/google/dataflow
I 2021-08-24T10:08:13.866232Z Processing /var/opt/google/dataflow/setuptools-57.4.0.tar.gz
I 2021-08-24T10:08:13.866244Z Installing build dependencies: started
I 2021-08-24T10:08:13.866253Z Installing build dependencies: finished with status 'error'
I 2021-08-24T10:08:13.866262Z ERROR: Command errored out with exit status 1:
I 2021-08-24T10:08:13.866272Z command: /usr/local/bin/python3 /tmp/pip-standalone-pip-_qnajhyd/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-x9grztra/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links /var/opt/google/dataflow -- wheel
I 2021-08-24T10:08:13.866290Z cwd: None
I 2021-08-24T10:08:13.866305Z Complete output (3 lines):
I 2021-08-24T10:08:13.866314Z Looking in links: /var/opt/google/dataflow
I 2021-08-24T10:08:13.866324Z ERROR: Could not find a version that satisfies the requirement wheel (from versions: none)
I 2021-08-24T10:08:13.866335Z ERROR: No matching distribution found for wheel
I 2021-08-24T10:08:13.866344Z ----------------------------------------
I 2021-08-24T10:08:13.866359Z WARNING: Discarding file:///var/opt/google/dataflow/setuptools-57.4.0.tar.gz. Command errored out with exit status 1: /usr/local/bin/python3 /tmp/pip-standalone-pip-_qnajhyd/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-x9grztra/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links /var/opt/google/dataflow -- wheel Check the logs for full command output.
I 2021-08-24T10:08:13.866383Z ERROR: Could not find a version that satisfies the requirement setuptools>=54.0 (from versions: 57.4.0)
I 2021-08-24T10:08:13.866394Z ERROR: No matching distribution found for setuptools>=54.0
I 2021-08-24T10:08:13.866404Z ----------------------------------------
I 2021-08-24T10:08:13.869072Z WARNING: Discarding file:///var/opt/google/dataflow/pymssql-2.2.2.tar.gz. Command errored out with exit status 1: /usr/local/bin/python3 /tmp/pip-standalone-pip-_qnajhyd/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-idcssgqm/overlay --no-warn-script-location --no-binary :none: --only-binary :none: --no-index --find-links /var/opt/google/dataflow -- 'setuptools>=54.0' 'setuptools_scm[toml]>=5.0' 'wheel>=0.36.2' 'Cython>=0.29.22' Check the logs for full command output.
I 2021-08-24T10:08:13.869405Z ERROR: Could not find a version that satisfies the requirement pymssql (from versions: 2.2.2)
I 2021-08-24T10:08:13.869475Z ERROR: No matching distribution found for pymssql
I 2021-08-24T10:08:13.943732Z /usr/local/bin/pip failed with exit status 1
F 2021-08-24T10:08:13.943818Z Failed to install packages: failed to install requirements: exit status 1
I 2021-08-24T10:11:13.976686Z [topologymanager] RemoveContainer - Container ID: 6006a3d10b0289d5b69478c1a8189eef02db1fa0af2216bb5e7c57659498009c
I 2021-08-24T10:11:14.015846Z [topologymanager] RemoveContainer - Container ID: bb14dd1d3a5414c2ea02157ebff7e7ba227337e47ff86216a3f86847e261cdee
E 2021-08-24T10:11:14.016267Z Error syncing pod 5814435f816ec192ccc2709209670a6a ("dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"), skipping: failed to "StartContainer" for "python" with CrashLoopBackOff: "back-off 1m20s restarting failed container=python pod=dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"
I 2021-08-24T10:11:25.595721Z [topologymanager] RemoveContainer - Container ID: bb14dd1d3a5414c2ea02157ebff7e7ba227337e47ff86216a3f86847e261cdee
E 2021-08-24T10:11:25.596300Z Error syncing pod 5814435f816ec192ccc2709209670a6a ("dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"), skipping: failed to "StartContainer" for "python" with CrashLoopBackOff: "back-off 1m20s restarting failed container=python pod=dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"
I 2021-08-24T10:11:38.592275Z [topologymanager] RemoveContainer - Container ID: bb14dd1d3a5414c2ea02157ebff7e7ba227337e47ff86216a3f86847e261cdee
E 2021-08-24T10:11:38.592668Z Error syncing pod 5814435f816ec192ccc2709209670a6a ("dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"), skipping: failed to "StartContainer" for "python" with CrashLoopBackOff: "back-off 1m20s restarting failed container=python pod=dataflow-cloudsql-upload-2021-08-2-08240304-up8o-harness-x0f4_default(5814435f816ec192ccc2709209670a6a)"
I 2021-08-24T10:11:53.598346Z [topologymanager] RemoveContainer - Container ID: bb14dd1d3a5414c2ea02157ebff7e7ba227337e47ff86216a3f86847e261cdee
There are so many SO posts which suggested to see about, conflicts with dependency in requirements and even after trying multiple changes in requirements.txt (trial and error) I am unable to figure out right dependencies. I followed this and
this but was unable to debug. Following is my code files
dataflow.py
import csv
import datetime
import logging
import apache_beam as beam
from apache_beam.io.fileio import MatchFiles, ReadMatches
import argparse
import os
import json
from ldif3 import LDIFParser
import pymssql
from apache_beam.options.pipeline_options import PipelineOptions, GoogleCloudOptions
logging.basicConfig(level='INFO')
# Change the project_id
project_id = os.getenv('GOOGLE_CLOUD_PROJECT')
def get_db_connection():
mssqlhost = '127.0.0.1'
mssqluser = 'a'
mssqlpass = 'b'
mssqldb = 'usersdb'
cnx = pymssql.connect(user=mssqluser, password=mssqlpass,
host=mssqlhost, database=mssqldb)
return cnx
class SQLWriteDoFn(beam.DoFn):
# Max documents to process at a time
MAX_DOCUMENTS = 200
def __init__(self, project):
self._project = project
def setup(self):
os.system("wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy")
os.system("chmod +x cloud_sql_proxy")
os.system(f"./cloud_sql_proxy -instances=mydatabase-database=tcp:0.0.0.0:1433 &")
def start_bundle(self):
self._mutations = []
logging.info("In start_bundle")
def finish_bundle(self):
logging.info("In finish_bundle")
if self._mutations:
self._flush_batch()
def process(self, element, *args, **kwargs):
logging.info("In process")
self._mutations.append(element)
if len(self._mutations) > self.MAX_DOCUMENTS:
self._flush_batch()
def _flush_batch(self):
try:
mssqlconn = get_db_connection()
print("Connection Established to MS SQL server.")
cursor = mssqlconn.cursor()
stmt = "insert into usersdb.dbo.users_dataflow (uname, password) values (%s,%s)"
cursor.executemany(stmt, self._mutations)
mssqlconn.commit()
mssqlconn.close()
except Exception as e:
print(e)
self._mutations = []
def return_dictionary_element_if_present(dict_entry, element):
if dict_entry.get(element):
return dict_entry.get(element)[0]
return ''
class CreateEntities(beam.DoFn):
def process(self, file):
parser = LDIFParser(file.open())
arr=[]
for dn, entry in parser.parse():
# dict1 ={}
dict_entry = dict(entry)
uname = return_dictionary_element_if_present(dict_entry,'uname')
userPassword = return_dictionary_element_if_present(dict_entry,'userPassword')
arr.append(tuple((uname,userPassword)))
return arr
def dataflow(pipeline_options):
print("starting")
options = GoogleCloudOptions.from_dictionary(pipeline_options)
with beam.Pipeline(options=options) as p:
(p | 'Reading data from GCS' >> MatchFiles(file_pattern="gs://my_bucket_name/*.ldiff")
| 'file match' >> ReadMatches()
| 'Create entities' >> beam.ParDo(CreateEntities())
# | 'print to screen' >> beam.Map(print)
| 'Write to CloudSQL' >> beam.ParDo(SQLWriteDoFn(pipeline_options['project']))
)
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description='dataflow options for ldif to sql')
parser.add_argument('--project', help='Project ID',
default=f'{project_id}')
parser.add_argument('--region', help='region', default='us-central1')
parser.add_argument('--runner', help='Runner', default='DirectRunner')
parser.add_argument('--staging_location',
default=f'gs://{project_id}/staging')
parser.add_argument('--temp_location',
default=f'gs://{project_id}/tmp')
args = parser.parse_args()
JOB_NAME = 'cloudsql-upload-{}'.format(
datetime.datetime.now().strftime('%Y-%m-%d-%H%M%S'))
pipeline_options = {
'project': args.project,
'staging_location': args.staging_location,
'runner': args.runner,
'job_name': JOB_NAME,
'temp_location': args.temp_location,
'save_main_session': True,
'requirements_file': 'requirements.txt',
'region': args.region,
'machine_type': 'n1-standard-8'
}
dataflow(pipeline_options)
requirements.txt
ldif3
pymssql
apache-beam[gcp]==2.31.0
Execution way
python3 dataflow.py --runner=DataflowRunner
Any help is really appreciated. Thanks in Advance.
Edit1:
I have done multiple trial and error and finally changed my requirements file as following
setuptools==57.4.0
wheel==0.37.0
setuptools_scm[toml]==6.0.1
Cython==0.29.24
ldif3
apache-beam[gcp]==2.31.0
But iam getting only following error now.
ERROR: pymssql-2.2.2-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.whl is not a supported wheel on this platform.
The way iam running now is
python3 dataflow.py --runner=DataflowRunner --requirements_file=requirements.txt --extra_package=pymssql-2.2.2-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.whl
When you run your pipeline locally, the packages that your pipeline
depends on are available because they are installed on your local
machine. However, when you want to run your pipeline remotely, you
must make sure these dependencies are available on the remote
machines.
In your execution way I see that you don't specify the requirements.txt:
so you need to start by adding --requirements_file requirements.txt
Ref: https://beam.apache.org/documentation/sdks/python-pipeline-dependencies/
The error
is not a supported wheel on this platform
is probably due to trying to use a Python 3.6 wheel with a Beam pipeline that uses a different Python version. Can you try to use a wheel that matches the Python version your Beam pipeline is run with ?
This works fine on ubuntu 16.04, but not on 17.10
+ curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
^M 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0^M100 56093 100 56093 0 0 56093 0 0:00:01 --:--:-- 0:00:01 98929
+ chmod +x ./awslogs-agent-setup.py
+ ./awslogs-agent-setup.py -n -c /etc/awslogs/awslogs.conf -r us-west-2
Step 1 of 5: Installing pip ...^[[0mlibyaml-dev does not exist in system ^[[0m^[[92mDONE^[[0m
Step 2 of 5: Downloading the latest CloudWatch Logs agent bits ... ^[[0mTraceback (most recent call last):
File "./awslogs-agent-setup.py", line 1317, in <module>
main()
File "./awslogs-agent-setup.py", line 1313, in main
setup.setup_artifacts()
File "./awslogs-agent-setup.py", line 858, in setup_artifacts
self.install_awslogs_cli()
File "./awslogs-agent-setup.py", line 570, in install_awslogs_cli
subprocess.call([AWSCLI_CMD, 'configure', 'set', 'plugins.cwlogs', 'cwlogs'], env=DEFAULT_ENV)
File "/usr/lib/python2.7/subprocess.py", line 168, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib/python2.7/subprocess.py", line 390, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1025, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
I noticed that earlier on in the process, in the AWS boilerplate it failed to install libyaml-dev but not sure if that's the only problem.
Always find the answer right after I post it...
Here's my modified CF template command:
050_install_awslogs:
command: !Sub
"/bin/bash -x\n
exec >>/var/log/cf_050_install_awslogs.log 2>&1 \n
echo 050_install_awslogs...\n
set -xe\n
# Get the CloudWatch Logs agent\n
mkdir /opt/awslogs\n
cd /opt/awslogs\n
# Needed for python3 in 17.10\n
apt-get install -y libyaml-dev python-dev \n
pip3 install awscli-cwlogs\n
# avoid it complaining about not having /var/awslogs/bin/aws binary\n
if [ ! -d /var/awslogs/bin ] ; then\n
mkdir -p /var/awslogs/bin\n
ln -s /usr/local/bin/aws /var/awslogs/bin/aws\n
fi\n
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O\n
chmod +x ./awslogs-agent-setup.py\n
# Hack for python 3.6 & old awslogs-agent-setup.py\n
sed -i 's/3,6/3,7/' awslogs-agent-setup.py\n
./awslogs-agent-setup.py -n -c /etc/awslogs/awslogs.conf -r ${AWS::Region}\n
echo 050_install_awslogs end\n
"
Not entirely sure about the need for the dir creation but I expect this is a temporary case that will get resolved soon as one still needs to fudge the python 3.6 compatibility check.
it may be installable using python 2.7 as well, but that felt like going backwards at this point as the my rationale for 17.10 was python 3.6.
Credit for the yaml package and dir creation idea to https://forums.aws.amazon.com/thread.jspa?threadID=265977 but I prefer to avoid easy_install.
I had similar issue on Ubuntu 18.04.
Instruction from AWS for standalone install worked for my case.
To download and run it standalone, use the following commands and follow the prompts:
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/AgentDependencies.tar.gz -O
tar xvf AgentDependencies.tar.gz -C /tmp/
sudo python ./awslogs-agent-setup.py --region us-east-1 --dependency-path /tmp/AgentDependencies
Hi I am using docker to deploy my rails app using phusion/passenger image. Here is my Dockerfile:
FROM phusion/passenger-ruby22:0.9.19
# set correct environment variables
ENV HOME /root
ENV RAILS_ENV production
# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]
# Expose Nginx HTTP service
EXPOSE 80
# Start Nginx / Passenger
RUN rm -f /etc/service/nginx/down
# Remove the default site
RUN rm /etc/nginx/sites-enabled/default
# Add the nginx site and config
ADD nginx.conf /etc/nginx/sites-enabled/nginx.conf
ADD rails-env.conf /etc/nginx/main.d/rails-env.conf
# Let ensure these packages are already installed
# otherwise install them anyways
RUN apt-get update && apt-get install -y build-essential \
nodejs \
libpq-dev
# bundle gem and cache them
WORKDIR /tmp
ADD Gemfile /tmp/
ADD Gemfile.lock /tmp/
RUN gem install bundler
RUN bundle install
# Add rails app
ADD . /home/app/webapp
WORKDIR /home/app/webapp
RUN touch log/delayed_job.log log/production.log log/
RUN chown -R app:app /home/app/webapp
RUN RAILS_ENV=production rake assets:precompile
# Clean up APT and bundler when done.
RUN apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
I am getting permission issue for tmp and log files.
web_1 | [ 2016-07-19 08:45:12.6653 31/7ff812726700 age/Cor/App/Implementation.cpp:304 ]: Could not spawn process for application /home/app/webapp: An error occurred while starting up the preloader.
web_1 | Error ID: 42930e85
web_1 | Error details saved to: /tmp/passenger-error-9DeJ86.html
web_1 | Message from application: Permission denied # rb_sysopen - log/logentries.log (Errno::EACCES)
web_1 | /usr/lib/ruby/2.2.0/logger.rb:628:in `initialize'
web_1 | /usr/lib/ruby/2.2.0/logger.rb:628:in `open'
web_1 | /usr/lib/ruby/2.2.0/logger.rb:628:in `open_logfile'
web_1 | /usr/lib/ruby/2.2.0/logger.rb:584:in `initialize'
web_1 | /usr/lib/ruby/2.2.0/logger.rb:318:in `new'
web_1 | /usr/lib/ruby/2.2.0/logger.rb:318:in `initialize'
web_1 | /var/lib/gems/2.2.0/gems/le-2.7.2/lib/le/host/http.rb:37:in `new'
web_1 | /var/lib/gems/2.2.0/gems/le-2.7.2/lib/le/host/http.rb:37:in `initialize'
I tried to give chmod -R 665/775/777 log/ and still didn't fixed the problem.
Thanks
Rearrange your line RUN RAILS_ENV=production rake assets:precompile first then RUN chown -R app:app /home/app/webapp(after your rake task) So, It should be something like this:
RUN RAILS_ENV=production rake assets:precompile
RUN chown -R app:app /home/app/webapp
I'm using Vagrant and Chef solo to setup my django dev environment. Using Chef Solo I successfully install my packages (vim, git, apt, python, mysql) but then when I setup my project using pip to download/install my requirements (django, south, django-registration, etc), these ones are not correctly downloaded/found in my fresh VM.
I'm not sure if it's a location issue, but it's downloading and I have only warnings, never errors, but then it's not at the supposed location (I have another project setup exactly the same and it works, so maybe I'm missing something here...).
Here is my Vagrantfile:
Vagrant::Config.run do |config|
config.vm.define :djangovm do |django_config|
# Every Vagrant virtual environment requires a box to build off of.
django_config.vm.box = "lucid64"
# The url from where the 'config.vm.box' box will be fetched if it
# doesn't already exist on the user's system.
django_config.vm.box_url = "http://files.vagrantup.com/lucid64.box"
# Forward a port from the guest to the host, which allows for outside
# computers to access the VM, whereas host only networking does not.
django_config.vm.forward_port 80, 8080
django_config.vm.forward_port 8000, 8001
# Enable provisioning with chef solo, specifying a cookbooks path (relative
# to this Vagrantfile), and adding some recipes and/or roles.
django_config.vm.provision :chef_solo do |chef|
chef.json = {
python: {
install_method: 'source',
version: '2.7.5',
checksum: 'b4f01a1d0ba0b46b05c73b2ac909b1df'
},
mysql: {
server_root_password: 'root',
server_debian_password: 'root',
server_repl_password: 'root'
},
}
chef.cookbooks_path = "vagrant_resources/cookbooks"
chef.add_recipe "apt"
chef.add_recipe "build-essential"
chef.add_recipe "git"
chef.add_recipe "vim"
chef.add_recipe "openssl"
chef.add_recipe "mysql::client"
chef.add_recipe "mysql::server"
chef.add_recipe "python"
end
django_config.vm.provision :shell, :path => "vagrant_resources/vagrant_bootstrap.sh"
end
end
And here the bootstrap file to download Django and continue setting up things:
#!/usr/bin/env bash
eval vagrantfile_location="~/.vagrantfile_processed"
if [ -f $vagrantfile_location ]; then
echo "Vagrantfile already processed. Exiting..."
exit 0
fi
#==================================================================
# install dependencies
#==================================================================
/usr/bin/yes | pip install --upgrade pip
/usr/bin/yes | pip install --upgrade virtualenv
/usr/bin/yes | sudo apt-get install python-software-properties
#==================================================================
# set up the local dev environment
#==================================================================
if [ -f "/home/vagrant/.bash_profile" ]; then
echo -n "removing .bash_profile for user vagrant..."
rm /home/vagrant/.bash_profile
echo "done!"
fi
echo -n "creating new .bash_profile for user vagrant..."
ln -s /vagrant/.bash_profile /home/vagrant/.bash_profile
source /home/vagrant/.bash_profile
echo "done!"
#==================================================================
# set up virtual env
#==================================================================
cd /vagrant;
echo -n "Creating virtualenv..."
virtualenv myquivers;
echo "done!"
echo -n "Activating virtualenv..."
source /vagrant/myquivers/bin/activate
echo "done!"
echo -n "installing project dependencies via pip..."
/usr/bin/yes | pip install -r /vagrant/myquivers/myquivers/requirements/dev.txt
echo "done!"
#==================================================================
# install front-endy things
#==================================================================
echo -n "adding node.js npm repo..."
add-apt-repository ppa:chris-lea/node.js &> /dev/null || exit 1
echo "done!"
echo -n "calling apt-get update..."
apt-get update &> /dev/null || exit 1
echo "done!"
echo -n "nodejs and npm..."
apt-get install nodejs npm &> /dev/null || exit 1
echo "done!"
echo -n "installing grunt..."
npm install -g grunt-cli &> /dev/null || exit 1
echo "done!"
echo -n "installing LESS..."
npm install -g less &> /dev/null || exit 1
echo "done!"
echo -n "installing uglify.js..."
npm install -g uglify-js &> /dev/null || exit 1
echo "done!"
#==================================================================
# cleanup
#==================================================================
echo -n "marking vagrant as processed..."
touch $vagrantfile_location
echo "done!"
My requirements dev.txt looks like this:
Django==1.5.1
Fabric==1.7.0
South==0.8.2
Pillow==2.1.0
django-less==0.7.2
paramiko==1.11.0
psycopg2==2.5.1
pycrypto==2.6
wsgiref==0.1.2
django-registration==1.0
Any idea why I can't find Django and my other things in my VM?
This is a whole 'nother path, but I highly recommend using Berkshelf and doing it the Berkshelf way. There's a great guide online for rolling them this way.
That is, create a cookbook as a wrapper that will do everything your script does.
So the solution was to remove the dependency with Postgre psycopg2==2.5.1 I have in my requirements (from the setup in my other project), because here I'll be having a MySQL database instead.