I am currently trying to install version 3 of google protocol buffer on windows for python.
I have gone to the python folder through the command line and I am attempting to use:
python setup.py build
however I am getting the following error:
python : protoc is not installed nor found in ../src. Please compile it or install the binary package.
What's going on here?
As the error says, you must first install protoc.exe. You can get it from the Win32 package included with every Protobuf release. The latest version is here:
https://github.com/google/protobuf/releases/download/v3.0.0-alpha-3/protoc-3.0.0-alpha-3-win32.zip
(You can also build protoc from source by downloading the C++ source code release.)
I was able to solve this issue, by following the steps below:
Download the package which contains the precompiled version of Protoc from https://github.com/protocolbuffers/protobuf/releases. You will find the zip file at the bottom, in the assets section (e.g.,protoc-3.14.0-win32.zip)
Add the path of your .exe file which is located inside the bin of the Protoc folder, to the system variables of your system.
Open cmd and go to the directory where you have cloned the source code for the protocol buffer (https://github.com/protocolbuffers/protobuf). Get inside the python folder
Check if python version 2.7 or newer is installed by running the command python -V. If yes then try the command, python setup.py build
python setup.py install
check the installed protoc version with protoc --version
Related
I have been trying to install ONOS using Bazel's new version i.e., Bazel-5.1.1 in Ubuntu 20.04 LTS for mininet/containernet. I have been having issues regarding Bazel build onos command. I have searched for the installation process all over and tried as well many. But have similar issues. Is there any link or article where there is a clear step-by-step procedure to do so? I am having a task for the installation which I need to finish in a week. Any help would be appreciated. Thanks in Advance.
The error I get:
sendate#sendate04:~/onos$ bazel build onos
ERROR: The project you're trying to build requires Bazel 3.7.2 (specified in /home/sendate/onos/.bazelversion), but it wasn't found in /home/sendate/.bazel/bin.
Bazel binaries for all official releases can be downloaded from here:
https://github.com/bazelbuild/bazel/releases
You can download the required version directly using this command:
(cd "/home/sendate/.bazel/bin" && curl -fLO https://releases.bazel.build/3.7.2/release/bazel-3.7.2-linux-x86_64 && chmod +x bazel-3.7.2-linux-x86_64)
I tried doing the steps given like downloading the same version and also tried to change the version in the .bazelversion file. But nothing succeeded in Bazel build.
1.bazel —-version
If you didn’t download bazel version 3.7.x download it. If you download java version 11.You need create java default symbolic link. It will be /bar/lib/jam. . Test echo $PATH it will give ONON path. Then run sudo apt install —-reinstall build-essential. Build onos
I want to embed the python script in my c++ Qt application, By searching on the net I found that PythonQt is exactly what I am looking for but when I went to it's github repo there is build description given for windows system but not for ubuntu system so after cloning the repo if I include it's src in my Qt .pro file it gives me output that
Python.h not found, I think the reason is that I didn't build it in my system. Is there anyone who could tell me that how to build PythonQt in ubuntu. The link for their repo is this: https://github.com/MeVisLab/pythonqt
If this didn't work you can also suggest me some other thing which will help me to embed python scripts into my Qt c++ application.
First clone the repo by using the following command
https://github.com/MeVisLab/pythonqt.git
After that cd into the clone folder and execute the below command to build it into your system.
qmake
This command will generate the MakeFile into your current directory run the following command to completely build the PythonQt in your system.
sudo make all
sudo make install
While executing those commands if you get the following error
fatal error: 'private/qmetaobjectbuilder_p.h'
Run the below command to solve this
sudo apt install qtbase5-private-dev
I've previously installed caffe and Fast-RCNN, so I should have all the required libraries and dependencies.
I need to install it again for another repository(https://github.com/ronghanghu/natural-language-object-retrieval) that uses Caffe.
When I run
make all
it gives me the following error:
CXX .build_release/src/caffe/proto/caffe.pb.cc
In file included from .build_release/src/caffe/proto/caffe.pb.cc:5:0:
.build_release/src/caffe/proto/caffe.pb.h:12:2: error: #error This file was generated by a newer version of protoc which is
#error This file was generated by a newer version of protoc which is
^
.build_release/src/caffe/proto/caffe.pb.h:13:2: error: #error incompatible with your Protocol Buffer headers. Please update
#error incompatible with your Protocol Buffer headers. Please update
^
.build_release/src/caffe/proto/caffe.pb.h:14:2: error: #error your headers.
#error your headers.
^
In file included from .build_release/src/caffe/proto/caffe.pb.cc:5:0:
.build_release/src/caffe/proto/caffe.pb.h:26:55: fatal error: google/protobuf/generated_enum_reflection.h: No such file or directory
#include <google/protobuf/generated_enum_reflection.h>
compilation terminated.
make: *** [.build_release/src/caffe/proto/caffe.pb.o] Error 1
I thought maybe protobuf has been updated, and tried
protoc --version
which returns
libprotoc 2.5.0
It seems like newer version of protobuf has been released (2.6 or up).
So my question would be:
1) Is there a simple way to update it?
2) If I do update it, will it affect caffe and fast-rcnn that I previously installed, which depends on the older version of protobuf?
I suspect your problem is that you have multiple versions of protobuf in your include path. It may be picking up the headers from the older version instead of the latest. I can confirm that latest caffe (git master as of right now) compiles cleanly against the libprotobuf-dev-2.5.0-9ubuntu1 which is in ubuntu 14.04LTS.
I guess before you get this problem, you have used protoc to generate caffe.pb.h`. If you did, my solution maybe be useful to you.
Firstly, you should know how many protoc installed in your OS.
For example, in my OS:
Prompt> whereis protoc
protoc: /usr/bin/protoc /home/xxx/.conda/envs/python27/bin/protoc/usr/share/man/man1/protoc.1.gz
So, there are 2 protoc in my OS. You can use which protoc and protoc --version in order to find which version protoc is used default. In my OS:
Prompt> which protoc
/home/xxx/.conda/envs/python27/bin/protoc
Prompt>protoc --version
libprotoc 3.5.1
Finally, use another protoc to create caffe.pb.h again.
let the dir to caffe/src/caffe/proto, and execute:
/usr/bin/protoc --cpp_out=. caffe.proto
I solved this on my computer and maybe it can help you. My environment is Ubuntu16.04, and I installed Anaconda(for python2.7) before I install Caffe. It happens that I used conda to install libprotobuf-dev, and this leads to conflict with the caffe's 'sudo apt-get install libprotobuf-dev' command, for 'apt-get' and conda installed different on my computer which can be find by 'locate protobuf', so I remove the Anaconda's version of libprotobuf, and no problem happens again.
conda uninstall libprotobuf
When you install tensorflow before install Caffe, this problem will also happen, because of the libprotobuf conflict.
I solve the problem by
conda uninstall libprotobuf
then remove the caffe folder and download a new one
git clone https://github.com/BVLC/caffe.git
then do
make all -j8
I'm trying to use the GLPK solver with Pyomo. I have a working model that's been tested, but keep getting an error saying GLPK can't be found.
WARNING: Could not locate the 'glpsol' executable, which is required for solver 'glpk'
I've installed glpk sucessfully. I also added the directory to my path variable so the executed can be called globally. I tested this with glpsol --help from my command line, and see the help info printed.
The below thread says it should be working, but alas, it is not.
How do you install glpk-solver along with pyomo in Winpython
Any ideas?
This answer is late but I want to share the solution that worked for me.
solvername='glpk'
solverpath_folder='C:\\glpk\\w64' #does not need to be directly on c drive
solverpath_exe='C:\\glpk\\w64\\glpsol' #does not need to be directly on c drive
I used to do this:
sys.path.append(solverpath_folder)
solver=SolverFactory(solvername)
This works for the cbc solver in coin-or but it does not work for glpk. Then I tried something different:
solver=SolverFactory(solvername,executable=solverpath_exe)
This worked for both cbc and glpk. No idea why this works (I really didn't do anything else).
Version: Python 2.7 or Python 3.7 (tested both), glpk 4.65
You can install glpk solver using this command -
brew install glpk
Installing the glpk package worked for me. As I use Anaconda:
conda install -c conda-forge glpk
This was after already including the 'glpsol' exectuable's folder path in my PATH variables.
So it looks like the set path variable is not handled by your Python installation.
A normal Python installation is set up for a seperated "PYTHONPATH" environment variable to look up additional modules.
There is also the option to make an entry in the windows registry or (like you already mentioned) move the files to the Python home directory, which is recognized relative to your installation directory if "PYTHONHOME" is not set.
More information in the Python Documentary under 3.3.3.
https://docs.python.org/2/using/windows.html#finding-modules
I was having the same issue. I don't know if this is a solution but it definitely got the solver working.
After downloading the Windows installation. I copied all the files in the w64 folder and pasted them directly into my Python working directory.
After that my python code could locate the solver.
NOTE: this was for Python 3.4.3.4, Windows 8.1 64 bit
Reading the source code here suggests you try:
from pyutilib.services import register_executable, registered_executable
register_executable(name='glpsol')
maybe will it give a clue
I had the same issue on Windows 10 and it was down to glpk being installed in a different conda environment. Full steps for installing pyomo and glpk below.
Test the installation by running 'Repeated Solves' example from https://pyomo.readthedocs.io/en/latest/working_models.html
Instructions (run at an anaconda prompt)
conda create --name myenv
conda activate myenv
conda install -c conda-forge pyomo
conda install -c conda-forge pyomo.extras
conda install -c conda-forge glpk
Run spyder from myenv so that if finds everything
spyder activate myenv
Here is the relevant part where pyomo 6.2 searches for the glpsol executable
https://github.com/Pyomo/pyomo/blob/568c6595a56570c6ea69c3ae3198b73b9f473abd/pyomo/common/fileutils.py#L288
def _path():
return (os.environ.get('PATH','') or os.defpath).split(os.pathsep)
There are two options to solve the PATH problem:
Putting the executable in an available folder in PATH (recommended practice). The glpsol executable must be in one of the folders present in the PATH system environment variable. Use in your code print(os.environ['PATH']) to see the available folders and put it there.
Adding the folder to PATH at runtime. You can add it to the system PATH statically or use code to add it dynamically (only while your script is running):
GLPK_FOLDER_PATH = "path/to/glpk"
os.environ["PATH"] += os.pathsep + str(GLPK_FOLDER_PATH)
In my case, my Python project has a virtual environment .venv, and I have an installation process that pastes the files essential to the glpsol executable when I install the project inside the .venv/Scripts folder. Because that folder is added automatically to the system PATH when Python is called from the virtual environment, libraries like Pyomo can find it. And I don't have to remember to add the folder to PATH at runtime whenever I want to use Pyomo.
For anyone that has the same issue, I found a workaround (not a solution!). I copied all the glpk files into my C:/Python27 directory, and (Surprise!) Python can now find them.
I'll hold out for a real solution before accepting this one.
I have developed some software using Python under Windows 7.
I have given it to a colleague to run on a Mac (OS X 10.9.2). I have never used a Mac and am having trouble helping them to get started. I have downloaded and installed Anaconda 1.9.2 on the Mac. According to the continuum documentation, libtiff is included, but when I run my python file using the Spyder IDE I get the following error when it tries to import libtiff:
ImportError: No module named libtiff.
Following one of the answers on Stack Ooverflow, I tried:
conda install libtiff
This runs and returns:
All requested packages already installed.
However on Windows 7 I can see a libtiff folder under \python27\lib\site-packages. On the Mac there is no libtiff folder under /lib/python2.7/site-packages.
Can anyone tell me what I am missing?
This question is answered here:
Installing Python modules with Anaconda or Canopy
If pip install libtiff does not work, you can download the source for PyLibTiff as directed at https://code.google.com/p/pylibtiff/source/checkout and run setup.py with whichever interpreter you would like for PyLibTiff to be installed to.
Also, you do not have to have the the C libraries that Anaconda installs installed for PyLibTiff to work if you have libtiff libraries installed elsewhere.
Unclear on this. But what you can do to begin with is to type echo $PATH from the terminal and see what paths are set. Unclear on how Anaconda interacts with the system, but a good hunch is that if the library file is not in a path then that would cause this.
Also, looking at this thread on Google Groups it seems that Anaconda installs it’s own libraries that might need to be symbolically linked into the main /usr/local/lib directory. The user Denis Engemann—who made the post—posts this bash script in the last response in the thread:
for lib in ~/anaconda/lib/*;
do
ln -s $lib /usr/local/lib/$(basename $lib);
done
I would recommend checking those two directories first before linking to make sure all is as expected.