Is it possible use Go build with extra build steps? - build

What to do when go build is not enough and ones need to run extra commands along with go build? Does go tools have this use case covered? If so, what's the convention?
I noticed it's possible to pass extra flags to build tools with:
//#cgo pkg-config: glib-2.0 gobject-2.0 etc etc
import "C"
Is it possible to run extra commands or at least tell go build to use a Makefile?

No. The go tool isn't intended to be a generic build system. There are some provisions made for cgo (like pkg-config), but it's not extendable.
in go1.4 there will be the generate command. This will let you run arbitrary commands for pre-processing source files, but it always has to be a separate step that is run explicitly. You can't hook it into go get, go build, or go install.
Many projects that require a more complicated build use a script or a Makefile, and eschew the general ability go get. Library packages however should strive to be get-able for simplicity in dependecy resolution.

I don't believe you can add extra steps.
pkg-config is a special keyword for the built in build system.
Normally, more complex builds are accomplished with a makefile that calls go build at the appropriate step.

Related

Autotool : Compiling same project using multiple configuration

I am setting up a project based on autotools for the first time and I need to build the project for two platform (using two different toolchains).
Currently when I want to build for one tool chain, I restart from scratch because I am not sure to fully understand how it works.
(autoconf, configure, make).
But is the autoconf / configure part required each time?
If I create two build directories, and call configure from each directory (with different flags for each one),
can I just then call make without performing all process?
Thanks
But is the autoconf / configure part required each time? If I create two build directories, and call configure from each directory (with different flags for each one), can I just then call make without performing all process?
Autoconf is for building the (cross-platform) build system. There is no reason to think that running the same version of Autoconf with the same inputs (your configure.ac and any custom macros bundled with your project, mostly) would yield different outputs. So no, you don't need to run autoconf separately for each build, and in fact it is the standard Autotools model that you not run autoconf at each build.
So yes, it is absolutely natural and expected usage to create two build directories, and in each one run configure; make. Moreover, if indeed you are creating separate build directories instead of building in the source directory, then you will be able to see that configure writes all its output to the build directory. Thus, in that case, one build cannot interfere with another.

How to manage building a huge source code that uses gnu autotools?

I have multiple source codes which I have to cross-build and use together as one huge project. The building process of each source code is the same './configure-make-make install' commands with added parameters for cross compilation. So far I have been managing this by typing a really long configure command "./configure CC=....." in text editor and then copy pasting that on to terminal and running it. Then repeating the process for another source code. Taking care of multiple include paths, library paths, pkg-config paths etc. the process turns out to be very messy, error-prone and cumbersome. I have already used eclipse ide and have found no option for configuring the "./configure .." command to my need. Is there any elegant way to handle this problem? I would like a solution which will require me to write minimal amount of script/instruction.
Is there any elegant way to handle this problem?
If you want to automatize the configuration and compilation of several sub-projects which are actually one project, I suggest you to use the GNU/Autotools canonical way to deal with it with is Nested Autotools project
In that way you can do a project which contains all the other projects in the following fashion:
./UmbrellaProject
subproject1/
subproject2/
...
Makefile.am
Inside the parent project Makefile.am you will have a line at the beginning such as:
SUBDIRS = subproject1 subproject2
More information at the GNU Automake docs

How does one find the targets that a particular Makefile exposes

Many a time, I would do the standard ./configure, make, and make install when compiling a package.
However, there are times when I see a package being built with optional make parameters that are specific to that package.
Is there a easy way to list the bundled targets other than going through the Makefile source?
As a general statement, no.
If there are meaningful variables the project should call them out specifically.
The bash (and probably zsh) tab completion support does attempt to get available make targets (with varying degrees of success) if that is of help though.

How to manage growing C++ project

I am wondering how I should manage a growing C++ project. Now, I am developing a project with Netbeans and it's dirty work generating makefiles. The project has become too big and I have decided to split it up into a few parts. What is the best way of doing this?
I am trying to use Scons as my build system. I have had some success with it, but should I edit the build scripts every time I append or delete files. It's too dull.
So I need your advice.
P.S. By the way, how does a large project like google chrome do this? Does everybody use some kind of IDE to build scripts generated only for software distribution?
I also use Netbeans for C++ and compile with SCons. I use the jVi Netbeans plugin which really works well.
For some reason the Netbeans Python plugin is no longer official, which I dont understand at all. You can still get it though, and it really makes editing the SCons build scripts a nice experience. Even though Netbeans doesnt have a SCons plugin (yet?) you can still configure its build command to execute SCons.
As for maintaining the SCons scripts automatically by the IDE, I dont do that either, I do that by hand. But its not like I have to deal with this on a daily basis, so I dont see that its that important, especially considering how easy to read the scripts are.
Here's the build script in SCons that does the same as mentioned previously for CMake:
env = Environment()
env.EnsurePythonVersion(2, 5)
env.EnsureSConsVersion(2, 1)
libTarget = env.SharedLibrary(target = 'foo', source = ['a.cpp', 'b.cpp', 'c.pp'])
env.Program(target = 'bar', source = ['bar.cpp', libTarget])
The SCons Glob() function is a nice option, but I tend to shy away from automatically building all the files in a directory. The same goes for listing sub-directories to be built. Ive been burned enough times by this, and prefer explicitly specifying the file/dirs to be built.
In case you hear those rumors that SCons is slower than other alternatives, the SCons GoFastButton has some pointers that can help out.
Most large projects stick with a build system that automatically handles all the messy details for them. I'm a huge fan of CMake (which is what KDE uses for all their components) but scons is another popular choice. My editor (KDevelop) supposedly handles CMake projects itself, but I still edit the build scripts myself because it's not that hard.
I'd recommend learning one tool really well and sticking with it (plenty of documentation is available for any tool you'll be interested in). Be sure you also look into version control if you haven't already (I have a soft spot for git, but Mercurial and Subversion are also very popular choices).
A simple CMake example:
project("My Awesome Project" CXX)
cmake_minimum_required(VERSION 2.8)
add_library(foo SHARED a.cpp b.cpp c.cpp) #we'll build an so file
add_executable(bar bar.cpp)
target_link_libraries(bar foo) #link bar to foo
This is obviously a trivial case, but it's very easy to manage and expand as needed.
I am trying to use Scons as build system. I have some success with it, but I should edit
build scripts every time I append or delete file. It's too dull.
Depending on how your files are organized, you can use, for example, Scon's Glob() function to get source files as a list without having to list all files individually. For example, to build all c++ source files into an executable, you can do:
Program('program', Glob('*.cpp'))
You can do the same in CMake using its commands.
And, if you're using SCons, since it's Python you can write arbitrary Python code to make your source file lists.
You can also organize files into multiple folders and have subsidiary SCons (or CMakeList.txt) build files that the master build script can call.

Making a Makefile

How I can make a Makefile, because it's the best way when you distribute a program by source code. Remember that this is for a C++ program and I'm starting in the C development world. But is it possible to make a Makefile for my Python programs?
From your question it sounds like a tutorial or an overview of what Makefiles actually do might benefit you.
A good places to start is the GNU Make documentation.
It includes the following overview "The make utility automatically determines which pieces of a large program need to be recompiled, and issues commands to recompile them."
And its first three chapters covers:
Overview of make
An Introduction to Makefiles
Writing Makefiles
I use Makefiles for some Python projects, but this is highly dubious... I do things like:
SITE_ROOT=/var/www/apache/...
site_dist:
cp -a assets/css build/$(SITE_ROOT)/css
cp -a src/public/*.py build/$(SITE_ROOT)
and so on. Makefile are nothing but batch execution systems (and fairly complex ones at that). You can use your normal Python tools (to generate .pyc and others) the same way you would use GCC.
PY_COMPILE_TOOL=pycompiler
all: myfile.pyc
cp myfile.pyc /usr/share/python/...wherever
myfile.pyc: <deps>
$(PY_COMPILE_TOOL) myfile.py
Then
$ make all
And so on. Just treat your operations like any other. Your pycompiler might be something simple like:
#!/usr/bin/python
import py_compile
py_compile.compile(file_var)
or some variation on
$ python -mcompileall .
It is all the same. Makefiles are nothing special, just automated executions and the ability to check if files need updating.
How i can make a MakeFile, because it's the best way when you distribuite a program by source code
It's not. For example, KDE uses CMake, and Wesnoth uses SCons. I would suggest one of these systems instead, they are easier and more powerful than make. CMake can generate makefiles. :-)
A simple Makefile usually consists of a set of targets, its dependencies, and the actions performed by each target:
all: output.out
output.out: dependency.o dependency2.o
ld -o output.out dependency.o dependency2.o
dependency.o: dependency.c
gcc -o dependency.o dependency.c
dependency2.o: dependency2.c
gcc -o dependency2.o dependency2.c
The target all (which is the first in the example) and tries to build its dependencies in case they don't exist or are not up to date. will be run when no target argument is specified in the make command.
For Python programs, they're usually distributed with a setup.py script which uses distutils in order to build the software. distutils has extensive documentation which should be a good starting point.
If you are asking about a portable form of creating Makefiles you can try to look at http://www.cmake.org/cmake/project/about.html