I've recently started using a lightweight IDE called Geany. It's really efficient, very light on resources and has all the basic functionality I need. It has built in syntax highlighting for lots of programming languages including C++ which I'm coding in, however some of the highlighting doesn't seem to function properly.
Looking at the above screenshot I took, you can see that there is some syntax highlighting going, however the user declared function "addition" has no colouring applied to it, even after changing its colour in the configuration files. I found out that when I change the colour of "operator" in the configuration file, it changes every semicolon, bracket etc. to that colour, so clearly the dectection there isn't great.
I think this issue is due to the way the syntax highlighter works, which I believe is scintilla looking at the Geany GitHub source files. Here is the lexer file specifically for C++: https://github.com/geany/geany/blob/master/scintilla/lexers/LexCPP.cxx
I have a few questions:
Can the lexer file for C++ be updated with a better one so it can actually detect user defined functions as well as other parts of the language?
Is there a way of viewing all the different syntax's that scintilla picks up on (string, operator, preprocessor etc.)?
Are there any better syntax highlighters that could be possibly integrated into Geany?
I made an external plugin that enable surgical filetype editing, with color selectors.
https://github.com/webdev23/Geany-editor-dynamic-color-schemes
It is dynamically editing the filetypes.xml config.
Thinking to make the capability to import highlight themes from some other editor. Doable from this base.
Otherwise, one line install + all plugins, all themes, config completions:
sudo apt install geany geany-plugins &&
curl -o geany-themes-master.zip https://codeload.github.com/codebrainz/geany-themes/zip/master && unzip geany-themes-master.zip && cd geany-themes-master && ./install.sh &&
curl -o geany16.zip https://codeload.github.com/RobLoach/base16-geany/zip/master && unzip geany16.zip && echo -e "Config written. Close Geany to apply." > ~/info_geany.txt && cd base16-geany-master && find . -type f -name "*.conf" -exec cp -n {} ~/.config/geany/colorschemes/ \; &&
geany ~/info_geany.txt && rm ~/info_geany.txt && sed -i 's/autocomplete_doc_words=false/autocomplete_doc_words=true/' ~/.config/geany/geany.conf && sed -i 's/pref_editor_tab_width=4/pref_editor_tab_width=2/' ~/.config/geany/geany.conf && sed -i 's/symbolcompletion_min_chars=4/symbolcompletion_min_chars=2/' ~/.config/geany/geany.conf && sed -i 's/pref_editor_tab_width=4/pref_editor_tab_width=2/' ~/.config/geany/geany.conf && sed -i 's/color_scheme=/color_scheme=base16-atelierlakeside.dark.conf/' ~/.config/geany/geany.conf && echo "Install done."
Related
I would like to use Vim to find certain string and replace it with another. For every replacements, it should ask for confirmation similar to what %s/foo/replace/gc does for a single file in Vim.
What have I tried?
sed: It doesn't do interactive replacements.
One of the comments in the following this link suggests vim -esnc '%s/foo/bar/g|:wq' file.txt. I tried vim -esnc '%s/foo/bar/gc|:wq' file.txt (used gc instead of g). Now the terminal gets stuck.
Emacs xah-find-replace package. Unfortunately it didn't do interactive replacements as promised in the link.
Combining :argdo with the substitute command would be the recommended way to do this.
You can populate the args by either opening all the files vim *.txt or manually populate this after opening vim using the command:
:args `find . -type f -name '*.txt'`
Now set hidden using the command:
:set hidden
this is required so that you're not prompted to save the file when switching from one buffer to the other. Refer, :h hidden for more information.
Now use the substitute command like you're used to, prefixing the argdo to perform this for every file in the argslist
:silent argdo %s/pattern/replace/gec
The silent is optional and just mutes the reporting. The e flag is to stop reporting the error no matches found message in some of the buffers
Now after replace, you can write the changes using the following command
:argdo update
This will write buffers that were modified only.
If you are looking for an interactive mode of replacement, it is easier to do it with vim.
vim -c '%s/PATTERN/REPLACEMENT/gc' -c 'wq' FILENAME
The stuck terminal in your case is due to piping the save command to the replacement string, as it does not allow the interactive mode to come in to action. And it is not a stuck terminal, if you type "yes" and press enter it should still show you the expected result.
In case multiple files are involved which is spread across multiple subdirectories, using find command with for loop will help as mentioned below:
for FILENAME in `find DIRECTORYPATH -type f -name *.txt`
do
vim -c '%s/PATTERN/REPLACEMENT/gc' -c 'wq' $FILENAME
done
In bash turn on double star to list all files in all subdirectories:
shopt -s globstar
Now start vim once with all files and run the substitute command for all files, then save and exit:
vim -c 'set nomore' -c 'argdo %s/foo/bar/gc' -c xa **/*.txt
I'm creating an R package that uses a third-party (closed-source) API for importing .edf files into R (from SR Research Eyelink eye trackers). Someone who has already gotten this to work in Linux has shared his code, and I was able to get it to work on Mac. It was a matter of changing the src/Makevars files to point to the API as it's installed on the mac:
PKG_LIBS=-framework edfapi -F/Library/Frameworks/
To make it work in linux, Makevars needs to have:
PKG_LIBS=-L/usr/local/lib -ledfapi -lm
I know that for windows-specific options, I need to create a Makevars.win file, but how do I have the build options change for Mac versus Linux? I would like to do something like:
if [[ `uname` -eq Darwin ]] ; then
PKG_LIBS=-framework edfapi -F/Library/Frameworks/
fi
if [[ `uname` -eq Linux ]] ;then
PKG_LIBS=-L/usr/local/lib -ledfapi -lm
fi
but putting this into Makevars doesn't work. From researching this it seems that I need a combination of setting options in configure and Makevars, but I haven't quite figured it out. I am comfortable with R programming and know just enough C++ to make some basic functions, but I still don't understand all the nuances involved in the building process. If someone could explain the main purpose of configure/configure.ac versus Makevars/Makevars.in that would be helpful as well.
Ideally I would like to bundle the API along with the R package and have the different versions in a platform-specific folder. The API consists of just 3 header files and a binary (and it rarely changes). I realize this would prevent me from putting the package on CRAN but that is fine. I've managed to successfully build the package with the API files in a different folder, but at runtime it still looks for it in the standard spot (/Library/Frameworks). I realize this is a more loaded question and I can create a separate post as well.
This post helped me figure it out: stackoverflow.com/a/32590600/1457051
configure (in the package root directory) looks like this:
#!/bin/bash
#make the Makevars file
if [ ! -e "./src/Makevars" ]; then
touch ./src/Makevars
fi
#if mac
if [[ `uname` -eq Darwin ]] ; then
echo "PKG_LIBS=-framework edfapi -F/Library/Frameworks/" > ./src/Makevars
#if linux
elif [[ `uname` -eq Linux ]] ;then
echo "PKG_LIBS=-L/usr/local/lib -ledfapi -lm" > ./src/Makevars
fi
Makevars is created and the appropriate options are added based on the platform. There may be a more direct solution, but this works for my purposes.
Hello, I wrote a bash script that compile several cpp's and object files
in g++. My goal is to run the script in vim by :!, but
it doesn't works within vim, only when I'm outside.
In addition I wanted to why using % in a script
doesn't give me the current file, but gives an error
instead.
the script:
#!/bin/bash
# Search for the main module and remove the ext.
delimain=`grep main *.cpp | cut -d. -f1`
# Checks if there are also object files
if [ -f ./*.o ]; then
g++ -g *.cpp *.o -o $delimain.exe
# If There are only cpp file
else
g++ -g *.cpp -o $delimain.exe
fi
Thanks!
RE your comment
I'm using ubuntu 13.10 alias: alias link 'sh ~/bin/lin_script %'
You should not invoke a shell script with an explicit interpreter; the #!/bin/bash first line tells the shell already which interpreter to use. You're obviously a beginner in Bash; try to read some introductions to gain a better understanding.
Aliases won't work in Vim because they are only defined in an interactive shell, but the commands launched from Vim usually are launched in a non-interactive shell (because this is faster and comes with less unnecessary stuff).
The alias is interpreted by the shell, but the % symbol is special to Vim. The two are not the same. See my other answer how to pass a filename to the script.
You're right that % is automatically expanded when supplied to an Ex command inside Vim, but this does not apply to external scripts. What you have to do is pass the current file when invoking the external script, and in there reference the command-line argument:
Inside Vim:
:!linkage.sh %
In your script:
if [ $# -gt 0 ]; then
delimain=$1
else
delimain=`grep ...
I've got a file structure that looks like:
A/
2098765.1ext
2098765.2ext
2098765.3ext
2098765.4ext
12345.1ext
12345.2ext
12345.3ext
12345.4ext
B/
2056789.1ext
2056789.2ext
2056789.3ext
2056789.4ext
54321.1ext
54321.2ext
54321.3ext
54321.4ext
I need to rename all the files that begin with 20 to start with 10; i.e., I need to rename B/2022222.1ext to B/1022222.1ext
I've seen many of the other questions regarding renaming multiple files, but couldn't seem to make it work for my case. Just to see if I can figure out what I'm doing before I actually try to do the copy/renaming I've done:
for file in "*/20?????.*"; do
echo "{$file/20/10}";
done
but all I get is
{*/20?????.*/20/10}
Can someone show me how to do this?
You just have a little bit of incorrect syntax is all:
for file in */20?????.*; do mv $file ${file/20/10}; done
Remove quotes from the argument to in. Otherwise, the filename expansion does not occur.
The $ in the substitution should go before the bracket
Here is a solution which use the find command:
find . -name '20*' | while read oldname; do echo mv "$oldname" "${oldname/20/10}"; done
This command does not actually do your bidding, it only prints out what should be done. Review the output and if you are happy, remove the echo command and run it for real.
Just wanna add to Explosion Pill's answer.
On OS X though, you must say
mv "${file}" "${file_expression}"
Or the mv command does not recognize it.
Brace expansions like :
{*/20?????.*/20/10}
can't be surrounded by quotes.
Instead, try doing (with Perl rename) :
rename 's/^10/^20/' */*.ext
You can do this using the Perl tool rename from the shell prompt. (There are other tools with the same name which may or may not be able to do this, so be careful.)
If you want to do a dry run to make sure you don't clobber any files, add the -n switch to the command.
note
If you run the following command (linux)
$ file $(readlink -f $(type -p rename))
and you have a result like
.../rename: Perl script, ASCII text executable
then this seems to be the right tool =)
This seems to be the default rename command on Ubuntu.
To make it the default on Debian and derivative like Ubuntu :
sudo update-alternatives --set rename /path/to/rename
The glob behavior of * is suppressed in double quotes. Try:
for file in */20?????.*; do
echo "${file/20/10}";
done
My job mostly consists of engineering analysis, but I find myself distributing code more and more frequently among my colleagues. A big pain is that not every user is proficient in the intricacies of compiling source code, and I cannot distribute executables.
I've been working with C++ using Boost, and the problem is that I cannot request every sysadmin of every network to install the libraries. Instead, I want to distribute a single source file (or as few as possible) so that the user can g++ source.c -o program.
So, the question is: can you pack the Boost libraries with your code, and end up with a single file? I am talking about the Boost libraries which are "headers only" or "templates only".
As an inspiration, please look at the distribution of SQlite or the Lemon Parser Generator; the author amalgamates the stuff into a single source file which is trivial to compile.
Thank you.
Edit:
A related question in SO is for Windows environment. I work in Linux.
There is a utility that comes with boost called bcp, that can scan your source and extract any boost header files that are used from the boost source. I've setup a script that does this extraction into our source tree, so that we can package the source that we need along with our code. It will also copy the boost source files for a couple of boost libraries that we use that are no header only, which are then compiled directly into our applications.
This is done once, and then anybody who uses the code doesn't even need to know that it depends on boost. Here is what we use. It will also build bjam and bcp, if they haven't been build already.
#!/bin/sh
BOOST_SRC=.../boost_1_43_0
DEST_DIR=../src/boost
TOOLSET=
if ( test `uname` = "Darwin") then
TOOLSET="--toolset=darwin"
fi
# make bcp if necessary
if ( ! test -x $BOOST_SRC/dist/bin/bcp ) then
if ( test -x $BOOST_SRC/tools/jam/*/bin.*/bjam ) then
BJAM=$BOOST_SRC/tools/jam/*/bin.*/bjam
else
echo "### Building bjam"
pushd $BOOST_SRC/tools/jam
./build_dist.sh
popd
if ( test -x $BOOST_SRC/tools/jam/*/bin.*/bjam ) then
BJAM=$BOOST_SRC/tools/jam/*/bin.*/bjam
fi
fi
echo "BJAM: $BJAM"
pushd $BOOST_SRC/tools/bcp
echo "### Building bcp"
echo "$BJAM $TOOLSET"
$BJAM $TOOLSET
if [ $? == "0" ]; then
exit 1;
fi
popd
fi
if ( ! test -x $BOOST_SRC/dist/bin/bcp) then
echo "### Couldn't find bpc"
exit 1;
fi
mkdir -p $DEST_DIR
echo "### Copying boost source"
MAKEFILEAM=$DEST_DIR/libs/Makefile.am
rm $MAKEFILEAM
# Signals
# copy source libraries
mkdir -p $DEST_DIR/libs/signals/src
cp $BOOST_SRC/libs/signals/src/* $DEST_DIR/libs/signals/src/.
echo -n "boost_sources += " >> $MAKEFILEAM
for f in `ls $DEST_DIR/libs/signals/src | fgrep .cpp`; do
echo -n "boost/libs/signals/src/$f " >> $MAKEFILEAM
done
echo >> $MAKEFILEAM
echo "### Extracting boost includes"
$BOOST_SRC/dist/bin/bcp --scan --boost=$BOOST_SRC ../src/*/*.[Ch] ../src/boost/libs/*/src/*.cpp ../src/smart_assert/smart_assert/priv/fwd/*.hpp $DEST_DIR
if [ $? != "0" ]; then
echo "### bcp failed"
rm -rf $DEST_DIR
exit 1;
fi
Have you considered just writing a build script for a build system like SCons?
You could write a python script to download boost, unpack it compile the needed files (you can even run bjam if needed) and compile your own code.
The only dependency your colleagues will need is Python and SCons.
Run the preprocessor on your code and save the output. If you started with one main.cpp with a bunch of includes in it, you will end up with one file where all of the includes have been sucked in. If you have multiple cpp files, you will have to concatinate them together and then run the preprocessor on the concatinated file, this should work as long as you don't have any duplicate global symbol names.
For a more portable method, do what sqlite does and write your own script to just combine and concatinate together the files you created+boost, and not get the system includes. See mksqlite3c.tcl in the sqlite code
http://www2.sqlite.org/src/finfo?name=tool/mksqlite3c.tcl
Why not just check in all the necessary files to SVN, and send you co-workers the URL of the repository? Then they can check out the code whenever they want to, do an 'svn up' any time they want to update to the latest version, etc.
If you're on a Debian-derived variety of Linux, well problems like this just shouldn't come up: let the packaging system and policy manual do the work. Just make it clear that the libboost-dev or whatever package is a build-dependency of your code and needs to be installed beforehand, and then /usr/include/boost should be right there where your code expects to find it. If you're using a more recent version of boost than the distro ships, it's probably worth figuring out how to package it yourself and work within the existing packaging/dependencies framework rather than reinventing another one.
I'm not familiar enough with .rpm based distros to comment on how things work there. But knowing I can easily setup exactly the build environment I need is, for me, one of the biggest advantages of Debian based development over Windows.