gettext - Load local catalogs - c++

While developing and translating an application, it might be nice if gettext will use the catalogs found in the local po/ dir so it wouldn't be necessary to call make install each time.
Is there a way to do it?
One of the problems is the naming convention: gettext looks for the catalog files in an hierarchy that looks like /usr/share/locale/LL/LC_MESSAGES/package.mo (where LL is two-letter language code), while usually in the development tree the binary catalogs reside in po/LL.gmo.

it might be nice if gettext will use the catalogs found in the local po/ dir so it wouldn't be necessary to call make install each time....Is there a way to do it?
If I am understanding your idea correctly, it sounds like gettext is able to do just that (i.e. change the translation path variable) if you follow the prescribed methods to set it up...
Translations should be stored in a path having a fixed structure.
First of all, we’ll have a root folder named to your taste (for
example “languages”). Inside it, we have to create a folder for every
targeted language whose name must comply to the ISO 3166 standard. So,
valid names for an Italian translation can be “it_IT” (Italian of
Italy), “it_CH” (Italian of Switzerland), “en_US” (English of USA),
and so on. Within the folder having the language code, we must have a
folder named “LC_MESSAGES” where, finally, we’ll store the translation
files.
From Here (there is a script example included in this link showing one method to perform this task)
Change "languages" in description above to "po", and that may do what you want?

Related

Is there a way to apply SAS EG processes to new files?

I'm taking over a project from a coworker that involves several extensive SAS process flows. I have all the files with all the same names and a copy of the process flows they used. Since the file paths in their processes are direct references to their computer, normally I would just re-import the files with the same output names and run the process from there. In a few cases I would have to recreate a query builder as I'm using a few .sas7bdat files from another project.
However, there are quite a few files involved and I may end up having to pass this to another coworker in a few months, and since I can't get a good look at exactly what the import task is doing I'm concerned I may have some of the variables imported incorrectly. Is there an easy way to just change the file path the import or other task refers to?
Given the updates in comments, there's two possibilities I see.
If the paths you're changing are, or can be, relative to the location of the EGP, then you can right click on the Project->Properties->File References and check "Use paths relative to the project...", which means instead of storing a file in c:\my EGP folder\my code folder\code.sas it would store it as my code folder\code.sas. So then if the whole project moves to another computer (or just any other folder) then it automatically has the right path. This is mostly useful for code or similar things.
Otherwise, you're going to have to convert things to SAS code modules. There you can use macro variables to define the locations of things.

Methods for opening a specific file inside the project WITHOUT knowing what the working directory will be

I've had trouble with this issue across many languages, most recently with C++.
The Issue Exemplified
Let's say we're working with C++ and have the following file structure for a project:
("Project" main folder with three [modules, data, etc] subfolders)
Now say:
Our maincode.cpp is in the Project folder
moduleA.cpp is in modules folder
data.txt is in data folder
moduleA.cpp wants to read data.txt
So the way I'd currently do it would be to assume maincode.cpp gets compiled & executed inside the Project folder, and so hardcode the path data/data.txt in moduleA.cpp to do the reading (say I used fstream fs("data/data.txt") to do so).
But what if the code was, for some reason, executed inside etc folder?
Is there a way around this?
The Questions
Is this a valid question? Or am I missing something with the wd (working directory) concept fundamentals?
Are there any methods for working around absolute paths so as to solve this issue in C++?
Are there any universal methods for doing the same with any language?
If there are no reasonable methods, how would you approach this issue?
Please leave a comment if I missed any important details with the problem's illustration!
At some point the program has to make an assumption where the file(s) are. Either by getting it from user input or a relative path with the presumed filename. As already said in the comments, C++ recently got std::filesystem added in C++17 which can help you making cross-platform code that interacts with the hosts' filesystem.
That being said, every program, big or small, has to make certain assumptions at some point, deleting or moving certain files is problematic for any program in case the program requires them to be at a certain location under a certain name. This is not solvable other than presenting the user with an error message etc.
As #Hatted Rooster said, it's not generally solvable for some arbitrary file without making some assumptions, however there are frameworks that allow you to "store" some files in the resources embedded into the executable (or otherwise). Those frameworks would usually allow your to handle such files in a opaque way, without the need to rely on a current working dir or relative paths.
For example, see the Qt Resource System.
Your program can deduce the path from argv[0] in the main call, if you know that it is always relative to your executable or you use an absolute path like "C:\myProgram\data\data.txt".
The second approach works in every language.

How to require file via absolute and relative paths in Crystal?

There's a file, just a single file, there's no shards or anything else.
/alex/projects/inspector/inspector.cr
I want to require it in another file in another folder
/alex/projects/my-project/play.cr
This won't work
require "/alex/projects/inspector/inspector"
require "/alex/projects/inspector/inspector.cr"
This also not working
CRYSTAL_PATH=$CRYSTAL_ROOT/src:lib:/alex/projects/inspector
require "inspector"
require "inspector.cr"
require "./inspector"
require "./inspector.cr"
P.S.
I would like to avoid using shards etc. as I have no plans to share that file or publish it. It's just a file that used by couple of other files in different locations.
I solved it by creating symlink
ln -s /alex/projects/inspector/inspector.cr /alex/projects/my-project/inspector.cr
require "./inspector"
Currently, require is only relative (as you learned from your forum post). However, you were on the right track with CRYSTAL_PATH.
CRYSTAL_PATH is an environment variable used by the Crystal compiler which tells it where to look for dependencies. So instead of using it in code, as it appears you did, you should use it when building the executable:
CRYSTAL_PATH=$CRYSTAL_ROOT/src:lib:/alex/projects/inspector crystal build /alex/projects/my-project/play.cr
Note that CRYSTAL_ROOT must be defined for that exact command to work. If you need to find the current CRYSTAL_PATH in order to append to it, you can use crystal env.

Where to store resources for C++ program on linux

This question says the best place to store settings in linux is in ~/.config/appname
The program I'm writing needs to use a 99MB .dat file for recognizing facial landmarks, embedding it in the binary doesn't seem like a good idea.
Is there some default place to store resources on linux? currently it's just in the directory next to the executable, but this requires that the program is run with the current directory being the directory it's located in.
What's the best way to deal with resources like this on linux? (that could potentially be cross platform with at least OSX)
You should take a look at the Filesystem Hierarchy Standards. Depending on the data (will it change, is it constant across all installations, etc) the path where it gets placed will change based on the standards.
In general:
/usr/lib/program: includes object files, libraries, and internal binaries for an application
/usr/share/program: for all read-only architecture independent data files
/var/lib/program: holds state information pertaining to an application or the system
Those seem like pretty good places to start, and you can check the documentation to see if your app falls into one of those categories.
If the file is specific to the user running the app, it should be in a subdir of ~/ but AFAIK there's no standard, and the best choice depends much on the file type/usage. If it should be visible to the user via GUI, you could use ~/Desktop or ~/Downloads. If it's temporary, you can use ~/tmp or ~/var/tmp.
If it's not specific, you should place it in a subdir of /var. Again, the exact subdir may depend on its kind and other factors.

finding my py modules in sub folders, from the main application working dir

this question might have been asked before, but I could not find it.
I am on a Linux box. I have py app that runs from a folder called /avt. (example)
I did not write this code, and it has about 12 modules that go with it. I was the lucky engineer to inherit this mess.
this app imports other modules that live under this dir /avt/bin
I want to be able find my modules in the /bin dir no matter where the current working dir is. sometimes the app changes dir to some other sub folders to perform some file I/O. Then should return, but seems like sometimes it does not make it back, because the code will error out with "no such file or directory" error. so I want to test for working dir each time before I do any file I/O to the /bin dir.
As an example, I want to create files in /bin, and then later open those files and read data from them. How can I test to make sure my current working dir is always /avt? and if it is not, then ch.dir to it? Note: it also has to be portable code meaning if must run on any directory structure on any Linux machine.
I tried this code, but it is not very clean I think. Python is not my main language. Is this coding proper and will it work for this? forgive me I don't know how to format it for this forum.
Avtfolder = os.path.realpath(os.path.abspath(os.path.split(inspect.getfile( inspect.currentframe() ))[0]))
if Avtfolder not in sys.path:
sys.path.insert(0, Avtfolder)
if Avtfolder.__contains__('/avt'):
modfilespath = Avtfolder + '/bin'
print 'bin dir is ' + modfilespath
else:
print 'directory lost...'
#write some code here that changes to the root /avt dir
I have a few notes.
First, I'm afraid you are mixing up two problems (or I couldn't tell from the question which one you're facing). These problems are:
I/O to files that can reside in different directories on different machines
Importing Python modules used by your app that can also be in slightly different locations.
The title of the question and some of the text suggests you're dealing with problem 2, whereas references to I/O and "no such file or directory" error point to problem 1.
Those are, however, separate problems and are treated separately. I won't be able to give the exact recipes on both, but here are some suggestions:
For problem 1: I don't think it's a good idea to do some I/O, create files, etc. in the folder where the user installs the Python libraries. It's a folder for Python modules, not data. Also, if the library is installed via setup.py, using pip or easy_install (if it isn't the case now, that can change in the future) then the program will probably habe insufficient permissions to write there, unless invoked as root. And that's right. Create files somewhere else.
As to "how to track the directory changes" part: I must confess I don't quite understand what you mean. Why do you even using the concept of "current directory"? In my mind you should just have some variable such as write_path, data_path, etc. and the code would be
data = open(os.path.join(data_path, 'data.foo'))
dump = open(os.path.join(write_path, 'dump.bar'), 'w')
etc.
Why do you even care where are your libraries located? I don't think it's right, I'd change that. This inspect.currentframe() stuff smells like you really need to rethink the design of the library.
Now, what the location of the libraries matters for is Problem 2. But again, the absolute path shouldn't matter (if it does, change that!). You only need all the modules to be inside one folder (or its subfolders). If they are in the same folder, you're good. import foo will just work. If some are in subfolders, those subfolders should have a file named __init__.py in them, and then they will be seen as modules by Python interpreter, so you'll be able to do from foo import bar, where foo is a subfolder with __init__.py and bar.py in it.
So, try to rewrite it so that you don't depend on where the .py files are. You really shouldn't need to use inspect there at all. On another note, don't use special methods like __contains__ directly unless you really need to. if '/avt' in Avtfolder will do the same.