How to change the tmp directory location in ember apps - ember.js

Is there a way to change the location of the tmp directory ember-cli is using to process trees? (without using symlinks)
I am trying to develop an ember app using a linux VM on a windows host. Shared folders of any type (be it virtualbox shared folders, nfs or smb) are slow, don't allow symlinks and ember-cli produces a lot of files in the tmp directory. Being able to move this tmp to the native filesystem of the VM would help a lot.
This should be easily configurable, but i couldn't find any configuration option that would allow it.

No. Nobody answered and the help of the ember-cli build doesn't show that parameter.

How about creating a RAMDisk and hardlinking your project tmp folder to it? In Windows this can be done as follows:
Create RAM DISK with imdisk
mklink /j "C:\project\tmp\" "H:\ember-tmp" where C:\project is your local project folder and H: is your RAM Disk.
I'm afraid I don't know the Linux equivalents but it should be easy enough to find. Apparently this can roughly halve your build times with no chance of losing data.
Information sourced from https://emberjs-developer.quora.com/How-to-make-Ember-js-CLI-ember-s-32-times-faster and kudos to Stefan Penner for the suggestion.
Note the link folder MUST already exist before you run mklink otherwise you get the misleading message "Local volumes are required to complete the operation" and the link destination (junction folder) MUST NOT exist otherwise you get the message "Cannot create a file when that file already exists".

Note as of Ember 3.0 you can now change the cache by doing the below
BROCCOLI_PERSISTENT_FILTER_CACHE_ROOT=/path/to/my/other/tmp/
This will output the broccolli files to the path of your choosing. You will need to do the tidy up operation yourself on this custom path
More information here
https://github.com/stefanpenner/async-disk-cache/issues/35

Related

VS2008 Build C++ Project on Network Drive

The short of it is, I have VMs for building different scenarios of software. I do not wish to snapshot the code as it is backup elsewhere so I am storing all my code on the host PC and only building/testing in VMs to save space. Unfortunately I am receiving program database update errors when I try to build from a location mapped to the host hdd.
I know there is nothing wrong with the C++ projects as they build fine if moved inside the VM.
I have tried:
-cleaning/rebuilding
-removing the debug/release folders entirely
-checking out a copy of the source onto the host drive from within the VM
*Even when the idb and pdb files that the compiler complains about are created by VS, the problem persists.
How can I stop these C2471 errors when building from a non-local drive?
Perhaps, problem in the files that was created by compiler at previus build. Try to remove Debug (or Release) folder and build project again.
I'm not entirely sure as to why, but the issue seems to be related to using shared folders in VirtualBox. If the folders are referenced via a direct UNC path to the host machine it appears to work fine, but accessing through a shared folder (mapped or unmapped to a drive letter) doesn't appear to have the correct permissions.

How to specify a standard directory in a Qt project file

I have developed an application that I plan to deploy on Windows, Mac, and Linux. The program requires access to some files (scripts and the like) at run-time.
The installation process should install the files to a location that my application can later determine without user-intervention (or perhaps just a prompt allowing the user to change the location, if desired).
What is the best way to achieve this? I can't seem to find any way to:
1. Use a "standardized path" variable in the project file's INSTALLS statement. (e.g., my application could use QStandardPaths to initialize the location, but I can't figure out how to access this path from the INSTALLS statement)
2. Save the path to my project's QSettings (.plist, registry, whatever) for later retrieval
That leaves me with creating a custom project file and INSTALLS command for each environment, and then I still can't install to the user's directory because I don't know the user's name when I deploy the make command. It seems as if there must be a better way, but I can't seem to find any documentation for this. Am I just using the wrong keywords in my searches? Thanks in advance!
What standard directory? What type of getting that standard directory?
For instance, you can put such thing in your windows branch of .pro file:
win32 {
APPDATA_DIR = $$system(echo %APPDATA%) # should be %LOCALAPPDATA% as requested
message($$APPDATA_DIR)
}
Just unsure of what exact kind of standartized path you are talking about. QStandardPaths knows many. It makes sense to be more concrete to find the correspondence with concrete OS.
Also somewhat relative reply on mine, on how to check the correspondence with certain variable, etc: Qt .pro file - how to add conditioning on OSX version?
Maybe this class will help you
QStandardPaths documentation
But your problem is still little bit unclear for me.

Running a process inside a virtual file system?

What I'm trying to achieve is to run a program, which thinks a folder exists within its own folder while actually the folder is somewhere else on the system.
So my program would launch a process and say to the process: Folder A which is at C:\A is within your own directory at C:\Program Files (x86)\SomeProgram\A
So the "virtual" directory would only be visible to that process.
I'm using Qt to program my program, so if there are any Qt functions I could use that would be great (in relation to portability). However, plan C++ or any windows-bound API's would be fine.
I was thinking about NTFS junctions or symbolic links but I would have no idea how to create either of those in C++, let alone bind them to a specific process.
Thanks in advance!
EDIT:
In relation to the above, I've found this question: https://superuser.com/questions/234422/does-windows7-support-symbolic-links-folder-shortcuts. However, it only shows how to perform the required actions from the command-line and it wouldn't be process bound.
EDIT 2:
Some extra information: I'm trying to create a virtual directory that is made up of a couple of other directories but then merged (I'm using a priority system to decide which files "win" from other files). These merged directories would then appear to the target process as one directory containing the merged files.
I think I'm going to stick with Window's mklink command. It seems to suit my needs the best.
What I'm going to do is use QFile::link() on all operating systems that aren't Windows, and QProcess with mklink on windows. This should work on every operating system.
For a good example look here: https://stackoverflow.com/a/21013935/979732
Such tasks are accomplished by use of a filesystem filter driver. The driver intercepts OS requests going to the filesystem and lets you insert your own virtual files and directories into the existing directory on the disk. Filter driver can be an overkill for your particular task, though.
Detours approach mentioned in comments requires system-wide hooking of file APIs and will slowdown the whole system(filesystem filter driver is attached to one disk and it's a documented approach, so it's faster and more robust).

Is it possible to make ImageMagick++ based applications check the local directory for dlls

I'm creating an application that uses ImageMagick++ to load and convert a sequence of pngs into gifs.
Everything works on my dev machine (unless I uninstall ImageMagick++), but it crashes on other users machines when it tries to use the gif and png coders. I don't want end users to have to install ImageMagick in order to user the software.
It requires IM_MOD_RL_gif_.dll and IM_MOD_RL_png_.dll from the ImageMagick install directory: C:\Program Files (x86)\ImageMagick-6.8.9-Q16\modules\coders
It finds all the dlls and functions correctly when copied to the local directory except the coders.
Which don't work regardless of whether I copy them directly to the program directory. The following locations also failed (based off of advice I found elsewhere on the web).
applicationDir/
applicationDir/ImageMagick-6.8.9-Q16/modules/coders
applicationDir/bin/ImageMagick-6.8.9-Q16/modules/coders
applicationDir/modules/coders
Is there anyway to make an application using ImageMagick++ check the local directory for coder dll's without having to rebuild ImageMagick++ myself?
I'll respond here because I've seen a similar question unanswered elsewhere.
You just need to set the environment variable MAGICK_CODER_MODULE_PATH for the process using the coders.

Program configuration data in Unix/Linux

What is recommended way to keep a user configuration data in Unix/Linux?
My programming language is C++. Configuration data will be kept in XML/text/binary format, I have no problem with handling such files. I want to know where can I keep them. For example, in the Windows OS configuration data may be kept in the Registry (old way) or in user application data directory. What about Linux?
I need read/write access to configuration files.
The concept of the registry is peculiar to Windows, and Microsoft once admitted to it being ill-conceived (see this, this, this, this (see #2), and this).
In Unix and Linux, configuration for system-wide programs is in /etc or maybe an application-specific subdirectory.
Per user configuration data are kept in the user's home directory in a hidden file—in text format—or an application-specific hidden directory in the user's home directory. The proper way to reference the home directory is through the environment variable HOME. Hidden files and directories are created by making . the first character of the name.
Examples for system-wide configuration is /etc/wgetrc and /etc/ssh/. Examples of per-user data are $HOME/.bashrc and $HOME/.mozilla/.
The XDG Base Directory Specification specifies where configuration and other files should be stored in Linux and other X-based operating systems:
http://freedesktop.org/wiki/Specifications/basedir-spec
This is the modern way, and may eventually reduce the dotfile mess in the typical user's home directory.
Dotfiles are the classic Unix solution. If you want to deal with reading/writing everything yourself, go for it.
However, most modern programs I use have used GConf for storing preferences. It makes a lot of things easier, both as a developer and as a user (and apparently as an administrator, but I have no experience there).
That depends a little on your flavor of Linux but as a general rule most programs have the system default configuration somewhere in /etc with .config files in your home directory that can override the defaults in the /etc dir.
Great point .config should be .[Name of config file]