How does precompile work? - coldfusion

Adobe ColdFusion Server comes with a precompile tool to parse and compile CFML temmplates (cfm, cfc) to Java bytecode. When I execute a precompilation, all templates are saved as .class files in \cfusion\wwwroot\WEB-INF\cfclasses\. But how does CF work with them?
In CF Admin there is an option to "Save Class Files", so already compiled templates are re-used on server start. But how does this option affect precompiled templates? Are precompiled templates even considered then (since they are in the same folder)?
And how does CF handle these .class files? Are they load into memory and then compiled by the JVM to machine code? How does CF distinguish between old versions of templates and newer ones? Seeing how the folder can be as big as 1,000,000 files and several GB of data, I doubt there's any cleaning routine.
Background: We are running about 100 applications with about 1,000 templates per application. On server restart and bigger code changes (affecting like 100/1,000 templates * 100 applications), CF takes ages to recompile everything to a point where the whole server just hangs, because threads are blocked.
How can I utilize the precompile tool to prepare code changes and make CF respond faster?

Related

How to use Jetbrains Webstorm with Citrix

I cannot use Webstorm on my Citrix account because the screen isn't rendered properly. It is not easy to describe, it looks like a screen refresh would help. When I scroll through the source, the lines appear one by one, but some regions of the gui (icon, menues,...) are never shown.
Not likely the answer you would expect, but I can only suggest installing WebStorm on a local machine and working with source files located on local drives. All IDE functionality is based on the index of the project files which WebStorm builds when the project is loaded and updates on the fly as you edit your code. To provide efficient coding assistance, WebStorm needs to re-index code fast, which requires fast access to project files. The latter can be ensured only for local files, that is, files that are stored on you hard disk and are accessible through the file system.
Moreover, the fsnotifier tool IDE uses to synchronize its virtual file system with external changes doesn't support remote drives, so you might have problems synchronizing files generated by LESS/SASS compilers and other external tools...
See also https://community.oracle.com/thread/1353486

What is the SVN best practice for storing source when developing and testing with IDEs?

I do a fair amount of personal development on my computer and have used TortoiseSVN (I'm on windows) for web projects, but haven't used any version control for other languages. Anyways, soon I will be starting a decent sized C++ project and was going to try using SVN for it.
For web development, I normally just used notepad++ and it was really easy to manage it with SVN (just commit the whole source folder). However, for this project I will be using an IDE (most likely Eclipse CDT or Visual Studio) and was wondering what the best practice is to manage all of the IDE, project, and binary files. My guess was to make the IDE project outside of the version control, and just point to all of the source files into the SVN so all of the build and project files aren't committed. This way the only files in the SVN would be the .cpp and .h files.
However, if I wanted to switch to a new branch, then I would need to update the location of all of the source and headers to the new folder which seems like it would be a huge hassle.
Whats the best way to handle this?
Thanks
Ok, it seem I misgot the aim of the question in the first round. Now I'm assuming what is asked really to what to put under source control and what not.
Well, naturally everything but temporary/transient files.
If you install GitExtensions, it right away has a feature to populate the .gitignore file. Certainly depending on language you adjust it. Sure, solution, project, make files belong under control. .USER files storing some IDE preferences do not. As both IDEs and source control is ubiquitously used the content is fairly separated for many years, and should be pretty obvious as you do it.
External dependencies normally also shall be in a repo, though choice shall be made in which one. Some store everything together, others keep one dependency repo, others separate repos per component -- all depends on actual components and workflow. And you can replace physical storage of deps by an info file with stable links to the used version. It may also be covered later on the first change in dependencies.
For Visual Studio, there is a plugin that manages your files for you. As long as the files are part of the project, then they will be put into source control by the plugin. See ankhsvn for plugin info. Note that the express versions of Visual Studio are not supported.
I am sure eclipse has a plugin for SVN as well.

how to upload and compile cpp for cgi-bin on WHM/cPanel?

I am brand new to C++. I am creating AJAX files in the cgi-bin for maximum speed vs PHP.
My host is no help, I can find no docs on this, and searches return nothing (probably because of my limited vocabulary on the subject).
I want to upload cpp files to my WHM/cPanel VPS to compile for the cgi-bin. I have no idea what directory to put these for compiling or how exactly to do it.
How do I upload cpp files to a WHM/cPanel VPS, compile them, and move them to the cgi-bin?
Many thanks in advance!
You should put your compiled binaries (your C++ code must implement the CGI interface) in whichever folder your web server treats as active (i.e. CGI). If there is a folder where PHP files are run using the CGI interface, then putting the C++ binaries in the same folder should work (unless the server is set up with special filters to block running binaries for security purposes).
And yes. This isn't a very common problem. Though when I did it, it was very straight forward (I just dropped the binaries in and off it went).

Analogue for maven-resources-plugin or maven-antrun-plugin for leiningen

I use leiningen to manage my clojure project and I want to copy jar file along with some other files into a certain directory as a final part of a build process. Leiningen treats 'resources' as something which should be included into the jar file, and it is unacceptable for me. If I used maven, I could configure it for such task using maven-resource-plugin or fall back to Ant using maven-antrun-plugin, but leiningen is far more convenient tool for clojure projects.
Strangely, I couldn't manage to find anything about similar functionality in leiningen on the internet. This is curious, because one of major clojure applications is web sites, and web sites usually do not include their resources (js, css, etc) into the jar (or do they? That would be weird since slight css tweak will require rather lenghty recompilation). It comes naturally that we have to prepare site environment (copy static resources along with jar bundle into some directory layout), and this task should be done by the build tool.
Is there a plugin to copy files around the filesystem (or something which could substitute it, like running Ant), or I must write one myself? Right now I'm using shell scripts, but it is very inconvenient since I had to run several commands instead of one, and also it is unportable.
did you checkout lein-resource?
in any case. here is a long list of available plugins for lein, maybe you will fine some of them helpful

Safe Unity Builds

I work on a large C++ project which makes use of Unity builds. For those unfamiliar with the practice, Unity builds #include multiple related C++ implementation files into one large translation unit which is then compiled as one. This saves recompiling headers, reduces link times, improves executable size/performance by bring more functions into internal linkage, etc. Generally good stuff.
However, I just caught a latent bug in one of our builds. An implementation file used library without including the associated header file, yet it compiled and ran. After scratching my head for bit, I realized that it was being included by an implementation file included before this one in our unity build. No harm done here, but it could have been a perplexing surprise if someone had tried to reuse that file later independently.
Is there any way to catch these silent dependencies and still keep the benefits of Unity builds besides periodically building the non-Unity version?
I've used UB approaches before for frozen projects in our source repository that we never planned to maintain again. Unfortunately, I'm pretty sure the answer to your question is no. You have to periodically build all the cpp files separately if you want to test for those kinds of errors.
Probably the closest thing you can get to an automagic solution is a buildbot which automatically gathers all the cpp files in a project (with the exception of your UB files) and builds the regular way periodically from the source repository, pointing out any build errors in the meantime. This way your local development is still fast (using UBs), but you can still catch any errors you miss from using a unity build from these periodic buildbot builds which build all cpps separately.
I suggest not to use Unity Build for your local development environment. Unity Build won't help you improve compile time when you edit and compile anyway. Use Unity Build only for your non-incremental continuous build system, which you are not expect to use the production from.
As long as every one commit changes after they have compiled locally, the situation you described should not appear.
And Unity Build might form unexpected overload calls between locally defined functions happen to use with duplicated names. That's dangerous and you are not aware of that, i.e. no compile error or warning might generated in such case. Unless you have a way to prevent that, please do not reply on the production generated by Unity Build.