Precompiled XSLT, ReBase and NGEN - xslt

Advanced performance question here. Here's my scenario:
I have a database that contains thousands of XSLT documents. One for each page of a website so these translate XML into HTML. An ASP.NET web server (farm) loads the XSLT documents from the database and uses them to render HTML for each web request.
I've implemented the optimization of using XslCompiledTransform and caching it between database refreshes (every 30 minutes). I'm looking to notch performance up further by pre-compiling the XSLT to DLLs with xsltc.exe. This is supposed to eliminate all the Dynamic Method Invocations that XslCompiledTransform creates.
So, I have a separate server writing the XSLTs to files and running through them with xsltc.exe. Takes about 20 minutes but that's OK. I then drop the DLLs onto each webserver. Now I can just have the webserver dynamically load the DLLs on an as-needed basis. Here's the code I'm using to load the assembly into XslCompiledTransform:
byte[] bytes = File.ReadAllBytes(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "XsltDlls\\" + fileName + ".dll"));
Assembly assembly = Assembly.Load(bytes);
Type type = assembly.GetType(fileName);
XslCompiledTransform compiledTransform = new XslCompiledTransform();
compiledTransform.Load(type);
Shoud I ReBase.exe the DLLs in the directory and/or NGEN.exe them? ReBase takes about 5 minutes and NGEN.exe with /queue will take about 10 minutes during which CPU is hit hard - likely causing an impact to the traffic serving function of the webserver. Given how I'm loading the assembly by reading bytes from the assembly will the native NGEN image even be referenced or will the JIT fire up anyway?
Any/all insight into this will be MUCHLY appreciated!
malcolm

Wow!
Assembly.Load(string) does permit native images to be loaded. However, I suspect that the overload that takes a byte array may not use it. I can't find a reference for that, but perhaps some experimentation using the Assembly Binding Log Viewer on a test project might prove either way.
You also have to make sure that your assemblies are strongly named for the native image to be used.
As for rebasing, this blog suggests that it's not required on Vista generation OSes or later.

Related

Is there a way to cache includes in xslt?

When transforming on client side, the browser gets all the includes everytime the transformation is applied. I find it makes the application run slower.
Is there a way, with XSLT, to cache the includes so it does not download each file everytime is used again?
I am using a service worker now but I was wondering if there is another way.
I would have expected the included files to be cached the way everything else is. However, even if it's cached, there's still a signficant cost in that you're compiling the stylesheet every time it's used. The way to avoid that is to redesign your site as a "single page application", so following a link loads new content and transforms it using the existing already-loaded stylesheet.
SaxonJS [disclaimer: my company's product] is designed very much on the basis that you will compile the stylesheet once and then use it repeatedly. You can even precompile the stylesheet on the server.

How can I compile my ColdFusion code for sourceless distribution, and have it be unreadable?

I've been tasked with creating a deployable version of a ColdFusion web app to be installed on a clients server. I'm trying to find a way to give them a compiled version of our code, and my first inclination was to use the CFCompile utility that I found here. However, after running CFCompile, most of the code in the CFM files is still readable. The only thing that appears to be obfuscated at all is the actual ColdFusion code - all of the SQL Queries are still perfectly readable. (Example in the screenshot below)
The HTML and JavaScript are also still readable in the compiled code, but that doesn't matter as those can be seen in a web browser anyways.
Is there another way to distribute my source code in a format that is completely unreadable to the user? I'm guessing that for whatever method I choose, there will be some way of decompiling the code. That's not an issue, I just need to find a way to make it more difficult than opening the file and seeing the queries.
Hostek has a pretty good write up on the subject over on their site - How to Encrypt or Compile ColdFusion Files.
Basically, from that article:
Using cfcompile.bat
The cfcompile.bat utility will compile all .cfm and .cfc files within a given directory into Java bytecode. This has the effect of making your source code unreadable, and it also prevents ColdFusion from having to compile your ColdFusion files on first use which provides a small performance enhancement.
More details about using cfcompile.bat can be found in ColdFusion's Documentation
Using cfencode.exe
The cfencode.exe utility will apply basic encryption to a specific file or directory. If used to encrypt a directory, it will apply encryption to ALL files in the directory which can break any JS, CSS, images, or other non-ColdFusion files.
They do also include this note at the bottom:
Note: Encrypting your site files with cfencode does not guarantee absolute security of your source code, but it does add a layer of obfuscation to help prevent unauthorized individuals from viewing the source.
The article goes on to give basic instructions on how to use each.
Adobe has this note on their site regarding cfencode:
Note: You can also use the cfencode utility, located in the cf_root/bin directory, to obscure ColdFusion pages that you distribute. Although this technique cannot prevent persistent hackers from determining the contents of your pages, it does prevent inspection of the pages. The cfencode utility is not available on OS X.
I would also add that it will be trivial for anyone familiar with ColdFusion to decode anything encoded with this utility because they also provide the decoder.

How to optimize loading of a large XAP file

We have inherited a silverlight application used in banks. This is a huge silverlight application with a single xap file of 6.5MB size.
Recently one the core banking applications has updated their services to delete the entire browser cache from the users machine on daily basis.
This impacts the silverlight application directly. We cannot afford to download the 6 MB file every day. On a long term basis I know we need to break this monolith in to smaller manageable pieces and load them dynamically.
I wanted to check if there are any short term alternatives.
Can we have the silverlight runtime load the xap in to different director ?
Will making the application Out of Browser application give us any additional flexibility in terms of where we are loading the xap from ?
Any other suggestions which can help us to give a short term solution will be helpful.
What is inside your xap file ? (change extension to .zip and check what is inside)
Are you including images, sound inside your xap file ?
Are all dlls used necessary ?
Short-term alternatives are :
do some cleanup of your application (remove unused dlls, images, code,...)
rezip your xap file using a more powerful compression tool to save some place
Also, some tools exists to "minify" the size of your xap file. (but I never tried them)
Here is a link that has helped me to reduce my xap size :
http://www.silverlightshow.net/items/My-XAP-file-is-5-Mb-size-is-that-bad.aspx
Edit to answer your comment :
I would suggest using the Isolated Storage.
Quote from http://www.silverlightshow.net/items/My-XAP-file-is-5-Mb-size-is-that-bad.aspx :
Use Isolated Storage: keep in cache XAP files, DLL’s, resources and application data. This won't enhance the first load of the application, but in subsequent downloads you can check whether the resource is already available on the client local storage and avoid downloading it from the server. Three things to bear in mind: It’s up to you building a timestamp mechanism to invalidate the cached information, by default isolated storage is limited to 1 Mb size (by user request it can grow), and users can disable isolated storage on a given app. More info:Basics about Isolated Storage and Caching and sample.
Related links :
http://msdn.microsoft.com/en-us/magazine/dd434650.aspx
http://timheuer.com/blog/archive/2008/09/24/silverlight-isolated-storage-caching.aspx

How to make hyperlinks call same C++ CGI process

So my C++ CGI program generates some html-page with several links. How can I make within the same C++ process that after clicking this links will be displayed some others pages with content depending on what hyperlink was clicked?
For now I just have variant that there will be other C++ CGI program that will read URL param with getenv, and this param will be different for every link from my first page. But I believe there must be a way of doing this with one C++ process.
You are trying to store session information in the memory of your CGI program. CGI protocol doesn't allow this by itself. You must store session information somewhere else. Your options are:
Output HTML where result of your calculations is embedded in URLs, so that next execution will see those results (if that information is sensitive, this is a security flaw - you may overcome this with safe encryption).
Store results outside your C++ program memory (a file?). Then output a cookie or embed a session identifier in the URLs. In the next execution, you perform a lookup with session identifier then load those results from your server. You must take care to free old data to avoid space exhaustion.
Turn your C++ application into a web server! Your C++ application will answer HTTP requests (it will not be only a CGI application). That may be overkill, but might be necessary. I think there are free open source libraries that helps on that, or you can develop an Apache (httpd) module.
Hope that answers your question!

Comparing DLL sets from the same app on two different machines

Is there a good way to compare the DLLs loaded between two machines running the same app. (And to replicate the process between N other machines, two at a time?)
Background: I am trying to track down a configuration/setup issue. It's the age-old, DLL-hell-type problem where an app will run on one machine but not on another.
I have eliminated our installer as an issue; it's stable but there are differences between the target systems. Different Windows flavors, MDAC versions etc.
I have tried: exporting EXE snapshots with Proc Explorer to a delimited file and using Excel to do the comparison. But this is very time-consuming and error prone. (I'm not ruling out Excel as a possibility, i just don't know enough tricks to use it to my ends.)
I'd recommend you take a look at EasyHook, using it, you can create a detour on all calls to LoadLibraryA and LoadLibraryW. This way you can monitor all files that gets loaded, and get the path to them. After that, you can compare the files in whatever way you'd like. If you need help using EasyHook, let me know, and I'll cook up an example.