Excluding Android specific assets from iOS build in Titanium? - build

I have one code base that creates Android and iOS versions, in Titanium. I want to exclude Android specific images from my iOS build, to keep the package as lean as possible.
Besides manually moving assets out of the app folder prior to doing a build, is there any way to achieve this?

This looks like a good resource for you. I'll admit I knew it was in there and it took me a while to find it.
Platform-specific resources
Titanium gives you various ways to include platform-specific resources, like images, stylesheets, and scripts, in your project. Titanium uses an "overrides" system to make it easy to use platform-specific resources. Any file in the platform-specific Resources directories (Resources/android, Resources/iphone, or Resources/mobileweb) will override, or be used in place of, those in the Resources directory. You don't have to use any special notation in your code to specify that these files should be used.
http://docs.appcelerator.com/titanium/3.0/#!/guide/Supporting_Multiple_Platforms_in_a_Single_Codebase
This is just a snippet of the information, there is more in the docs.

Related

Qt Application : How to create standalone executable file for Windows (& Mac) from Mac?

I developed a Qt application in MacBook (El-Capitan 10.11.2) and it is ready now to be released.
What i want now, is to create the standalone executable file for both Mac and Windows OS.
But I don't know how !
I found this link but I am unable to follow it is guidance, it looks different from what my system is showing me.
If you have any idea, please help me.
Thank you
Well, to compile an application for windows, you will need a windows machine (or at least a virtual machine). You can't compile for windows on mac.
Regarding the "standalone": The easy way is to deploy your application together with all the required dlls/frameworks and ship them as one "package". To to this, there are the tools windeployqt and macdeployqt. However, those will not be "single file" applications, but rather a collection of files.
If you want to have one single file, you will have to build Qt statically! You can to this, but you will have to do it on your own. And if you do, please notice that the LGPL-license (the one for the free version of Qt) requires you to make the source-code of your program public! That's not the case if you just link to the dynamic libraries.
EDIT:
Deployment
Deployment can be really hard, because you have to do it differently for each platform. Most times you will have 3 steps
Dependency resolving: In this step, you collect all the exectuables/lirabries/translations/... your application requires and collect them somewhere they can find each other. For windows and mac, this can be done using the tools I mentioned above.
Installation: Here you will have to create some kind of "installer". The easiest way is to create a zip-file that contains everyhing you need. But if you want to have a "nice" installation, you will have to create proper "installers" for each platform. (One of many possibilities is the Qt Installer Framework. Best thing about it: It's cross platform.)
Distribution: Distribution is how to get your program to the user. On Mac, you will have the App-Store, for windows you don't. Best way is to provide the download on a website created for this (like sourceforge, github, ...)
I can help you with the first step, but for the second step you will have to research the possibilities and decide for a way to do it.
Dependencies
Resolving the dependencies can be done by either building Qt statically (this way you will have only one single file, but gain additional work because you will have to compile Qt) or using the dynamic build. For the dynamic build, Qt will help you to resolve the dependencies:
macdeployqt is rather easy to use. Compile your app in release mode and call <qt_install_dir>/bin/macdeployqt <path_to_your_bundle>/<bundle>.app. After thats done, all Qt libraries are stored inside the <bundle>.app folder.
For windeployqt is basically the same: <qt_install_dir>\bin\windeployqt --release <path_to_your_build>\<application>.exe. All dependencies will be inside the build folder. (Hint: copy the <application>.exe in an empty directoy and run windeployqt on that path instead. This way you get rid of all the build-files).
Regarding the static build: Just google it, you will find hundreds of explanations for any platform. But unless you have no other choice but to use one single file (for whatever reason) it would recommend you to use dynamic builds. And regarding the user experience: On mac, they won't notice a difference, since in both cases everything will be hidden inside the app bundle. On windows, it's normal to have multiple files, so no one will bother. (And if you create an installer for windows, just make sure to add a desktop shortcut. This way the user will to have "a single file" to click.)

Embedding project folders(lua) into exe

I searching for a long time how to embed project files like folders (with lua scripts and images) into exe.
Basically i have some folders which are needed to run my game and i want to hide them somehow. Because now they are opened and can be easy edited by everyone.
I saw method in which folders have been changed to .dll file to protect them.
Using visual studio 2013.
I'll be very thankful for an answer.
You can use PhysicsFS, which allows to map a hierarchical filesystem to an archive. From the project page: "It is intended for use in video games, and the design was somewhat inspired by Quake 3's file subsystem." It's used by some open-source Lua frameworks (for example, Love2d), so you may check how they implemented the integration and access.
This doesn't guarantee full protection (nothing does), but it will at least make it more difficult for the users to make changes to those resources you want to protect.

How to build a Dojo layer against an existing layer file?

I'm using Dojo 1.9, and am happily building layer files using the Dojo builder. What I'd like to be able to do is build my layer files containing my modules which refer to third party modules, but where I only have pre-built layer files containing those modules rather than the individual third party module files.
(There are two reasons for wanting this: sometimes I don't have the individual module files, just the layer files, and sometimes even if I do have the individual module files, I have no intention of bundling them into my layer, and so don't want to increase the build time by having the builder scan all of those module files.)
If I have the raw module-by-module source for those third party modules, I can make it all work fine, but can it be made to work if I only have the pre-built Dojo layer?
I've tried specifying 'exclude' options in my layer specification, but that seems to affect which modules are generated into my layer rather than which modules it tries to locate as individual module files.
Is there a way to do this?
A bit late, though hoping this may be of any help to someone. Facing the same absence of relevant answer, I eventually gave up trying to get it done with the builder, especially given that my 3rd party wouldn't build even by itself. Instead, I set up grunt-contrib-concat to output AMD layers, and built a project with it, while caring to explicitely reference the library layers within dojoConfig.
https://github.com/mdolidon/grunt-amd-concat

Source code dependency manager for C++

There are already some questions about dependency managers here, but it seems to me that they are mostly about build systems, while I am looking for something targeted purely at making dependency tracking and resolution simpler (and I'm not necessarily interested in learning a new build system).
So, typically we have a project and some common code with another project. This common code is organized as a library, so when I want to get the latest code version for a project, I should also go get all the libraries from the source control. To do this, I need a list of dependencies. Then, to build the project I can reuse this list too.
I've looked at Maven and Ivy, but I'm not sure if they would be appropriate for C++, as they look quite heavily java-targeted (even though there might be plugins for C++, I haven't found people recommending them).
I see it as a GUI tool producing some standardized dependency list which can then be parsed by different scripts etc. It would be nice if it could integrate with source control (tag, get a tagged version with dependencies etc), but that's optional.
Would you have any suggestions? Maybe I'm just missing something, and usually it's done some other way with no need for such a tool? Thanks.
You can use Maven in relationship with C++ in two ways. First you can use it for dependency management of components between each other. Second you can use Maven-nar-plugin for creating shared libraries and unit tests in relationship with boost library (my experience). In the end you can create RPM's (maven-rpm-plugin) out of it to have adequate installation medium. Furthermore i have created the installation for CI environment via Maven (RPM's for Hudson, Nexus installation in RPM's).
I'm not sure if you would see an version control system (VCS) as build tool but Mercurial and Git support sub-repositories. In your case a sub-repository would be your dependencies:
Join multiple subrepos into one and preserve history in Mercurial
Multiple git repo in one project
Use your VCS to archive the build results -- needed anyway for maintenance -- and refer to the libs and header files in your build environment.
If you are looking for a reference take a look at https://android.googlesource.com/platform/manifest.

keeping Eclipse-generated makefiles in the version control - any issues to expect?

we work under Linux/Eclipse/C++ using Eclipse's "native" C++ projects (.cproject). the system comprises from several C++ projects all kept under svn version control, using integrated subclipse plugin.
we want to have a script that would checkout, compile and package the system, without us needing to drive this process manually from eclipse, as we do now.
I see that there are generated makefile and support files (sources.mk, subdir.mk etc.), scattered around, which are not under version control (probably the subclipse plugin is "clever" enough to exclude them). I guess I can put them under svn and use in the script we need.
however, this feels shaky. have anybody tried it? Are there any issues to expect? Are there recommended ways to achieve what we need?
N.B. I don't believe that an idea of adopting another build system will be accepted nicely, unless it's SUPER-smooth. We are a small company of 4 developers running full-steam ahead, and any additional overhead or learning curve will not appreciated :)
thanks a lot in advance!
I would not recommend putting things that are generated in an external tool into version control. My favorite phrase for this tactic is "version the recipe, not the cake". Instead, you should use a third party tool like your script to manipulate Eclipse appropriately to generate these files from your sources, and then compile them. This avoids the risk of having one of these automatically generated files be out of sync with your root sources.
I'm not sure what your threshold for "super-smooth" is, but you might want to take a look at Maven2, which has a plugin for Eclipse projects to do just this.
I know that this is a big problem (I had exactly the same; in addition: maintaining a build-workspace in svn is a real pain!)
Problems I see:
You will get into problems as soon as somebody adds or changes project settings files but doesn't trigger a new build for all possible platforms! (makefiles aren't updated).
There is no overall make file so you can not easily use the build order of your projects that Eclipse had calculated
BTW: I wrote an Eclipse plugin that builds up a workspace from a given (textual) list of projects and then triggers the build. That's possible but also not an easy task.
Unfortunately I can't post the plugin somewhere because I wrote it for my former employer...