Same application name on multiple IIS sites - coldfusion

We're majorly redesigning our website, in terms of both aesthetics and architecture. In order to avoid locking up the application against updates during this redesign, we'd like to redesign one module at a time and just pass users between the old and new sites as needed. However, we have a variety of session information, including their logins, that we need to preserve throughout this shuttling.
If we setup two different webroot folders with ColdFusion files (one for the old and one for the new) each with its own IIS site, but setup the Application.cfc files in both to use the same application name, will the session and application scopes be preserved as users move back and forth between the two sites? Will this cause any unexpected issues?
We are using ColdFusion 9.

While this will work, as Dan has suggested, it's not a great idea. For one thing you will have no ability to change anything in the application scope on site A without affecting site B. This can introduce head-scratching bugs. Consider DSN names which are often in the application scope. Unless you are using the exact same DSN for both sites, whichever site is first initialized is going to dictate which DSN is used - so you basically have to live with the exact same application scopes.
Still, it can be done. See this post for a plain example that might be helpful.
Application Variables on ICE - ColdFusion Muse

Related

How to migrate a web application to another technology in a seamless, incremental way

I have a Coldfusion application, developed without any framework and almost no architecture.
I'd like slowly migrate some part of the application to some kind of java based web application framework.
The original application must still be the main application all the way until the end of the migration.
The application has user and sessions and a lot of functionalities, not easy to decouple.
I'm looking for different ideas.
For example I could try to develop a REST API backe nd and start to use it from the ColdFusion application until I'll have Coldfusion only as front end. This process must go hand by hand with database refactoring and migrations, ie. new API must use its own database, so I guess that decoupling of database will be necessary (and probably database synchronization issues will arise)
But I would like to switch also to a different front end technology.
The situation would be like: Users login in coldfusion, enter main dashboard in Coldfusion and next "some" functionalities will be handled by another framework. That means a template engine is able to understand the user, his roles, his permissions, reproduce the same graphical layout, but it should serve the content with another technology.
Final result must be that all functionalities are migrated to the new technology.
I mention Java as it is in some way related to ColdFusion, but any web application framework could be used in principle.
I also think that Coldfusion is used in the original application is not relevant. The fact that no framework is used, probably gives me more flexibility.
Any architectural suggestion is welcome.
I recently did this very thing. A spaghetti code app developed by a graphic designer with a CFML book, a MSSQL database so denormalized not even Bizarro could have wrestled it and won.
What we did was we left the old app in place, built a new one and tested it extensively, perfected it, made sure it was capable of evolving (and it has). Then we flipped a DNS switch and sent out an announcement that the new system is the only option going forward.
I then built an archive feature that provided read-only access to the mess that we left behind. Some day, if the app owners want to migrate that data, we can try that (it won't go well), but generally the vast improvements have convinced everyone that the old stuff is only needed in that read-only mode.
Obviously that means reporting only goes back to the first day of the new tool, but it was either that (with a 4 week dev time) or they get it all and it takes 8 months doing what you described.
It's worked very well for us, and it's probably been the best-received work I've done here.

Tracking users on internet

We want to track users when they visits our competitor websites.
Suppose the user visits a.com(our site) and then visits b.com and c.com (our competitors) , we want to know the user has visited those sites. Is there a way to track this without using plugins like chrome extensions.
I know there is limitation in cookies. Is there a way to do this?
Thankfully, no.
From time to time accidental mechanisms to leak such information become known (such as, at one time, it was possible to inspect link styles to see if it's visited). Such things are regarded as serious security breaches and are swiftly eliminated by browser vendors.
Basic web security principles isolate the interaction with unrelated domains, unless both web pages willingly try to talk to each other. Furthermore, if a user visits your site first and the closes it, their interaction with you is finished - in no sane scenario would you be able to know what they do atferwards.
Not to mention the obvious fact that what you're asking is just plain highly unethical.
Unless you can get your competitors to give you access to their websites, either through include scripts or indirectly if you manage to track users through ad-networks or Facebook pings, it's not possible. You cannot access (should not be able, in any case) cookies set by another domain.
Some browsers allow access to the previously visited URL through document.referrer -- but that is not guaranteed to work, nor is reliable. In short, there's no way of doing it targeted.
Not being able to easily track people is a good thing; but it makes constructions like the on you are wanting a lot harder.

Is it possible to use Django and Node.Js?

I have a django backend set up for user-logins and user-management, along with my entire set of templates which are used by visitors to the site to display html files. However, I am trying to add real-time functionality to my site and I found a perfect library within Node.Js that allows two users to type in a text box and have the text appear on both their screens. Is it possible to merge the two backends?
It's absolutely possible (and sometimes extremely useful) to run multiple back-ends for different purposes. However it opens up a few cans of worms, depending on what kind of rigour your system is expected to have, who's in your team, etc:
State. You'll want session state to be shared between different app servers. The easiest way to do this is to store external session state in a framework-agnostic way. I'd suggest JSON objects in a key/value store and you'll probably benefit from JSON schema.
Domains/routing. You'll need your login cookie to be available to both app servers, which means either a single domain routed by Apache/Nginx or separate subdomains routed via DNS. I'd suggest separate subdomains for the following reason
Websockets. I may be out of date, but to my knowledge neither Apache nor Nginx support proxying of websockets, which means if you want to use that you'll sacrifice the flexibility of using an http server as a app proxy and instead expose Node directly via a subdomain.
Non-specified requirements. Things like monitoring, logging, error notification, build systems, testing, continuous integration/deployment, documentation, etc. all need to be extended to support a new type of component
Skills. You'll have to pay in time or money for the skill-sets required to manage a more complex application architecture
So, my advice would be to think very carefully about whether you need this. There can be a lot of time and thought involved.
Update: There are actually companies springing around who specialise in adding real-time to existing sites. I'm not going to name any names, but if you look for 'real-time' on the add-on marketplace for hosting platforms (e.g. Heroku) then you'll find them.
Update 2: Nginx now has support for Websockets
You can't merge them. You can send messages from Django to Node.Js through some queue system like Reddis.
If you really want to use two backends, you could use a database that is supported by both backends.
Though I would not recommended it.

Is it a bad idea for multiple ColdFusion applications to use the same client variable store?

We have two ColdFusion applications that share a common database. There are three instances of each application. (One instance of each application runs on each of three servers.)
I can see that the three instances of a given application should share a client variable store. (Load-balancing can cause a single user session to bounce between the three instances.) My question is: Is there any danger to having all instances of both applications share the same data store? Or should only one application be pointing at a given data store?
You can use the same client data store. The CDATA table has an 'app' column that stores the coldfusion application name. That column will keep your data unique to each application.
I'm working at an enterprise level ColdFusion shop with multiple CF applications running on the same server that are all pointed at the same client variable store. The only concern within the organization is how the client variable store affects regular backups, and that falls under the data team's purview. We don't have any problems with the different apps actually using the same client variable storage.
Related, from the ColdFusion documentation:
Some browsers allow only 20 cookies to
be set from a particular host.
ColdFusion uses two of these cookies
for the CFID and CFToken identifiers,
and also creates a cookie named
cfglobals to hold global data about
the client, such as HitCount,
TimeCreated, and LastVisit. This
limits you to 17 unique applications
per client-host pair.
I guess this deals more with how many applications you actually run rather than whether you have them all share the same client data store, but it does suggest that there may be some kind of hard limit on the total number of apps you can run at once, although I'd recommend splitting across hosts (or just using a different domain name) if you're planning on more than 16 apps!
As Eric stated above, running multiple apps off of one datasource is fine. What I would warn you is that these databases can fill up fast if you're not careful to block spiders and search engines from using them. Because CF creates client variables on each request for a new session, a search engine will get a new one every time because it never sends its old credentials/cookies so CF thinks it's a new user who needs a new client variable set. Also, be absolutely certain to check "Disable global client variable updates" in CF admin. This will save you a lot of unnecessary overhead.
I would think that multiple applications sharing the same data store would open up the possibility of users from one application having access to the other applications. While it may not be likely, the possibility could exist. (I don't have any facts to back that up, it just seems like a logical conclusion).
The question is, are you comfortable with that possibility, or do you have to absolutely make sure each application is secure?

Offline web application

I’m thinking about building an offline-enabled web application.
The architecture I’m considering is as follows:
Web server (remote) <--> Web server/cache (local) <--> Browser/Prism
The advantages I envision for this model are:
Deployment is web-based, with all the advantages of this approach
Offline-enabled
UI (html/js) synchronization is a non-issue
Data synchronization can be mostly automated
as long as I stay within a RESTful paradigm
I can break this as required but manual synchronization would largely remain surgical
The local web server is started as a service; I can run arbitrary code, including behind-the-scene data synchronization
I have complete control of the data (location, no size limit, no possibility of user deleting unknowingly)
Prism with an extension could allow to keep the javascript closed source
Any thoughts on this architecture? Why should I / shouldn’t I use it? I'm particularly looking for success/horror stories.
The long version
Notes:
Users are not very computer-literate.
For instance, even superficially
explaining how Gears works is totally
out of the question.
I WILL be held liable if data is loss, even if it’s really the users fault (short of him deleting random directories on his machine)
I can require users to install something on their machine. It doesn’t have to be 100% web-based and/or run in a sandbox
The common solutions to this problem don’t feel adequate somehow. Here is a short analysis of each.
Gears/HTML5:
no control over data, can be deleted
by users without any warning
no
control over location of data (not
uniform across browsers and
platforms)
users need to open application in browser for synchronization to happen; no automatic, behind-the-scene synchronization
different browsers are treated differently, no uniform view of data on a single machine
limited disk space available
synchronization is completely manual, sql-based storage makes this a pain (would be less complicated if sql tables were completely replicated but it’s not so in my case). This is a very complex problem.
my code would be almost completely open sourced (html/js)
Adobe AIR:
some of the above
no server-side includes (!)
can run in the background, but not windowless
manual synchronization
web caching seems complicated
feels like a kludge somehow, I’ve had trouble installing on some machines
My requirements are:
Web-based (must). For a number of
reasons, sharing data between users
for instance.
Offline (must). The application must be fully usable offline (w/ some rare exceptions).
Quick development (must). I’m a single developer going against players with far more business resources.
Closed source (nice to have). Yes, I understand the open source model. However, at this point I don’t want competitors to copy me too easily. Again, they have more resources so they could take my hard work and make it better in less time than I could myself. Obviously, they can still copy me developing their own code -- that is fine.
Horror stories from a CRM product:
If your application is heavily used, storing a complete copy of its data on a user's machine is unfeasible.
If your application features data that can be updated by many users, replication is not simple. If three users with local changes synch, who wins?
In reality, this isn't really what users want. They want real-time access to the most current data from anywhere. We had better luck offering a mobile interface to a single source of truth.
The part about running the local Web server as a service appears unwise. Besides the fact that you are tied to certain operating environments that are available in the client, you are also imposing an additional burden of managing the server, on the end user. Additionally, the local Web server itself cannot be deployed in a Web-based model.
All in all, I am not too thrilled by the prospect of a real "local Web server". There is a certain bias to it, no doubt since I have proposed embedded Web servers that run inside a Web browser as part of my proposal for seamless off-line Web storage. See BITSY 0.5.0 (http://www.oracle.com/technology/tech/feeds/spec/bitsy.html)
I wonder how essential your requirement to prevent data loss at any cost is. What happens when you are offline and the disk crashes? Or there is a loss of device? In general, you want the local cache to be the least farther ahead of the server, but be prepared to tolerate loss of data to the extent that the server is behind the client. This may involve some amount of contractual negotiation or training. In practice this may not be a deal-breaker.
The only way to do this reliably is to offer some sort of "check out and lock" at the record level. When a user is going remote they must check out the records they want to work with. This check out copied the data to a local DB and prevents the record in the central DB from being modified while the record is checked out.
When the roaming user reconnects and check their locked records back in the data is updated on the central DB and unlocked.