We need to incorporate coldfusion pages into a DotNetNuke site. Example, a login page, consisting of a simple login form. OnSubmit, a coldfusion cfc webservice is called to check the credentials, returning a success flag. I am brand new to DotNetNuke and don't even know if this can be accomplished. Googling reveals next to nothing, which probably isn't a good sign. If anyone can provide a real simple example of how to do this, I would be extremely grateful.
The easiest way to do this would be to use the IFrame module in DotNetNuke, if you don't have the IFrame module available check out the Host/Extensions page, and click on the Available Extensions option.
Your cold fusion pages will have to reside somewhere else, but within the IFrame they can be loaded into a DotNetNuke website.
Long term you would likely be served best by rewriting that functionality inside of a DotNetNuke module, or finding a module that provides the same functionality.
Related
I am concerned about page ranking on google with the following situation:
I am looking to convert my existing site with 150k+ unique page results to a ember app, off the route. so currently its something like domain.com/model/id - With ember and hash change - it will be /#/model/id. I really want history state but lack of IE support doesn't leave that as a option. So my Sitemap for google has lots and lots of great results using the old model/id. On the rails side I will test browser for compatibility, before either rendering the JS rich app or the plain HTML / CSS. Does anyone have good SEO suggestions with my current schema for success.
Linked below is my schema and looking at the options -
http://static.allplaces.net/images/EmberTF.pdf
History state is awesome but it looks like support is only around 60% of browsers.
http://caniuse.com/history
Thanks guys for the suggestions, the google guide is similar to what I'm going to try. I will roll it out to 1 client this month, and see what webmasters and analytics show.
here is everything you need to have your hash links be seo friendly: https://developers.google.com/webmasters/ajax-crawling/
basically You write Your whole app with hashlinks, but You have to add "!" to them, so You have #!/model/id. Next You must have all pages somewhere generated and if google asks for them, return "plain html" as described here: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
use google webmaster tools to check if Your site is crawlable.
I'm not sure if you're aware that you can configure Ember to use the browser history for the location API and keep using your pages the way they are reference now. All you need to do is configure the Route's location property
App.Router.reopen({
location: 'history'
});
See more details about specifying the location api here
I am working on a hosted CMS, and am thinking about allow site editors to add custom javascript and html (a much requested feature).
I am concerned that this will open up an attack vector - nasty js could make calls to the functions that our hosted CMS exposes (see the Samy worm for an example of what user scripts did to myspace), but I really want to give users control over their site (what's the point of a CMS you can't add your own clever stuff to?)
What is a good approach to fixing this issue? I can think of several which I would like commentary on, but am not going to list them for fear of the 'no list questions mods'!
I suspect that Caja is on your list, so I'll mention that this is squarely in Caja's use cases; for example, Google Sites is very like a CMS and uses Caja to embed arbitrary JS and HTML.
Caja host pages can provide arbitrary additional interfaces for use by the sandboxed content, which can include, for example, embedding widgets provided by your CMS inside the user-supplied HTML while maintaining encapsulation.
(Disclosure: I work for Google on the Caja team.)
I am an ASP.NET MVC / WebForms developer by trade, so all of the websites/apps I have created in the past allowed me to use Master/Layout pages for the look/feel of all my site, while allowing me to change just the parts specific to that page.
Now, I am doing some freebie web work for a friend and I want to write it in HTML, CSS, and JavaScript using Aptana 3 so that it can be hosted wherever. Master pages are not an option for me since I am no longer in the ASP.NET/Visual Studio world, so I am looking at Server Side Includes to accomplish this. My question:
Is this a good use of SSI? I am seeing conflicting forum posts, where some say that they should be used for small pieces of the page (like a specific piece of text, time, etc). I want it to generate a large portion of my page, things like the footer, footer links, menus, banner image, etc. Basically, I want to use SSI for most of the page, and then just plug in the pieces specific to the page. Have others done this in the past with success?
If the purpose behind you choice is so that it can be hosted anywhere, then even using SSI may be restrictive since it relies on that functionality being enabled on a server.
Having said that, it is a valid option, but personally I would familiarise yourself with both the options in PHP and .NET so that you are comfortable in adapting your code to both. You are rarely going to be asked to move a site hosted on one framework to another, and if you are then you can factor changing the code into your costs.
I'm interested in emulating the functionality of a web browser in C++ so that I can create a wrapper for several web sites. Right now, the biggest issues with these sites are that they make heavy use of JavaScript that interacts with the HTML DOM. Thus, the simple solution of using curl to download the page, and something like RapidXML to parse its contents is out.
Next, I considered using something like v8 with curl, and that solves the issue of interpreting the JavaScript on the page nicely. However, it doesn't solve the issue of connecting the HTML DOM methods with the JavaScript; in other words, document.getElementById() would fail in v8.
Next, I considered WebKit, which seems like it's perfectly suited to emulate a web browser--after all, Chromium and Safari both utilize it in their web browsers. However, it's a little too complete. I don't need all of the rendering aspects it includes.
So, I'd be looking for some way to:
Make an SSL connection to a web site
Interpret the JavaScript on that web site in connection with the HTML DOM
Set the value of the username/passwords <input> fields with my username and password
Simulate clicking the "Submit" button by calling the formSubmit() function, from <input type="button" onClick="formSubmit()">
Handle the HTTP POST form action and the subsequent HTTP 301 and JavaScript redirects (accomplished using window.location)
Repeat 2-5 as needed
Besides what I've already considered, what other options do I have? Ideally, I'd want this to be extremely lightweight, without requiring linking to many libraries.
I'm primarily concerned with developing for Windows 7 64-bit.
Well, this sounds all too much like a brute-force program. Disregarding that, and since you don't seem to need to render any website, I think you should just fetch the file through cURL or something, then parse it, check for the form through using a regex, retrieve the form action, then make a request using the method taken from the <form> tag and whichever input you want.
Problem is, there would be no proper way to know when is it that you've logged in properly, unless you made some kind of per-site checking. This comes mainly from the fact that many sites use sessions rather than direct cookies or HTTP auth, and since you can't read from sessions directly, it is impossible for you to guess when the session has changed.
That's the most lightweight solution I can come up with right now.
I'm developing a web application. It's months away from completion but I would like to build a landing page to show to potential customers to explain things and gauge their interest--basically collecting their email address and if they feel like it additional information like names + addresses.
Because I'm already using Django to build my site I thought I might use another Django App to serve as this landing page. The features I need are
to display a fairly static page and potentially a series of pages,
collect emails (and additional customer data)
track their actions--e.g., they got through the first two pages but didnt fill out the final page.
Is there any pre-existing Django app that provides any of these features?
If there is not a Django app, then does anyone know of another, faster/better way than building my own app? Perhaps a pre-existing web service that you can skin and make look like your own? Maybe there's the perfect system but it's PHP?--I'm open for whatever.
Option 1: Google Sites
You can set it up very very quickly. Though your monitoring wouldn't be as detailed as you're asking for.. Still, easy and fasssst!
Option 2: bbclone
Something else that may be helpful is to set up some PHP based site (wordpress or something) and use bbclone for tracking stuff on it. I've found bbclone to be pretty intense with the reporting what everyone does - though it's been a while since I used it.
Option 3: Django Flatpages
The flatpages Django contrib app is pretty handy for making static flat pages. I'd probably just embed a Google Docs Form to collect email addresses (as that's super fast and lets you get back to real work). But this suggestion would still leave you needing to figure out how to get the level of detail you want on the stats end.
Perhaps consider Google Analytics anyway?
Regardless, I suggest you use Google Analytics with everything. That'll work with anything you do really, and for all I know, perhaps you can find a way to get the stats you're really looking for out of it.