I created php pages.. those php pages are registration form.with the fields.FirstName,LastName and CompanyName...If I fill the information in that form and hit submit the data will store it on the mysql database.And I created one page...that displays the data which I have added through register form...I have a button on this page which displays the data...my question is when I click that button...my display data must store on the Iphone ...is that possible to do.....or any other solution.....and how to use php on Xcode....?
You appear to be confusing server and client-side programming.
If you want to run stuff on a website on the iPhone to save locally, you will have to use HTML5 and Javascript, and HTML5's feature for local databases (if the iPhone doesn't support this currently, it certainly will soon).
PHP will run on the server, if you want to save the data on the server-side.
If you want to run stuff only locally on the iPhone, then you'll want to look into XCode and Objective-C. The iPhone uses SQLite as a local database for applications to use.
Are you opening your webpage in Safari on the iPhone, or are you planning on having an application running?
If you have a server with a PHP application, why would you want to store the "display data" on the iPhone? Is this for offline use?
Users can copy/paste info to notes on their iPhone, though this is not a very handy solution.
If you want the phone to remember the data locally, you can set the expire header of your application to a date in the far future, but be careful as this has risks if you change data and the client doesn't learn about it because it thinks it has a valid local copy.
Also, read this (short) article about caching data with HTML on the iPhone: http://ajaxian.com/archives/html5-features-in-latest-iphone-application-cache-and-database
That suggests it is supported. But, I wouldn't know how. I'm sure google can help you there.
You could just capture the user's information in standard UITextField's and then submit the form programmatically using ASIFormDataRequest from the ASIHTTPRequest library. It's made specifically for posting data to a form.
Related
Firstly sorry if this question has been asked before but I'm a novice so even if it has I'm unaware of the language I'd even use to try and seek it out.
I'm beginning to learn about REST API's and it got me thinking. Is it possible to load the JSON response directly from the API server into the user's browser and bypass your own server?
Imagine you have say a Django app running on a server that accesses email messages from Outlook.com using the graph API.
I assume an ordinary flow would go something like:
User request->your server->graph api-> your server-> user browser.
It seems like a waste for it to hit your server that second time before it goes on to be presented to the user's browser.
Is there a way the Django app can render a template and effectively tell the browser "expect some data from X source, and place it in y location in this template"?
You could do that with javascript. You'd have to include either a script tag in your template, or create and include some static javascript files with the code.
I'd recommend learning and using the jQuery javascript library, as it makes what you're talking about much easier to implement. Research ajax requests, those are what you'll need to make requests directly to another server, bypassing your own
I have an existing Django project in the sports domain with apps that are built on the Model-View-Template structure. Many of the models and views are fairly sophisticated and work well currently. Data from the database (scores etc) is combined with some incoming user inputs through forms (HTTP POST requests) to be displayed on a web page via templates.
However, I now need live data to be displayed to users and be automatically refreshed to all the users continuously, either because one of the users entered something new (in the front-end), or because the scores changed during the game (directly enters the back-end).
I've done some research on Stack Overflow as well as tutorials on Youtube/ rest of the web and it appears that for me to use Django Channels, I'd have to start from scratch and build everything ground-up, which I'd like to avoid. How do I use the websocket protocol for my Django app easily without having to completely rework everything that I've done so far?
You don't really need to start from scratch or anything. You just need to add a module using channels. I am assuming that currently, the data is fetched only when the page is refreshed. What you need to do is write a consumer that is used to send messages directly to the client via websocket. Then in the front you can update the widget with the scores on each message received in the websocket. You can also stream user actions through the websocket to the server which will then be broadcasted by the consumer to needed clients. You may not even need to change anything in the existing code.
It will be easier to understand how this works and how you can incorporate it to your project by reading the channels tutorials. It became clearer to me after reading it so I would advice you do the same
Recently I have create some UI tests for a qooxdoo application with the built-in simulator ( I am using qooxdoo 3.0.1, selenium-server-standalone-2.35.0 and firefox23)
and I need to store a cookie in the browser and save that for the next time that it open.
code that stores the cookie:
if(!this.getQxSelenium().isCookiePresent(debugVariable))
this.getQxSelenium().createCookie("debugVariable=0","path=/, max_age=350000, domain=subdomain.foo.com");
console.log(this.getQxSelenium().getCookieByName("debugVariable"));
I have find that there is an argument in the server the -profilesLocation that specifies the directory that holds the Firefox profiles that java clients can use to start up the Firefox
I even try to use -browserSessionReuse but it does not working either for me.
I see this is not enough what other solution I could try to make the Firefox to remember the cookies?
This is not a qooxdoo-specific issue. I tried it with a plain HTML+JS page and Selenium's -firefoxProfileTemplate option and it didn't keep the cookie either.
You could try using an older version of Selenium (and perhaps also Firefox). -firefoxProfileTemplate is specific to Selenium RC, which is deprecated and gets more broken with every new release.
I've got fw1 using the content of the default.cfm page as the editable content region. While this works fine for static content, I'd like to add the ability to edit the content over with fckeditor or some other in-browser WYSIWYG tool.
Is there any tool you could recommend that would make this easy? I don't want to convert to a CMS like mura, just want to login and the ability to edit the contents of about 5 files, with the possibility of creating a timestamped backup of the file.
We have the concept of a dynamic text area on some pages on applications that don't require a full on cms.
This is with ColdBox, but you should be able to implement something similar in fw1.
We have a helper component with a method that allows us to "render dynamic text" with a specific code eg. "helppagetext" in a zone in the page. We then have a very simple CRUD application using CKEditor that saves text blocks against those codes. The CRUD application is protected by a pre-existing login system.
It is pretty simple to implement something like this, especially if you already have a security and login system in place.
Hope that helps.
ColdFusion 8 and above has a built in WSYIWYG editor. It is a part of <cftextarea
http://livedocs.adobe.com/coldfusion/8/htmldocs/help.html?content=Tags_t_02.html
I want to make my website available offline even if the user clears the cache and cookies. Is is possible? Also I am dealing with database. Is is possible to handle databases offline?
A user could store a local copy of a single webpage using Chrome (right click save-as) and it will store all resources (images, css, js) required to fully load the page offline. Other browsers will have similar options.
You can use wget to mirror a whole website for offline browsing.
wget --mirror --convert-links --html-extension -p http://www.example.com/
of course neither of these options will handle database driven elements of your site/page.
If you want to mock a database or dynamic elements of a page offline then Google Gears is probably the closest to what you are looking for but I think it was deprecated by Google last year.
If your users have modern browsers, try HTML5 Application Cache.
References:
Overview - http://www.html5rocks.com/en/features/offline
Demo - https://jonathanstark.com/labs/app-cache-7/
Tutorial - https://www.html5rocks.com/en/tutorials/appcache/beginner/
Article - http://grinninggecko.com/developing-cross-platform-html5-offline-app-1/
Summary: Click me, I'm the newish thing that browsers now support!
I clicked some of the links found in other answers, and all tools mentioned are deprecated or will/should be soon.
Later when I wasn't connected to the internet, I opened a site operated by Google (either Google Docs or YouTube, I sadly forgot since then) and went to view the page source, as I was curious to see other answers in action. I found something called ORIGIN-TRIAL in the manifest file.
After a quick Google search, I found this, which brought me to this, which somehow brought me to the last link:
https://developers.google.com/web/fundamentals/primers/service-workers
In conclusion, use Service Workers now. If you're curious if it now works with all browsers, don't worry. All popular browsers should support it as seen here.
No, if your databases are housed online. then you need a internet connection for the PHP/ASP (whatever you're using to deal with DBs) to connect/communicate to the DB's
For storing data locally and accessing them offline take a look at Gears and Web Storage.
The main problem is what degree of functionality you want to provide with your website. It always requires some work on the client (user) side to "store" aka. save your website offline. You would have to store all your functionality in one page that the user stores (be it a Flash movie or some Javascript-Code).
You can use simple command to download whole website locally with all links working properly.
wget -rk 'http://www.website.com'
For https url you need to add one more property like below :
wget -rk --no-check-certificate 'https://www.website.com'