Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
Is there a web service of some sort (or any other way) to pull a current time zone settings for a (US) city. For the parts of the country that don't follow the Daylight Saving Time and basically jump timezones when everyone else is switching summer/winter time... I don't fancy creating own database of the places that don't follow DST. Is there a way to pull this data on demand?
I need this for the database server (not for client workstations) - there entities stored in the database that have City, State as properties. I need know current timezone for these entities at any moment of time.
We encountered same issue and, alongside the great suggestions above, Google appears to have two complementary APIs, one for Time Zone from geocode (latitude/longitude) data and the geocode API.
For example, to get the time zone and offset for San Francisco:
1) Convert the city to a geocoded location:
http://maps.googleapis.com/maps/api/geocode/json?address=San%20Francisco,+CA&sensor=false
The geocoded location is in the JSON return data:
"location": {
"lat": 37.77492950,
"lng": -122.41941550
}
2) Convert the geocoded location to a local timezone and offset, if any:
https://maps.googleapis.com/maps/api/timezone/json?location=37.77492950,-122.41941550×tamp=1331161200&sensor=false
Which returns the current time zone information:
{
"status": "OK",
"dstOffset": 0.0,
"rawOffset": -28800.0,
"timeZoneId": "America/Los_Angeles",
"timeZoneName": "Pacific Standard Time"
}
Time zones for a region can change for a variety of reasons. So it is a good idea to find an authoritative server-based solution and not cache. For more information see Wikipedia's Time Zone article.
earthtools.org provides a free web service to get the time zone from a city here:
http://www.earthtools.org/webservices.htm#timezone
You just pass in the long/lat values like this: (This is for New York)
http://www.earthtools.org/timezone-1.1/40.71417/-74.00639
EDIT:
It seems like earthtools has been shut down. A good alternative (That did not exist in 2008 when this question was answered) is the Google Time Zone API. To use it you must first activate the Time Zone API on your account. It is free if you stay below these limits:
2500 requests per 24 hour period.
5 requests per second.
The documentation is available on Google Developers.
Geonames.org has a wonderful set of worldly data that's available via webservice or download:
http://www.geonames.org/export/ws-overview.html
In particular
http://www.geonames.org/export/web-services.html#timezone
.
Earthtool's timezone info is not up to date ... for an instance, the Sri Lankan current offset is +5.5 from GMT but EarthTools shows as +6 which was the old offset before 2005.
I suggest GeoNames.org.
WorldTimeServer.com has what appears to be a comprehensive time zone database, which you can purchase access to in a variety of formats, including a .NET component for Web use.
No connection, just had to research the same thing myself recently.
Simple Offline Library : APTimeZones
In order to find the time zone for a location you can use such as the Google Maps API’s time zone API. Unfortunately this requires you to query a remote service and you are subject to their limits.
Here’s a library rom Alterplay called APTimeZones(Git is attached) that allows you to extract an NSTimeZone from a given location without the need to connect to a remote service. APTimeZones works by querying a local listing of time zones (included with the library).
In case anyone should bump into this question.
You could use Google API to search for an address. That returns latitude / logitude. With those values in hand, you can find the closest timezone using e.g. PHP.
Or you can use an API like timezoneapi.io (I'm behind that) which enables you to search for an address / city / country. It returns the address, the timezone information and the current date/time for that given timezone.
https://timezoneapi.io/developers/address
I know this is answered, but I am posting this answer as people might still find it useful - The selected answer does not work successfuly right now.
Google have their own service, which is very reliable and easy to use, and outputs info in JSON format. It even allows for specifying a custom time, e.g get the timezone in 02/02/2013 in Malta.
https://developers.google.com/maps/documentation/timezone/
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I am building a web application (book readers group) , where every one can select a book from website books and then create a readers group for that book.
1- User selected a book named (How to develop with ReactJS).
2- User specify the number of group persons, let say 20 persons to read that book. then that selected book pages will be divided into the /20 persons .
3- A user will have a read URL that he send to his reader group.
4- A table will be shown to each reader opened the shared URL:-
I may later add a feature where a readers can send an internal message to the group creater...
Now i am in confuse between which DB should i choose, its not important for me the ease of implementation .
My matter is the cost , because i am expecting a high volume traffic for the website.
Speed is not really necessary for me if the read speed different between SQL and noSQL is less than 1 sec , the important thing is accessibility and availability of the services 24 hours and the cost of course.
Let say if select Amazon Dynamo DB , dynamo db will cost me according to each read and write request.
The hourly rate for Amazon RDS (Mysql) for db.m5.xlarge instance is $0.396 and $0.133 per GB-month, and i later on i may need to run an auto scale to start more instance.
While in DynamoDB is charging as per read and write request and storage usage.
In my experience - and with my uses cases - I have found that for small to medium sized projects DynamoDb ends up being cheaper, and in some cases even completely free because the use fits within the free-tier that aws offers - which is pretty generous. DynamoDb is my goto for these types of applications.
On larger projects I have found it not so clear - not knowing you usage patterns, and amount of data storage used/needed, there is not an easy, one size fits all answer.
Based on the use case scenarios mentioned above, if the solution has to be developed using DynamoDB, it may require main tables and some secondary indexes to search the book by name etc. So, in terms of pricing, AWS will charge you for both main table and secondary indexes read / writes separately.
In general, DynamoDB would give better results if you find by id or key (i.e. Partition key). As soon as you need wildcard search or find some data by non key attributes, you may need to scan the full table or create some secondary index.
If you foresee wide range of features which will be added to your application to give better user experience, you should go with some typical RDBMS option. It will be cost effective and flexible to add more features as well.
You can consider AWS Mariadb if you are going to stick with AWS cloud.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last year.
Improve this question
I know that the google geocoding api has a 2,500 hits a day limit before it'll start returning REQUEST_DENIED. How many does the bing one take? I heard that it's unlimited but I had trouble confirming that.
Just came across this question. It's a bit old but still relevant. For Bing Maps there are two different ways to geocode address. The first is with batch geocoding which has the limits outlined by alfski.
The second option is the REST geocoding service which allows you to geocode one address at a time. The free transactions for the Bing Maps services varies depending on use case. For Windows/Windows Phone/WPF apps you can make up to 50,000 transactions a day under the free terms of use. For other platforms such as iOS, Android and Web apps there is 125,000 transactions allowed per year for free. You can find more details on this here: http://www.microsoft.com/maps/create-a-bing-maps-key.aspx
The Microsoft Bing Maps terms & conditions are at http://www.microsoft.com/maps/product/terms.html
The T&C's will change so please check the link.
As of January 2013 under the General Terms of Use, the Bing geocoding limits are:
"a total of 5 batch geocoding or file uploads with a maximum of 50 records each, using the Bing Spatial Data Services API, within any 24 hour period."
However if you have a Bings Map Agreement:
"...a total of 24 batch geocoding or file uploads with a maximum of 200,000 records each, using the Bing Spatial Data Services API, within any 24 hour period.
1) Google Maps API recently changed their prices:
Standard plan:
2,500 requests per day free.
After that is $0.50 per 1000 requests.
Premium plan:
Pay $10,000/year.
Get 100,000 requests per day "for free".
Above that - 500,000 requests per year are "free".
Above that - buy next package.
However "Web service APIs and the JavaScript API require the Premium Plan."
My understanding is that formally you must use Premium Plan ($10000/year) if you are calling geocoding from your server (not mobile phone).
More details here: http://dennisgorelik.livejournal.com/112993.html
2) Bing Maps API does not publish their prices.
Here's the Licensing tool to find out how much they'll charge you
http://www.microsoft.com/maps/Licensing/licensing.aspx
By the way you can also try Yandex Geocoding API which is for free for up to 25.000 Requests per day per Website/Mobile App:
http://api.yandex.com/maps/doc/geocoder/desc/concepts/input_params.xml
Updated answer as of Feb 2022
Google Maps offers a $200 monthly credit, which is equivalent to 40,000 free geocodes per month.
Bing Maps allows you to sign up for a "basic key" for development purposes which gives you 125,000 free geocodes per calendar year. Their licensing page is here.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Would it be more cost effective for a small business (around 25 concurrent users) to buy a PAF database and code it up ourselves or use a Postcode service such as Postcode Anywhere?
The Royal Mail site is really confusing! http://www.royalmail.com/marketing-services/address-management-unit/address-data-products/postcode-address-file-paf/prices
We operate 24 hours a day and at any one time, we have between 1 and 25 users doing postcode searches. We are currently using a PAYG service and it is really pricey so we want to buy a PAF database and create our own. I don't understand the pricing on the link above (basically we're looking at something in the region of £2 to £49,500?!)
Also, what do you actually get with a PAF database? As in what kind of files do they send you, is there an API and do you pay a one off fee or an ongoing fee? Do you have to agree to the delete the data once you stop paying royal mail?
Thanks
For the time it would take to code it up yourselves, it would be more time and cost efficient to go with someone like Postcode Anywhere. They'll also provide guaranteed first class service along with service updates to improve service.
We use them on a lesser-scale (after moving from QAS which were crap in comparison).
Have you investigated pricing with any providers yet - if so, what's it coming out at?
I can't add anymore to the answer by Alan, which describes how the files are provided and how it needs to be done.
You get a bunch of flat files and need to use the PAF programmers guide to help build yourself a system
http://www.royalmail.com/marketing-services/address-management-unit/address-data-products/programmers-guide
See also
www royalmail.com/sites/default/files/docs/pdf/01_tell_me_the_basics.pdf
www royalmail.com/pafnews
You're probably better off buying some package www poweredbypaf.com/
You don't need to purchase the Royal Mail Postcode Address File (PAF). There are lots of API's available.
getAddress.io is the only one I've found that's free:
https://getAddress.io
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I have the following requirement. I have with me a database containing the contact and address details of at least 2000 members of my school alumni organization. We want to store all that information in a relation model so that
This data can be created and edited on demand.
This data is always backed up and should be simple to restore in case the master copy becomes unusable.
All sensitive personal information residing in this database is guaranteed to be available only to authorized users.
This database won't be online in the first 6 months. It will become online only after a website is built on top of it.
I am not a DBA and I don't want to spend time doing things like backups. I thought Amazon's RDS with it's automatic backup facility was the perfect solution for our needs. The only problem is that being a voluntary organization we cannot spare the monthly $100 to $150 fees this service demands.
So my question is, are there any less costlier alternatives to Amazon's RDS?
In your case of just contact and address data I would choose Amazon SimpleDB. I know SimpleDB might not be suitable for a large number of tables with relationships and all, but for your kind of data I think SimpleDB is sufficient. And costs is much much cheaper than Amazon RDS.
I also wanted to use RDS, but the smallest db size costs $80 p/month.
With out a bit more info I may be way off base here. but 2000 names addresses etc. is not a large DB and I would have thought that the possible use of Amazons RDS was a bit "overkill" to say the least.
Depending on how (and who) you want view edit etc. there are a number of free or almost free alternatives.
One method may be to set up /use a hosting package that has something like phpMyAdmin linked to a mySQL DB. Doing this it is possible to access and edit etc. the DB without having a website front end. Not pretty (like a website front end) but practical. A good host should also back up for you.
Another is to look at Google Documents. OK not really a database more a spread sheet, but very much on the lines of Excel. You can share Google docs with invited people and even set up a small website via Google Docs. This is a free method, but may not be that practical depending on your needs.
Have you taken a look at Microsoft SQL Azure? You can use it free for something like 90 days and then if you only need a 1GB db it would only be about $10 a month.
You mention backup so I thought I would talk about that as well. They way SQL Azure works is that it automatically creates 2 additional copies of your database on different machines in the data center. If one of the machines or db's become unavailable it automatically fails over to one of the other db's.
If you need anything above that you can also use the copy command to backup the database.
You can check
http://www.enciva.com/postgresql9-hosting.htm
and
http://www.acugis.com/postgresql-hosting.htm
They work for Postgres and MySQL.
For a frankly tiny db of that size I'd seriously look at http://www.sqlite.org/
it's inprocess, easy to constantly .dump off to S3 and you can use update hooks to keep checkpoints after updates.
backups/restores are almost the equivalent of windows batchfiles and wgets
good encryption using http://sqlcipher.net/
standard OS Filesystem and user level ACLs control security.
running a file backed db makes sense given the fragility of a normal EC2 backed RDBMS to EBS gremlins.
there are exclusions from to SQL92 (no real showstoppers), but given the project cost sensitivity and the RPO and RTO's of an alumni database, I reckon it's a good bet.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
We need an address finder (premise level) based on postcode. We have a budget of 40k for this. But I have been assigned to find some cheaper alternatives for Royal mail PAF database. Is Google any good to find premise level address when you send full postcode. Any recommendation over Royal Mail PAF file. Any web services out there for this to accomplish? Please share your knowledge.
Cheers,
Naren
We use products from AFD for this, they work well for us.
Edit just saw Best way to geocode UK postcode with Google Maps API? on the front page.
In the UK the government has said that PAF data should be made free[1]. I'm painfully aware of the almost extortionate nature that Royal Mail operate.
Having worked with Royal Mail PAF API, I known a 'friend' (wink wink) that created a class wrapper around the APIs. This 'friend' of mine built a custom Importer that automatically ripped all the PAF data into a MS SQL database. Post the data import, he no longer needs to renew he's licences because he is no longer using PAF data.
This may be something you could do also, buy the data one time an import it.
As for data changes, you can buy perhaps every few years e.g. 2-3 years and do a update of your existing data.
[1] Damn It! guess I was wrong, http://www.guardian.co.uk/technology/2010/jan/22/postcode-petition-fails-blocked-number-ten
I work as the integrations specialist for Postcode Anywhere (we are one of the leading Royal Mail PAF resellers). Address capture doesn’t have to be expensive – and you don’t have to sacrifice reliability for an affordable service. Postcode Anywhere can be licensed either on a simple credit pack based system or on an annual basis, and you can be up and running in 10 minutes using our JavaScript client. If you are looking to create a more bespoke integration we also have an array of web services and code samples to help you.
If you want to have a play around with the service to see what you think we will be more than happy to provide you with a free trial. A full run-down of all of our products and services can be found here: http://www.postcodeanywhere.co.uk/products.
I work for CraftyClicks.
There are a few PAF resellers around. The data is all the same, prices can vary significantly. Best to spend a few minutes browsing the various sites.
At CraftyClicks our focus is on uptime/availability and keeping the price of PAF data reasonable - at high volumes the price falls to well below 1 penny a click.
Our address lookup web service can be integrated client side via JavaScript or server side via XML.
Let us know your requirements (adam at craftyclicks.co.uk) - you shouldn't be spending anywhere near 40k for this!
Adam.
The base PAF data is the same but a lot of value is put into adding information that is not included into PAF to help with realtime and batch addressing matching with products based on PAF. We have a lot of locality information that is not included within PAF but people tend to use within their address.
As to updates, there are thousands of changes every month so its vital that you use a source that has regular updates to the PAF data and also associated files such as business and consumer names data that also help in the matching process.
Have a look at our site www.capscan.com for both UK and International data quality with services delivered either installed or as a web service.
You can also contact us on 0207 428 1255