looking for an intraday stock quote feed [closed] - web-services

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I have an application which needs to get intraday stock quotes on several assets (indices, commodities etc').
I want to be able to query the data in HTTP and get it as CSV/XML format.
Now, I'd like to be able to ask the data provider for example what was the last bid/ask/price on GE (General Electric) at 4:00PM, and ask it in let's say 4:05PM on that day, for further processing.
Similar services to what I'm looking for:
Reuter's DataLink service can give me this data on the last trade of the day.
I need it to flow through all day long - intraday.
Yahoo Finance (the query formay within it) is a great service which does what I want in terms of data delivery yet I'm unsure regarding its reliability/timing since it's free.
Also, I couldn't find any information regarding the delay of the data they provide relatively from the real world timing (like many websites give this data in delay of ~20min).
QuoteRSS gives this for free as well, it let's me pick a ticket and get its data, yet once again I'm unsure regarding its reliability, as well as its timing, which I have doubt if this is "realtime" or close to that.
Finally this blog post by google "At long last, real-time stock quotes are here" claims to offer free data on certain stocks, but in Google Finance's pages I can't find anything about it, nor at their API pages, and again, who knows what delay I get from the realtime data.
In addition to the concern with the above mentioned services (Yahoo, QuoteRSS & Google) I'm not sure how/if they provide an intraday information regarding the stocks, something which I need.
Worth mentioning is that many websites which deal with Forex claim to be getting their data feed from Reuters/Bloomberg.
Didn't find such a solution on both's sites. I even went online with a sales rep. at Reuters to ask about it and his answer, after a decent discussion, was that "he's afraid he cannot offer me anything better than their service DataLink". How odd!!
So to summarize my question;
1) Where do I get such data feed, in which I select several tickets from several markets, and get a closer-than-20min information regarding these tickets, in concise format (CSV/XML)?
2) If Reuters/Bloomberg offer it (I'll probably also call them later) - where is it being offered, at their websites? I'd like to get the data from a "big name" such as these guys, for reliability reasons.
3) Regarding "realtime" or not, it depends on the cost. What costs should I prepare to? I'm assuming that realtime feed costs a LOT, so, is there an option between realtime and the 20min delayed feed? Something like 2-5min delay?
4) Please mention how, or if, I can query for stocks' data in a timely manner, like "what was the price of GOOG at 4:00PM?".
Note #1:
Please keep in mind, when answering, that I need the quotes intraday and not "by the end of the day".
Note #2:
If google/yahoo do actually offer this kind of service for free, how do I find it? Directly. I don't mind starting with these "freewares" for testing and such, especially if I can query for data in a timely manner as mentioned above ("what was the price of GOOG at 4:00PM?").
Note #3:
In terms of licensing, I do not intend to resell this information. Simply as that.

Before they closed shop, I used opentick. My blog post about opentick shutting down got quite a bit of traffic, so I decided to write another post that examined some potential opentick alternatives. Take a look at the companies in the post and comments. Hopefully one of them will work for you.

I have used IQFeed for some time. It is not HTTP or a CSV but it is a streaming push of ticks from their servers to you. The client is a bit kludgy but overall I find it to be acceptable for the price. This type of feed would be considered "realtime" by most people and since you are talking about minutes I assume that you are someone who is not worried about a couple seconds of latency here or there.
I have experience with Reuters (Thomson) feeds. They are expensive since we are now talking about TotalView/OpenBook data. This would be used to calculate the history of the order book and could be used for analyzing things like the liquidity of an equity at different price levels. I had a good experience with them at another job. 24/7 Engineering support, fixes, decent security db. The reality is that there is a wide variety of ways to get these feeds mostly from brokerages. I don't think this is what you are looking for since you mentioned things that were free.
There are "mid tier" providers like CQG although I have no experience with them.
In general no matter who you are using you need to be willing to implement their protocol and format. I have found this to be true no matter which feed I use. The good news is that all you need to do is make a parser.
What was the price of Google at 4:00PM? Who can say. Which part of 4PM? Would the price at 4PM would be something like the final print to the tape of the closing auction? Is it the auction midpoint? The price is what you can transact at which can be very different then what you see printed. ;-P
A final note: If you are building a trading system of some sort pay for your data. It should be cleaner than trying to assemble it. The exchanges charge for data and there is no real way around it. If you can't afford a couple of hundred bucks a month for some data then you probably don't have enough capital to be trading.

Concerning Bloomberg, I just called them & they said that they only provide market data for personal use. So you cannot show it on your site, but you can do whatever you want with it as long as you don't publish it.

Related

Wiki, Content Management or Roll my Own? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Oh, Collective Wisdom of the Crowd,
I've been handed the Task. My inclination is to go code but as I'm old and weary I'm aware of my total ignorance in Web coding, as well as my tendency to code instead of using off-the-shelf parts (a.k.a. NIH) and the similarity of the Task to the problems solves with Wikis and Content Management Systems. So, my question is: Solve the Task using a Wiki, A Content Management System or roll my own site?
The Task:
I've videoed a three-days sports event of my Ninjutsu club, and have created a set of three DVDs, containing many "chapters". Most chapters consist of an explanation and demonstration of a technique, followed by the instructor moving around and correcting people.
The big Honcho would like some of the senior students to review the DVDs prior to production.
One way would be to reproduce a few sets of the DVDs, mail said students, and have them e-mail me their comments. This, however, is low tech, not sexy, and I'm sure would not generate the desired involvement.
As an alternative I thought about creating a web site for this purpose, with the added value that the web site would later, upon release, serve as a companion to the DVD. First-draft requirements appear to be:
Allow each reviewer to pick up the part of the material he’s "the owner of" (i.e. responsible for).
Provide a web page for each DVD chapter, together with a navigation system.
Upon creation each page will contain and embedded video of that chapter.
Allow each owner to mark her sections as “OK”, “With Issues” or “Remove”.
Direct reviewers to pages with sections having problems, or not-yet-reviewed, or with high activity (i.e. interesting).
Allow reviewers to collectively document the techniques demonstrated in the video sequences, especially during the corrections when the instructor can’t be clearly heard as the speakers are turned off. Upond release this documentation will be "frozen" and provide additional insight into the technique, in addition to what was provided in the event.
Generate the basis for sub-titles.
In addition to above documentation, each such page will also contain discussions between the reviewers concerning the technique. These discussions will be visible on the page, below the video and the documentation, unlike Wikipedia where the discussions appear on different tabs.
When documenting the techniques, the instructors will be able to create and use a collection of terms – names for the techniques. These names will be collected into a central ontology, together with their translation, and will later be used to index the content.
Hebrew support of the content is mandatory
The site will have the ability to contain translated versions of the content where the
user can choose the language she will use. So after release a Spanish speaking student who have purchased the DVD and gone to the site would be able to look at a Spanish translation of the documentation.
I know, I know this is a tall order and I'm only an egg :(
Stick with the email for the review; ninjas <3 email. Enforce participation through intimidation. Focus on shortening the time-to-release, IMO.
Use the time to figure out if creating an online back-end is worth it for the release--even a good martial arts video doesn't sell a lot of copies; if you're not Hatsumi or Hayes, even less.
It looks like the biggest requirement is I18N and comments.
I'd go with a Wiki; its collaborative model of content creation is perfect for things like this, and many support translations--although keeping up with the translating can be problematic. Wiki gardening is time-consuming and non-trivial, adding a layer of translation...
Although it'd give a whole new meaning to ninja edits.
Potential revenue or the emotional investment will dictate the scope of the project, but here's a couple of ideas to consider:
Ticketing system to allocate the work to users, track progress, define state of completion. I recommend the open source Request Tracker. This would be the easier option to implement in terms of management of the project, but doesn't touch on the l18n or the web development.
OR
A Component Content Management System to act as database and publishing tool. I would suggest the open source Pressgang CCMS. This would take more effort to implement but offers the features of Request Tracker with the addition of publishing output functionality (especially in terms of the use of DocBook XML and Publican). It is also built to work with the open source translation tool Zanata.

High Volume Geocoding API? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I would like some recommendations on a high volume Geocoding API. I've reached out to Google and Yahoo so far. Google wants too much money for too little offering and Yahoo doesn't have a commercial offering.
I need to geocode about 250,000 items per day initially, but this number will grow exponentially in the near future, so I need a solution that will grow with us.
Any thoughts?
There are many providers that offer bulk and/or batch geocoding. You can also purchase datasets, depending on your accuracy and coverage area requirements. As one example, Microsoft offers a solution. I can not vouch for its quality.
I'd get on the horn to SimpleGeo and see what kind of deal could be put in play; they're a startup so probably hungry for volume business.
Otherwise, I'd probably start looking to the source rather than brokers like Google, e.g., TeleAtlas, but that's bound to be painful.
If you're daily numbers will be exponentially larger than 250K, e.g, 1.5e+16, you're bound to be repeating a lot of queries; find a way to clean them up/normalize them to increase cache hits and shove them into memcached to keep the third-party queries down.
This is a helpful resource, stumbled across when looking for ways to get TIGER data which has to be free, but US only: http://www.vterrain.org/Culture/geocoding.html
One comment on simplegeo , their api is simple and the you can query large number of records in a very short term . But their geo coding quality is not as good as Google or even Bing. I had many cases where I got the same coordinates for different set of addresses.
I'm looking for a solution my self and I'm testing the MapQuest api , seems like there is not rate limit (per say).

System Analysis and design of A social Network [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
Is It possible to perform a system analysis and design for a Website ( particularly a social Network ) ?
What are the Expected contents will be , In the document ?
can u provide an example , please ?
{ I made a social network (www.sy-stu.com) as to be my graduation project and I want to add a full analysis study to the graduation document , I do have experience in UML and Usecases just the Idea of an analysis of a website is not clear and never perform one before }
thanx in advance
This sounds very ambitious, but I'm sure it's possible. Unfortunately, I've forgotten a bit of System Analysis, but do adhere to many of its guiding principles for my own projects. In fact, I would say that most data-driven Web sites are excellent candidates for Systems Analysis and should be used always during Web planning for any project you plan on putting into production.
Straight from the wiki:
The development of a feasibility
study, involving determining whether
a project is economically, socially,
technologically and organizationally
feasible.
Conducting fact-finding
measures, designed to ascertain the
requirements of the system's
end-users. These typically span
interviews, questionnaires, or
visual observations of work on the
existing system.
Gauging how the
end-users would operate the system
(in terms of general experience in
using computer hardware or
software), what the system would be
used for etc.
For the first point, I would analyze different technologies such as ASP.NET, Ruby on Rails and PHP. Each technology has its strengths and weaknesses. One key thing to keep in mind is if you plan on making your social network free, you may consider open source technologies over proprietary - as many servers and application frameworks for proprietary projects are costly. I would also consider Web startup and hosting fees. If you plan on getting a reseller account with Host Gator, then you would need to factor in monthly billing costs. If you plan to host your own servers, you may be amazed at the cost of doing so. For a truly stable system, you would need to put a lot of work and cash into managing your own Web servers.
For the second point, you could probably locate plenty of information on user requirements from similar sites - just check out forums for DIY social networks and see what people are having issues with in the Technical Support section. Obviously, looking into technology based articles and magazines would be a good place to search on end user expectations - or even just joining Facebook and Twitter - see what they are doing since people seem content.
For the third point, again you can consult your competition and see how the user interface works out. Is it easy to use? Is it difficult in some aspects? If you had to use their system for 8 hours a day at least 5 days a week, what would drive you mad and how would you do it better? And keep in mind logical work flow as well. Knowing your user base is important too. In some systems, you may be developing for other programmers. Using strong jargon may be fine, but for a social network you must remember that they aren't familiar with Web site data flow and terminology. So your controls should still make sense to a computer novice and still work securely (don't forget system security too!) and in an organized fashion.
Finally, remember that things happen. I recently created a back-end site for a client of mine. I though the system worked very well - and they were very pleased, but I just got an email today that they want the way order items are stored to work differently. This is why there's a maintenance aspect to the System Development Life Cycle - things change after you finish deploying. It could also be said that if I had communicated with my client's needs more closely, this could have been resolved. Fortunately, the change is relatively minor, and we do live in a real world where things don't always work as we expect. We just do our best :)
As I said earlier, Systems Analysis is a lot of work and should be. The point of it is to determine that what you are trying to accomplish is feasible and practical without committing to a long term project that could span years. And always remember that no plan is perfect. If there were perfect plans, we wouldn't need new systems :).

What data source could I use for my stock market program? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I would like to make a free open-source C++ application for both Linux and Windows which will create live stock market charts (i.e. they're refreshed frequently).
Please could you give me some pointers on these issues:
What should I use as the data source? Are there free services I can implement? I would like to use the same or similar information as companies like Google.
I'm not sure what GUI toolkit would be best to use, is there one which has charting built in, or would I need to use a specialized library for this?
Some things to note:
This is my first attempt at both cross-platform C++ development, and a GUI application for Linux.
I'm based in the UK, so I'd like to use data sources that provide information for the London stock exchange (LON) as well as NASDAQ, etc.
As of Nov 2014, these links are dead.
Google Finance API: http://code.google.com/apis/finance/
Yahoo! Finance API: http://developer.yahoo.com/finance/
Cross-platform C++ charts w/ Qt: http://www.int.com/products/2d/carnac/chart_component.htm
Assuming the rules in the UK are the same as in the US, you basically have 3-tiered choices.
You can hack together a lame feed from things like Google or Yahoo but you absolutely are not getting every tick, if that is what you are after.
A step up from the obvious internet sources are some of the online brokers. Their data is more reliable and timely but obviously you need an account and they have to offer some kind of API. Check into something like InteractiveBrokers.com. They are mostly java centric but offer a Window's based C++ DLL as well. Several other brokers offer similar APIs but IB is excellent in that it covers a multitude of exchanges including, I believe, London. They also make it relatively easy to transfer currencies if that is a concern.
Lastly you have to go to commercial brokers. You can find them easily enough with a search but be prepared to pay a couple of hundred dollars per month minimum.
I think Mark's suggestion of QT is a good way to go for a GUI. Java tends to be adequate for putting up a grid of running quotes but tends to fail in the charting area, IMO.
You said you wanted "live" market charts. If you mean real-time, you will never get that for free. All the data you see on google etc is delayed, usually at least 15 minutes, and they don't get every tick.
If a delay is not a problem and if you are only interested in daily data, you can easily get historical data for free via simple HTTP request using this historical data API.

Timezone lookup from latitude longitude [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
Is there any library (or even better, web service) available which can convert from a latitude/longitude into a time zone?
I looked fairly deeply into this question for a project I am working on. GeoNames.org and EarthTools.com are both good options for many situations but with the following serious flaws:
GeoNames.org finds the time zone by searching for the nearest point in their database that contains a time zone field. This often leads to the wrong result near borders. It is also painfully slow, leading to query times on the order of a couple seconds per request. It also doesn't return a valid time zone if there is no item in their database near the query point. GeoNames also restricts the number of queries that can be made per day, making bulk operations difficult.
EarthTools.org uses a map and is able to return queries quickly, but it doesn't take into account daylight savings time for most locations, and it returns a raw offset rather than a time zone ID (i.e., they return "GMT-7" instead of "America/Chicago"). Also, I just looked at their page while preparing this post and Google Chrome warned about malware on their site. That is new to me and it may change, but is obviously a cause for concern.
These flaws meant that these existing tools were not suitable for my needs so I rolled my own solution and have published it for general use. You can find it here:
http://www.askgeo.com/
AskGeo is based on a time zone map of the world, so it returns a valid time zone for every valid latitude and longitude. It returns the standard time zone ID (e.g., "America/Los_Angeles") used on Linux and most other operating systems and programming frameworks. It also returns the current offset, taking full account of daylight savings time.
It is extremely easy to use and usage is documented on the main page of the site. The API supports batch queries, so if you need to do a lot of look-ups, please use the batch interface rather than bog down our servers with serial requests. The bulk queries are also much faster, so everybody wins.
When we first launched this, we built it on Google App Engine (GAE) and made it free to all users. This was possible because GAE's prices were so low at that time. Since then, our server load has increased substantially and GAE's prices went way up. Both factors combined led us to switch to Amazon Web Services for hosting and to start charging for commercial use, while keeping the service free for non-profit, non-commercial open source projects, and researchers. For commercial users, we provide 1000 free queries to let potential customers evaluate the API to make sure it meets their needs. See the web site for pricing and terms.
The underlying library was written in Java and due to popular demand, we also released the library under a commercial license. Full documentation of the library and pricing details are on the web site.
I hope this is useful. It certainly was useful for the project I was working on.
Take a look at Geonames.org
It's a free webservice that allow you to get a lot of informations from a long/lat
They also provide a free (and open source) Java Client for GeoNames Webservices library (library for other language also provided: ruby, python, perl, lisp...)
Here's some info you can get from long/lat: (complete list of webservices here)
Find nearest Address
Find nearest Intersection
Find nearby Streets
Elevation
Timezone
Timezones are now available via Google API
https://developers.google.com/maps/documentation/timezone/
The Yahoo places API provides timezone information via reverse geolocation.
Check it out.
http://developer.yahoo.com/geo/placefinder/guide/requests.html
Eric Muller has made shapefile maps for the timezones of the tz (Olson) database. A few minor caveats, though:
The boundaries used are often unofficial.
It isn't updated as regularly as the tz database itself, so some newly-formed or -adjusted zones may be missing.
Those aside, however, it seems to be very accurate for most purposes.
How much accuracy do you need? Dividing the longitude by 15 would almost be right :p
Project dead :-/
These look pretty promising-
Archive link:
https://web.archive.org/web/20150503145203/http://www.earthtools.org/webservices.htm
DRT Engine takes a latitude, longitude and local datetime and returns a timezone offset. This can be used to establish the timezone of a particular location at a future date.