I'm looking at youtube api v.3 for a project for a client. They want to know for how long the applicaiton will work without maintenance. Is there any dates presented for how long api v.3 will be supported, or at least a version history for when the previous api:s where created and depricated?
The answer can be found in the Terms of Service:
Deprecation.
Google will announce if it intends to discontinue or make backwards
incompatible changes to this API or Service. Google will use
commercially reasonable efforts to continue to operate those YouTube
API versions and features identified at
http://developers.google.com/youtube/youtube-api-list without these
changes until the later of: (i) one year after the announcement or
(ii) April 20, 2015, unless (as Google determines in its reasonable
good faith judgment):
required by law or third party relationship (including if there is a
change in applicable law or relationship), or doing so could create a
security risk or substantial economic or material technical burden.
The above policy is the "Deprecation Policy."
Related
I am trying to wrap my head around Microsoft Cloud for Sustainability. Apparently it's a solution based on Microsoft Dynamics. I need to have more back-end to that solution, because as it is right now I'm either lacking permissions (or extra paid access to Microsoft resources) or missing a chunk of documentation, because I'm unable to:
Change default language across the board - I can switch MS Dynamics to any language I want, but it will work for a shell only. Anything that's CfS specific, is in English. Do I remove the demo data and import my own scopes and data? As only thing available are database and Cube for BI analytics and JSON files describing CfS structure in general (that's in CDM), do I really have to create it from scratch? This brings me to second question:
Access entry-level data that's already in demo version - I need to see what's in the database the CfS is using, or be able to modify it. Is there any way to get to it via Business Central, if at all possible?
Since I will be preparing several presentations for potential customers, I need a way to quickly create a dataset based on initial and very basic information provided by each customer, how can I do that with trial user
I work for a company that's Microsoft Certified Partner, so logically resources for what I need should be available to me, but either links in the documentation are dead (and some are, as they redirect to general info) or require some special access level (or are dead, but error message is really not helpful at all).
Is there somewhere else I can go? The Documentation page offers little towards what I need...
P.S. I think I should tag this question with CfS specific tags, but not enough rep...
Refer to 7.2 Deprecation Policy. Google will announce if it intends to discontinue or make backwards incompatible changes to the Services specified at the URL in the next sentence. Google will use commercially reasonable efforts to continue to operate those Services versions and features identified at https://cloud.google.com/terms/deprecation without these changes for at least one year after that announcement, unless (as Google determines in its reasonable good faith judgment)
Question, what is the announcement will be made? Is it email notice to project owner, or announcement on official website?
Thanks
Usually when you are being affected by the deprecation of a GCP tool, the project owner gets an email mentioning what they are going to be affected by as well as some guidance to be able to decrease the impact of that.
Also, they make an announcement on the documentation with the dates in which they plan to deprecate the tools. Something like this is usually added.
Hope you find this useful!
I currently work for a company that uses allot of Sitecore servers and has many dev seats across Europe.
A problem that I have run into is that we desperately need a testing environment for Smoke Testing, Automation Tests and other Manual pre QA deployment.
The internal department that deals with licenses says that kind of environment is classed a a full server and requires the full license fee (which has allot of zeros!!)
Because its an enterprise business we are now in a catch 22 situation. I have heard that spinning up a new VM on the machine I am developing on is allowed on a developers license / and I can reusue my developer license on any machine as long as i am the only person that uses it.
So, if our tester sets up his own test machine that only he uses, its covered by his developer license? That thing will be rebuilt several times a week and never have anybody else connect to it really, maybe other developers. (license overlapping?)
Anybody have any similar issues or solution? I need to provide formal proof if I have any chance of pushing this forward. (I contacted sitecore also but it may take a while for them to come back, Just looking too see if anybody else may help in the mean time)
I have experienced this same scenario with several clients who did not purchase licenses for their test servers but are now wondering if their developer seats can cover this. I have always recommended that a separate server license be procured and not to attempt to use the developer seat.
You state that you need a 'formal' proof. That can only be obtained from your Sitecore sales rep. They are usually very quick to respond to clarify licensing questions on what your particular licensing agreement covers for your organization.
If you are working with an implementation partner, they may also be able to help you understand your licensing, but in most cases they would still need to confirm with your Sitecore sales rep.
Sitecore 8+ licensing structure has changed and will allow you to create multiples of Virtual Machines using a single license. This can be leveraged for test systems, load balancing, pre-production or quality assurance uses.
IMHO: The only reason they did this was to probably get onto the
"cloud" marketing train and realised their 1990's extremely restrictive licensing terms
needed to be overhauled as it prevented them selves from using their
own software in virtual machines.
So prior to Sitecore 8, No. You have to have a full license for each machine and each machine.
Basically extremely restrictive licensing that cost a fortune, as that is the Sitecore business model.
I was wondered whether there is a survey or report of the current state of browser compliance with the three Cookie specifications: Netscape’s original draft, RFC 2109, and RFC 2965 that obsoletes RFC 2109.
I know that, due to its age, Netscape’s draft will be supported by most clients. But some recommend not to use it any more, e.g. this tutorial on Apache’s HttpClient:
Netscape draft: This specification conforms to the original draft specification published by Netscape Communications. It should be avoided unless absolutely necessary for compatibility with legacy code.
So what about the other specification? Are they ready to be used yet?
The consensus seems to be that they still aren't ready to be used yet. Some of the reasons for that are mentioned here and mostly relate to browser compliance.
However, on a hunch, I suspect your motive for asking this might relate to the session hijacking problem that has been brought into the limelight by applications like FireSheep.
If that's the case, I came across an interesting paper proposing a solution to the problem called OTC's—one-time cookies. It might be worth a read. It's title is One-Time Cookies: Preventing Session Hijacking Attacks with Disposable Credentials and it's from 4 PhD students at Georgia Tech.
(In case that google Docs link doesn't work here's a direct link to the PDF.)
In summary, it basically concludes:
While completely replacing HTTP with HTTPS will improve the overall security of the Web, it can be a challenging and complex project for some web applications . . . As a result, many web applications will remain vulnerable while site-wide HTTPS is being deployed, a process that is likely to take several years.
...
By relying on a well-known cryptographic construction such as hash chains, OTC creates disposable authentication tokens that cannot be reused, providing more robust session integrity . . . OTC is considerably more efficient than HTTPS and has approximately the same performance as current cookie-based mechanisms.
It's a very interesting read. I hope that helps someone in some way,
~gMale
The most recent survey out there seems to be the one written by Ka-Ping Yee in 2002, which is considered ancient in the evolution of WWW/Internet. The upside is that it surveyed 12 browsers across 3 OSs, which may give an fair insight about how they adapted cookie management.
Yee, Ka-Ping, "A survey of Cookie
Management Functionality and Usability
in Web browsers,"
http://zesty.ca/2002/priv/cookie-survey.pdf,
2002.
Another more recent article, although less relevant, is written by Yue, Xie, and Wang in 2009 (published in 2010). It conducted a large-scale study on HTTP cookie management with more than 5000 websites, using a system that can automatically validate the usefulness of cookies from a website and set the cookie usage permission on behalf of users.
Chuan Yue, Mengjun Xie, and Haining
Wang, "An Automatic HTTP Cookie
Management System," in Journal of
Computer Networks (COMNET), 54(13) pp.
2182--2198, 2010.
You might want to check
http://lists.w3.org/Archives/Public/www-tag/2011Mar/0021.html
which refers to
http://www.ietf.org/id/draft-ietf-httpstate-cookie-23.txt
This is intended to obsolete RFC 2965.
"Document Quality
This document defines the HTTP Cookie and Set-Cookie HTTP
header fields as they are presently utilized on the Internet. As a
result, there are already many implementations of this specification."
Web services and web APIs have managed to increase the accessibility of the information stored and catalogued on the internet. They have also opened up a vast array of enterprise power functionality for smaller thin client applications.
By taping into these services developers can provide functionality that would have taken them months perhaps years to set up. They can combine them into single applications that make life generally easier for its users.
Whether displaying information about the music being played, finding items of interest in the locale of the user or just simply tweeting and blogging from the same application - the possibilities are growing everyday.
I want to know about the most interesting or useful services that are out there, especially ones that most of us may not have heard about yet. Do you maintain an API or service? or do you have a clever mash up that provides even more benefits than the originals?
YQL - Yahoo provide a tool that lets you query many different API's across the web, even for sites that don't provide an API as such.
From the site:
The Yahoo! Query Language is an
expressive SQL-like language that lets
you query, filter, and join data
across Web services.
...
With YQL, developers can access and
shape data across the Internet through
one simple language, eliminating the
need to learn how to call different
APIs.
The World Bank API is pretty cool. Google uses it in search results. My favourite implementations are the cartograms at worldmapper.
(source: worldmapper.org)
It's very niche, but I happen to think the OpenCongress API is amazing.
Less niche: Google Translate has an API which will guess the language of something. You'd be AMAZED how frequently this comes in handy (even though it's not as tweakable as you'd like and is not trained on small samples).
I was just about to have a stab at using the SoundCloud API
I know many people who already use for sharing their musical masterpieces and its a pretty good site. Hopefully the api will be as well!
I like the RESTful API for weather.com. It's free and very useful for the new age of location-aware apps: https://registration.weather.com/ursa/xmloap/step1
It does require registration, but they don't spam you or anything - it's just to provide you a key to use the API.
Ah yes - here's another one I've been meaning to check out but haven't tried yet
The BBC offer a bunch of apis/feeds that look very promising
http://ideas.welcomebackstage.com/data
They include apis for accessing schedule data for both TV and Radio listings along with all kinds of news searches. It even looks like they'll be offering some sort of geo-location service soon so it will be interesting to see what that has to offer
Another interesting one for liberal brits! ;)
The Guardian news paper have their own api
http://www.guardian.co.uk/open-platform
MuiscBrainz
Excellent service for music mashups.
Not so many knows that Last.FM initial database was scraped from this service.
The United States Postal Service offers a web service that does address standardization. Quite useful in reducing clutter and cleaning data before it gets put into your database.