I'm looking to use ApiAxle to offer a prepaid API to my customers. I want to collect payment in a different system, and have that system add credits for API calls to ApiAxle. ApiAxle would then decrement the amount of credits available whenever an API call is made, and return an error if the user is out of credits.
For example, the user would buy 1,000 API calls for $1 in my external system. The external system would make an ApiAxle API call to credit the user's API key with 1,000 calls. When they tried to make their 1,001th call, it would return an error instead of hitting my API endpoint.
Is this possible? If not, is there another free API management platform that could handle this?
I ended up modifying apiaxle-proxy/node_modules/apiaxle-base/app/model/redis/api_limits.js. I added a Quota that is decremented with every API hit, and is set directly in Redis by my billing system. Not ideal, but workable.
I'm still really surprised this isn't built into ApiAxle. Prepaid APIs aren't that uncommon.
Related
Is there a REST or Node.JS library API that provides global network metadata for Bitcoin and/or Etherum ?
The metadata I'm looking for is -
Average wait time for a transaction confirmation on the network
Average fee cost per transaction on the network
I know I could crawl/parse one of the many sites that provide this data, but that's not ideal. Hence I'm looking for a dedicated API to obtain this information.
Eventually I found out of places like CryptoCompare that provide that data
I have an App that consumes my own API (Google Cloud Functions) and my own Storage (there are images).
Now, I have a couple of clients, that wants to consume my API and my Storage (A Google Cloud Bucket).
The Cloud Storage is a bucket that contains a lot of photo that have Public Read Access.
I'm trying to define a tier pricing model, in which the price depends on 2 things:
The number of API calls,
The Cloud Storage Bandwidth
Meaning, I want to set some pricing in relation to the costs they are consuming on my Google Cloud account.
To give an example:
If a client does between 1 and 500.000 API calls, I'll change them 10 dollars. Between 500.001 and 1.000.000, I'll charge 18 dollars, etc, etc.
Same thing for the Cloud Storage Bandwidth, if they consume between 0GM and 10GB, it's going to cost 10 dollars. If they consume between 10GB and 100GB, it's going to cost 18 dollars, etc, etc.
How can I do it with Google Cloud? How can I know how my clients are consuming? And is there a way to share that information with them, so they are able to monitor the usage every day?
I'm thinking that measuring the API usage is not going to be THAT hard, because I can just save a value in the DB every time the user calls the API, but if there is a way to avoid it, will be good, due to Google Cloud is going to charge me for that DB write action (that I use to track the API usage).
On the other hand, for measuring the Cloud Storage, I was thinking something like this:
Let's suppose I have a Public Bucket with photos in the URL: buckets.google.com/photos.
If my client wants to get the /cats/ugly-cat.jpg photo, I can ask them to call A FUNCTION in /api/get-photo/?url=/cats/ugly-cat.jpg, so there in that Function a can track that the user just get a photo, and then I redirect the call to the real URL where the user is going to see the photo (buckets.google.com/photos/cats/ugly-cat.jpg). As you can see, this idea seems to be too ppor performant, due to it's going to charge the Function usage, the DB write, and also the Storage bandwidth usage. And even, that way doesn't track the Bandwidth. It only tracks the number of photos that the client wants to show.
As you can see, both ideas are a bit ugly, with poor performance.
There should be something already done that makes it beautiful.
Obviously, the API call (and also the photo link) may have the client API-KEY, to help to measure the usage. Something like:
functions.google.com/api/search-photos/?api-key=111, and
bucket.google.com/photos/cats/ugly-cat.jpg?api-key=111
Where 111 identifies the client 111.
So, the question: Do you know if there is a "best-known" way to do measure those usages?
I think Cloud Endpoints is the best solution for you because managing your API as you suggest might get unwieldy quickly.
Endpoints provides all the tools to control authentication, quota and cost management and a developer portal so your users can access documentation and interact with your API. It also integrates with all Cloud Platform products including Cloud Functions.
I'm building a website to allow people to donate to a local charity quickly and easily. The charity allows direct donations, but it's primary function is to do "per mile" style donations, but with pull ups. In that past, they have collected the pledges ("I'll pay $1 per pull up"), then manual contacted people for payment after the event. This isn't very slick and very time consuming.
What I'd like to be able to do is collect a pledge and payment information, then charge people automatically after the event. From what I've seen, I should put a hold/authorization on their account, then capture it with the appropriate amount after the event. But reauthorizing will only allow up to 115% of the original, and I can't very well just authorize a large amount and let it sit for two months before reauthorizing and capturing it.
I know this can be done, but I haven't messed with this side of things before, and the REST API from paypal doesn't have an obvious solution. Is there something I'm missing? Should I be going about this a different way?
You can use reference transactions. I would recommend sticking with the Classic API for now, though. REST isn't as mature yet and doesn't have all the same functionality quite yet.
So in the classic API you would use Express Checkout and/or Payments Pro. You can process an original authorization and then simply void it, or use the card verification process with Payments Pro.
You won't need to capture an original amount, so you won't need to worry about the 115% cap on the capture.
Instead, you'll use the DoReferenceTransaction API to process any amount you need to at any time from that user's account account.
With Express Checkout you have to be sure to include a billing agreement in the setup. This guide outlines that whole process.
With Payments Pro you just do the original card verification / auth and then pass that auth ID into the DoReferenceTransaction API.
In either case, if you're working with PHP this PayPal PHP SDK will make all of the API calls very quick and easy for you.
I have a .Net application that uses list of names/email addresses and finds there match on Facebook using the graph API. During testing, my list had 900 names...I was checking facebook matches for each name in in a loop...The process completed...After that when I opened my Facebook page...it gave me message that my account has been suspended due to suspicious activities?
What am I doing wrong here? Doesn't facebook allow to search large number requests to their server? And 900 doesn't seem to be a big number either..
per the platform policies: https://developers.facebook.com/policy/ this may be the a suspected breach of their "Principals" section.
See Policies I.5
If you exceed, or plan to exceed, any of the following thresholds
please contact us by creating confidential bug report with the
"threshold policy" tag as you may be subject to additional terms: (>5M
MAU) or (>100M API calls per day) or (>50M impressions per day).
Also IV.5
Facebook messaging (i.e., email sent to an #facebook.com address) is
designed for communication between users, and not a channel for
applications to communicate directly with users.
Then the biggie, V. Enforcement. No surprise, it's both automated and also monitored by humans. So maybe seeing 900+ requests coming from your app.
What I'd recommend doing:
Storing what you can client side (in a cache or data store) so you make fewer calls to the API.
Put logging on your API calls so you, the developer, can see exactly what is happening. You might be surprise at what you find there.
I'm building a site with django that lets users move content around between a bunch of photo services. As you can imagine the application does a lot of api hits.
for example: user connects picasa, flickr, photobucket, and facebook to their account. Now we need to pull content from 4 different apis to keep this users data up to date.
right now I have a a function that updates each api and I run them all simultaneously via threading. (all the api's that are not enabled return false on the second line, no it's not much overhead to run them all).
Here is my question:
What is the best strategy for keeping content up to date using these APIs?
I have two ideas that might work:
Update the apis periodically (like a cron job) and whatever we have at the time is what the user gets.
benefits:
It's easy and simple to implement.
We'll always have pretty good data when a user loads their first page.
pitfalls:
we have to do api hits all the time for users that are not active, which wastes a lot of bandwidth
It will probably make the api providers unhappy
Trigger the updates when the user logs in (on a pageload)
benefits:
we save a bunch of bandwidth and run less risk of pissing off the api providers
doesn't require NEARLY the amount of resources on our servers
pitfalls:
we either have to do the update asynchronously (and won't have
anything on first login) or...
the first page will take a very long time to load because we're
getting all the api data (I've measured 26 seconds this way)
edit: the design is very light, the design has only two images, an external css file, and two external javascript files.
Also, the 26 seconds number comes from the firebug network monitor running on a machine which was on the same LAN as the server
Personally, I would opt for the second method you mention. The first time you log in, you can query each of the services asynchronously, showing the user some kind of activity/status bar while the processes are running. You can then populate the page as you get the results back from each of the services.
You can then cache the results of those calls per user so that you don't have to call the apis each time.
That lightens the load on your servers, loads your page fast, and provides the user with some indication of activity (along with incrimental updates to the page as their content loads). I think those add up to the best User Experience you can provide.