Stripe - API Request rate limit exceeded - Firebase Cloud Functions - google-cloud-platform

Let's say I'm creating a PWA (Progressive Web App) where products can be added by users.
Prices of these products are variable from 0,01 EUR to 1,00 EUR.
I'm using Stripe for payments.
The Stripe Order object do not support dynamic price, passed on the fly, without any reference (kind of foreign key).
To accept the Order, Stripe needs a reference to a SKU.
This SKU will be, in my case, a variation of the price, on the product.
It means that, to cover all variations, I need 100 SKUs, from 1 (0.01 EUR) to 100 (1,00 EUR).
So, for each product created in Stripe, I need to create 100 SKUs in Stripe.
I tried to insert a test dataset of 200 products, which means (200 products + (200 x 100 SKUs)) = 20200 requests.
I got a surprising "Request rate limit exceeded" error from Stripe.
Less than half of records where created... :(
That "Request rate limit exceeded" is the core of the problem.
Right now, the insertion process is the following (x 200):
Create product in Firestore.
Firebase cloud function listener :
OMG new product inserted in Firestore. Ok let's :
Import official nodejs Stripe & Algolia libraries
Create product in Stripe to make it billable
Create the 100 SKU related to the product in Stripe, with Promise.all (This is where, at some point, I end up with a rate limit error, because my concurrent cloud functions instances are using the same Stripe key, which means the same Stripe account)
Create product in Algolia to make it searchable
I need solutions to counter this Stripe API rate limit error.
I have several solutions in mind :
Solution 1 :
Be able to increase Stripe rate API limit for a given amount of time.
Not sure this is possible.
Solution 2 :
Be able to use differents Stripe keys, then rotate over them, to perform admin stuff, such inserting multiple products/SKUs in Stripe.
Ultimately on production, be able to create programmatically 1 Stripe key per user, so each user would have its own limit.
Not sure this is possible.
Solution 3 :
Slow down insertion process in javascript.
Don't know how to perform that.
Besides, Cloud functions have a budget/limit of 60 seconds for javascript execution. So I can't delay too much.
Solution 4 :
Delay work using Pub/Sub (?), or Firestore Triggers
For example, having an integer in Firestore, that each function call increments, and same function listen the write to re-increment he number, etc, etc, etc, until the number equals 100 for the 100th SKU. That solution would sequentialize the 100 SKUs writes in Stripe.
Not sure this will really slow down enough the work to be under the API rate limit. In addition, such a solution would cost lots of money : 100+ Firestore writes, and 100+ functions calls to perform these writes, for only one product, which means 20000+/20000+ for the 200 products. That would be expensive.
Solution 5 :
Perform Just-In-Time insertions, when user pays.
The server side algorithm, after a Payment Request API call, might look like this :
Create order in Stripe
If error 'No such sku...' catched {
For each SKU { // Ideally filter here SKUs to create (only those in error)
If price not between 1 and 100 {
continue // Bad price, not legit
}
Create SKU in Stripe
If error 'Already exists' {
continue // no creation needed for that SKU
}
If error 'No such product...' catched {
If productId does not exists in Firestore {
continue // Bad productId, not legit
}
Create product in Stripe
}
Create SKU in Stripe
}
}
Create order in Stripe
This last solution could do the job.
But it might comes with some delay for the user when it executes payment, which could increase stress. Plus it might increase Stripe calls during the business hours. Many purchases in same time could lead to a Stripe API rate limit error, especially with well furnished carts (let's say an average of 30 products in the cart, so in worst case 30+ HTTPS calls during payment, times 1000 users = 30000 calls => Stripe error). That problem might decrease over time for a given product, because once a SKU is created it is created definitively. Still, as there would be new products, so products with zero SKU at creation, every day, the problem remains.
What do you think ?
Do you have any other ideas ?

Solution 3 and Solution 5 with some tweaks will work best.
Solution 3: You can limit number of concurrent requests to Stripe using async module's forEachLimit or queue.
Solution 5 : Just in time insertions is also a good option as it won't put much load on Stripe server at same time. Regarding your concern of getting the same error during business hour, it will a very rare case as Stripe APIs are built to perform very well. But if you still have doubt regarding this what you can do is to have a Background process for adding SKUs during non-business hours, which will keep on creating SKUs for you without encountering Stripe API rate limit error.
Solution 6 (Modified Solution 5): Have just in time insertions but also create an extra API request to your server whenever a product is entered in the cart which will then check if the SKU exist in Stripe and if not then create it in the background before cart payment happens.

Solution 6 :
Same idea (JIT), but moving SKU creation from payment time to product selection time. Each time a product is selected, try to create the product and its current SKU (price variation) in Stripe. This way, Stripe calls should be more distributed in the time. Or maybe it will ends with more API calls, as we select products more often than we pay, because users can select & unselect products, so they might end with more products selected during their journey than the sum of products finally being paid in the cart ?
Solution 7 :
Same idea (JIT), but with SKU cached in Algolia or Firebase, so I can perform "does this SKU exist ?" calls without querying Stripe, which should reduces Stripe calls if the existence test is performed before the create call (we do not call Stripe.skus.create() blindly). The drawback is, that Firebase and Algolia are exposed in Front so the SKUs and prices will be too, and this is a potential source of threat, so another index, dedicated and only known by the server, has to be used.

Related

Amazon obtain current buybox prices

I have a list containing Amazon products (ASINS). I want to update the buybox price in my list like every 5 hours. I am an registered Amazon seller so I do have access to the Selling Partner API - Amazon-Services-API. But the issue here is the rate limit. It is only 0.5 requests per second.
I have like 500k products in my list, it would take like multiple days with a rate limit of 0.5 requests per second.
There are serveral tools like scanunlimited or analyzer.tools which are able to obtain the current buybox price of a product way faster. Where are they getting their live data from? Am I missing out on some API?
Does anyone have an idea, how I can gather the data more quickly then 0.5 requests per seconds?
Kind regards

How to build events aggregation service for high load system with DynamoDB

I'm working on an Ad-tech system which serves millions of users.
Basically users (non anonymous users) can see different Ads that are being created by the marketing team.
Our marketing team want to be able to set some Frequency caps on those Ads (among other targeting rules they already have)
For example:
"We should not show this ad for a user if he already seen/click this ad more than X times in the last Y days"
Also ads can be grouped to campaigns, so rules like that are also possibile:
"We should not show this for a user if he viewed more than X times ads in this campaign in the last Y days".
Also our marketing might wanna know how many people viewed/click a specific add in the last Y days.
We have roughly 200K RPM and our responses should be very fast.
The smallest unit of time for our queries is one day and it will not change.
Few questions and thoughts:
Is DynamoDB a good fit?
I thought about creating a table for each event type (Click/View/Close..)
What is the best way to configure the primary key?
I thought about settings the primary key as the user id and the sort key as a combination of the ad id and the current day {dd/mm/yyyy}
I thought about use "ADD" operation to increase the counter when a user click/view/.. an Ad in a specific date. are they expensive operations? do I have an alternative?
What is the best way I can use to also be able to query per ad and campaigns as well (for example: "all users views for all ads in campaign" or "get all ad views in the last 40 days) ) ?
What other considerations should I take in mind?
Thanks a lot

Authorize.Net Query on recurring billing

We have been trying to integrate authorize.net payment gateway in one of our clients project based on Asp.net web API. We have few queries that we came across while implementing Recurring Planning scenarios.
Query 1
We checked the API’s for Creating Subscription, Getting Subscription, Updating Subscription. However once we have created subscription, is there any way we can update the amount in the subscription.
Let’s say for example.
We have a created a subscription for our user for 50$ amount on 01st Jan 2021 with 30 days interval.
And on 15th Jan 2021, our user wishes to purchase 1 more license which will cost him 10$ more.
Hence can we increase his billing cycle of subscription by updating the subscription?
We checked in Update Subscription API, & it is only allowing to update credit card info hence is there any way to update amount.
Query 2
Is there any way to implement Autorenewal, hence when a user wishes he/she can set auto renewal on/off for recurring billing.
Query 3
If there is any way to switch off auto renewal of recurring billing, then is there any link that we can generate & send them through which they can pay there next due.
Query 1: You cannot update a subscription amount. If the amount needs to change you either need to cancel the current subscription and create a new one for the new amount (being sure to prorate credit from the previous subscription payment) or use CIM to manage your subscription service which allows you to charge against their card at your discretion but requires you to also manage the subscription yourself.
Query 2: Not through Authorize.Net. If you want a subscription to start or end you need to explicitly do so through their API.
Query 3: Not through Authorize.Net. That application logic and, once again, you would be responsible for managing.
I'm assuming you are using or are aware of the API provided for Authorize.net here: https://github.com/AuthorizeNet/sdk-dotnet/tree/master/Authorize.NET/Api/Controllers
Query 1: As of now, there is a way to update the amount for a given subscription. You can use ARBSubscriptionType class. There is an amount property there you can set. Then you can create the request ARBUpdateSubscriptionRequest, passing in the ARBSubscriptionType class and the subscription Id.
Note: You might have to handle pro-rating.
Query 2: There isn't a built in renewal feature in Authorize.Net as far as I know. It seems like you could potentially update the totalOccurrences by some amount to act as a "renewal", when technically its an extension of the subscription. The method in which you check when to update, either a Modulo operation or a date check is up to you. You can use paymentScheduleType class to update totalOccurrences, passing it along to a ARBUpdateSubscriptionRequest.
Query 3: Authorize.Net does not have any in house link generation.

"Average Daily Active Users" for Facebook API Limits

I've suddenly started running into API limits. I've been restricting my API calls to: number of users * 200, but I'm getting error #4 about once per day.
This calculation was based on the docs from end of 2015 that said number of users your app had yesterday, plus new logins today.
But it looks like that has changed to:
The number of users your app has is the average daily active users of your app, plus today's new logins
Can someone explain to me what "average daily active users" is? And is there a way I can get access to this number?
Some information on what I'm doing:
My app fetches pages and posts from pages. To do this, I hit the Facebook API to get user's liked pages. Then each hour, I fetch posts from pages the system knows about.
I do the following:
Batch requests (50 per batch)
I'm only fetching posts since the last fetch (using since, until and limit params. 90% of the requests return 0 posts)
I'm only fetching posts from pages my users like
I'm using my app token for these requests
I limit the number of calls per hour to users * 200
Batch Requests don´t reduce API limits, they are only faster, that´s all. That being said: You wrote that you are using an "App Token" for the requests - you should use a "User Token" instead. It´s still a LOT of calls though, the only thing you can do in addition is to reduce the amount of API calls.
I found this endpoint in the documentation: https://developers.facebook.com/docs/graph-api/reference/application/
I tested this via
https://graph.facebook.com/v2.10/<my_app_id>?access_token=<my_access_token>&fields=daily_active_users
And it returned
{
"daily_active_users": "152",
"id": "<my_app_id>"
}
It is not average daily active users though

Any way to find out my Amazon Product API Limits

I was given an Amazon awsAccessKeyId and awsSecretKey,
also our company has affiliated with Amazon, we get a Associate Tag​.
And I was told we may get higher API limits, because we are affiliated.
But I don't have any detailed info about the API limits,
I want to know how many calls i can make in a second
Is there any way I could check our API Key status?
The call i use will be check product info like:
Service=AWSECommerceService
&Operation=ItemLookup&ItemId=[ID]
&IdType=ASIN
.....
When you exceed the requests limit, Amazon Product Advertising API sends a (possibly gzipped) response with 503 status code. Example response for ItemLookup query:
<?xml version="1.0"?>
<ItemLookupErrorResponse xmlns="http://ecs.amazonaws.com/doc/2013-08-01/">
<Error>
<Code>RequestThrottled</Code>
<Message>AWS Access Key ID: YOUR-AWS-ACCESS-KEY-ID. You are submitting requests too quickly. Please retry your requests at a slower rate.</Message>
</Error>
<RequestId>fabebd87-54a2-44ec-b547-deb5feee900a</RequestId>
</ItemLookupErrorResponse>
The rules have changed since #at0mzk's answer.
You have to make sales to use the API. The limits are set by sales in the last 30 days.
Effective 23-Jan-2019, the usage limit for each account is calculated based on revenue performance attributed to calls (also called requests) to the Product Advertising API (PA API) during your account’s latest 30-day trailing period.
Each account used for Product Advertising API is allowed an initial usage limit up to a maximum of 1 request per second and a cumulative daily maximum of 8640 requests per day (TPD) for the first 30-day period after your account has been approved. Following that period, your PA API usage limit will solely be based on your shipped item revenue. Your account will earn a usage limit of 1 TPD for every 5 cents or 1 TPS (up to a maximum of 10 TPS) for every $4320 of shipped item revenue generated via the use of Product Advertising API for shipments in the previous 30-day period.
Docs
Apparently there's not way of checking exactly what those limits are and they will change based on the performance of your account.
It seems that the minimum rate limits are:
1 request per second
8640 requests per day
Those limits will increase if your account has a good performance (as in shipped items revenue) using the API links.
From: Amazon Product API - Troubleshooting
API Rates
Curious to know how we provision API call rates for Product Advertising API 5.0? First, some definitions:
TPS – Transactions per second, refers to the maximum number of API calls you can make in one second. Each API call counts as one transaction. For example, if you send 10 ASINs in the request parameter of a GetItems() call, it counts as a single transaction.
TPD – Transactions per day, refers to the maximum number of API calls you can make in one day. If Associate has 1 TPS and 8640 TPD, then maximum of 1 request can be sent per second and 8640 per day. Even if 1 TPS is there, once TPD is exhausted requests will be throttled.
Primary Account – This refers to the Amazon username (email address) and password that you used to create your Associates account and used to generate Product Advertising API 5.0 credentials.
Shipped revenue – This refers to the total sales volume of all items Amazon has shipped from orders resulting from clicks through links you created using Product Advertising API 5.0.
Also:
As soon as you create your Product Advertising API 5.0 credentials, you are allowed an initial usage limit up to a maximum of one request per second (one TPS) and a cumulative daily maximum of 8640 requests per day (8640 TPD) for the first 30-day period. This will help you begin your integration with the API, test it out, and start building links and referring products to your readers.
Your PA API usage limit will be adjusted based on your shipped item revenue. Your account will earn a usage limit of one TPD for every five cents or one TPS (up to a maximum of ten TPS) for every $4320 of shipped item revenue generated via the use of Product Advertising API 5.0 for shipments in the previous 30-day period. For correct attribution of shipped item revenue please ensure that you always call Product Advertising API 5.0 with the primary account credentials and retain all the URL parameters that the API returns in its response.
...
If you are trying to submit requests that exceed your account’s usage limit, or if your access has been revoked you will receive a 429 TooManyRequests error message from Product Advertising API 5.0. Please refer our API integration best practices to learn more on how to avoid these situations and optimally access the API.