We used stripe for the payment system.
Imagine a user is doing different things in our system and for each part, he has to pay. We send these payments to Stripe by calling:
stripe.InvoiceItem.create()
Then we create and finalize the invoice for him by calling:
invoice=stripe.Invoice.create()
stripe.Invoice.finalize_invoice(invoice.id)
So if the user has to pay for 3 items:
item1 = 1
item2 = 2
item3 = 3
The finalize_invoice will have an id, total, ...., and:
total = 6
To test if all items are sending the correct amount to Stripe, I'd like to check the total.
In order to test our payment system, I had to mock Stripe, but the Stripe invoice total would always be zero.
I mocked stripe.InvoiceItem.create and stripe.Invoice.finalize_invoice and stripe.Invoice.create like this:
#patch("app_name.models.stripe.InvoiceItem.create")
#patch("app_name.models.stripe.Invoice.finalize_invoice")
#patch("app_name.models.stripe.Invoice.create")
def test_method(
self,
mock_create,
mock_finalize,
mock_invoice_item,
):
response = MagicMock()
# api_key and stripe_account from this link https://stripe.com/docs/api/connected_accounts
response.api_key = "sk_test_MSc53AbTknQXPy"
response.stripe_account = "acct_1032D82eZvKYlo2C" # Stripe account ID
# last version here https://stripe.com/docs/upgrades
response.stripe_version = "2022-08-01"
mock_invoice_item.return_value = response
response = MagicMock()
response.total = 20
response.invoice_pdf = "https://google.com"
response.id = "sk_test_MSc53AbTknQXPy"
mock_create.return_value = response
mock_finalize.return_value = response.id
Stripe might have a mocking feature.
Stripe-mock was not clear to me how to use it in unit tests.
I really don't know how you are mocking the different Stripe functions in order to pinpoint the issue with the the invoice total cost.
If you're thinking of using stripe-mock, I guess the best way to handle unit testing is to do so in an agnostic way (regardless of the stack), by running the stripe-mock Docker as described in the github Readme and create a proxy that will route any API call to the Docker URL instead of the actual API URL (https://api.stripe.com). This will allow you to do unit testing locally on your machine and even with your preferred CI/CD.
With that being said, please bear in mind that there are some known limitations described in the Readme doc.
Related
I'm writing Django app and want to send out tokens using Web3 once Coinpayments sends me callback about successfull payment. The problem is that Coinpayments sends multiple callbacks at once and just in one case tokens are sending, other callbacks get replacement transaction underpriced error. I've already tried to use solutions like add +1 to nonce or remove this parameter, but that doesn't help me because transactions are still building with the same nonce. How can that be fixed or what am I doing wrong?
class CoinpaymentsIPNPaymentView(BaseCoinpaymentsIPNView):
def post(self, request, order_id, *args, **kwargs):
status = int(request.POST.get('status'))
order = Order.objects.get(id=order_id)
order.status = request.POST.get("status_text")
if not status >= 100:
order.save()
return JsonResponse({"status": status})
amount = Decimal(request.POST.get('amount1'))
record = Record.objects.create(
user=order.user,
type='i',
amount=amount,
)
order.record = record
order.save()
gold_record = GoldRecord.objects.get(from_record=record)
contract = w3.eth.contract(address=CONTRACT_ADDRESS, abi=ABI_JSON)
transaction = contract.functions.transfer(order.user.wallet.address, int(gold_record.amount * 10 ** 18)).buildTransaction({
'chainId': 1,
'gas': 70000,
'nonce': w3.eth.getTransactionCount(WALLET_ADDRESS) # address where all tokens are stored
})
signed_tx = w3.eth.account.signTransaction(transaction, WALLET_PRIVATE_KEY) # signing with wallet's above private key
tx_hash = w3.eth.sendRawTransaction(signed_tx.rawTransaction)
print(tx_hash.hex())
tx_receipt = w3.eth.waitForTransactionReceipt(tx_hash)
return JsonResponse({"status": status})
P.S. I've already asked it on Ethereum StackExchange, but nobody answered or commented it: https://ethereum.stackexchange.com/questions/80961/sending-tokens-out-on-coinpayments-success-payment-using-web3py
Ok, let the web know answer and solution that I found out by myself
Each transaction should have unique nonce, so I noticed that if I do a loop for sending transactions and set nonce as w3.eth.getTransactionCount(WALLET_ADDRESS) + index then it sends all transactions without any errors. So I removed instant coins sending (even removed waitForTransactionReceipt to speed up it), and made management command where I process all payouts and if it was sent successfully I assign its tx_hash and run it every 10 minutes with Heroku Scheduler
I have a non-profitable website that I need to handle newsletter emails to probably thousand people (lets be realistic and give an upper bound of at most 2000 - 2500 registered users).
I have implemented email this way:
#login_required
def SendEmail(request):
receivers = []
users = Users.objects.all()
receivers.append(user.Email for user in users)
emailTypeSelected = request.POST.get('email_type', -1)
email_factory = EmailFactory()
emailManager = email_factory.create_email(emailTypeSelected)
emailManager.prepare("Some Title")
emailManager.send_email_to(receivers)
return render(request, 'new_user_email.html')
And here is the "abstract" class.
class Email(object):
title = ""
plain_message = ""
html_message = ""
def send_email_to(self, receivers):
send_mail(
self.title,
self.plain_message,
SENDER,
receivers,
html_message=self.html_message
)
I have tested this code and it takes a while to send 1 email to 1 user. My concern is that for thousand emails will put a big overhead to the server.
I was thinking to do the following:
Break the users into group of 100 and send email to those users every 30 minutes.
But I am not sure how this can be implemented. Seems that I will need to implement a sort of threads that will be triggered independently and handle the email for me.
Is there any design pattern that you are aware on how to solve this problem?
Now I know that the best way to do this is to use an external service that handle email newsletter and free up my server from doing this but as a non-profitable website I am trying to minimise the expenses as I already have to pay the server expenses. So at the moment I am trying to find a way to implement that in-house unless big problem arises which will force me to go into third-party services.
I want to make testing endpoints in a Google App Engine instance easier by using a script. Based on this information https://webapp-improved.appspot.com/guide/testing.html, I wrote a simple script like this:
from main import app
params = {
'top': 'true',
'sort_by': 'most-recent',
}
query = urllib.urlencode(params)
url = '/activities?' + query
request = webapp2.Request.blank(url)
response = request.get_response(app)
But there is a problem: my handler has a security check to ensure only signed-in user can call the endpoint. It will return status code of 401 (HTTPUnauthorized) if the cookie __auth__ is not set propertly
HTTPUnauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required.
Here is the relevant webapp2 configuration
'webapp2_extras.sessions': {
'cookie_name': '__session__',
'secret_key': login.secrets.SESSION_KEY,
'cookie_args': {
'max_age': 30 * 24 * 60 * 60
}
},
'webapp2_extras.auth': {
'cookie_name': '__auth__',
# Use with 'remember' flag to make persistent cookies
'token_max_age': 30 * 24 * 60 * 60,
'user_attributes': []
},
I have tried to 'monkey-patch' my handler classes to bypass the check.
Here is the get function of my handler class ActivitiesHandler (it extends webapp2.RequestHandler)
def get(self):
# Checking if user is logged in
if self.secure and not self.logged_in:
self.abort(401)
# ...
My money patch code goes like this:
ActivitiesHandler.secure = False
However it does not take any effect. I think it is because the app variable, being a variable of a module scope, is already initialised along with all the handlers well before my money patch code is executed.
As an alternative, I want to programmatically create a __auth__ for a valid User entity. But I cannot find relevant information regarding this.
Any pointer or sample code will be welcome.
this the code I'm using, is there anyway to make it run faster:
src_uri = boto.storage_uri(bucket, google_storage)
for obj in src_uri.get_bucket():
f.write('%s\n' % (obj.name))
This is an example where it pays to use the underlying Google Cloud Storage API more directly, using the Google API Client Library for Python to consume the RESTful HTTP API. With this approach, it is possible to use request batching to retrieve the names of all objects in a single HTTP request (thereby reducing the extra HTTP request overhead) as well as to use field projection with the objects.get operation (by setting &fields=name) to obtain a partial response so that you aren't sending all the other fields and data over the network (or waiting for retrieval of unnecessary data on the backend).
Code for this would look like:
def get_credentials():
# Your code goes here... checkout the oauth2client documentation:
# http://google-api-python-client.googlecode.com/hg/docs/epy/oauth2client-module.html
# Or look at some of the existing samples for how to do this
def get_cloud_storage_service(credentials):
return discovery.build('storage', 'v1', credentials=credentials)
def get_objects(cloud_storage, bucket_name, autopaginate=False):
result = []
# Actually, it turns out that request batching isn't needed in this
# example, because the objects.list() operation returns not just
# the URL for the object, but also its name, as well. If it had returned
# just the URL, then that would be a case where we'd need such batching.
projection = 'nextPageToken,items(name,selfLink)'
request = cloud_storage.objects().list(bucket=bucket_name, fields=projection)
while request is not None:
response = request.execute()
result.extend(response.items)
if autopaginate:
request = cloud_storage.objects().list_next(request, response)
else:
request = None
return result
def main():
credentials = get_credentials()
cloud_storage = get_cloud_storage_service(credentials)
bucket = # ... your bucket name ...
for obj in get_objects(cloud_storage, bucket, autopaginate=True):
print 'name=%s, selfLink=%s' % (obj.name, obj.selfLink)
You may find the Google Cloud Storage Python Example and other API Client Library Examples helpful in figuring out how to do this. There are also a number of YouTube videos on the Google Developers channel such as Accessing Google APIs: Common code walkthrough that provide walkthroughs.
So I have a Django app, which as part of its functionality makes a request (using the requests module) to another server. What I want to do is have a server available for unittesting which gives me canned responses to test requests from the Django app (allowing to test how Django handles the different potential responses).
An example of the code would be:
payload = {'access_key': key,
'username': name}
response = requests.get(downstream_url, params=payload)
# Handle response here ...
I've read that you can use SimpleHTTPServer to accomplish this, but I'm not sure of how I use it to this end, any thoughts would be much appreciated!
Use the mock module.
from mock import patch, MagicMock
#patch('your.module.requests')
def test_something(self, requests_mock):
response = MagicMock()
response.json.return_value = {'key': 'value'}
requests_mock.get.return_value = response
…
requests_mock.get.assert_called_once_with(…)
response.json.assert_called_once()
Much more examples in the docs.
You don't need to (and should not) test the code that makes the request. You want to mock out that part and focus on testing the logic that handles the response.