How to detect how many GeoCode requests per day I'm making? - geocoding

I understand that the Google Geocoding API has a limit of 2,500 requests per day.
Is there a way that I could detect the amount of requests that I am making?

Just log every request you're doing...
Something like that
var i = 0;
if(i < 2500)
{
// Your geocoding request here
i++;
}
//Display i, or save it into a file

Related

Cloudfront serves cache at 3rd request

I'm writing tests to make sure my objects are cached by cloudfront. If the object is not cached, I expect the second time I make the same request, it will be served from the cache. However I noticed some objects are only served from cache at the 3rd request.
Anyone knows why it happens? In production this is not an issue, because who cares. But I'd like to know why, also I have to adjust my test code.
A test looks like this. 10% of my requests break the loop at i=1, the rest at i=0:
const client = axios.create({baseUrl: myUrl})
let response = await client.get(myPath)
expect(response.headers['x-cache']).toMatch(/Miss/)
for (let i = 0; i < 20; i++) {
response = await client.get(myPath)
if (response.headers['x-cache'] && /Hit/.exec(response.headers['x-cache'])) {
console.log('number of tests ', i + 1, path)
break
}
}
expect(response.headers['x-cache']).toMatch(/Hit/)
Here I read that requests to different edge locations are cached separately. However, the response header "x-amz-cf-pop" is always the same.

Reading cookies or sending info to Google Tag Manager

What is the correct way to send data to my Google Tag Manager?
I got a cookie notice that gives the user the opportunity to accept certain cookies (performance, marketing and analytics). So far I got this script to read the user' choice:
...
if (e.detail.performance) {
setCookie('cookie_performance', e.detail.performance, 365);
}
if (e.detail.analytics) {
setCookie('cookie_analytics', e.detail.analytics, 365);
}
if (e.detail.marketing) {
setCookie('cookie_marketing', e.detail.marketing, 365);
}
...
However, this only sets a cookie in the user' browser. I like to know inside my Google Tag Manager container if the user accepted the cookie.
I have read stuff about using the dataLayer, but I am stuck on configuring the triggers or tags inside my container.
Is it possible to send an event to my container whenever the user accepts a certain cookie?
Sure. If you go to the variables section and click "new" one choice for the variable type you have is "First Part Cookie".
As an aside, if the "365" in your code refers to the lifetime of your cookie in days (I assume it does, since 365 days is a year), be aware that on Safari and iOs Cookie lifetime will be limited to seven days due to the new version of their "Intelligent Tracking Prevention".
While using cookies works fine for GTM, the datalayer usually is best practice (cookies come with a few caveats - the browser may not allow them, their size is limited, their number per domain is limited etc).
As mentioned before there are inbuilt cookie variables for use within GTM, this way you can reference these in an if statement by using {{cookieVar_Name}} within GTM code.
However to answer your question about clueing in GTM about those cookies one way is to probably send along a dataLayer.push event with the necessary data.
For example you could adapt your current code to push an event when those cookies are set:
...
if (e.detail.performance) {
setCookie('cookie_performance', e.detail.performance, 365);
window.dataLayer.push({
event: 'performanceEvent',
cookie_performance: true
});
}
if (e.detail.analytics) {
setCookie('cookie_analytics', e.detail.analytics, 365);
window.dataLayer.push({
event: 'analyticsEvent',
cookie_analytics: true
});
}
if (e.detail.marketing) {
setCookie('cookie_marketing', e.detail.marketing, 365);
window.dataLayer.push({
event: 'marketingEvent',
cookie_marketing: true
});
}
...
At this point you could create a custom event trigger named to say, marketingEvent, you could then use this trigger to fire a tag when that dataLayer.push is actioned/is consented to.
In regards to reading the cookies of returning customers, you could either use a customHTML tag with a cookie reading function:
//This function can be used to retrieve a cookie and its value by its key(name)
function getCookie(name) {
var nameEQ = name + "=";
var ca = document.cookie.split(';');
for(var i=0;i < ca.length;i++) {
var c = ca[i];
while (c.charAt(0)==' ') c = c.substring(1,c.length);
if (c.indexOf(nameEQ) == 0) return c.substring(nameEQ.length,c.length);
}
return null;
}
getCookie(cookie_marketing);
Or store the cookie value in an inbuilt GTM cookie variable and write an if statement:
if({{cookie_marketing}} == true){
// fire code here
}
Hope this helps get you on the right track.

fetch the retweets for the tweets using python

I have to fetch the retweets for the tweets and create the JSON file with retweets,user id etc using the python script. Kindly help me to sort it our this issues.
Thanks in advance!!
This task require some fields of knowledge, and since you ask in a general way, I reckon you need a script to run immediately, but setting up this process requires sometime
This part to get connect to twitter API
from twython import Twython, TwythonError
APP_KEY = 'YOUR_APP_KEY'
APP_SECRET = 'YOUR_APP_SECRET'
twitter = Twython(APP_KEY, APP_SECRET)
Use Twitter API call from Twython,
you can find a list here https://twython.readthedocs.io/en/latest/api.html, the param is the same as twitter API
response = twitter.get_retweets(id, 100)
Pagnation
each call to API have limit of returns, in example for engine.get_friends_ids was limited to 5000 (https://dev.twitter.com/rest/reference/get/friends/ids), if you want to get more than 5000, you have to use the cursor in the returned result (if cur = 0 in json returned means no more results), following is example of how to handling cursor
#Set a temp to loop
cur = -1
#Stop when no more result
while cur !=0:
response = twitter.get_friends_ids(user_id=user_id, cursor=cur)
#Some code to handle the response
cur = response["next_cursor"]
API key
Key expires after some calls (https://dev.twitter.com/rest/public/rate-limits), so you need to set some code to auto change your key, or wait for some period (key reached limit return error code 429)
Response
The response from API was in JSON format, which was easy to use, you can access data by selecting base on response[key], in example
reponse["ids"] or response["next_cursor"]

Tweepy - Get All Followers For Account - Rate Limit Issues

Below is my working code to get twitter followers for certain accounts (#hudsonci in this case).
My issue is the time that it is taking to pull in all of these followers. This account specifically has approx 1,000 followers ... I can only get to 300 at a time with the rate limiting restrictions. So, it is taking > an hour to get all the followers for this account. I can imagine this will become a huge pain in the ass for large accounts.
I am looking for some suggestions for how I can improve this. I feel like I am not taking full advantage of the pagination cursor, but I can't be sure.
any help is appreciated.
#!/usr/bin/env python
# encoding: utf-8
import tweepy
import time
#Twitter API credentials
consumer_key = "mine"
consumer_secret = "mine"
access_key = "mine"
access_secret = "mine"
#authorize twitter, initialize tweepy
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_key, access_secret)
api = tweepy.API(auth)
def handle_errors(cursor):
while True:
try:
yield cursor.next()
except tweepy.TweepError:
time.sleep(20 * 60)
for user in handle_errors(tweepy.Cursor(api.followers,screen_name='hudsonci').items()):
print user.screen_name
As per the Twitter documentation for followers you need to use the count parameter.
Specifies the number of IDs attempt retrieval of, up to a maximum of 5,000 per distinct request.
So, adding count=5000 should help you.
You are getting 300 followers at a time because getting the followers object (as opposed to IDs only) has a page limit 20. With 15 requests per window, that comes out to be 300 followers.
Here are the docs for followers: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-list

Can a facebook user opt out of the graph api?

I am using spring social to get a list of all the my friends. However, it seems that the number of friends I can get from the facebook graph api is not the same as the number of friends that facebook reports on my profile.
According to facebook I have 175 friends. Using spring social facebook I can only seem to get 156 friends. I know about the paging issues with spring social facebook so I am using the code below.
private Map<String, FacebookProfile> getFacebookProfiles()
{
Connection<Facebook> connection = connectionRepository.findPrimaryConnection(Facebook.class);
FriendOperations friendOperations = connection.getApi().friendOperations();
// FindFriendProfes() only returns 100 friends, need to use batching to get all friends
List<FacebookProfile> friendProfiles = new ArrayList<>(250);
final int batchSize = 100;
int offset = 0;
List<FacebookProfile> batch = friendOperations.getFriendProfiles(offset, batchSize);
do
{
friendProfiles.addAll(batch);
offset = offset + batchSize;
batch = friendOperations.getFriendProfiles(offset, batchSize);
} while (batch.size() != 0);
Map<String, FacebookProfile> result = new HashMap<>(friendProfiles.size());
for (FacebookProfile facebookProfile : friendProfiles)
{
result.put(facebookProfile.getId(), facebookProfile);
}
return result;
}
Either there is a bug in spring-social facebook integration or there is a way for facebook users to say that they should not be returned as part of graph api call.
Can a facebook user opt of being returned as a friend from the graph api?
Yes precisely, Facebook users can opt out of third party applications by going to their Privacy Settings > Ads, Apps and Websites > Edit Settings > How people bring your info to apps they use
Those friends who edited those settings will not show up in your graph API calls.