REST uri to get data from community junction(CJ) fails - web-services

I am making REST calls to download reports from commission junction(CJ) but not able to fetch data. Instead I am able to manually download report with the desired column as
Account name,Campaign name,Ad group,Destination URL,Ad distribution,Impressions,Clicks, CTR,Average,CPC,Spend,Avg. position.
The REST uri I am using is
https://commission-detail.api.cj.com/v3/commissions?date-type=posting&start-date=2013-03-14&end-date=2013-04-14&action-types=impression
Is this the right REST uri to get the data for desired columns mentioned above.Please suggest.

I believe using v3 you can only get yesterdays data.
http://www.ericnagel.com/how-to-tips/commission-junction-web-services.html

Related

How to specify dataset location using BigQuery API v0.27?

I am trying to figure out how to specify the dataset location in a BigQuery API query using v0.27 of the BigQuery API.
I have a dataset located in northamerica-northeast1 and the BigQuery API is returning 404 errors since this is not the default multi-regional location "US."
I am using the run_async_query method to execute my queries but based on documentation am unsure how to add a location to this field to make it location aware.
I have also tried to previously update my client instantiation like this:
def _get_client(self):
bigquery.Client.SCOPE = (
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/drive')
client = bigquery.Client.from_service_account_json(_KEY_FILE)
if self._params['bq_data_location'].strip():
client.location = self._params['bq_data_location']
return client
However, it does not appear that this is the correct way to inform the BigQuery API of a dataset location.
For additional context, in my SQL that I am passing to the BigQuery API, I am already specifying the PROJECT_ID.DATASET_ID.TABLE_ID, however, this does not seem to be sufficient to find regional data.
Furthermore, I am making this request from Google App Engine using the CRMint open source data flow platform.
Can you please help me with an example of how location can be added to the BigQuery API for v0.27 so that the API does not return 404?
Thank you!
From the code sample it seems you're likely talking about google-cloud-bigquery 0.27, which was released in Aug 2017 and predates location support (as well as many other features).
Your best bet is to update that dependency to something more recent.

What to use to get data from OpenFIGI API to store into AWS or Snowflake?

I am trying to get help on what to use to get data from OpenFIGI API to store into AWS or Snowflake?
https://www.openfigi.com/api
I would like to build a form or input sheet to be able to input the POST /v2/search Request Format for OpenFIGI and send the API request.
I need help figuring out how to get the input built and where to stage the return data in AWS or Snowflake so I can attach it to other tables.
Thanks for any help or quickstarts that may help me along.
Best,
PJ

Not able to delete multiple campaigns using Postman from Eloqua

I have been trying to delete multiple campaigns from Eloqua at a time using Postman. But I am not able to do. I don't see reference in the tool as well http://docs.oracle.com/cloud/latest/marketingcs_gs/OMCAB/index.html#Developers/RESTAPI/REST-API.htm%3FTocPath%3D%2520Application%2520API%7C_____0.
Please let me know if deleting the multiple campaigns is possible.
It is not possible.
The link you provided mentions it's outdated, and a redirection link was available: http://docs.oracle.com/cloud/latest/marketingcs_gs/OMCAC/rest-endpoints.html
Have a look at all the DELETE methods over there, and you will see that there is no provision for sending more than one id at a time.
Edit: You say you are using Postman. It is possible to perform repetitive tasks (like deleting mulitple campaigns) with different parameters each time by using Collections.
Edit 2:
Create an environment,
Type your url with the id as a variable, e.g.: xyz.com/delete/{id}
And send all the id values as a JSON or CSV file. They have given a sample JSON, you would simply have to provide your ids inside an array, e.g.:
[
{"id":1},
{"id":2},
{"id":3}
]

Selecting Categories using API in Amazon MWS

I am developing a integration between a desktop application and Amazon MWS and need to be able to offer users a choice of categories to put the product they are listing into. My problem is that I can't find any way of programmatically getting the current categories from MWS using the API.
Additionally once I have a category reference to use I will need a way to the pull in and add the category specific XML child of ProductData (eg Home, Jewelry, Computers, etc) but they don't seem to be linked in any well defined way. For example, I can't say "if the chosen category is reference nnnnn ask them to populate the Computers specific ProductData", unless I write something myself to map them.
Has anyone else come across these problems and found a workable solution?
Any help appreciated...
I am currently exploring the option of limiting users to only selling products already listed on Amazon but still can't figure out how to pull in the correct category specific XML.
There are various product look-ups but they all seem to work from either my SKU (which will not yet be there) or Amazons ASIN (which I don't yet know)
You can use amazon advertizement api for this.
You have to create account on amazon affiliate programme.From that you have to get security credentials also .
After That go to BrowseNode Tree Page and download root categories list and save it to file or database.From there you get categoryname and their browseNodeId.
Then call BrowseNodeApi to get Child Categories for parent Category.
Please Follow This Link
http://docs.aws.amazon.com/AWSECommerceService/latest/DG/ProgrammingGuide.html
code for calling BrowseNodeApi
SignedRequestHelper helper =
new SignedRequestHelper(appConfig["AWSAccessKey"], appConfig["AWSSecretKey"], appConfig["endpoint"]);
string url = helper.Sign("http://ecs.amazonaws.com/onca/xml?Service=AWSECommerceService&Operation=BrowseNodeLookup&BrowseNodeId=" + value + "&AssociateTag=beginners00-00&Version=2011-08-01");
HttpWebRequest request = WebRequest.Create(url) as HttpWebRequest;
// Get response
using (HttpWebResponse response = request.GetResponse() as HttpWebResponse)
{
}
and also download SignatureGenerator class HMAC

A way to bypass the per-ip limit retrieving profile picture?

My app download all the user's friends pictures.
All the requests are of this kind:
https://graph.facebook.com/<friend id>/picture?type=small
But, after a certain limit is reached, instead of the picture I get:
{"error""message":"(#4) Application request limit reached","type":"OAuthException"}}
Actually, the only way I found to prevent this is to change the server ip (manually).
There isn't a better way?
For the record:
The limit is related to the Graph Api only, and the graph.facebook.com/<user>/picture url is a graph api call that returns a redirect.
So, to avoid the daily limit, simply fetch all the images url from FQL, like:
SELECT uid, pic_small, pic_big, pic, pic_square FROM user WHERE uid = me() or IN (SELECT uid2 FROM friend WHERE uid1=me())
now these are the direct urls to the images, for eg:
http://profile.ak.fbcdn.net/hprofile-ak-snc4/275716_1546085527_622218197_q.jpg
so don't store them since they continuously change.
If it's needed for an online app better way not to download those images, but use an online version, there is couple of reasons for doing so:
Users change pictures (some frequently), do you need an updated version?
Facebook's servers probably faster than yours and friends pictures probably cached within browser of your user.
Update:
Since limit you reach is Graph API call limit, not the image retrieval, another solution that comes to my head just now is using friends connection of user in Graph API and specifying picture in fields argument, eq: https://graph.facebook.com/me/friends?fields=picture, this will return direct URL-s for friends pictures so you can do only one call to get all needed info to download the images for every user...