I am trying to use the Google Groups Migration API to add an entry to a Google Group. According to the documentation I use this url:
https://www.googleapis.com/upload/groups/v1/groups/transend#googlegroups.com/archive?uploadType=media
I am supplying the auth token correctly I believe (got past http 401 error). Now I am getting http 500, internal server error. The json
response says "Backend Error". My http headers are:
Content-Length: 225
Content-Type: message/rfc822
The data that follows is as plain a rfc822 type message as I can make:
From: jmckay9351#gmail.com
To: transend#googlegroups.com
Subject: forward test
MIME-Version: 1.0
Date: Mon, 22 Feb 2016 08:03:00 -0800
Content-Type: text/plain; charset="UTF-8"
This is the first line of the message.
I believe the group is set up correctly - it can receive messages via email from jmckay9351#gmail.com, just not via the API. Any suggestions for me?
The Groups Migration API can only be used with Google Apps accounts, not for googlegroups.com.
See Prerequisites:
Have a Google account and create an administrator. The API applies to Google Apps for Business, Education, Government, Reseller, and ISP accounts.
Related
We are catching a BigCommerce webhook event in our Google Cloud Run application. The request looks like:
Headers
host: abc-123-ue.a.run.app
AccountId: ABC
Content-Type: application/json
Password: Goodbye
Platform: BC
User-Agent: akka-http/10.1.10
Username: Hello
Content-Length: 197
Connection: keep-alive
Body
{"created_at":1594914374,"store_id":"1001005173","producer":"stores/gy68868uk5","scope":"store/product/created","hash":"139fab64ded23b3e1b8473ba24ab21bedd3f535b","data":{"type":"product","id":132}}
For some reason, this causes a 400 response from Google Cloud Run. Our application doesn't even seem to be passed the request. All other endpoints work (including other post requests).
Any ideas?
Edit
In the original post, I had the path in the host header. This was a mistake made in creating this post and not the actual value passed to us. We can only inspect the request via Requestbin (I can't find the request values anywhere in Google logs) so I'm speculating on the host value and made a mistake writing it out here.
Research so far...
So upon further testing, it seems that BigCommerce Webhooks also fail to send to any Google Cloud Function we set up. As a workaround, I'm having Pipedream catch the webhook and send the payload to our application. No problems there. This endpoint also works with mirror payloads from local and Zapier which seems to eliminate authentication errors.
We are running FastAPI on Google Run and the simplest function on Google Cloud Functions. This seems to be an error with how Google Serverless and BigCommerce Webhook Events communicate with each other. I'm just not sure how...
Here are the headers we managed to capture on one of the only times a BigCommerce Webhook Event came through to our Google Cloud Function:
Content-Length: 197
Content-Type: application/json
Host: us-central1-abc-123.cloudfunctions.net
User-Agent: akka-http/10.1.10
Forwarded: for="0.0.0.0";proto=https
Function-Execution-Id: unes7v34vzyo
X-Appengine-Country: ZZ
X-Appengine-Default-Version-Hostname: f696ddc1d56c3fd66p-tp.appspot.com
X-Appengine-Https: on
X-Appengine-Request-Log-Id: 5f10e15c00ff082ecbb02ee3a70001737e6636393664646331643536633366643636702d7470000165653637393633633164376565323033383131366437343031613365613263303a36000100
X-Appengine-Timeout-Ms: 599999
X-Appengine-User-Ip: 0.0.0.0
X-Cloud-Trace-Context: a62207698d141465d0f38488492d088b/9870406606828581415
X-Forwarded-For: 0.0.0.0
X-Forwarded-Proto: https
Accept-Encoding: gzip
Connection: close
> host: abc-123-ue.a.run.app/bigcommerce/webhooks/
This is most likely the issue. Host headers must contain only the hostname, not the request /paths.
You can clearly see this will fail:
$ curl -IvH 'Host: pdf-2wvlk7vg3a-uc.a.run.app/foo' https://pdf-2wvlk7vg3a-uc.a.run.app
...
HTTP/2 400
However if you don't craft the Host header yourself, it will work.
The AWS S3 PUT REST API docs are lacking a clear example of the Authorization string in the Request Syntax.
Request Syntax
PUT /Key+ HTTP/1.1
Host: Bucket.s3.amazonaws.com
x-amz-acl: ACL
Cache-Control: CacheControl
Content-Disposition: ContentDisposition
Content-Encoding: ContentEncoding
Content-Language: ContentLanguage
Content-Length: ContentLength
Content-MD5: ContentMD5
Content-Type: ContentType
Expires: Expires
x-amz-grant-full-control: GrantFullControl
x-amz-grant-read: GrantRead
x-amz-grant-read-acp: GrantReadACP
x-amz-grant-write-acp: GrantWriteACP
x-amz-server-side-encryption: ServerSideEncryption
x-amz-storage-class: StorageClass
x-amz-website-redirect-location: WebsiteRedirectLocation
x-amz-server-side-encryption-customer-algorithm: SSECustomerAlgorithm
x-amz-server-side-encryption-customer-key: SSECustomerKey
x-amz-server-side-encryption-customer-key-MD5: SSECustomerKeyMD5
x-amz-server-side-encryption-aws-kms-key-id: SSEKMSKeyId
x-amz-server-side-encryption-context: SSEKMSEncryptionContext
x-amz-request-payer: RequestPayer
x-amz-tagging: Tagging
x-amz-object-lock-mode: ObjectLockMode
x-amz-object-lock-retain-until-date: ObjectLockRetainUntilDate
x-amz-object-lock-legal-hold: ObjectLockLegalHoldStatus
Body
The docs show this request example further on...
PUT /my-image.jpg HTTP/1.1
Host: myBucket.s3.<Region>.amazonaws.com
Date: Wed, 12 Oct 2009 17:50:00 GMT
Authorization: authorization string
Content-Type: text/plain
Content-Length: 11434
x-amz-meta-author: Janet
Expect: 100-continue
[11434 bytes of object data]
But again, the doc does not have an example format for Auth String. I tried AccessKeyID Secret but that didn't work. I dont' even see logical parameters in the request syntax to pass the two parts of the credential (AccessKeyID and Secret) anywhere in the examples!
Does anyone have a simple example of how to use PUT to add a .json file to S3 using the REST API? Preferrably a screenshot of PostMan setup to better explain where values go (in URL vs. as headers).
From the AWS docs here, it appears it is not possible to create a PUT request to an S3 bucket using REST API alone:
For authenticated requests, unless you are using the AWS SDKs, you have to write code to calculate signatures that provide authentication information in your requests.
This is a new concept to me. I've used token requests and sending keys in headers before when authenticating via REST API's. It sounds like a more secure method of auth.
I have some images that I need to do a HttpRequestMethod.HEAD in order to find out some details of the image.
When I go to the image url on a browser it loads without a problem.
When I attempt to get the Header info via my code or via online tools it fails
An example URL is http://www.adorama.com/images/large/CHHB74P.JPG
As mentioned, I have used the online tool Hurl.It to try and attain the Head request but I am getting the same 403 Forbidden message that I am getting in my code.
I have tried adding many various headers to the Head request (User-Agent, Accept, Accept-Encoding, Accept-Language, Cache-Control, Connection, Host, Pragma, Upgrade-Insecure-Requests) but none of this seems to work.
It also fails to do a normal GET request via Hurl.it. Same 403 error.
If it is relevant, my code is a c# web service and is running on the AWS cloud (just in case the adorama servers have something against AWS that I dont know about). To test this I have also spun up an ec2 (linux box) and run curl which also returned the 403 error. Running curl locally on my personal computer returns the binary image which is presumably just the image data.
And just to remove the obvious thoughts, my code works successfully for many many other websites, it is just this one where there is an issue
Any idea what is required for me to download the image headers and not get the 403?
same problem here.
Locally it works smoothly. Doing it from an AWS instance I get the very same problem.
I thought it was a DNS resolution problem (redirecting to a malfunctioning node). I have therefore tried to specify the same IP address as it was resolved by my client but didn't fix the problem.
My guess is that Akamai (the service is provided by an Akamai CDN in this case) is blocking AWS. It is understandable somehow, customers pay by traffic for CDN, by abusing it, people can generate huge bills.
Connecting to www.adorama.com (www.adorama.com)|104.86.164.205|:80... connected.
HTTP request sent, awaiting response...
HTTP/1.1 403 Forbidden
Server: **AkamaiGHost**
Mime-Version: 1.0
Content-Type: text/html
Content-Length: 301
Cache-Control: max-age=604800
Date: Wed, 23 Mar 2016 09:34:20 GMT
Connection: close
2016-03-23 09:34:20 ERROR 403: Forbidden.
I tried that URL from Amazon and it didn't work for me. wget did work from other servers that weren't on Amazon EC2 however. Here is the wget output on EC2
wget -S http://www.adorama.com/images/large/CHHB74P.JPG
--2016-03-23 08:42:33-- http://www.adorama.com/images/large/CHHB74P.JPG
Resolving www.adorama.com... 23.40.219.79
Connecting to www.adorama.com|23.40.219.79|:80... connected.
HTTP request sent, awaiting response...
HTTP/1.0 403 Forbidden
Server: AkamaiGHost
Mime-Version: 1.0
Content-Type: text/html
Content-Length: 299
Cache-Control: max-age=604800
Date: Wed, 23 Mar 2016 08:42:33 GMT
Connection: close
2016-03-23 08:42:33 ERROR 403: Forbidden.
But from another Linux host it did work. Here is output
wget -S http://www.adorama.com/images/large/CHHB74P.JPG
--2016-03-23 08:43:11-- http://www.adorama.com/images/large/CHHB74P.JPG
Resolving www.adorama.com... 23.45.139.71
Connecting to www.adorama.com|23.45.139.71|:80... connected.
HTTP request sent, awaiting response...
HTTP/1.0 200 OK
Content-Type: image/jpeg
Last-Modified: Wed, 23 Mar 2016 08:41:57 GMT
Server: Microsoft-IIS/8.5
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
ServerID: C01
Content-Length: 15131
Cache-Control: private, max-age=604800
Date: Wed, 23 Mar 2016 08:43:11 GMT
Connection: keep-alive
Set-Cookie: 1YDT=CT; expires=Wed, 20-Apr-2016 08:43:11 GMT; path=/; domain=.adorama.com
P3P: CP="NON DSP ADM DEV PSD OUR IND STP PHY PRE NAV UNI"
Length: 15131 (15K) [image/jpeg]
Saving to: \u201cCHHB74P.JPG\u201d
100%[=====================================>] 15,131 --.-K/s in 0s
2016-03-23 08:43:11 (460 MB/s) - \u201cCHHB74P.JPG\u201d saved [15131/15131]
I would guess that the image provider is deliberately blocking requests from EC2 address ranges.
The reason the wget outgoing ip address is different in the two examples is due to DNS resolution on the cdn provider that adorama are providing
Web Server may implement ways to check particular fingerprint attributes to prevent automated bots . Here a few of them they can check
Geoip, IP
Browser headers
User agents
plugin info
Browser fonts return
You may simulate the browser header and learn some fingerprinting "attributes" here : https://panopticlick.eff.org
You can try replicate how a browser behave and inject similar headers/user-agent. Plain curl/wget are not likely to satisfied those condition, even tools like phantomjs occasionally get blocked. There is a reason why some prefer tools like selenium webdriver that launch actual browser.
I found using another url also being protected by AkamaiGHost was blocking due to certain parts in the user agent. Particulary using a link with protocol was blocked:
Using curl -H 'User-Agent: some-user-agent' https://some.website I found the following results for different user agents:
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:70.0) Gecko/20100101 Firefox/70.0 okay
facebookexternalhit/1.1 (+http\://www.facebook.com/externalhit_uatext.php): 403
https ://bar: okay
https://bar: 403
All I could find for now is this (downvoted) answer https://stackoverflow.com/a/48137940/230422 stating that colons (:) are not allowed in header values. That is clearly not the only thing happening here as the Mozilla example also has a colon, only not a link.
I guess that at least most webservers don't care and allow facebook's bot and other bots having a contact url in their user agent. But appearently AkamaiGHost does block it.
We are working on product which uses Azure storage service for storing data.
We are using Azure REST API through C++ to communicate with Azure. We are using cURL to execute REST request.
Right now, we are working on functionality to list blobs, but its failing with error
<?xml version="1.0" encoding="utf-8"?>
<Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate
the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:16cd7e3d-0001-0032-2dd6-6f2e4f000000
Time:2016-02-25T14:14:23.2377982Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'CyPhz
sBdBCRRg2w157IYY4sIB23XwzKsfdAaUTVCAts=' is not the same as any computed signature. Server used following string to sign
: 'GET
x-ms-date:Thu, 25 Feb 2016 14:16:20 GMT
x-ms-version:2015-02-21
/sevenstars/container2
comp:list
delimiter:/
maxresults:2
restype:container'
</AuthenticationErrorDetail></Error>
======================
Following is the wireshark output that we observed
GET /container2?comp=list&delimiter=/&maxresults=2&restype=container HTTP/1.1
Host: sevenstars.blob.core.windows.net
Accept: */*
x-ms-date:Thu, 25 Feb 2016 14:16:20 GMT
x-ms-version:2015-02-21
Authorization:SharedKey sevenstars:CyPhzsBdBCRRg2w157IYY4sIB23XwzKsfdAaUTVCAts=
HTTP/1.1 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: 704
Content-Type: application/xml
Server: Microsoft-HTTPAPI/2.0
x-ms-request-id: 16cd7e3d-0001-0032-2dd6-6f2e4f000000
Date: Thu, 25 Feb 2016 14:14:22 GMT
...<?xml version="1.0" encoding="utf-8"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:16cd7e3d-0001-0032-2dd6-6f2e4f000000
Time:2016-02-25T14:14:23.2377982Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request 'CyPhzsBdBCRRg2w157IYY4sIB23XwzKsfdAaUTVCAts=' is not the same as any computed signature. Server used following string to sign: 'GET
======================
As per suggestions on Microsoft forum. I ensured all parameters are set correctly. (: is used instead of = in string to sign)
Can you please let us know that how can we resolve this issue?
Your help is much appreciated.
Thanks and regards
Rahul Naik
My company is working on a new SharePoint site, which will use Forms Based Authentication to allow our customers to log into the site for subscriber specific content (downloads, license info, etc).
All these customers are located within our CRM, NetSuite, which is where we want our customer care teams to update a customers information and assign them to FBA roles (the roles are already added to Groups in SharePoint).
To do this, I'm looking to create SOAP XML files, that can be used by NetSuite's own development language, SuiteScript, which would send the SOAP request, and the process the response.
For example: Using soapUI I'm constructing the following XML:
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:dir="http://schemas.microsoft.com/sharepoint/soap/directory/">
<soapenv:Header/>
<soapenv:Body>
<dir:GetUserInfo>
<dir:userLoginName>myUserName</dir:userLoginName>
</dir:GetUserInfo>
</soapenv:Body>
</soapenv:Envelope>
The problem is that my XML response, when executing this XML using soapUI, is 403 FORBIDDEN - the Raw response is:
HTTP/1.1 403 Forbidden
Cache-Control: private, max-age=0
Server: Microsoft-IIS/7.5
SPRequestGuid: 36264ce4-9702-44bb-9693-23852a5e0c99
X-SharePointHealthScore: 1
X-Forms_Based_Auth_Required: http://mySPserver/_layouts/login.aspxReturnUrl=/_layouts/Error.aspx&Source=%2f_vti_bin%2fusergroup.asmx
X-Forms_Based_Auth_Return_Url: http://ec2-devmoss1/_layouts/Error.aspx
X-MSDAVEXT_Error: 917656; Access denied. Before opening files in this location, you must first browse to the web site and select the option to login automatically.
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.4762
Date: Tue, 19 Jul 2011 19:25:47 GMT
Content-Length: 13
403 FORBIDDEN
I'm guessing I need to log in somehow using credentials within the XML, but how do I do that? I tried using this in my <soapenv:Header>...
<soapenv:Header>
<h:BasicAuth xmlns:h="http://soap-authentication.org/basic/2001/10/" SOAP-ENV:mustUnderstand="1">
<Name>user</Name>
<Password>password</Password>
</h:BasicAuth>
</soapenv:Header>
but then my Raw response becomes:
HTTP/1.1 400 Bad Request
Cache-Control: private
Server: Microsoft-IIS/7.5
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.4762
Date: Tue, 19 Jul 2011 19:43:12 GMT
Content-Length: 0
Can anyone advise on how to correctly form an XML SOAP call for this, or any, SharePoint web service method, or point me to an article/question (with answer) that explains it? I tried googling and looking through stackoverflow (of course ), but I just cannot find the information/solution I need.
(sorry for the really long question)
Kevin
In the warm, fuzzy .Net world...
Accessing webservices on a SharePoint site using FBA takes a little extra work.
In .Net, it's pretty simple. In fact, there's a MSDN article with code samples and all on precisely how to do that. In summary, you first call the Login method on Authentication.asmx, and use the returned cookie in all future web service requests.
Outside .Net
One dark and stormy night, I ventured out into the non-Microsoft world. No-man's land. Without the .Net generated web service proxies, we were rolling our own SOAP messages to communicate with SharePoint webservices.
Where's my cookie??
Without the .Net proxy, you can't use CookieContainer as the MSDN article suggests. Authentication.asmx's description for Login shows the following SOAP response:
The response XML simply contains the authentication cookie's name. Where did the actual cookie go? GIMME MY COOKIE!!!
Getting the cookie
It turns out the cookie is sent in the SOAP header. If login is successful, the response's SOAP header will look something like this:
The Set-Cookie field above gives us the FBA cookie called .ASPXAUTH, with value 987A98.......
Using the cookie
To use the cookie for web service requests, we need to include it in the SOAP request header by adding a Cookie field:
You can include multiple cookies by separating the name/value pairs with semi-colons. Once the .ASPXAUTH cookie is set, you can send the request and a response should be returned as normal.
No-man's land behind ISA lines
SharePoint sites behind an ISA server with Forms authentication can be handled similarly. The difference is that we have to get the authentication cookies from the ISA server instead of SharePoint. That can be done by POSTing to the /CookieAuth.dll?Logon url. I won't go over the details, but it shouldn't be hard to figure out the appropriate url and querystring to use.
I found this again after the original blog disappeared then reappeared. Added for posterity here in case the blog goes away.
New blog location.
Author Bio