Acquire SharePoint Document Library Level Security Users PowerShell - web-services

I am using SharePoint Web Services Access to gather site level security users. SharePoint recognizes Active Directory Security Groups as users (not as groups). I gathered these pseudo-groups with a SOAP request (see below) utilizing SharePoint Web Services via http://{site}/_vti_bin/Lists.asmx:
$uri = $context
$soap = '<?xml version="1.0" encoding="utf-8"?>'
$soap+= '<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">'
$soap+= '<soap:Body>'
$soap+= '<GetUserCollectionFromWeb xmlns="http://schemas.microsoft.com/sharepoint/soap/directory/" />'
$soap+= '</soap:Body>'
$soap+= '</soap:Envelope>'
[xml]$WF = Invoke-RestMethod $uri -Credential $CRED -Method POST -ContentType "text/xml" -Body $soap
$Users = $WF.Envelope.Body.GetUserCollectionFromWebResponse.GetUserCollectionFromWebResult.GetUserCollectionFromWeb.Users.User
The above request is successful at the site and subsite level. But is not successful at the library, lists and documents level.
Is there a resource via Web Services Access that mirrors GetUserCollectionFromWeb at the Library, List and Document level?
I'm beginning to rip my hair out trying to access this data.
Thanks.

You would use the Permissions web service, accessible at /_vti_bin/Permissions.asmx.
You can refer to MSDN documentation on the Permissions service here.
The GetPermissionCollection method will give you an xml fragment with members (by ID number) and permission masks.
You can use the People web service (/_vti_bin/People.asmx) to associate member ID numbers with actual people and groups.

Related

How to access Amazon images with https (AWSECommerceService)

For each product on my website I have a page that promotes a few book from Amazon. I get the books using a query to AWSECommerceService from my web server. The XML I receive from Amazon contains a list of books with information such as title, price, image-url, etc. I use those info to generate my website page.
The images URLs provided by Amazon are all HTTP, while I need to publish them using an the HTTPS protocol in order to avoid warnings for the page visitors at the browser lever. Just replacing HTTP with HTTPS doesn't work.
Example:
http://ecx.images-amazon.com/images/I/51tD0SDNMeL.SX166.jpg => OK
https://ecx.images-amazon.com/images/I/51tD0SDNMeL.SX166.jpg => ERR_CERT_COMMON_NAME_INVALID
Any suggestion?
I just found out that the same images can be accessed via HTTPS on a different amazon.com sub-domain:
Replacing 'http://ecx.images-amazon.com' with 'https://images-na.ssl-images-amazon.com' will generate a perfectly working URL.
The image in the example in my question can be successfully accessed via https at the following URL:
https://images-na.ssl-images-amazon.com/images/I/51tD0SDNMeL.SX166.jpg

AWS S3 gracefully handle 403 after getSignedUrl expired

I'm trying to gracefully handle the 403 when visiting an S3 resource via an expired URL. Currently it returns an amz xml error page. I have uploaded a 403.html resource and thought I could redirect to that.
The bucket resources are assets saved/fetched by my app. Still, reading the docs I set bucket properties to handle the bucket as a static webpage page and uploaded a 403.html to bucket root. All public permissions are blocked, except public GET access to the resource 403.html. In bucket properties, website settings I indicated the 403.html as error page. Visiting http://<bucket>.s3-website-us-east-1.amazonaws.com/some-asset.html redirects correctly to http://<bucket>.s3-website-us-east-1.amazonaws.com/403.html
However, when I use aws-sdk js/node and call method getSignedUrl('getObject', params) to generate the signed url, it returns a different host url: https://<bucket>.s3.amazonaws.com/ Visiting expired resources from this method do not get redirected to 403.html. I'm guessing that since the host address is different this is the reason it is not automatically redirecting.
I have also set up static website routing rules for condition
<Condition>
<HttpErrorCodeReturnedEquals>403</HttpErrorCodeReturnedEquals>
</Condition>
<Redirect>
<ReplaceKeyWith>403.html</ReplaceKeyWith>
</Redirect>
Still that's not redirecting the signed urls. So I'm at a loss of how to gracefully handle these expired urls. Any help would be greatly appreciated.
S3 buckets have 2 public-facing interfaces, REST and website. That is the difference between the two hostnames, and the difference in behavior you are seeing.
They have two different feature sets.
feature REST Endpoint Website Endpoint
---------------- ------------------- -------------------
Access control yes no, public content only
Error messages XML HTML
Redirection no yes, bucket, rule, and object-level
Request types all supported GET and HEAD only
Root of bucket lists keys returns index document
SSL yes no
Source: http://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteEndpoints.html
So, as you can see from the table, the REST endpoint supports signed URLs, but not friendly errors, while the website endpoint supports friendly errors, but not signed URLs. The two can't be mixed and matched, so what you're trying to do isn't natively supported by S3.
I have worked around this limitation by passing all requests for the bucket through HAProxy on an EC2 instance and on to the REST endpoint for the bucket.
When a 403 error message is returned, the proxy modifies the response body XML using the new embedded Lua interpreter, adding this before the <Error> tag.
<?xml-stylesheet type="text/xsl" href="/error.xsl"?>\n
The file /error.xsl is publicly readable, and uses browser-side XSLT to render a pretty error response.
The proxy also injects a couple of additional tags into the xml, <ProxyTime> and <ProxyHTTPCode> for use in the output. The resulting XML looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="/error.xsl"?>
<Error><ProxyTime>2015-10-13T17:36:01Z</ProxyTime><ProxyHTTPCode>403</ProxyHTTPCode><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>9D3E05D20C1BD6AC</RequestId><HostId>WvdkvIRIDMjfa/1Oi3DGVOTR0hABCDEFGHIJKLMNOPQRSTUVWXYZ+B8thZahg7W/I/ExAmPlEAQ=</HostId></Error>
Then I vary the output shown to the user with XSL tests to determine what error condition S3 has thrown:
<xsl:if test="//Code = 'AccessDenied'">
<p>It seems we may have provided you with a link to a resource to which you do not have access, or a resource which does not exist, or that our internal security mechanisms were unable to reach consensus on your authorization to view it.</p>
</xsl:if>
And the final result looks like this:
The above is a general "Access Denied" because no credentials were supplied. Here's an example of an expired signature.
I don't include the HostId in the output, since it's ugly and noisy, and, if I ever need it, the proxy captured and logged it for me, and I can cross-reference to the request-id.
As a bonus, of course, running the requests through my proxy means I can use my own domain name and my own SSL certificate when serving up bucket content, and I have real-time access logs with no delay. When the proxy is in the same region as the bucket, there is no additional charge for the extra step of data transfer, and I've been very happy with this setup.

GWT interacting with restful architecture

I have a backend REST API and i have a front end with GWT. I'm on a network where my ip address is 192.168.1.4.
I've declared a constant URL as
public static final String DomainName="http://127.0.0.1/recess/restApp/";
with the ip 127.0.0.1, everything is working fine
but when i changing the ip to 192.168.1.4, the application is not working and i'm getting status code of 0.
The GWT app is on the web root same as the REST API.
In fact, this cannot be a same origin policy problem because its a web service??
Anyone has an idea about this??
I'm using xampp and the GWT project is in htdocs same as the REST API!!
It certainly sounds like you are running into a sandbox policy forbidding external connections.
One policy you may be running into is the GWT browser add-on connection limitation. In Chrome, add your ip addresses at tools | extensions | GWT Dev Plugin | options. In Firefox, it is at Tools | Add-ons | Extensions | Google... | Options
May I suggest that instead of using hard coded urls, to conform to the single source policy, you may want to use the exact url that loaded the web page. The web toolkit provides a mechanism for doing just that.
public static GWT.getHostPageBaseURL() is useful for prepending to paths of resources relative to the host page.
public static GWT.getModuleBaseURL() is useful for prepanding urls intended to be module-relative.
http://google-web-toolkit.googlecode.com/svn/javadoc/2.3/com/google/gwt/core/client/GWT.html#getHostPageBaseURL()
I'm afraid this is a Same-Origin Policy violation: if you ask your browser to load some URL using XMLHttpRequest (or in GWT a RequestBuilder), then that URL has to be on the same origin as the page trying to access it; unless the browser supports CORS (all, except IE; at least until IE10 is out the door) and the server you're trying to reach sends the appropriate HTTP headers to allow the cross-origin connection.

Calling a custom SharePoint web service from ASP.Net AJAX gives a 403 error?

I have developed a custom SharePoint web service, and deployed it to /_vti_bin/myservice.asmx. As a "regular" user, browsing to that ASMX URL works fine. When I try to browse to "/_vti_bin/myservice.asmx/js" as required to call this service from ASP.Net AJAX, I get a 403. If I browse to it as no less than a farm admin (site collection admin doesn't work), I get a 403. It is entirely possible that the farm admin's role as a local server admin is also allowing it to work.
This is my web service class:
[WebService(Namespace = "http://sharepointservices.genericnamespace.com/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
[System.Web.Script.Services.ScriptService]
public class ApprovalSvc : System.Web.Services.WebService
{
[WebMethod]
[ScriptMethod(ResponseFormat = ResponseFormat.Xml)]
public XmlDocument GetInboxItems(string inboxName, string s_Id)
{
// code removed
}
}
This is the art of my web part code where I am hooking up the ASP.Net AJAX stuff:
ScriptManager scriptMgr = new ScriptManager();
string webUrl = SPContext.Current.Web.Url;
ServiceReference srvRef = new ServiceReference(webUrl + "/_vti_bin/ApprovalSvc.asmx");
scriptMgr.Services.Add(srvRef);
this.Controls.Add(scriptMgr);
If I'm logged in as a farm/server admin, it works. Otherwise, no. The web service assembly is in the GAC & listed in SafeControls. Any ideas?
Good old Process Monitor to the rescue.
The facts:
The service code DLL is in the web application's bin directory, as it cannot be signed b/c it references unsigned DLLs.
The request for the service DLL is coming from ASP.Net & not SharePoint, specifically an HttpModule in the System.Web.Extensions assembly.
The solution:
Because the request didn't come through SharePoint, and identity impersonation is also turned on by default, the default NTLM permissions on the web app's BIN directory were not good enough - the user's account had no access to the BIN directory or the DLLs within it.
We gave the NT AUTHORITY\Authenticated Users Read access (not Read & Execute, not List Folder Contents, just Read) to the folder, and all is well.

Login failed when a web service tries to communicate with SharePoint 2007

I created a very simple webservice in ASP.NET 2.0 to query a list in SharePoint 2007 like this:
namespace WebService1
{
/// <summary>
/// Summary description for Service1
/// </summary>
[WebService(Namespace = "http://tempuri.org/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
// To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
// [System.Web.Script.Services.ScriptService]
public class Service1 : System.Web.Services.WebService
{
[WebMethod]
public string HelloWorld()
{
return "Hello World";
}
[WebMethod]
public string ShowSPMyList()
{
string username = this.User.Identity.Name;
return GetList();
}
private string GetList()
{
string resutl = "";
SPSite siteCollection = new SPSite("http://localhost:89");
using (SPWeb web = siteCollection.OpenWeb())
{
SPList mylist = web.Lists["MySPList"];
SPQuery query = new SPQuery();
query.Query = "<Where><Eq><FieldRef Name=\"AssignedTo\"/><Value Type=\"Text\">Ramprasad</Value></Eq></Where>";
SPListItemCollection items = mylist.GetItems(query);
foreach (SPListItem item in items)
{
resutl = resutl + SPEncode.HtmlEncode(item["Title"].ToString());
}
}
return resutl;
}
}
}
This web service runs well when tested using the built-in server of Visual Studio 2008. The username indicates exactly my domain account (domain\myusername).
However when I create a virtual folder to host and launch this web service (still located in the same machine with SP2007), I got the following error when invoking ShowSPMyList() method, at the line to execute OpenWeb(). These are the details of the error:
System.Data.SqlClient.SqlException: Cannot open database "WSS_Content_8887ac57951146a290ca134778ddc3f8" requested by the login. The login failed.
Login failed for user 'NT AUTHORITY\NETWORK SERVICE'.
Does anyone have any idea why this error happens? Why does the web service run fine inside Visual Studio 2008, but not when running stand-alone? I checked and in both cases, the username variable has the same value (domain\myusername).
Thank you very much.
Thank you very much for the replies. I'll look into the documents to see how i can change the settings related to the application pool as suggested.
I want to make clear that i wanted to build a webservice to run outside of sharepoint (but can be deployed on the same server with sharepoint).
Is there any way i can programmatically pass the credentials (another domain account instead of 'NT AUTHORITY\NETWORK SERVICE' by default) to sharepoint when invoking OpenWeb method? I believe if i'm able to do that then i can walkaround the security issue above.
When you create your own custom virtual folder and set it inside the IIS, it's highly possible that the user account who run the application pool of that particular IIS virtual directory is currently set to NT authority\Network Service.
You can check carefully, by looking closely of what is the actual application pool that run that particular IIS virtual directory.
From there, you can go to the "Application Pool" folder and right click, choose Properties. Select the "Identity" tab, and it will show you who is the user account that currently running the application pool.
Alternatively, you can refer to the SharePoint SDK, something similar to ExtractCrmAuthenticationToken in dynamics CRM to extract the Authentication Token ticket.
Or alternatively you can use Network Credential to embed your own custom user id and password.
Hope this helps,
hadi teo
I fully agree with Hadi, if this is something you want to just quickly test, for a proof of concept, you can change the credentials under what the Application pool runs, to a user that has permissions. Or you could use Identity Impersonate setting in your config file.
However resist the temptiation to do this in a production enviroment, use the proper authentication. It will come back, to bite you.
If you need to set this up for production, there is a couple of areas that you want to look at, duplicate SPN's, and deligation probably the most common areas that is not configured correctly. Your error however points to impersanation not happening.
Also make sure you are deploying the web service to its own web site that does not already run SharePoint. If you want the web service to run on the same web site as SharePoint read Creating a Custom Web Service.
You can check what application pool identity SharePoint is using by following the same instructions that Hadi writes, but for an app pool running SharePoint. Make sure to only change the application pool used by your web service and not SharePoint or else other permission errors could occur within SP. (There should be no reason but if you are interested in changing the app pool identity used by SharePoint follow these instructions.)
On solution would be to "impersonate" as the SharePoint System account using the following code:
SPSecurity.RunWithElevatedPrivileges(delegate()
{
// also dispose SPSite
using (SPSite siteCollection = new SPSite("http://localhost:89"))
{
using (SPWeb web = siteCollection.OpenWeb())
{
// ...
}
}
});