Error while deploying Sharepoint 2013 timer job :The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo' - sharepoint-2013

While trying to create a custom SharePoint timer job at feature activation I got the following error from the log files:
System.Data.SqlClient.SqlException (0x80131904): The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo'. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.Sql... 5c6d109c-dbc6-e02e-7ae4-010d7f559e0b
In order to make it work i located the stored procedure proc_putObjectTVP and granted execute permission to the site apppool userID. It worked as desired.
My question is:
Is this a bug in Sharepoint 2013?
Is this the proper way to do it? (On production environment I may not be allowed by the server administrator to perform such operations)

I had a similar error in the event log for the account used for SharePoint 2013 services:
Insufficient SQL database permissions for user 'Name:
XXXXX\SP_Services SID: xxxxxxxxxxxxxxx ImpersonationLevel: None' in
database 'XXXX_Config' on SQL Server instance 'XXXXXXXXX'. Additional
error information from SQL Server is included below.
The EXECUTE permission was denied on the object 'proc_putObjectTVP',
database 'XXXX_Config', schema 'dbo'.
Googling around lots of blog posts recommend the same approach of applying the required permission to the stored proc. Personally I didn't like this approach, however I eventually found this TechNet post which grants the required permissions by adding the stored proc to the securables of the WSS_Content_Application_Pools role.
Using SQL Server Management Studio do the following:
Expand Databases then expand the SharePoint_Config Database.
Expand Security -> Roles -> Database Roles
Find WSS_Content_Application_Pools role, right click it, and select Properties
Click on Securables and click Search
Next click Specific objects and click OK
Click Object Types and select Stored Procedures. Click OK
Add the Stored Procedure 'proc_putObjectTVP' and click OK (if it does not automatically grant it exec permission; you need to click the
checkbox on "execute" and save it)
Using this method any new accounts added to the WSS_Content_Application_Pools role will have the correct rights preventing the problem cropping up again.

SPDataAccess role in SharePoint_Config was configured to execute proc_putObjectTVP for my install of SharePoint 2013 (which has been a trial-by-fire to get used to SQL Server 2012), anyway, making sure my sharepoint users had that role set seems to have done the trick (and of course brought up more errors to debug, now that more things are successfully starting...)
SPDataAccess (also written as SP_DATA_ACCESS) has been a useful role to Google for, bringing up tons of good resources and tips to fix one problem or another. I'll be reading blogs all night. I suspect configuring databases is old hat for quite a few SharePoint admins and devs, but it's not as well-explained, particularly as the wizard does so much (and so little) for you.
I signed up for Safari Books just to access http://my.safaribooksonline.com/book/programming/microsoft-sharepoint/9781118655047 and books like it. It's useful to help me "think like SharePoint", though Google has been just as much help. (More, really.)

Related

GCP: Is it possible to have an access to a resource if don't have project access?

It is my first expirience in Google Cloud Platform and I'm confused.
I've got an access to a resource:
xxx#gmail.com has granted you the following roles for resource resource_name(projects/project_name/datasets/ClientsExport/tables/resource_name) BigQuery Data Editor
But if I open BigQuery Data Editor, I don't see project_name and resource_name. Search by resource_name also returns no result.
Is it only access that I have in the project (I didn't get another accesses and mails).
Could you please help me with this? Maybe should I get some additional access to resource_name will be available? If is there another way to find the resource?
Thank you in advance!
In the message you have access to BigQuery data inside a table. You can query them from your project, you are autorised to access them (and to write also, because you are editor).
However, this table isn't in your project, it's in another project that's why you don't see it directly in the BigQuery console. In addition, you haven't the right to read the metadata (roles/bigquery.metadataViewer) on the dataset of the other project. Eventually, you can't also view the table schema in the console, but the bq CLI allow you to view it.
I had some discussions with Google BigQuery team about that (because I got the same issue in my company), and updates should happen by the end of the year (or soon in 2022) to fix this "view" issue in the console.
It looks like you have IAM permission to access a specific resource in BigQuery but cannot access it from the GUI.
Some reasons you may not see access on your GUI:
You have permission to interact with BigQuery but don't have access to any of the data.
You aren't a member of the organization which provided the resources and they have higher level permissions (on the org level) which prevents sharing of resources outside of the org.
Your access is restricted to the command line/app level. (If your account is a service account then this is likely the case.)

WSO2 IS - POST_DELETE_USER error while deleting user from IS

We have installed WSO2AM 2.6.0 with IS as KM (5.7). We deployed AM as an active-active all in one instance and IS as KM active-active too following all the directives written on the Official documentation.
Based on the documentation, we created the following databases with their respectives datasources: regdb (registry), carbondb, userdb (user store), mb-store, apimdb.
The issue that we have now is on IS side. We tried several things to check that everything was working correctly, like create users, check registry acces etc. We created a user called "test", chaged some properties, etc and after that, we proceed to delete the user.
When we deleted the user we get the following popup on the IS console:
Checking the logs we find the following:
Caused by: org.postgresql.util.PSQLException: ERROR: relation "cm_receipt" does not exist
Position: 135
TID: [-1234] [] [2020-05-11 09:00:30,062] ERROR {org.wso2.carbon.user.mgt.ui.UserAdminClient} - Error when handling event : POST_DELETE_USER
org.wso2.carbon.user.mgt.stub.UserAdminUserAdminException: UserAdminUserAdminException
We checked on the database and the user was deleted correctly and IS carbon console is not displaying it any more, so the user was correctly deleted. Checking a little bit more, the Delete user process is trying to access table "cm_receipt" on carbondb, but the table exists on apimdb.
On postgres side, we have this log during the delete:
<2020-05-08 11:49:50.452 -03:172.19.35.21(45740):wso2carbon#carbondb:[12476]:>ERROR: relation "cm_receipt" does not exist at character 135
<2020-05-08 11:49:50.452 -03:172.19.35.21(45740):wso2carbon#carbondb:[12476]:>STATEMENT: SELECT R.CONSENT_RECEIPT_ID, R.LANGUAGE, R.PII_PRINCIPAL_ID, R.PRINCIPAL_TENANT_ID, R.STATE,RS.SP_DISPLAY_NAME,RS.SP_DESCRIPTION FROM CM_RECEIPT R INNER JOIN CM_RECEIPT_SP_ASSOC RS ON R.CONSENT_RECEIPT_ID=RS.CONSENT_RECEIPT_ID WHERE PII_PRINCIPAL_ID LIKE $1 AND PRINCIPAL_TENANT_ID =$2 AND SP_NAME LIKE $3 AND STATE LIKE $4 ORDER BY ID ASC LIMIT $5 OFFSET $6
Have you got any idea why it can be happening? There is some bug related or something?
Thanks!
There could be two reasons for this.
You've forgot to execute the D script which contains the consent management tables. /wso2is-5.7.0/dbscripts/consent/postgresql.sql.
Your wso2is-5.7.0/repository/conf/consent-mgt-config.xml configuration file is referring to the wrong datasource.
Solution
Check what's the datasource that the consent-mgt-config.xml file is referring to. By default it's like this.
<ConsentManager xmlns="http://wso2.org/carbon/consent/management" xmlns:svns="http://org.wso2.securevault/configuration">
<DataSource>
<!-- Include a data source name (jndiConfigName) from the set of data sources defined in master-datasources
.xml -->
<Name>jdbc/WSO2IdentityDB</Name>
</DataSource>
Here, it's the jdbc/WSO2IdentityDB. Then go to your wso2is-5.7.0/repository/conf/datasources/master-datasource.xml file and check the database of that datasource. If the mentioned tables are not created in that database you can execute the above mentioned postgre.sql script in that database. (If you've already created these tables in a different datasource, you might want to change the datasource defined in the consent-mgt-config.xml file.)
P.S. Never use -Dsetup argument for automatic executions of database scripts on the startup. Always manually execute the database scripts against the database.
P.S. The reason for the user deletion success is that this user consent removal process being a POST_USER_DELETION event. A failure in a POST handler won't effect the action itself.

APEX_ADMINISTRATOR_ROLE in AWS RDS Oracle Instance

I am trying to install APEX on my AWS Oracle 12 RDS Instance. In order to achieve this, I am following these instructions : http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.APEX.HTML
However, I got stucked in step 7:
Step 7:
You must set a password for the APEX admin user. To do this, use
SQL*Plus to connect to your DB instance as the master user, and then
issue the following commands:
grant APEX_ADMINISTRATOR_ROLE to master;
#/home/apexuser/apex/apxchpwd.sql
Replace master with your master user name. When the apxchpwd.sql
script prompts you, type a new admin password
When I log into my my RDS Instance with my master user and execute this:
grant APEX_ADMINISTRATOR_ROLE to [mymasteruser];
I received this error:
ERROR at line 1:
ORA-01924: role 'APEX_ADMINISTRATOR_ROLE' not granted or does not exist
Can you please help me to solve this?
Edit 12/09/2017.
Using this post/answer:
https://serverfault.com/questions/276541/how-do-you-recover-you-rds-master-user-username
I understand my master user is shown in the following image. As I know, in RDS instance i have no access to sys or system user, so this is the only user i can use.
Many thanks
Edit 20/09/2017.
I applied Alex solution, and it works!!. However, some issues to comment:
The tutorial was changed, in fact the url changed, now is
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.Oracle.Options.APEX.html (the last "html" was in uppercase before)
but is not reliable now, there are some points that should be fixed, e.g. it says now that RDS support Oracle APEX version 5.1.2, i tried with this versión and I got this error:
Also, some directories dont match with the previos step ....
So, I used the versión that the tutorial originally says : Oracle APEX version 4.2.6.v1
I had to execute both statements :
EXEC rdsadmin.rdsadmin_util.grant_apex_admin_role;
grant APEX_ADMINISTRATOR_ROLE to [master];
Then i could execute the apxchpwd.sql script successfully!!.
But, unfortunately, when I accessed to my apex home page and tried to create a new workspace "ws_prueba", I receive this error (Im trying to create it with my apex admin user):
Any ideas?
Use
EXEC rdsadmin.rdsadmin_util.grant_apex_admin_role;
instead. I have a case open on this with AWS and just asked them to update the documentation page.

PowerBI Embedded: Datasource has no credentials, unable to Patch the gateway

I wanted to test out PowerBI embedded so I downloaded the the sample app that is able to publish a pbix file and to embed it.
So I created the easiest PowerBI file one is able to make with Azure SQL, using the DirectQuery option, as underlying data source.
I succesfully imported the PowerBI file in my workspace collection
I changed the connection string of my PowerBI file succesfully
After that the code to patch the gateway with the username and password credentials fails
Then when I tried to view the embedded report I got this error.
I believe the connectionstring is in the correct format because it was updated succesfully. I also already tried to point it to another SQL database and then the error shows the other SQL database in the error message.
1) I thought this could be because the Gateway does not get the credentials that I gave it is that correct?
2) Does someone know how can I fix this?
Thanks in advance!
As #Cuong Le stated, this was a Microsoft Issue at first.
When the problem was fixed I still received a BadRequest exception. After trying to update the credentials with the PowerBI-CLI the problem became clearer. I needed to grant rights for Azure IP addresses to the relevant SQL database. Once I did that I was able to update the credentials. Unfortunately PowerBI API SDK's exception messages are not as good as the PowerBI-CLI messages. I also tried it with PowerBI API SDK and it also worked.
The exception message I got was the following:
[ powerbi ] {"error":{"code":"DM_GWPipeline_Gateway_DataSourceAccessError","pbi.error":{"code":"DM_GWPipeline_Gateway_DataSourceAccessError","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2146232060"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Cannot open server 'engiep-dev-weeu-sql' requested by the login. Client with IP address 'xx.xx.xx.213' is not allowed to access the server. To enable access, use the Windows Azure Management Portal or run sp_set_firewall_rule on the master database to create a firewall rule for this IP address or address range. It may take up to five minutes for this change to take effect."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2146232060"}},{"code":"DM_ErrorDetailNameCode_UnderlyingNativeErrorCode","detail":{"type":1,"value":"40615"}}]}}}
The correct connectionstring format to use is:
Data Source=yourDataSource;Initial Catalog=yourDataBase;User ID=yourUser;Password=yourPass;
(Don't use quotes anywhere.)
I was experiencing the same issue. Also it is an open issue on github.
Attached Image :
enter image description here
To solve this, I used the PowerBI Cli 1.0.4 from NPM. And used Update Connection Operation,(remember to add -d).
powerbi update-connection -c [workspace name] -k [access key] -w [workspace id] -d [dataset id] -s "Data Source=xxx.database.windows.net;Initial Catalog=xxx;User ID=xxx;Password=xxx"
If it fails do it(Update-Connection Operation) again.
The issue happens since sometimes datasource credentials are not carried over to the workspace.
In the case of reports that use direct query, credentials are never brought with the pbix as an import is done. All private info are stripped out.
Hope this helps!
Thanks

Sitecore allow role to publish content in specific areas only

I am trying to create a role within Sitecore which can publish content, but only within a specific area(s) of the site. I've added the standard Sitecore\Client Publishing role to my role, but I can't see how to prevent the role from being able to publish all areas of the site. I've looked at the Security editor and the Access viewer, but setting the write access of the sections only seems to affect the ability to edit those sections and has no effect on the ability to publish on those sections.
Workflow is the typical way this is handled. Giving roles access to approve (this could be called 'publish') content of certain sections of the content tree will be the best way to achieve what you are describing. Combine this with an auto-publish action to make it more user friendly.
One thing to keep in mind though using this method is referenced items (images from media library the content may be using for example). Take a look at the 'Publishing Spider' module on the shared source library http://trac.sitecore.net/PublishingSpider
EDIT: Update
I recently discovered this setting in the web.config: "Publishing.CheckSecurity". If set to true, this setting will only publish items if the user has read + write on the item and will only remove items from the web DB if the user has delete permissions.
I had a similar situation once and I created roles per section which only had read and write to that section and no where else (let say 'editor section 1') and another role which only had publishing permission for that section (let say 'publisher section 1'). Then added 'editor section 1' role to 'publisher section 1' role which gives you the role for publishing only specific section.
You do not need multiple workflows, same workflow with multiple roles can also achieve this goal
Answer to this is to set Publishing.CheckSecurity to true
You need to find this code inside web
<!-- PUBLISHING SECURITY
Check security rights when publishing?
When CheckSecurity=true, Read rights are required for all source items. When it is
determined that an item should be updated or created in the target database,
Write right is required on the source item. If it is determined that the item
should be deleted from target database, Delete right is required on the target item.
In summary, only the Read, Write and Delete rights are used. All other rights are ignored.
Default value: false
-->
<setting name="Publishing.CheckSecurity" value="false" />
Set the value="true"
But again you have to govern the security tightly, and assign user role properly. Failed to
do so you will experience buggy publishing.
Hope that will help