I'm doing my internship in a company has very high security settings, sometimes something in the network will be blocked without any notice. And I don't really have admin role for my computer. Right now I met a problem, and not sure whether it's my permission issue.
I'm trying to upload files in Python code to OneDrive for Business, it's the company account.
This is the tutorial I'm using, check that OneDrive for Business part.
Before uploading the item, I should pass the authentication.
I have tested my code line by line, it worked well,
import onedrivesdk
from onedrivesdk.helpers import GetAuthCodeServer
from onedrivesdk.helpers.resource_discovery import ResourceDiscoveryRequest
redirect_uri = 'http://localhost:8080'
client_secret = '[my client secret]'
client_id='[my client id]'
resourceId = "https://api.office.com/discovery/"
auth_server_url='https://login.microsoftonline.com/common/oauth2/authorize'
auth_token_url='https://login.microsoftonline.com/common/oauth2/token'
http = onedrivesdk.HttpProvider()
auth = onedrivesdk.AuthProvider(http,
client_id,
auth_server_url=auth_server_url,
auth_token_url=auth_token_url)
auth_url = auth.get_auth_url(redirect_uri)
code = GetAuthCodeServer.get_auth_code(auth_url, redirect_uri)
auth.authenticate(code, redirect_uri, client_secret, resource=resourceId)
service_info = ResourceDiscoveryRequest().get_service_info(auth.access_token)
until when I am trying to get the service_info, it's empty....
Do you know why it's empty?
Or how can i keep writing the code here so that I could upload files to OneDrive for business?
I get the feeling that this is happening because our SDK drops all services where the service_api_version is less than 2.0. Assuming that the service you are trying to access works with API 2.0, then you can use this workaround to get past your issue.
Related
Quick Overview: I have a ruby app that runs nightly and does something with a user's google calendar. The user has already given access via a separate react app. I'm having trouble getting the ruby app to access the user's calendar with the authorization code from the react app.
Details: I have a React front-end that can sign in a user using gapi and subsequently sign the user into Firebase. Here is how I configure the gapi obj:
this.auth2 = await loadAuth2WithProps({
apiKey: config.apiKey, // from firebase
clientId: config.clientId, // from gcp
// ....
access_type: "offline", // so we get get authorization code
})
Here is sign in:
doSignInWithGoogle = async () => {
const googleUser = await this.auth2.signIn();
const token = googleUser.getAuthResponse().id_token;
const credential = app.auth.GoogleAuthProvider.credential(token);
return this.auth.signInWithCredential(credential);
};
The user's next step is to grant the app offline access to their calendar:
doConnectGoogleCalendar = async () => {
const params = {scope:scopes};
const result = await this.auth2.grantOfflineAccess(params);
console.log(result.code); // logs: "4/ygFsjdK....."
};
At this point the front end has the authorization code that can be passed to a server-side application to be exchanged for access and refresh tokens. I haven't been able to find a good way to use a user supplied auth-code to make calls to available scopes. This is how I've configured the oauth client:
auth_client = Google::APIClient::ClientSecrets.load(
File.join(Rails.root,'config','client_secrets.json') // downloaded from GCP
).to_authorization
^ I'm using the same GCP Credentials on the backend that I'm using for the frontend. It is a "OAuth 2.0 Client ID" type of credential. I'm unsure if this is good practice or not. Also, do I need to define the same config that I do on the frontend (like access_type and scope)?.
Next I do what the docs say to get the access and refresh tokens(click Ruby):
auth_client.code = authorization_code_from_frontend
auth_client.fetch_access_token!
---------
Signet::AuthorizationError (Authorization failed. Server message:)
{
"error": "invalid_grant",
"error_description": "Bad Request"
}
Is there something I'm missing in setting up a separate backend application that can handle offline access to a user granted scope? There is so much different information on these libraries but I haven't been able to distill it down to something that works.
UPDATE
I found this page describing the "one-time-code flow" which I haven't found anywhere else is all of the docs I've gone through. It does answer one of my minor questions above: Yes, you can use the same client secrets as the web application for the backend. (see the full example at the bottom where they do just that). I'll explore it more and see if my bigger problem can be resolved. Also going to update the title to include one-time-code flow.
After a good amount of digging through code samples and source code, I have a clean working solution. Once I found the page in my "update" it led me to finding out that ClientSecrets way I was doing things had been deprecated in favor of the google-auth-library-ruby project. I'm glad I found it because it seems to be a more complete solution as it handles all of the token management for you. Here is code to setup everything:
def authorizer
client_secrets_path = File.join(Rails.root,'config','client_secrets.json')
client_id = Google::Auth::ClientId.from_file(client_secrets_path)
scope = [Google::Apis::CalendarV3::AUTH_CALENDAR_READONLY]
redis = Redis.new(url: Rails.application.secrets.redis_url)
token_store = Google::Auth::Stores::RedisTokenStore.new(redis: redis)
Google::Auth::WebUserAuthorizer.new(client_id, scope, token_store, "postmessage")
end
and then this is how I use the authorization code:
def exchange_for_token(user_id,auth_code)
credentials_opts = {user_id:user_id,code:auth_code}
credentials = authorizer.get_and_store_credentials_from_code(credentials_opts)
end
after calling that method the library will store the exchanged tokens in Redis (you can configure where to store) for later use like this:
def run_job(user_id)
credentials = authorizer.get_credentials(user_id)
service = Google::Apis::CalendarV3::CalendarService.new
service.authorization = credentials
calendar_list = service.list_calendar_lists.items
# ... do more things ...
end
There is so much info out there that it is difficult to isolate what applies to each condition. Hopefully this helps anyone else that gets stuck with the "one-time-code flow" so they don't spend days banging their head on their desk.
My company has a Google Brand Account to manage our YouTube (YT) channel. We are trying to use the YouTube Analytics & Reporting API to automatically export metrics about our channel. To do so, we have created an App in our GCP Organisation and in that App we have created an Oauth client. Everything works fine when retrieving YouTube data for a channel owned by a user within our GCP Org. However, the Brand Account that owns the YT channel we are interested in is not a member of our GCP Org. This means that when trying to access that channel using our Oauth client we get the following error:
Error 403: org_internal
This client is restricted to users within its organization.
From searching online documentation it seems that we can do one of the following:
Make our App external from the APIs & Services -> OAuth consent screen section
Migrate the Brand Account to our GCP Org
I don't know how to do 2. and whether it is actually feasible at all. And 1. seems a bit overboard to me as we don't really want to access data from any user with a Google Account but it might be the only way. So I am looking for help on how best to proceed so that we can use an OAuth client within our GCP Org to get analytics data from our YT channel.
Looks like this was first documented here. It's makred "won't fix" by Google, not sure why.
I was able to use this OAuth Playground (OAP) workaround to get my app to work. It's a pretty sad workaround because the token will only work for an hour and then you must manually refresh in the playground.
Once you get the token from OAP, here is the code I'm using.
import os, json
from pathlib import Path
import google.oauth2.credentials
from googleapiclient.discovery import build
# pasted from OAP, note only access_token is actually needed
Path('oap.json').write_text('''
{
"access_token": "tokenstring",
"scope": "https://www.googleapis.com/auth/youtube.readonly https://www.googleapis.com/auth/yt-analytics.readonly",
"token_type": "Bearer",
"expires_in": 3599,
"refresh_token": "refreshtoken"
}
''')
TOKEN_FILE = 'oap.json'
# For Reference
# SCOPES = ['https://www.googleapis.com/auth/yt-analytics.readonly',
# 'https://www.googleapis.com/auth/youtube.readonly']
API_SERVICES = [
('youtubeAnalytics', 'v2'),
('youtube', 'v3')
]
oap = json.load(open(TOKEN_FILE, 'r'))
creds = google.oauth2.credentials.Credentials(oap['access_token'])
service_list = []
for API_SERVICE_NAME,API_VERSION in api_services:
service = build(API_SERVICE_NAME, API_VERSION, credentials = creds)
service_list.append(service)
ytAnalytics, ytData = service_list
# test ytData
req = ytData.channels().list(
part = 'id,snippet',
mine=True)
res = req.execute()
print(res)
for channel in res['items']:
print('Channel:',channel['snippet']['title'])
I have an application that must verify the identity of its caller. For this I need ID tokens (the JWTs from Google's OpenID Connect implementation) from each client calling the service.
I would like to write client code that works both locally using the default user credentials—for testing and development—and on a Compute Engine instance in production. The official Python auth SDK generally does a good job of handling those cases and saving me the trouble of checking the environment, e.g. I can just call google.auth.default and it figures out where to get credentials.
However, that google.auth package only seems to be able to give me auth tokens, not ID tokens in an environment-independent way. Here is what I tried:
import google.auth
from google.auth.transport import requests
credentials, project = google.auth.default(scopes=["openid"])
req = requests.Request()
credentials.refresh(req)
print(credentials.id_token)
This works on my laptop with my default credentials, but on the Compute Engine instance I instead get an error AttributeError: 'Credentials' object has no attribute 'id_token'
According to this page in the docs, you are supposed to fetch an ID token for an instance by requesting it from the metadata server...
import requests
audience = 'service_identifier'
metadata_server_token_url = 'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience='
token_request_url = metadata_server_token_url + audience
token_request_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(token_request_url, headers=token_request_headers)
jwt = token_response.content.decode("utf-8")
print(jwt)
I don't want to do that. I don't want to manually check the environment. The SDK is supposed to handle that complexity for me. Shouldn't there be a way to leverage the google-auth SDK to generate an ID token in an environment-independent way?
EDIT 1: Why I need this
The application is based on Cloud Functions, and it returns highly sensitive data. Only a specific set of subjects—some trusted devs and services—should be able to access that data. So, the cloud functions must verify the ID of the caller (user or service account) using an ID token signed by Google. Specifically, I need to know the sub claim, the "subject" in the JWT. It is effectively the same issue that the IAM features are meant to solve, documented here. However, I cannot use these because they are still in beta. So, I'm writing identity checks manually into the cloud functions for the time being.
I think I have an answer to this question. I could always get it to work locally or in the cloud, the trick was to find a way of combining the two. I colleague of mine actually showed me how to do this and I just wanted to share this with others who are looking for a solution.
import google.auth
from google.auth.transport.requests import AuthorizedSession, Request
from google.oauth2.id_token import fetch_id_token
import requests
def GetIdToken(audience):
credentials, _ = google.auth.default()
session = AuthorizedSession(credentials)
request = Request(session)
credentials.refresh(request)
if hasattr(credentials, "id_token"):
return credentials.id_token
return fetch_id_token(request, audience)
def ProcessPayload(url, payload):
# Get the ID Token
id_token = GetIdToken(url)
# Process post request
headers = {'Authorization': f'Bearer {id_token}'}
response = requests.post(url, json=payload, headers=headers)
I am using the PowerBI API to upload some pbix files. Most of these files are using the Import mode for the SQL.
When I use the rest API to upload the files, the credentials do not get updated on the website. I know the credentials do not live on the actual file. I also know there is API to patch these credentials using the API but I have not been able to make it work with the Import Mode. Only seems to work with DirectQuery.
I have also tried Set All connections which is documented to only work with direct query connections using this format:
Data Source=xxxx.mydb.net; Initial Catalog=dbname;User ID=xxx;Password=xxxx;
My problem now is that the way Power BI manages cached credentials make it hard to figure out which credentials are being used. There is some magic happening where updating one file sometimes makes the other files which use the same credential also allow refresh.
This is the error I am getting for all files uploaded via API.
Data source errorScheduled refresh has been disabled due to at least one data source not having credentials provided. Please provide credentials for all data sources, and then turn scheduled refresh back on.
Cluster This-is-not-relevant.net
Activity ID00000000-0000-0000-0000-000000000000
Request ID00000000-0000-0000-0000-000000000000
Time2020-09-99 99:54:11Z
Thank you,
Chéyo
This is the solution using the PowerBI Csharp SDK. Make sure the JSON payload is property scaped.
var request = new UpdateDatasourceRequest
{
CredentialDetails = new CredentialDetails
{
Credentials = $"{{\"credentialData\":[{{\"name\":\"username\",\"value\":{JsonConvert.SerializeObject(credential.Username)}}},{{\"name\":\"password\",\"value\":{JsonConvert.SerializeObject(credential.Password)}}}]}}",
CredentialType = "Basic",
EncryptedConnection = "Encrypted",
EncryptionAlgorithm = "None",
PrivacyLevel = "None"
}
};
await PowerBI.Client().Gateways.UpdateDatasourceAsync(gatewayId: datasource.GatewayId, datasource.DatasourceId, updateDatasourceRequest: request);
I'm having difficulties using Google new Admin SDK. In particular the Directory API using Oauth2.
I think I'm almost there but I've got stuck trying to retrieve a users details using the Directory API (I'm using a Google Education Edition domain).
Basically what I'm trying to do is write a python script that provisions or de-provisions users based on their enrollment status which is managed by our AD. I've got a script that does this using Oauth1 but want to update it to use Oauth2.
Here is a code snippet based on some examples I found.
f = file('test_key.p12', 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
'606346240424-10stfco1ukj9h4m4b4r40#developer.gserviceaccount.com',
key,
scope= 'https://www.googleapis.com/auth/admin.directory.user')
http = httplib2.Http()
http = credentials.authorize(http)
service = build(serviceName='admin', version='directory_v1', http=http)
lists = service.users().get(userKey='joe.blogs#mydomain.com').execute(http=http)
pprint.pprint(lists)
This piece of code appears to connect correctly but when I try to execute the query I get a 403 error.
ERROR: https://www.googleapis.com/admin/directory/v1/users/joe.blogs#mydomain.com?alt=json returned "Not Authorized to access this resource/api">
My first thought was because I haven't turned on this API on the administrators console (Google API's console) but I have. (Actually I turned on the Admin SDK and not the Directory API because there is no Directory API to turn on and seeing that it's part of the Admin SDK it would work?).
Is there another step I'm missing or have I made a silly mistake somewhere?
Bruce,
you're pretty close.
Couple of items:
If you're using App Engine, need to convert p12 key to pem and strip header
Need to include user with super user credentials (who has permission to do these operations) whom you're impersonating (not the user who is being changed) using the sub= parameter
So full code will look a bit like this:
# domain configuration settings
import domainconfig
f = file(domainconfig.KEY_FILE, "rb") # b reads file in binary mode; not strictly necessary, but safer to avoid strange Windows EOL characters: https://stackoverflow.com/questions/9644110/difference-between-parsing-a-text-file-in-r-and-rb-mode
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(
domainconfig.SERVICE_ACCOUNT_EMAIL,
key,
scope = domainconfig.SCOPE,
sub=domainconfig.SUB_ACCOUNT_EMAIL # 'sub' supercedes the deprecated 'prn'
)
http = httplib2.Http()
http = credentials.authorize(http)
directoryservice = build("admin", "directory_v1", http=http)
users = directoryservice.users()
response = users.get(userKey='joe.blogs#mydomain.com').execute()
This should be of help: https://developers.google.com/drive/delegation
When asserting the credentials you need to connect it to the user that is going to be changed. From the link above, note this section:
credentials = SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key,
scope='https://www.googleapis.com/auth/drive', sub=user_email)