Whats best way of getting content from SharePoint to AWS/Azure programmatically? - amazon-web-services

How do we move from sharepoint to AWS estate?
I have found various sources on how to do it in the UI, but nothing programmatically?
Any suggestions would be greatly appreciated
Here are UI steps I've found but nothing programmatically - https://www.youtube.com/watch?v=VW6gqVsvOeQ

You should be able to do this in code using the Graph APIs. In particular, you'll be looking for the Working with files in Microsoft Graph section of the API documentation.
Follow these steps to install the Graph SDK.
Follow these steps to Create an app registration.
Follow these steps to Add a certificate to the app registration.
Get an auth token in your code.
Get the site ID by appending /_api/site/id to the site url e.g. https://contoso.sharepoint.com/sites/TheSite/_api/site/id
Get the list of drives associated with the document libraries on your site.
For each drive, get a list of children.
Iterate each child recursively to expand through folders and sub folders.
Download items.
Upload items to AWS.
Getting an auth token
using Azure.Identity;
var scopes = new[] { "https://graph.microsoft.com/.default" };
// Multi-tenant apps can use "common",
// single-tenant apps must use the tenant ID from the Azure portal
var tenantId = "common";
// Values from app registration
var clientId = "YOUR_APP/CLIENT_ID";
var clientCertificate = new X509Certificate2("MyCertificate.pfx");
var options = new TokenCredentialOptions
{
AuthorityHost = AzureAuthorityHosts.AzurePublicCloud
};
// https://learn.microsoft.com/dotnet/api/azure.identity.clientcertificatecredential
var clientCertCredential = new ClientCertificateCredential(
tenantId, clientId, clientCertificate, options);
var graphClient = new GraphServiceClient(clientCertCredential, scopes);
Get list of drives
var drives = await graphClient.Sites["{site-id}"].Drives
.Request()
.GetAsync();
Get root items of a drive
var children = await graphClient.Drives["{drive-id}"].Root.Children
.Request()
.GetAsync();
Get children of items
var children = await graphClient.Drives["{drive-id}"].Items["{driveItem-id}"].Children
.Request()
.GetAsync();
Download files
var stream = await graphClient.Me.Drive.Items["{driveItem-id}"].Content
.Request()
.GetAsync();

Related

Uploaded media files 404 on production but show up hours later

We've built a custom front end for users to post threads and upload images on top of Sitecore 9. Recently, we moved the user generated content to it's own database, so the media files are no longer in the proper sitecore 'media gallery'. Also, file storage is on a networked share. Uploading images works just fine. It's immediately after where the problem lies.
Upon image upload, our rest api returns the media url from the MediaManager.GetMediaUrl(itemId). This also works. It returns a valid url, it is formatted correctly and should resolve. Unfortunately, for some time after, the url does the sitecore dance an 302s to our 404 page.
I can see the image through the content editor and our custom content handler injects the image folder node into both master and web databases.
Why would the link manager be able to find the url, but the image is not available? I uploaded something yesterday afternoon, an when I checked this morning, the image is now available on the site. Any information or suggestions are appreciated.
I have tried saving the image multiple times hoping that this might trigger whatever mystical Sitecore event that causes images to show. Since there isn't a publish due to the separate database, I can't try that. I've removed all versions thinking it couldn't default to a particular language. Nothing. Only time seems to make the images visible. The code below works just fine. I'm just putting it here to to show some work.
public MediaItem UploadSimpleMedia(MediaGallerySimpleUploadRequest request)
{
try
{
var destinationFolder = request.ParentItemId != null
? _content.GetItem<Item>(request.ParentItemId.Value)
: _content.GetItem<Item>(_publicLibraryPath + "/embeded");
var name = !string.IsNullOrEmpty(request.Name)
? ItemUtil.ProposeValidItemName(request.Name)
: ItemUtil.ProposeValidItemName(request.Files[0].FileName);
var creator = new MediaCreator();
var tags = request.Tags != null ? request.Tags.Split(',') : new string[0];
var tagIds = _tagService.GetTagIds(tags);
var tagIdString = string.Join(",", tagIds);
var options = new MediaCreatorOptions()
{
AlternateText = request.Name,
FileBased = true,
IncludeExtensionInItemName = false,
Versioned = false,
Destination = $"{destinationFolder.Paths.Path}/{name}",
Database = Factory.GetDatabase("content")
};
using (new SecurityDisabler())
using (new DatabaseCacheDisabler())
{
MediaItem item = creator.CreateFromStream(request.Files[0].InputStream, request.Files[0].FileName, options);
item.BeginEdit();
item.InnerItem["Title"] = request.Name;
item.InnerItem["Description"] = request.Description;
item.InnerItem["__Semantics"] = tagIdString;
// Sitecore won't calculate MIME Type or File Size automatically unless you upload directly
// to the Sitecore Media Library. We're uploading in-place for private groups for permission inheritance
// and now because of indexing, they are getting uploaded to /Community/Gallery/Files if not private.
item.InnerItem["Size"] = request.Files[0].ContentLength.ToString();
item.InnerItem["Mime Type"] = request.Files[0].ContentType;
item.EndEdit();
// Just pausing here to reflect on why Sitecore is so sexily complicated
// that it takes time to display an image on the CD, but has no problem giving
// up the media url
CacheManager.GetHtmlCache(Sitecore.Context.Site);
_searchService.RefreshIndex(item.ID.Guid);
var fromDatabase = _content.GetItem<Item>(item.ID.Guid);
return fromDatabase;
}
}
catch (Exception ex)
{
_logger.Error(this.GetType().AssemblyQualifiedName, ex);
_logger.Trace(ex.StackTrace);
}
return null;
}
I don't get any error messages either in our custom logs or the default sitecore logs. Images just aren't resolving. Is it caching? I've even nuked all caches by calling CacheManager.ClearAllCaches().

WSO2 API Manager Admin Services - Get List of Secondary Store Users

I am using admin services to get a list of all users available in the store. I am calling the service through Jaggery using this code:
ws = require('ws');
var user = "";
var wsUser = new ws.WSRequest();
var optionsUser = new Object();
optionsUser.useSOAP = 1.2;
optionsUser.useWSA = 1.0;
optionsUser.action = "urn:listUsers";
wsUser.open(optionsUser, "https://localhost:9443/services/RemoteUserStoreManagerService", false, "admin", "admin");
wsUser.send('<ser:listUsers xmlns:ser="http://service.ws.um.carbon.wso2.org"><ser:filter></ser:filter><ser:maxItemLimit>-1</ser:maxItemLimit></ser:listUsers>');
resultUser = wsUser.responseText;
This gives me the list of users of the Primary Store. There is also a Secondary User Store connected to the APIM through Active Directory, and I would like to get the list of the users of that store as well.
Is there a way to get the list of users of all stores using admin service, if yes how would I do that?
Thanks
If you set '*' (i.e.asterisk) for search filter like below, you should get users of all userstores.
<ser:listUsers xmlns:ser="http://service.ws.um.carbon.wso2.org">
<ser:filter>*</ser:filter>
<ser:maxItemLimit>-1</ser:maxItemLimit>
</ser:listUsers>

Facebook ads insights API - get basic metrics

I would like to connect to the Facebook Ads Insights API with google scripts in order to generate and update a google sheet containing my ads key performance indicators.
I have read Facebook's documentation but I'm a bit lost, for example In the documentation's website, I can see that I am supposed to follow this syntax to get a campaign's impressions
GET <AD_OBJECT>/insights?fields=impressions
but I'm not quite sure where that would fit in a cURL get query, should it look like this ?
https://graph.facebook.com/v2.5/CAMPAIN_ID/insights?fields=impressions%?access_token=TOKEN
I have tried to build the following google script but I'm not sure it's getting anywhere, any help ?
var myClientID = '';
var myClientSecret = '';
var myAccessToken = 'MY_TOKEN';
var graphURL = 'https://graph.facebook.com/v2.5/';
function getPageLikes(campaign_id) {
var searchParams = '?fields=impressions%2Cunique_clicks%2Creach';
var campaignID = MY_CAMPAIGN_ID;
var fullURL = graphURL + campaignID + '/insights/' + searchParams + '&access_token=' + myAccessToken;
var fetchResult = UrlFetchApp.fetch(fullURL);
var campaign = JSON.parse(fetchResult);
var likes = campaign.data[0];
return campaign_data;
}
Thank you
Yes. If you like CURL, you should play with https://developers.facebook.com/tools/explorer/
Then you can formulate something like:
https://graph.facebook.com/v2.9/[campaign_id]/insights?fields=impressions%2Creach&access_token=[token]
But if you want an easier life, I would recommend you to use one of the SDK. There is one for
PHP: https://github.com/facebook/facebook-php-ads-sdk
Python: https://github.com/facebook/facebook-python-ads-sdk
Java: https://github.com/facebook/facebook-java-ads-sdk
And we also have a tool to guide you generate code in the Getting Started session in https://developers.facebook.com/apps/[app_id]/marketing-api/
Basically you can pick metrics and the wizard will generate a working code for you. (It is only generating Java code for now)
Also if you don't really wants to code, you may try the new product we just released called Facebook Ads Manager for Excel. It allows you to download Insights data into Excel directly. More info:
https://www.facebook.com/business/m/facebook-ads-manager-for-excel

How to login into a third-party website using google app script and manage data on login?

I am interested in creating a google app script that on run would login into a specific website (third-party) and complete certain functions within the website (pressing buttons/copying text).
After browsing the stackoverflow and other forums I have created a script that allows me to login into my website (source1 source2).
However, I am having difficulties staying logged in and managing the data.
//The current code is just testing if I can get data from within the website.
//The results are displayed in a google app.
function doGet() {
var app = UiApp.createApplication();
app.add(app.createLabel(display_basic_data()));
return app;
}
//logins into website and displays data
function display_basic_data() {
var data;
var url = "http://www.website.bla/users/sign_in";
var payload = {"user[username]":"usr","user[password]":"ps"};
var opt ={"method":"post","payload":payload, "followRedirects" : false};
var response = UrlFetchApp.fetch(url,opt);
data = response;
return data;
}
Currently, the data returned from display_basic_data() is
"<html><body>You are being redirected.</body></html>".
If I try to change my script so that "followRedirects" is true, the data is equivalent to the HTML of the login page.
I understand I have to play around with cookies in order to 'stay' logged in but I have no idea what to do as the examples online provided to be fruitless for me.
Any help would be much appreciated!!!
You may want to do something like this:
var cookie = response.getAllHeaders()['Set-Cookie'];
//maybe parse cookies here, depends on what cookie is
var headers = {'Cookie':cookie};
var opt2 = {"headers":headers};
var pagedata = UrlFetchApp.fetch("http://www.website.bla/home",opt2);

New Google Drive Directory APIs error out: Bad request

I am using below piece of code to list all domain users in my simple Console application
var certificate = new X509Certificate2("D:\\3acf2c2008cecd33b43de27e30016a72e1482c41-privatekey.p12", "notasecret", X509KeyStorageFlags.Exportable);
var privateKey = certificate.Export(X509ContentType.Cert);
var provider = new AssertionFlowClient(GoogleAuthenticationServer.Description, certificate)
{
ServiceAccountId = "877926787679-b7fd15en1sh2oc65e164v90cfcvrfftq#developer.gserviceaccount.com",
Scope = DirectoryService.Scopes.AdminDirectoryUserReadonly.GetStringValue(),
ServiceAccountUser = "user1#05.mygbiz.com"
};
var auth = new OAuth2Authenticator<AssertionFlowClient>(provider, AssertionFlowClient.GetState);
DirectoryService dirService = new DirectoryService(new BaseClientService.Initializer()
{
Authenticator = auth,
ApplicationName = "My APP"
});
Users users = dirService.Users.List().Execute();
Execute() method errors out saying Bad Request.
Questions:
How to overcome this issue?
Does this Admin SDK support trial version of Google APP account?
I have updated service account Client ID in Google Console and also updated in Admin Console with below scopes
https://www.googleapis.com/auth/admin.directory.group
https://www.googleapis.com/auth/admin.directory.user
and also set API access check box. Do I missing something in settings?
Like JoBe said, you should include the domain parameter.
happy_user = service.users().list(domain='mydomain.com').execute()
This has worked for me.