I'd like to do a key phrase analysis of a Microsoft Word document. It looks like the API only takes JSON documents. Is there a route to use real life documents like Microsoft Office documents?
A recent hackathon project, Resolving Managed Metadata Madness in SharePoint, answers this question.
The developer of that project used a three step process involving custom code. An Azure Function was written to extract the text to pass to the API. The function returns the results of the analysis back to Microsoft Flow.
A Flow attached to Document Library will call the Azure Function
that’ll do the heavy lifting
An Azure Function will run, extract
text, analyze it using Azure Cognitive Services, and then write the
info back to SharePoint Online
Finally, notifies admin of the
execution and the creator of the file.
Related
As part of an automation procedure, I must copy the emails with attachments from Outlook to GCS(attachment formats should be .csv files). Can somebody advise me on how to complete this process best? Please keep in mind that I am new to GCP and that the simplest explanation would be beneficial.
Thanks in advance.
You can use REST API such as Graph API to retrieve the required data from the Office365 side and transfer it to the GCS. See Use the Microsoft Graph API for more information.
In case of dealing with Outlook (as an application installed) you can develop a VBA, COM add-in or just automate Outlook from an external application.
The simplest choice is VBA which allows to automate tasks in Outlook. VBA macros are not designed for distributing on multiple machines, that is for COM add-ins were invented. In that case you can create an installer like for any other distributable software.
The Outlook object model provides a rich set of properties and methods that allows getting the job done. You can use the Attachment.SaveAsFile method to save attached files on the disk from where you could upload them to the GCS.
I have exported Firestore collections to Google Big Query to make data analysis and aggregation.
What is the best practice (using Google Cloud Products) to serve Big Query outputs to a client web application?
Google provides seven client libraries for BigQuery. You can take any library and write a webserver that will serve requests from client web application. The webserver can use a GCP service account to access BigQuery on behalf of its clients.
One such sample is this project. It's written in TypeScript. Uses NodeJS library on the server and React for the client app. I'm the author.
You may try to have an express tour through Google Data Studio, looking for the main features what this Google analytics service can offer for the customers. If your aim stands for visualizing data from Bigquery, Data Studio is a good option, thus it provides a variety of informative dashboards and reports, allowing the user customize charts and graphs sharing them publicly or via user collaboration groups.
Data Studio spreads a lot of connectors to different data sources, hence you can find a separate Bigquery connector for further integration with data resources residing in Bigquery warehouse.
You can track for any future product enhancements here.
I would be starting ft in one company, where i was been told that the application is developed using 'Sas' and 'salesforce'. What is the difference between two?
And which are recommended online resource which I can use to learn more about it.
SAS is software for statistical analysis. If your company/job description doesn't look like working with large sets of data & complex reporting that's probably not it.
They probably mean SaaS (Software as a Service) model, also known as "the cloud", cloud computing etc. You write the program (or use / modify existing one) but you don't buy servers, worry about network connection, electricity costs, load balancing (spikes in traffic will not cause your website to go down). Many apps operate in this model. Microsoft's Azure cloud (or even online wersions of MS Office). There's Siebel Oracle on Demand CRM, Microsoft Dynamics, SAP I think also has SaaS offering...
It's a big topic, I'm simplifying a lot here. And then there are Platform as a Service things too (PaaS) where they give you "just" the hosting etc but no base application to build on top of. You write everything you need from scratch and upload it. Think Heroku or Amazon Web Services (AWS).
Salesforce is "just" one more SaaS application. You start with base application & database, similar to all other clients in the world. You can install plugins to it (some free, some paid), configure it yourself, write custom code if your functionality is too complex... You can do a lot with just clicks & drag & drop but if you need to code stuff then JavaScript (for client-side) and Apex (for server-side) will be your friend. Apex is bit similar to Java.
Where to start... Trailhead is good source of self-paced trainings. You can sign up for a free Salesforce Developer Edition (has almost all features as the paid one but limited storage space), try to pass some courses... Or in SF help&training there should be tons of videos (actually in that link whole left menu "getting started with salesforce" might be good).
I am working with my team to prep a project for a potential client. We've researched Amazon MWS API, and we're trying to develop an algorithm using the data scraped from this API.
Just want to make sure we understand the research correctly:
Is it possible to scrape data from Amazon.com like the plugins RevSeller or HowMany do? Then can we add that data to a database for use in an algorithm to determine whether or not an Amazon reseller should invest in reselling a product?
Thanks!
I am doing a similar project. I don't know the specifics of RevSeller or HowMany, but another very popular plugin is Amzpecty. If you use a tool like Fiddler, you can see the HTTP traffic and figure out what it does. They basically scrape out the ASIN and offer listing ID's on the current page you are looking at and one-by-one call the Amazon Product Advertising API, which is not the same thing as MWS. Out of that data returned, they produce a nice overlay that tells you all kinds of important stuff.
Instead of a browser plugin, I'm just writing an app that makes HTTP calls based on a list of ASIN's to the PA API and then I can run the results through my own algorithms. Hope that gives you a starting point.
I have a project which requires SharePoint 2013 Search Service (on premise deployment) to index one Office 365 Shared Mailbox.
Based on my research the following is not possible:
Exchange content source: it works with old on premise Exchanges
eDiscovery: This is different feature. It has Exchange configuration
but can’t be used in Search scenarios.
Business Connectivity Services:I tried creating external content
source in Visual Studio providing OData urls but didn’t succeed
Hybrid federated search works in scenario SharePoint Server 2013 to
SharePoint Online, not in my scenario
Third part solution is not accessible by my client.
This post is close to my scenario but it is old and doesn’t lead to solution.
I explored also the new “Groups” in O365 but this is not applicable. My client needs single place for searching SharePoint data and this Shared Mailbox.
What I see as the only possible approach is the use of the Outlook Rest API for real time searches against this Mailbox. I tested it and I’m able to retrieve data in SharePoint hosted app. The big problem is that I don’t have refinements and total items count.
Can someone point me to a better solution? Am I missing something?
I suggest that you use the Microsoft Graph REST instead of Office 365 REST API.
It exposes multiple APIs from Microsoft cloud services through a single REST API endpoint (https://graph.microsoft.com). Using the Microsoft Graph, you can turn formerly difficult or complex queries into simple navigations.
The Microsoft Graph gives you:
A unified API endpoint for accessing aggregated data from multiple
Microsoft cloud services in a single response
Seamless navigation between entities and the relationships among them
Access to intelligence and insights coming from the Microsoft cloud
And you can use $count query parameters to return the number of items in the collection. Here is an example that return the number of result for your reference:
GET: https://graph.microsoft.com/v1.0/me/messages?$filter=contains(subject,'a')&$count=true
You could get the resonpnse like below for the number of items:
#odata.context=https://graph.microsoft.com/v1.0/$metadata#users('')/messages
#odata.count=341
#odata.nextLink=https://graph.microsoft.com/v1.0/me/messages?$filter=contains(subject,'a')&$count=true&$skip=10
value{}