I have a project which requires SharePoint 2013 Search Service (on premise deployment) to index one Office 365 Shared Mailbox.
Based on my research the following is not possible:
Exchange content source: it works with old on premise Exchanges
eDiscovery: This is different feature. It has Exchange configuration
but can’t be used in Search scenarios.
Business Connectivity Services:I tried creating external content
source in Visual Studio providing OData urls but didn’t succeed
Hybrid federated search works in scenario SharePoint Server 2013 to
SharePoint Online, not in my scenario
Third part solution is not accessible by my client.
This post is close to my scenario but it is old and doesn’t lead to solution.
I explored also the new “Groups” in O365 but this is not applicable. My client needs single place for searching SharePoint data and this Shared Mailbox.
What I see as the only possible approach is the use of the Outlook Rest API for real time searches against this Mailbox. I tested it and I’m able to retrieve data in SharePoint hosted app. The big problem is that I don’t have refinements and total items count.
Can someone point me to a better solution? Am I missing something?
I suggest that you use the Microsoft Graph REST instead of Office 365 REST API.
It exposes multiple APIs from Microsoft cloud services through a single REST API endpoint (https://graph.microsoft.com). Using the Microsoft Graph, you can turn formerly difficult or complex queries into simple navigations.
The Microsoft Graph gives you:
A unified API endpoint for accessing aggregated data from multiple
Microsoft cloud services in a single response
Seamless navigation between entities and the relationships among them
Access to intelligence and insights coming from the Microsoft cloud
And you can use $count query parameters to return the number of items in the collection. Here is an example that return the number of result for your reference:
GET: https://graph.microsoft.com/v1.0/me/messages?$filter=contains(subject,'a')&$count=true
You could get the resonpnse like below for the number of items:
#odata.context=https://graph.microsoft.com/v1.0/$metadata#users('')/messages
#odata.count=341
#odata.nextLink=https://graph.microsoft.com/v1.0/me/messages?$filter=contains(subject,'a')&$count=true&$skip=10
value{}
Related
I have exported Firestore collections to Google Big Query to make data analysis and aggregation.
What is the best practice (using Google Cloud Products) to serve Big Query outputs to a client web application?
Google provides seven client libraries for BigQuery. You can take any library and write a webserver that will serve requests from client web application. The webserver can use a GCP service account to access BigQuery on behalf of its clients.
One such sample is this project. It's written in TypeScript. Uses NodeJS library on the server and React for the client app. I'm the author.
You may try to have an express tour through Google Data Studio, looking for the main features what this Google analytics service can offer for the customers. If your aim stands for visualizing data from Bigquery, Data Studio is a good option, thus it provides a variety of informative dashboards and reports, allowing the user customize charts and graphs sharing them publicly or via user collaboration groups.
Data Studio spreads a lot of connectors to different data sources, hence you can find a separate Bigquery connector for further integration with data resources residing in Bigquery warehouse.
You can track for any future product enhancements here.
I'd like to do a key phrase analysis of a Microsoft Word document. It looks like the API only takes JSON documents. Is there a route to use real life documents like Microsoft Office documents?
A recent hackathon project, Resolving Managed Metadata Madness in SharePoint, answers this question.
The developer of that project used a three step process involving custom code. An Azure Function was written to extract the text to pass to the API. The function returns the results of the analysis back to Microsoft Flow.
A Flow attached to Document Library will call the Azure Function
that’ll do the heavy lifting
An Azure Function will run, extract
text, analyze it using Azure Cognitive Services, and then write the
info back to SharePoint Online
Finally, notifies admin of the
execution and the creator of the file.
I am working with my team to prep a project for a potential client. We've researched Amazon MWS API, and we're trying to develop an algorithm using the data scraped from this API.
Just want to make sure we understand the research correctly:
Is it possible to scrape data from Amazon.com like the plugins RevSeller or HowMany do? Then can we add that data to a database for use in an algorithm to determine whether or not an Amazon reseller should invest in reselling a product?
Thanks!
I am doing a similar project. I don't know the specifics of RevSeller or HowMany, but another very popular plugin is Amzpecty. If you use a tool like Fiddler, you can see the HTTP traffic and figure out what it does. They basically scrape out the ASIN and offer listing ID's on the current page you are looking at and one-by-one call the Amazon Product Advertising API, which is not the same thing as MWS. Out of that data returned, they produce a nice overlay that tells you all kinds of important stuff.
Instead of a browser plugin, I'm just writing an app that makes HTTP calls based on a list of ASIN's to the PA API and then I can run the results through my own algorithms. Hope that gives you a starting point.
I am tasked with creating an API that would allow 3rd party customers the ability to send orders into our Microsoft Dynamics NAV 5.0 SP1.
I want to be able to create a SalesOrder in Dynamics NAV not with the client but via an API so i can allow a seperate process to enter in orders automatically.
Any help is appreciated in leading me in the right direction.
Well, it depends on how complicated you want to make it. Do you need confirmation of the Sales Order creation in "real time"? If so, you'll need to use a Web Service and ensure that there is a network path from wherever customers will create orders (public internet, extranet) to your NAV Web Service - likely using a VPN tunnel, etc.
es th
Alternatively, if you can live with a batch type process then you can have your customers create SOs via a web-based form, etc. and then import these orders into NAV on a regular basis using Dataports or XMLPorts.
For example, you may have a form online that your customer can create an Order on that places the Order in a staging table in SQL or even an XML or CSV file. Then you can run a process on a regular basis that imports these Orders into NAV and creates the appropriate SalesOrders.
Presumably, you also need a way to expose your Item database to the Ordering interface so customers can select which Items to order (and therefore create SalesLines from).
Which type of scenario are you interested in?
Web Services is the way to go; we have several applications that have a similar requirement. I'd recommend building an interface (ASP, to utilise the web service from NAV) and have it talk to NAV that way.
Editing the database directly is not recommended as it will cause locking and may result in deadlocks if not careful. Also NAV can be quite sensitive when it comes to the database, so best not write to it directly if possible :)
I'd recommend creating a codeunit that handles the sales order, in which you can create your functions, 'CreateOrder' and then expose that via Web Services. Even if you're not planning to use a web-based interface, NAV uses the SOAP protocol -- many libraries exist to enable you to connect and interface to Web Services from other languages, like Java.= for instance.
I've recently been toying with data migration into Microsoft Dynamics CRM using MS SQL Server Integration Services. First, the basic problem domain:
I have an exported flat file from a previous homebrew CRM system, the goal is to efficiently cleanup the data, and then to move the data over into Dynamics CRM. I've decided to put in one entity at a time in order to keep the orchestrations simple. There is currently an Attribute in CRM that contains the primary key we used in the old CRM. The basic process in my head currently is, import the flat-file into SSIS using the Excel Adapter, then make a connection to the Microsoft Dynamics Database in order to Query for data related to the import. Since I'm not updating the database in anyway, I figure this is fine. Once I have my list of Account Guids and Foreign Keys, I will then compare the list of Excel rows to the list from the CRM database, and create a new derived column with the GUID in it indicating that the operation should be an update, and that the guid to use is the one in that row.
I then create a script object, and make a call out to the CRM Web Service, I go down the Excel file Row by Row, and if it's has a value in the derived column, it updates the CRM, else it just creates a new entity.
If all goes well I'll package the SSIS and execute it from the SQL server.
Is there any gaping flaw in this logic? I'm sure there are ways to make it faster, but I can't think of any that would make a drastic difference. Any thoughts?
Your design is good. Actually, specialized CRM integration software Scribe (and probably others too) do this very much this way with most of their adapters. They use direct database access for reads and calling web service for insert/update/delete and other operations.
I just wonder if this complication is actually necessary. It depends on the size of the data you have to import. I usually deal with data that gets imported over one night.
Sounds good to me - by getting the GUIDs directly from the database, you are are reducing the number of necessary web service calls.
CozyRoc has recently released a new version, which includes Dynamics CRM integration components. Check the official release announcement here.