Using SSIS 2008 to migrate data into Microsoft Dynamics CRM - web-services

I've recently been toying with data migration into Microsoft Dynamics CRM using MS SQL Server Integration Services. First, the basic problem domain:
I have an exported flat file from a previous homebrew CRM system, the goal is to efficiently cleanup the data, and then to move the data over into Dynamics CRM. I've decided to put in one entity at a time in order to keep the orchestrations simple. There is currently an Attribute in CRM that contains the primary key we used in the old CRM. The basic process in my head currently is, import the flat-file into SSIS using the Excel Adapter, then make a connection to the Microsoft Dynamics Database in order to Query for data related to the import. Since I'm not updating the database in anyway, I figure this is fine. Once I have my list of Account Guids and Foreign Keys, I will then compare the list of Excel rows to the list from the CRM database, and create a new derived column with the GUID in it indicating that the operation should be an update, and that the guid to use is the one in that row.
I then create a script object, and make a call out to the CRM Web Service, I go down the Excel file Row by Row, and if it's has a value in the derived column, it updates the CRM, else it just creates a new entity.
If all goes well I'll package the SSIS and execute it from the SQL server.
Is there any gaping flaw in this logic? I'm sure there are ways to make it faster, but I can't think of any that would make a drastic difference. Any thoughts?

Your design is good. Actually, specialized CRM integration software Scribe (and probably others too) do this very much this way with most of their adapters. They use direct database access for reads and calling web service for insert/update/delete and other operations.
I just wonder if this complication is actually necessary. It depends on the size of the data you have to import. I usually deal with data that gets imported over one night.

Sounds good to me - by getting the GUIDs directly from the database, you are are reducing the number of necessary web service calls.

CozyRoc has recently released a new version, which includes Dynamics CRM integration components. Check the official release announcement here.

Related

Pentaho / Salesforce: How to integrate SF-Enterprise-Web-Services-API V48.0 into PDI 9.0 that only supports v47.0

actually I am working with PDI 8.2, however I am able to upgrade to 9.0.
The main issue is that a customer wants to pull data from salesforce which works well so far. But he is using the Enterprise Web Services API with version 48.0, latest Pentaho supports v47.0 only.
I strongly assume that reading via v48.0 won't work with PDI so that I have to build a workaround. Could anyone point me to a feasible solution? To be honest, I don't even know whether the Enterprise or the Partner API is relevant for Pentaho. Have got my own SF-Account so that I could try around with the APIs.
Is the "Web Services lookup" the right step for the workaround?
Any answer would be appreciated! Thanks in advance!
Oh man, what a crazy question, all over the place.
I strongly assume that reading via v48.0 won't work
You'd have to try it but it should work. Salesforce has 3 releases a year and that's when they upgrade API versions. We're in Spring'20 now, it's v.48. That doesn't mean anything below is deprecated. You should have no problems calling with any API version >= 20. From what I remember their master service agreement states that API version released will stay up at least 3 years. Well, v.20 is 9 years old and still going strong...
Check for example https://baa.my.salesforce.com/services/data/ (if your client has "My Domain" enabled you can use that too instead of some unknown company), you should see a list similar to this: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_versions.htm (no login required, that'd be a chicken & egg situation. You need to choose API version you want when making the login call).
So... what does your integration do. I assume it reads or writes to SF tables (objects), pretty basic stuff. In that sense the 47 vs 48 won't matter much. You should still see Accounts, Contacts, custom objects... You won't see tables created specifically in v 48. Unless you must see something mentioned in Spring'20 release notes I wouldn't worry too much.
If your client wrote a specific class (service) to present you with data and it's written in v.48 it might not be visible when you login as v.47. But then they can just downgrade the version and all should be well. Such custom services are rarely usable by generic ETL tools anyway so it'd be a concern only if you do custom coding.
whether the Enterprise or the Partner API is relevant for Pentaho
Sounds like your ETL tool uses SOAP API. Salesforce offers 2 versions of the WSDL file with service definitions.
"Partner" is generic, all SF orgs in the world produce same WSDL file. It doesn't contain any concrete info about tables, columns, custom services written on top of vanilla salesforce. But it does contain info how to call login() or run a "describe" that gives you all tables your user can see, what are their names, what are columns, data types... So you learn stuff at runtime. "Partner" is great when you're building a generic reusable app that can connect to any SF or you want to be dynamic (some backup tool that learns columns every day and can handle changes without problems. Or there's a "connection wizard" where you specify which tables, which columns, what mapping... new field comes in - just rerun the wizard).
"Enterprise" will be specific to this particular customer. It contains everything "Partner" has but will also have description of current state of database tables etc. So you don't have to call "describe", you already have everything on the plate. You can use this to "consume" the WSDL file, generate your Java/PHP/C# classes out of it and interact with them in your program like any other normal object instead of crafting XML messages.
The downside is that if new field or new table is added - you won't know if your program doesn't call "describes". You'd need to generate fresh WSDL and consume it again and recompile your program...
Picking right one really depends what you need to do. ETL tools I've met generally are fine with "partner".
Is the "Web Services lookup" the right step
No idea, I've used Informatica, Azure Data Factory, Jitterbit, Talend... but no idea about this Pentaho thing. Just try it. If you pull data straight from SF tables without invoking any custom code (you can think of SF custom services like pulling data from stored procedures) - API version shouldn't matter that much. If you go < 41.0 I believe you won't see Individual object for example but I doubt you need to be on so much cutting edge.

How to expose sql query (with joins on multiple table) or stored procedure as webservice

I am a novice on webservices front, so please bear with me. I have developed a JavaFX desktop application connecting to database using standard JDBC way. Now that, I want to make this web application, the jdbc had to go out.
I am now implementing with tomee in between and created restful apis (created entity classes for tables and restful apis on top of it) for db access. However, I am not sure how to do this where
a) I need to execute a stored procedure
b) Where the sql query is a join on multiple tables
c) Where I need to insert a sequence in one of the columns.
Any help is appreciated, esp wrt (a) and (b).
If you want a no-frills way of doing it, I'd use Dropwizzard (http://www.dropwizard.io).
If you want more customization, build the stack using JAX-RS, Bean Validation, an EJB as a DAO, and a couple of JPA objects to represent your tables.
See examples here:
http://tomee.apache.org/examples-trunk/simple-rest/README.html
http://tomee.apache.org/examples-trunk/simple-stateless/README.html
http://tomee.apache.org/examples-trunk/injection-of-entitymanager/README.html

Creating Orders in Microsoft Dynamics NAV via web services or an API

I am tasked with creating an API that would allow 3rd party customers the ability to send orders into our Microsoft Dynamics NAV 5.0 SP1.
I want to be able to create a SalesOrder in Dynamics NAV not with the client but via an API so i can allow a seperate process to enter in orders automatically.
Any help is appreciated in leading me in the right direction.
Well, it depends on how complicated you want to make it. Do you need confirmation of the Sales Order creation in "real time"? If so, you'll need to use a Web Service and ensure that there is a network path from wherever customers will create orders (public internet, extranet) to your NAV Web Service - likely using a VPN tunnel, etc.
es th
Alternatively, if you can live with a batch type process then you can have your customers create SOs via a web-based form, etc. and then import these orders into NAV on a regular basis using Dataports or XMLPorts.
For example, you may have a form online that your customer can create an Order on that places the Order in a staging table in SQL or even an XML or CSV file. Then you can run a process on a regular basis that imports these Orders into NAV and creates the appropriate SalesOrders.
Presumably, you also need a way to expose your Item database to the Ordering interface so customers can select which Items to order (and therefore create SalesLines from).
Which type of scenario are you interested in?
Web Services is the way to go; we have several applications that have a similar requirement. I'd recommend building an interface (ASP, to utilise the web service from NAV) and have it talk to NAV that way.
Editing the database directly is not recommended as it will cause locking and may result in deadlocks if not careful. Also NAV can be quite sensitive when it comes to the database, so best not write to it directly if possible :)
I'd recommend creating a codeunit that handles the sales order, in which you can create your functions, 'CreateOrder' and then expose that via Web Services. Even if you're not planning to use a web-based interface, NAV uses the SOAP protocol -- many libraries exist to enable you to connect and interface to Web Services from other languages, like Java.= for instance.

Sync Framework Considerations for Smart Client app

Microsoft Sync Framework with SQL 2005? Is it possible? It seems to hint that the OOTB providers use SQL2008 functionality.
I'm looking for some quick wins in relation to a sync project.
The client app will be offline for a number of days.
There will be a central server that MUST be SQL Server 2005.
I can use .net 3.5.
Basically the client app could go offline for a week. When it comes back online it needs to sync its data. But the good thing is that the data only needs to push to the server. The stuff that syncs back to the client will just be lookup data which the client never changes. So this means I don't care about sync collisions.
To simplify the scenario for you, this smart client goes offline and the user surveys data about some observations. They enter the data into the system. When the laptop is reconnected to the network, it syncs back all that data to the server. There will be other clients doing the same thing too, but no one ever touches each other's data. Then there are some reports on the server for viewing the data that has been pushed to the server. This also needs to use ClickOnce.
My biggest concern is that there is an interim release while a client is offline. This release might require a new field in the database, and a new field to fill in on the survey.
Obviously that new field will be nullable because we can't update old data, that's fine to set as an assumption. But when the client connects up and its local data schema and the server schema don't match, will sync framework be able to handle this? After the data is pushed to the server it is discarded locally.
Hope my problem makes sense.
I've been using the Microsoft Sync framework for an application that has to deal with collisions, and it's miserable. The *.sdf file is locked for certain schema changes (like dropping of a column, etc.).
Your scenario sounds like the Sync Framwork would work for you, out of the box... just don't enforce any referential integrity, since the Sync Framework will cause insert issues in these instances.
As far as updating schema, if you follow the Sync Framework forums, you are instructed to create a temporary table the way the table should look at the end, copy your data over, and then drop the old table. Once done, then go ahead and rename the new table to what it should be, and re-hookup the stuff in SQL Server CE that allows you to handle the sync classes.
IMHO, I'm going to be working on removing the Sync functionality from my application, because it is more of a hinderance than an aid.

Concurrency in RIA

This'll be my first question on this platform. I've done lots of development using Flex, WebORB and ASP.NET. We have solved Concurrency problems with messaging (Pessimistic Concurrency Control). This works pretty good but it also makes the whole application dependent of the messaging. No messaging, no concurrency control.
I know that ASP.NET has version control in DataSets, but how would you go and use that if you are working on a RIA. It seems hard to go and store each dataset in the session of the client... So, if the Client would like need all products, I would need to store the dataset in the session of the client. When the client would change something to a product and save the product, I could then update the dataset (stored in the session) and try to save it...
Seems a lot of work and a lot of memory that will be used (because those products will be kept in the memory of the client, so the dataset needs to be kept on the server side session).
I think the most easy way would be to provide all DTO's with a version number. If the client would try to save a DTO, I could compare the version number with the one in the database.
Lieven Cardoen
This is something I've done before - as the original data was coming from an SQL Server database we just used a rowversion typed column in each DTO to determine if it had changed while the user was working on it.
At this point you can either barf on the error or try and figure out a way to merge the changes, but at least you can tell that it's changed underneath you :)