How can we create a data source in wso2 micro integrator? - wso2

I am new to wso2(MI) micro integrator. Just i want to know how to create a data source for db connection details. From that i can reuse data source for further development.
Someone could help me to do.

As of now, you need to define data sources inline where ever you need them. Reusing is not possible.

Related

Has anyone setup SFTP with Google Cloud platform to trigger cloud function?

So I was able to setup SFTP so that I'm able to send a file to my VM instance on GCP.
However, does anyone have any guidance on how to then have a cloud function run at time of file being uploaded?
The objective is with our customers they send booking requests via SFTP in a certain EDI format.
Would need a cloud function to then run off of what was sent.
Any direction would be greatly appreciated 🙏
As an idea to think about - you might like to implement a SFTP server on a VM in a such manner that it uses Storage Buckets for file management behind the externally exposed standard API. In that case you may be able to use storage events to trigger cloud functions.
I think such implementations do exist, so it should be possible to find some examples.
So I could never figure out a simple, clean way of doing it on GCP. I did find this new platform called stedi. I was able to resolve my needs by it.
DISCLAIMER: I am not affiliate with stedi in anyway. I just want others to know that its out there cause could save some developers some major headaches

How do we dump data into informatica?

I have to dump data from various sources to Informatica. Sources are some manual files which would be dumped via a SFTP server, some via APIs, some with direct DB connection. In that case, how do we connect the files from the server? via some kind of connection to the SFTP server, API endpoint connection, putting DB connection via DB endpoint? In these cases, how do we authenticate? i dont want to use the username/password, is there a way to use Active Directory connect?
How does informatica authenticate if the source of the files are genuine?
If you mean the source itself, then you need to decide if the source is genuine before you create a connection to it
If you mean how to secure the connection, then that is a property of the source and defined by the owner of the source. Informatica can use almost any industry-standard secure protocols and authentication methods
Any way to scan for malicious files?
Informatica can implement any business rules you want to define to determine if the data in a file is malicious
If you are asking is there a "magic button" you can press that will tell you if a file is malicious, then the answer is no
Answer to Question about PocketETL
Once you've identified all the functionality required to implement your overall architecture, you have 2 basic options for how you satisfy these requirements:
Identify a single tool that covers as much of the functionality as possible and then fill in the gaps with other tools
simplest to implement
should "just work"
unlikely to be "best of breed" in all areas
unlikely to the cheapest solution
Implement point solutions for each area of functionality
likely to be a better solution, for you, in each area
may be cheaper
but you have to get all the components working together, which is unlikely to be trivial
you need to know how to implement and configure multiple products, not just one
So you could use Informatica to do everything or you could use PocketETL to do the first piece of data movement and then other tools to implement the rest of data pipeline

How to synchronize the local DynamoDb and Amazon DynamoDb web service

Hello, thanks for your viewing my question first!
I am running the Amazon dynamoDb locally and all databases are saved locally. With the local dynamoDb, I have to show everything with a lot of code, but I feel the interface at web service is much better, in which I can perform operations and see the tables directly and clearly:
So may I ask how can connect them, then I can practice the coding and check the status easily?
Looking forward to your reply and thank you so much!
Sincerely
You cannot connect them as they are completely separate databases. However, you can put a simple user interface on top of your local DynamoDB database.
I use the SQLite Browser: http://sqlitebrowser.org/. Once you have it installed, open the .db file located in the folder where you are running DynamoDBLocal.jar. You should be able to see all your tables and the data within them. You won't be able to see DynamoDB specific things like your provisioned capacity, but I think this will give you enough of what you're looking for.
Does this help?

Does Biztalk Server support data exchange without use of web services

As I have very little knowledge on how ESB's work in tandem with database I'm asking a question regarding how communication can take place between the two hoping I'll atleast be pointed in the right direction to search in!
SITUATION : We have two systems(one of them is the client's) on different networks which have their own databases. We are required to do a regular real-time data exchange of all points present in our database with the other. We are also required to have a provision to be abel to import data into our system. This exchange has to follow SOA functionality over customer provided Biztalk ESB.We are supposed to provide the exchange by the use of ODBC.
Question: My query is whether it is possible to integrate the databases to the ESB as some endpoints without making any use of WEBSERVICES or extra interfaces, and send the data over the ESB as a pull-push transfer mechanism?
I have tried searching the net for this situation but have not come up with a lot of straightforward answers. Could someone please point me in the right direction.
ESB Toolkit in BizTalk is not an ESB! It is just small additional tool for some special cases.
Let's stop talk about the ESB, we need to solve the technical problem, right?
As I can understand you have two SQL databases and want to integrate them.
To do so with BizTalk the easiest way is to use the WCF-SQL ports/adapters.
You start the Wizards for this adapter, choose the tables/sp-s which should provide data/consume data, the Wizard will generate all needed Xml schemas for you.
Then you will use BizTalk Mapper to create the Xslt maps, which will transfer one SQL data format to another.
They you will create a pair of ports. One will consume data from one SQL database, the second will insert data to another SQL database. One of this port will use the mentioned above Xslt map.
If you need more processing, you could create and orchestration to manage additional processing, sophisticated error handling, etc.
I would recommend using MSMQ. There's a fairly detailed description of it here

Simple ETL: Smooks or ETL product

I am fairly new to the subject and doing some research.
I have an ESB (using WSO2 ESB) and want to extract master data from the passing messages (like Customers, Orders, etc) and store them in DB to keep as a reference data. Source data is in XML coming from web services.
So there needs to be a component that will be able to maintain master data: insert new objects, delete old and update changed (would be also nice to have data events so ESB can route data accordingly).Basically, the logic will be similar for any entity type and it might be good idea to autogenerate it for all new entity types...
Options as I see them now:
Use Smooks with either SQLExecutor or Hibernate for persistence with all matching logic written either in smooks config or in DAO annotations
Use some open source ETL tool (like Talend, Kettle, Clover, etc). So the data will be passed to the ETL and all transformation logic is defined there. Also could accommodate future scenarios when they appear or can be an overkill..
.
Would appreciate if you share your thoughts and point me to the right direction.
You'd better to leave your database part to another tool.
If you have a fair amount of database interactions in your message flow, you can expect serious decreases in your performance.
However you do not need an ETL for the use case you explained. You can simply do it using WSO2 DSS by creating services to insert or update your data inside the database.
We have been using this for message logging purposes (inside DB) beside the ESB and are happy with that. It's better to use it as non-blocking fire-and-forget web services in your message flow within ESB. Hope this helps.