We have requirement to load the data from web-service into the target database using Informatica. The web-service will be initiated by the source application whenever there is a change in source side. From Informatica side, we have to trigger the loading job whenever we receive the web-service instead of scheduling/batch jobs.
Please let me know if you have any option to achieve this using power exchange.
You could make use of HTTP transformation to load the data from web-service.
There is a demo on this in Informatica marketplace. Download the file there to get the complete implementation steps - https://marketplace.informatica.com/solutions/mapping_web_service_using_http_transformation
And with respect to triggering the work flows adhoc, may be you can make use of file watchers. Whenever there is a web service request, you can arrange to have a file transferred to your source location that indicates a new request. I am not sure if this is possible in your case. It would be great if you could provide more details. However, there is another demo here explaining implementation of file wather to auto trigger your workflows that could help -
https://marketplace.informatica.com/solutions/mapping_email_on_non_occurrence_of_event
Related
I have some experience with Google Cloud Functions (CF). I tried to deploy a CF function recently with a Python app, but it uses an NLP model so the 8GB memory limit is exceeded when the model is triggered. The function is triggered when a JSON file is uploaded to a bucket.
So, I plan to try Google Cloud Run but I have no experience with it. Also, I am not completely sure if it is the best course of action.
If it is, what is the best way of implementing provided that the Run service will be triggered by a file uploaded to a bucket? In CF, you can select the triggering event, in Run I didn't see anything like that. I could use some starting points as I couldn't find my case in the GCP documentation.
Any help will be appreciated.
You can use at least these two things:
The legacy one: Create a GCS notification in PubSub. Then create a push subscription and add the Cloud Run URL in the HTTP push destination
A more recent way is to use Eventarc to invoke directly a Cloud Run endpoint from an event (it roughly create the same thing with a PubSub topic and push subscription, but it's fully configured for you)
EDIT 1
When you use Push notification, you will received a standard PubSub message. The format is described in the documentation for the attributes and for the body content; keep in mind that the raw content is base64 encoded and you have to decode it to get the final format
I personally have a Cloud Run service that log the contents of any requests to be able to get in the logs all the data that I need to develop. When I have a new message format, I configure the push to that Cloud Run endpoint and I automatically get the format
For Eventarc, the format will be added to the UI soon (I view that feature in preview, but it's not yet available). The best solution is to log the content to know what you get to know what to do!
Just getting started with Wowza Streaming Engine.
Objective:
Set up a streaming server which live streams existing video (from S3) at a pre-defined schedule (think of a tv channel that linearly streams - you're unable to seek through).
Create a separate admin app that manages that schedule and updates the streaming app accordingly.
Accomplish this with as a little custom Java as possible.
Questions:
Is it possible to fetch / update streamingschedule.smil with the Wowza Streaming Engine REST API?
There are methods to retrieve and update specific SMIL files via the REST API, but they only seem to be applicable to those created through the manager. After all, streamingschedule.smil needs to be created manually by hand
Alternatively, is it possible to reference a streamingschedule.smil that exists on an S3 bucket? (In a similar way footage can be linked from S3 buckets with the use of the MediaCache module)
A comment here (search for '3a') seems to indicate it's possible, but there's a lot of noise in that thread.
What I've done:
Set up Wowza Streaming Engine 4.4.1 on EC2
Enabled REST API documentation
Created a separate S3 bucket and filled it with pre-recorded footage
Enabled MediaCache on the server which points to the above S3 bucket
Created a customised VOD edge application, with AppType set to Live and StreamType set to live in order to be able to point to the above (as suggested here)
Created a StreamPublisher module with a streamingschedule.smil file
The above all works and I have a working schedule with linearly streaming content pulled from an S3 bucket. Just need to be able to easily manipulate that schedule without having to manually edit the file via SSH.
So close! TIA
To answer your questions:
No. However, you can update it by creating an http provider and having it handle the modifications to that schedule. Should you want more flexibility here you can even extend the scheduler module to not require that file at all.
Yes. You would have to modify the ServerListenerStreamPublisher solution to accomplish it. Currently it solely looks a the local filesystem to read teh streamingschedule.smil file.
Thanks,
Matt
I need to create a service that will receive an XML feed at any given time that will have data related to a content-type.
Could someone please advise me what modules i should use to develop a solution.
So
Another server will post a xml feed with instruction add/delete/update content in the xml
I will require to update the content type from the XML feed post
I have previously used the migrate module but this is run on my side through cron or manual. The main difference here is that i could receive the post from the other server at any given time or possibly multiple concurrent posts.
This sounds like a job for the Services 3 module. You can make a resources module to that which parses your xml file an does the work for you. The services module is handling rest/rpc connections for you.
I am working with WSO2 Data Service version 2.6. In this version the tool for contract firts service generation has been removed due to a bug, and it will be realease again in the next version.
We need to change the way the results are returned. We would like to define our own wsdl and generate an empty data service from it, where we finally set up each operation, data source and configure how the results are mapped to the wsdl schema types.
Is there any way of do this without data service contract first tool?
I'm afraid, there's no straightforward way to do this. You would have to carefully start from creating the data service yourself to make it generate a WSDL that will resemble the WSDL that you would need.
Cheers,
Anjana.
Is there any way to call a web service in a ssis package in order to insert some data into a table within SQL Server? How? any sample or guidance please?
I assume by your question, you are referring to using a web service as a destination for a data flow? I inherited a series of packages that integrate with our MS CRM site. As designed, these packages are a horrible fit for the SSIS paradigm but that's my burden to bear...
These packages generally fit the form of Source (OLE DB or Flat File) fed to a Script Task (destination). I don't know that providing all the code of a particular task would be enlightening. It's simply invokes the web service for each row sent into it. RBAR is not what SSIS or set based languages are made for but you can certainly do it.
The Script transformation will have a web reference (ours is named CrmSdk) to the service.
Declare an instance of the service as a member of ScriptMain.
Instantiate that service in your script, passing credentials as needed. Most likely in your PreExecute method
Make calls to the web service in your Input0_ProcessInputRow method using the Row.Column1 notation. Do be aware of nulls and how the web service handles them. Our code uses service.CompanyName = Row.CompanyName_IsNull ? string.Empty : Row.CompanyName;
If your intention is to use a web service at the Control Flow level, be aware that the default Task has a hard coded 5 minute timeout. Not sure if that's still the case, but in the 2005 package I was dealing with, we had to use a straight Script Task to communicate with our webservice (it was cleansing millions of rows of address data in batch fashion) to bypass the timeout issue. Reference to timeout property