Download JSON in Informatica Application Integration - web-services

I am new to and ICAI, and i have a requirement
a. create a service
b. the user will upload a json file using this
webservice
c. the json file will be downloaded and saved locally.
The solution path i was taking is:
create a process which will accept 2 inputs (some generic text and the json file)
which generated the below url
I tested the same in POSTMAN and it is working fine, but i am not able to download the json into informatica server on any location,

Final solution based on the feedback from Maciejg
Steps taken:
create a filewriter app connection and set it up only for
"eventtarget"
create a process
in start - create a input field of type - attachment
in start - create a temp field of type - filewriter connection
add a assignment task
in assignment task add a filed temp->content format of type content
-> attachment
in the same assignment task add another field temp->file name of
type formula
Above steps are enough to save the uploaded file, if required, other steps (check file type, authentication etc) can be added.

It seems you need to use a FileWriter Service. Check out this knowledgebase article for details.

Related

Azure Data Factory HDFS dataset preview error

I'm trying to connect to the HDFS from the ADF. I created a folder and sample file (orc format) and put it in the newly created folder.
Then in ADF I created successfully linked service for HDFS using my Windows credentials (the same user which was used for creating sample file):
But when trying to browse the data through dataset:
I'm getting an error: The response content from the data store is not expected, and cannot be parsed.:
Is there something I'm doing wrongly or it is kind of permissions issue?
Please advise
This appears to be a generic issue, you need to point to a file with appropriate extension rather than a folder itself. Also make sure you are using a supported data store activity.
You can follow this official MS doc to use HDFS server with Azure Data Factory

AWS Appflow <-> Salesforce integration

I'm trying to setup a workflow to backup Accounts & Contact objects from Salesforce to S3 via AWS Appflow. Perhaps, I'm able to setup the connection and able to backup the files on-demand.
However, for restoration I would like to import the mapping using .csv file and below are sample first 3 lines (using comma-separator source & destination fields).
Name, Name
Type, Account Type
AccountNumber, Account Number
But Appflow is unable to import as " Couldn't parse rows from the file" - Am I missing something ?
This was bug on AWS side and it taken up ! Workaround is to do manual mapping instead of external CSV; make sure the source field attributes match with the corresponding objects in Salesforce.

Connect BigQuery as a source to Data Fusion in another GCP project

I am trying to connect BigQuery of ProjectA to Data Fusion of ProjectB and its asking me to enter a service key file. I have tried to upload the service key file to Cloud Storage of ProjectB and provided the link but it's asking me to provide a local file path.
Can someone help me on this?
Thanks in advance.
Can you try this, grant BQ permission of project A to data fusion in project B.
service-project_number#gcp-sa-datafusion.iam.gserviceaccount.com.
project_number-compute#developer.gserviceaccount.com.
Steps:
Navigate to the customer project that contains the CDF instance and copy the project number (this is found on the Home Page in the Project Info card)
Navigate to the project that contains the resources you would like to interact with.
In the sidebar, click on ‘IAM & Admin’
Click on ‘Add’ at the top of the page.
Provide the first service account name from the table above, be sure to replace with the actual number you obtained in step 1
Grant the Admin role for the resource you would like to interact with. Ex. BigQuery Admin for reading/writing to BigQuery. For BigQuery, you will also need to grant the BigQuery Data Owner role as well.
Repeat steps 5 & 6 for the second service account in the table above.
In your pipeline, ensure you define the correct Project Id for the sources/sinks. Using ‘auto-detect’ will default to the customer project that contains the CDF instance.
Can you try download the service key json file to the local, ie you local computer? And try to put the file into some folder and provide the full path to that service key file in the BigQuery properties.

Informatica - Create & email a flat file if records exist, otherwise complete successfully

Hello everyone and thanks for taking the time to read this question!
What I am trying to do in Informatica is populate and email a file based on a sql query. So right now I have
DB source table ----> SQ w/ query -----> flat file target (emails flat file to me every time) in a single session workflow.
What I would like to do is ONLY email the file if it has records. I also don't want to the workflow to fail if there are no records and no email is sent. Just go on it's merry way and wait for the next run.
Thank You!
Should be simple.
Create two sessions/mappings like this -
session 1 - Read using your SQL and write data into a dummy file.
session 2 - Depending on target success row of session1 this will run. Link condition should be - if Session1.TgtSuccessRows>0 then run session2. Add your mail id on session2's success e mail. You can use any dummy mapping here.
Please note you can use email task instead of session2.
Koushik

In jmeter I need to send a file path with random file name, how to get that random file name to pass in a request

I am load testing a web application scenario in which:
1.) First I need to upload an excel file it will be parsed and sheets (tabs in excel) loads into a drop-down.
2.)Then selects an option in drop-down and clicks on import button.
3.) After clicking on import a new pop up open with some data loaded in the grid with "update to database" button. Now clicked on the update to database button.
This is my scenario of load test here I need to load test "import" and "update to database" requests
In this scenario 4 requests works.
In point 1: servicename url is: /xxx/xxx/uploadLoader.aspx
in params are: 2 viewstates, button, appuser, company, filename (with multi-part data)
response: it returns HTML of file upload control
Point 2 using 2 requests:
a) service url is: /xxxx/xxxx.asmx/updateImportSubTableFormat
in params:
{"as_connstring":["GET"],"alist_import":[{"__type":"ApplecartDS.DataLib.ImportTableMain","rowstate":"OLD","tableno":"019","oldtableno":"019","formatid":"I0000012","oldformatid":"I0000012","columnname":"pay_billrefno","oldcolumnname":"","columndesc":"Bill No","aormdesc":"Actual","excelcolumnname":"D","excelcolumnno":"3","excelcolumnnameadd1":"","excelcolumnnoadd1":-1,"excelcolumnnameadd2":"","excelcolumnnoadd2":-1,"serialno":"1","oldserialno":2,"formatname":"","startingrowno":0,"endingrowno":0,"keycolumn":"","keycolumndesc":"","payiddesc":null,"attributetype":"","paytype":"A",".............**some more output data here**.................... 1,"excelcolumnnameadd2":"","excelcolumnnoadd2":-1,"serialno":"3","oldserialno":3,"formatname":"","startingrowno":0,"endingrowno":0,"keycolumn":"","keycolumndesc":"","payiddesc":null,"attributetype":"","paytype":"A",""provision_actual":"P","dsexecstatus":"","dsmsg":"","slno":0,"as_fromatid":"I0000012","ai_xlcolumnname":"A","ai_xlcolumnno":"1","ai_startrow":"2","ai_endrow":0,"ai_header":"1","as_payid":"MEDRE","as_leaveid":"MEDRE","as_wfdcolname":"","ai_wfdcolno":0,"as_arrcolname":"","ai_arrcolno":0,"as_transafertype":"A"}
response:
{"d":[{"__type":"ApplecartDS.DataLib.ImportTableMain","rowstate":"OLD","tableno":"","oldtableno":"","formatid":"","oldformatid":"","columnname":"","oldcolumnname":"","columndesc":"","aormdesc":"","excelcolumnname":"","excelcolumnno":0,"excelcolumnnameadd1":"","excelcolumnnoadd1":-1,"excelcolumnnameadd2":"","excelcolumnnoadd2":-1,"serialno":0,"oldserialno":0,"formatname":"","startingrowno":0,"endingrowno":0,.............**some more output data here**...................."","valuefrom":"E","default_value":"","aorm":"A","benefitsid":"","licpayid":"","createmaster":"","allowzero":"N","groupid":"","groupdesc":"","reimbpayid":"","provision_actual":"P","dsexecstatus":"","dsmsg":"","slno":0}]}
Note: If I hit same request wit same in parmas then it returns correct same response.
b) service url is: /AAA/AAA.asmx/importXLData
in params:
{"as_connstring":["GET"],"as_filename":"C:\\Websites\\xxxx\\xxxx\\admin\\001\\admin115.248.100.6825520164659134Import_Template.xlsx","ai_header":2,"as_tablename":"rm_claimsmain","as_sheetname":"'Reim claim$'","as_tableno":"019","as_fromatid":"I0000012","as_creditperiodid":"","isUpdate":false,"isupdatemonattribute":false,"alistUpdatedata":[],"as_excelfilepath":"C:\\Websites\\xxxx\\xxxx\\admin\\001\\admin115.248.100.6825520164659134Import_Template.xlsx"}
response
{"d":[{"__type":"ApplecartDS.DataLib.ExcelDataImport","CValues1":"","CValues2":"","CValues3":"","CValues4":"","CValues5":"","CValues6":"","CValues7":"","CValues8":"",..........**very long out put data here**.......
,"NValues50":0,"NValues51":0,3":0,"IValues4":0,"rowmodify":"N","rowstate":"OLD","dsexecstatus":null,"dsmsg":""}]}
Note: But if I hit the same request again then it displays response as below that are different from above response (first-time response). Because the file name changes every time in case of the manual but I am not able to get the new file name to pass it in JMeter request.
second-time response:
{"d":[{"__type":"ApplecartDS.DataLib.ExcelDataImport","CValues1":"","CValues2":"","CValues3":"","CValues4":"","CValues5":"","CVal","CValues17":"","CValues18":"","CValues19":................................. dsexecstatus":"FAIL","dsmsg":"Exception : System.Data.OleDb.OleDbException (0x80004005): The Microsoft Office Access database engine could not find the object \u0027C:\\xxxx\\xxxx\\xxx\\admin\\001\\admin115.248.100.6825520165121629Import_Template_test_case_9May2016.xlsx\u0027. Make sure the object exists and that you spell its name and the path name correctly.
point 3 is using 1 request
service url: /xxxx/xxxx.asmx/importXLData
in params
{"as_connstring":["GET"],"as_filename":"C:\\xxxx\\xxxx\\temp\\admin\\001\\admin115.248.100.682552016571324rm_claimsmain.xml;C:\\Websites\\cPayv8html\\temp\\admin\\001\\admin115.248.100.682552016571324rm_cliams.xml","ai_header":"1","as_tablename":"rm_claimsmain","as_sheetname":"'Reim claim$'","as_tableno":"019","as_fromatid":"I0000012","as_creditperiodid":"","isUpdate":true,"isupdatemonattribute":false,"alistUpdatedata":[{"__type":"ApplecartDS.DataLib.ExcelDataImport","CValues1":"Employee No;C;30;em_empno","CValues2":"Bill No;C;30;","CValues3":"Bill Date;D;30;","CVal/Date(-62135596800000)\/"..........**some more in data**..........,"DValues10":"\/Date(-62135596800000)\/","ls_msg1":"5","ls_msg2":"C:\\xxxx\\xxxx\\xx\\admin\\001\\admin115.248.100.682552016571324rm_claimsmain.xml;C:\\xxxx\\xxx\\xx\\admin\\001\\admin115.248.100.682552016571324rm_cliams.xml","ls_msg3":"","ls_msg4":"","NValues1":0,"NV4":0,"rowmodify":"N","rowstate":"OLD","dsexecstatus":null,"dsmsg":""}],"as_excelfilepath":"C:\\xxxx\\xxxx\\xxx\\admin\\001\\admin115.248.100.6825520164659134Import_Template_test_case_9May2016.xlsx"}
response
{"d":[{"__type":"ApplecartDS.DataLib.ExcelDataImport","CValues1":"","CValues2":"","CValues3":"","CValues4":"","CValues5":"","CValues6":"","CValues7":"","CValues8":"","CValues9":"","CValues10":"","CValues11":"","CValues12":"","CValues13":"","CValues14":"","CValues15":"","CValues16":"","CValues17":"","CValues18":"","................................ "rowmodify":"N","rowstate":"OLD","dsexecstatus":"SUCCESS","dsmsg":"Update to Database success ! New Rows added :0 \u003c\u003e Rows Updated : 49"}]}
If I hit same request again then second time response is:
......................"dsexecstatus":"FAIL","dsmsg":"Exception : Could not find file \u0027C:\\Websites\\cPayv8html\\temp\\admin\\001\\admin115.248.100.682552016525264rm_claimsmain.xml\u0027."}]}
My issue is in second-time response it is not showing correct response as the file name is changing randomly and I am not able to trace it to pass changed file path to the request.
Is there any way to handle it? Please help me out, how can I load test it with an authentic response.
${__RandomString(6,a12zeczclk, MYVAR)}
will return a random string of 6 characters picked from a12zeczclk set and store the result in MYVAR, MYVAR will contain string like 2z22ak or z11kce, …
see:
http://jmeter.apache.org/usermanual/functions.html#__RandomString
Looking into
Could not find file \u0027C:\Websites\cPayv8html\temp\admin\001\admin115.248.100.682552016525264rm_claimsmain.xml\u0027.
It seems that when you upload the file, it's being stored somewhere in a temporary location and when the file is imported into the database - it's deleted. So it's classic "correlation" example where you need to get a dynamic parameter from previous response and add it to the next request.
I would suggest carefully inspect server responses and to get the temporary filename from there.
As your server responds with JSON I would suggest to use
JSON Path PostProcessor - if you're using JMeter 3.0
JSON Path Extractor - if you're sitting on a previous JMeter version
JSON Path Syntax is very easy, check the following references:
JSONPath - XPath for JSON
JSONPath Online Evaluator
Advanced Usage of the JSON Path Extractor in JMeter