Informatica Power Center - override value for user defined workflow/worklet returning Null value - informatica

Workflow varibale declared -
enter image description here
Parameter File
enter image description here
Workflow Session failed as per logs
enter image description here
Why the value from parameter file which is TestFile001 not passed when workflow session is running?

Related

When I am trying to create second after one already created some feature is not highlighted in AWS saasBoost

When I am trying to create second after one already created some feature is not highlighted in AWS saasBoost.can anyone help how can i create 2 service and deploye 2 application on saasboost.
enter image description here
enter image description here

"Only numeric data" error on logs-base metric GCP

I have use Ops agent to send log to Cloud Logging
Uploaded logs
And then I used these logs to create logs-base metric with field name is jsonPayload.data
Create logs-base metric
After that, I review logs of that metric to make sure input data is correct
Review input data
But finally, the result is Cloud metric show error Only numeric data can be drawn as a line chart. I have checked at "review logs" step and make sure that input data is numeric. Can anyone help me explain that?
Error
Sorry, I'm new to stackoverflow, so I can't upload image directly.
You can see the Metric by changing the aligner to percentile.

Download JSON in Informatica Application Integration

I am new to and ICAI, and i have a requirement
a. create a service
b. the user will upload a json file using this
webservice
c. the json file will be downloaded and saved locally.
The solution path i was taking is:
create a process which will accept 2 inputs (some generic text and the json file)
which generated the below url
I tested the same in POSTMAN and it is working fine, but i am not able to download the json into informatica server on any location,
Final solution based on the feedback from Maciejg
Steps taken:
create a filewriter app connection and set it up only for
"eventtarget"
create a process
in start - create a input field of type - attachment
in start - create a temp field of type - filewriter connection
add a assignment task
in assignment task add a filed temp->content format of type content
-> attachment
in the same assignment task add another field temp->file name of
type formula
Above steps are enough to save the uploaded file, if required, other steps (check file type, authentication etc) can be added.
It seems you need to use a FileWriter Service. Check out this knowledgebase article for details.

How to know if a zero bytes video was uploaded to a gcp bucket?

As the title says, I need to fetch the size of the video / object I just uploaded to the bucket.
Every few seconds, an object is uploaded to my bucket which is of the form, {id}/video1.mp4.
I want to make use of google cloud storage triggers which would alert me if a 0 byte video was added. Can someone pls suggest me how to access the size of the added object.
Farhan,
Assuming you know the basics of cloud functions. You can create a cloud function trigger that runs a script every-time you create/finalize an object in a selected bucket.
The link you posted contains the tutorial and the following python script attached.
def hello_gcs_generic(data, context):
"""Background Cloud Function to be triggered by Cloud Storage.
This generic function logs relevant data when a file is changed.
Args:
data (dict): The Cloud Functions event payload.
context (google.cloud.functions.Context): Metadata of triggering event.
Returns:
None; the output is written to Stackdriver Logging
"""
print('Event ID: {}'.format(context.event_id))
print('Event type: {}'.format(context.event_type))
print('Bucket: {}'.format(data['bucket']))
print('File: {}'.format(data['name']))
print('Metageneration: {}'.format(data['metageneration']))
print('Created: {}'.format(data['timeCreated']))
print('Updated: {}'.format(data['updated']))
In this example, we see data has multiple items such as name, timeCreated ect.
What this example doesn't show however is that data has another item, SIZE!
listed as data['size']
So now we have a cloud function that gets the filename, and file size of whatever is uploaded when it's uploaded!. all we have to do now is create an if statement to do "something" if the file size is = 0. It will look something like this in python. (apologies for syntax issues, but this is the jist of it)
def hello_gcs_generic(data, context):
"""Background Cloud Function to be triggered by Cloud Storage.
This generic function logs relevant data when a file is changed.
Args:
data (dict): The Cloud Functions event payload.
context (google.cloud.functions.Context): Metadata of triggering event.
Returns:
None; the output is written to Stackdriver Logging
"""
print('File: {}'.format(data['name']))
print('Size: {}'.format(data['size']))
size = data['size']
if size == 0:
print("its 0!")
else:
print("its not 0!")
Hope this helps!

If line contains string, save unspecified number of characters

I am writing a small script for a client to take the output of event viewer security logs to list the number of times a user connects to a DC server over a week (event viewer is configured to output a .txt file on the first day of the week).
The script I currently have prints out the name and time of the connection, however I would like to add a functionality to record how many times each user connects.
The client has quite frequent turn over of staff, so I would like to not have to edit the script everytime a new user joins or leaves.
Is there a way to save an unspecified number of characters after a specified string?
The string I am trying to record is the username, when the string in question has a format "DOMAIN\USERNAME".
So, for example, if user Michael signs into the Microsoft domain (MICROSOFT\Michael) it will just save Michael into a tally.
Example 1:
Event viewer will throw out about 200,000 of the below a week:
Authentication Package: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0
Logon Account: Company\Admin
Source Workstation:
Error Code: 0xC000006A"
Audit Failure 15/01/2018 13:07:36 Microsoft-Windows-Security-Auditing 4776 Credential Validation "The computer attempted to validate the credentials for an account.Authentication Package: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0
Logon Account: Company\User 1
Source Workstation:
Error Code: 0xC000006A"
Audit Failure 15/01/2018 13:07:36 Microsoft-Windows-Security-Auditing 4776 Credential Validation "The computer attempted to validate the credentials for an account.
I've written a bit of basic code to strip away the unneeded information, the below is a small part of it:
for line in file_name.readlines():
enter code here if "Logon Account" in line:
new_file.write(line)
Basically, when each line is read, if it the user is not previously recorded, add the user to a dictionary, and when the username is repeated later in the file increase the key by one.
Use regex for it, example:
import re
m = re.search('(?<=domain\\)(.*)', yourstring)
print m.group(1)