I am trying to run an Informatica workflow to check database table and write fetched data to a flat file on the server.
source: database table
target: flat file
transformations: non
this wf runs fine when "run on demand" but I need to run this continuously so I tried with INFA scheduler to run it every 5 minutes. When the scheduler is enabled the workflow continuously fails. Kindly help with any ideas to run this on scheduler.
oops... sorted, this was my mistake. I have not checked in the Target I created for flat file. my bad. Thanks all
Related
I am new to dbt. currently i am trying to accessing S3 bucket which has parquet file via glue and Athena. I have configuration as per dbt documentation, however, after running the run dbt command it provided me how many model i am running, how may task it has there so up to this point it is good. but looks like after that it is hung and after some time its timed out. While checking dbt.log i found there is query running like below and it is running quite long time and eventually timed out. I am not sure why it is running and if any configuration i have to checked. I suspect it is coming from macro but there is no macro like that which run the below query. Please let me know if any pointer. Thank you.
query running by default after running dbt run command and not sure where it is coming from.
select table_catalog,table_schema,
case when table_type='BASE_TABLE' then 'table'
when table_type='VIEW' then 'view'
end as table_type
from information_schema.table
where regexp_like(table_schema,'(?i)\A\A')
Am trying to invoke campaign via SOAP request ,which is suppose to insert data into data warehouse,but it's not happening.am using DS2 code for it
When I try run the same campaign in test mode in SAS CI it's happening
I want the log that gets generated via SOAP request.can anyone let me know the path for the same.
Thanks in advance.
You need to check the log files on your application & mid-tier servers. Search the log files for the time you ran your request.
Start by checking the ObjectSpawner log to see if any connections were made the SAS environment at that time: ex. C:\SASHome\Lev1\ObjectSpawner\Logs
I am very new with Sitecore, I am trying to create one task, but after creating task I configured command and task at content editor. Still I don't see run now option for my task at content editor. Need help.I want to know where the logs of scheduled jobs are written?
There are 2 places where you can define custom task.
In database
In config file
If you decide to go with 1st option
a task item must be created under /sitecore/system/tasks/schedules
item in the “core” database (default behavior).
no matter what schedule you set on that item, it may never be executed, if you do not have right DatabaseAgent looking after that task item.
DatabaseAgent periodically checks task items and if tasks must be
executed (based on the value set in the Scheduling field), it
executes actual code
By default the DatabaseAgent is called each 10
minutes
If you decide to go with 2nd option, check this article first.
In short, you need to define your task class and start method in the config files (check out the /sitecore/admin/showconfig.aspx page, to make sure config changes are applied successfully)
<scheduling>
<!--
Time between checking for scheduled tasks waiting to execute
-->
<frequency>00:00:05</frequency>
<agent type="NameSpace.TaskClass" method="MethodName" interval="00:10:00"/>
</agent>
</scheduling>
As specified in the other answers, you can use config file or database to execute your task. However, it seems that you want to run it manually.
I have a custom module on the Sitecore Marketplace which allows you to select the task you want run. Here is the link
In brief, you need to go to the Sitecore control panel, then click on administration and lastly click on the Run Agent.
It will open a window where you can select the task. I am assuming that the task you have implemented does not take the Item details on which you are when triggering the job.
I have been running Camunda with MariaDB, it's a good solution
but I have a problem, I saw the Camunda User Guide that describes History and Audit Event Log, so I write some codes as follows:
List historyList = historyService.createHistoricProcessInstanceQuery().finished().processDefinitionId("Sample1").list();
int historySize = historyList.size();
LOGGER.info("historyList size=" + historySize);
I have finished the Sample1 Process, but the historySize still is zero, I think I lost some configuration, how can I do?
Wnat's difference between Runtime Database and Histort Database? do I need to install two Databases?
Thank you
I solved the Problem by using processDefinitionKey
I passed processDefinitionKey to processDefinitionId, no wonder it can't get all finished instances of a process
I can get all finished instances of a process using correct processDefinitionId or processDefinitionKey
I all
I have a job schudled by tivoli for an Informatica workflow.
i have checked property to save workflow logs for 5 runs.
Job is running fine through informatica but if u try to run is from tivoli using pmcmd it fails to rename the workflow log file .
pLease help , i am getting this error :
Cannot rename workflow log file [E:\Informatica\etl_d\WorkflowLogs\wf_T.log.bin] to [E:\Informatica\etl_d\WorkflowLogs\wf_T.log.4.bin]. Please check the Integration Service log for more information.
Disconnecting from Integration Service
Check the log file name in Workflow Edit options. Possibly you have same workflow log file name for multiple workflows.
HTH
Irfan