I have a problem with the 9.2 version of kettle - kettle

When using the kettle cluster, an error was reported for the conversion data, and the error format is as follows:
2021/12/29 10:28:57 - step1 - ERROR (version 9.2.0.0-290, build 9.2.0.0-290 from 2021-06-02 06.36.08 by buildguy) : Unexpected error
2021/12/29 10:28:57 - step1- ERROR (version 9.2.0.0-290, build 9.2.0.0-290 from 2021-06-02 06.36.08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2021/12/29 10:28:57 - step1 - This step will block all incoming rows until at least all defined steps finish! You must define at least one step.
2021/12/29 10:28:57 - step1-
2021/12/29 10:28:57 - step1 - at org.pentaho.di.trans.steps.blockuntilstepsfinish.BlockUntilStepsFinish.processRow(BlockUntilStepsFinish.java:71)
2021/12/29 10:28:57 - step1 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2021/12/29 10:28:57 - step1- at java.lang.Thread.run(Thread.java:748)
2021/12/29 10:28:57 -step1 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)

Related

Cloud build export cloud sql add date in filename

Here is my pipeline :
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['sql', 'export', 'sql', $_DB_INSTANCE_NAME, gs://$_BUCKET_NAME/$_FILENAME.sql, '--database=$_DB_DATABASE']
options:
dynamic_substitutions: true
substitution_option: 'ALLOW_LOOSE'
timeout: 3600s
I declared my variable $_FILENAME inside cloud build pipline and I set this value Cloud_Export_$(date +%Y-%m-%d)
But I got this error : Compilation failed: [_FILENAME -> Cloud_Export_$(date +%Y-%m-%d)]: generic::invalid_argument: unable to fetch payload for date +%Y-%m-%d
So I tried to remove $() to my $_FILENAME = Cloud_Export_date +%Y-%m-%d
Exporting Cloud SQL instance...
...failed.
ERROR: (gcloud.sql.export.sql) [ERROR_RDBMS] GCS URI "gs://export_bdd/Cloud_Export_date +%Y-%m-%d.sql" is empty or invalid
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/gcloud" failed: step exited with non-zero status: 1
How can I do to add the current date to my filename ?
Edit
I tried to create another variable _CURRENT_DATE with value date +%Y-%m-%d. Then I changed my $_FILENAME variable to Cloud_Export_${_CURRENT_DATE}
Now I don't have any errors but the date is empty the filename is Cloud_Export_.sql
I found a solution I removed the $_FILENAME variable then I'm using sh entrypoint and I added double $ to the date function $$(date +%Y-%m-%d) and it works :
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'sh'
args: ['-c', 'gcloud sql export sql $_DB_INSTANCE gs://$_BUCKET_NAME/filename_$$(date +%Y-%m-%d).sql --database=$_DB_NAME']

Ember Upgrade from 3.23 to 3.24 failing at build

Was planning to migrate ember application from 3.23 to 3.24. During ember serve the following error is coming:
I have tried upgrading broccoli and babels libs to the latest version. Still the issue persists.
ERROR Summary:
- broccoliBuilderErrorStack: TypeError: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
at PluginPass.ClassBody (/home/abc/dev/sample-ui/node_modules/#babel/plugin-proposal-class-static-block/lib/index.js:56:21)
at newFn (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/visitors.js:177:21)
at NodePath._call (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:53:20)
at NodePath.call (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:40:17)
at NodePath.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:100:31)
at TraversalContext.visitQueue (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:103:16)
at TraversalContext.visitSingle (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:77:19)
at TraversalContext.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:131:19)
at Function.traverse.node (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/index.js:82:17)
at NodePath.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:108:18)
- code: [undefined]
- codeFrame: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
- errorMessage: #ember/test-helpers/-internal/debug-info.js: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
in /home/abc/dev/sample-ui/node_modules/#ember/test-helpers/addon-test-support
at broccoli-persistent-filter:Babel > [Babel: #ember/test-helpers] (Babel: #ember/test-helpers)
- errorType: Build Error
- location:
- column: [undefined]
- file: #ember/test-helpers/-internal/debug-info.js
- line: [undefined]
- treeDir: /home/abc/dev/sample-ui/node_modules/#ember/test-helpers/addon-test-support
- message: #ember/test-helpers/-internal/debug-info.js: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
in /home/abc/dev/sample-ui/node_modules/#ember/test-helpers/addon-test-support
at broccoli-persistent-filter:Babel > [Babel: #ember/test-helpers] (Babel: #ember/test-helpers)
- name: Error
- nodeAnnotation: Babel: #ember/test-helpers
- nodeName: broccoli-persistent-filter:Babel > [Babel: #ember/test-helpers]
- originalErrorMessage: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
- stack: TypeError: /home/abc/dev/sample-ui/ngcoreui/#ember/test-helpers/-internal/debug-info.js: path.isStaticBlock is not a function
at PluginPass.ClassBody (/home/abc/dev/sample-ui/node_modules/#babel/plugin-proposal-class-static-block/lib/index.js:56:21)
at newFn (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/visitors.js:177:21)
at NodePath._call (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:53:20)
at NodePath.call (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:40:17)
at NodePath.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:100:31)
at TraversalContext.visitQueue (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:103:16)
at TraversalContext.visitSingle (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:77:19)
at TraversalContext.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/context.js:131:19)
at Function.traverse.node (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/index.js:82:17)
at NodePath.visit (/home/abc/dev/sample-ui/node_modules/#babel/traverse/lib/path/context.js:108:18)

if statement on gitlab.ci yaml

On my gitlab.ci.yaml response status must be 200 otherwise job should be fail and must write transfer already started.
But I don't know how to write if statement for that.
variables:
NUGET_PATH: 'C:\Tools\Nuget\nuget.exe'
MSBUILD_PATH: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Current\Bin\amd64\msbuild.exe'
SOLUTION_PATH: 'Textbox_ComboBox.sln'
stages:
- build
- job1
- job2
before_script:
- "cd Source"
build_job:
stage: build
except:
- schedules
script:
- '& "$env:NUGET_PATH" restore'
- '& "$env:MSBUILD_PATH" "$env:SOLUTION_PATH" /nologo /t:Rebuild /p:Configuration=Debug'
job1:
stage: job1
script:
- 'curl adress1'
job2:
stage: trigger_SAP_service
when: delayed
start_in: 5 minutes
only:
- schedules
script:
- 'curl adress2'
How to add if condition for status=200 and transfer started in response ?

Create frozen graph from pretrained model

Hi I am newbie to tensorflow. My aim is to convert .pb file to .tflite from pretrain model for my understanding. I have download mobilenet_v1_1.0_224 Model. Below is structure for model
mobilenet_v1_1.0_224.ckpt.data-00000-of-00001 - 66312kb
mobilenet_v1_1.0_224.ckpt.index - 20kb
mobilenet_v1_1.0_224.ckpt.meta - 3308kb
mobilenet_v1_1.0_224.tflite - 16505kb
mobilenet_v1_1.0_224_eval.pbtxt - 520kb
mobilenet_v1_1.0_224_frozen.pb - 16685kb
I know model already has .tflite file, but for my understanding I am trying to convert it.
My First Step : Creating frozen Graph file
import tensorflow as tf
imported_meta = tf.train.import_meta_graph(base_dir + model_folder_name + meta_file,clear_devices=True)
graph_ = tf.get_default_graph()
with tf.Session() as sess:
#saver = tf.train.import_meta_graph(base_dir + model_folder_name + meta_file, clear_devices=True)
imported_meta.restore(sess, base_dir + model_folder_name + checkpoint)
graph_def = sess.graph.as_graph_def()
output_graph_def = graph_util.convert_variables_to_constants(sess, graph_def, ['MobilenetV1/Predictions/Reshape_1'])
with tf.gfile.GFile(base_dir + model_folder_name + './my_frozen.pb', "wb") as f:
f.write(output_graph_def.SerializeToString())
I have successfully created my_frozen.pb - 16590 kb . But original file size is 16,685kb, which is clearly visible in folder structure above. So this is my first question why file size is different, Am I following some wrong path.
My Second Step : Creating tflite file using bazel command
bazel run --config=opt tensorflow/contrib/lite/toco:toco -- --input_file=/path_to_folder/my_frozen.pb --output_file=/path_to_folder/model.tflite --inference_type=FLOAT --input_shape=1,224,224,3 --input_array=input --output_array=MobilenetV1/Predictions/Reshape_1
This commands give me model.tflite - 0 kb.
Trackback for bazel Command
INFO: Analysed target //tensorflow/contrib/lite/toco:toco (0 packages loaded).
INFO: Found 1 target...
Target //tensorflow/contrib/lite/toco:toco up-to-date:
bazel-bin/tensorflow/contrib/lite/toco/toco
INFO: Elapsed time: 0.369s, Critical Path: 0.01s
INFO: Build completed successfully, 1 total action
INFO: Running command line: bazel-bin/tensorflow/contrib/lite/toco/toco '--input_file=/home/ubuntu/DEEP_LEARNING/Prashant/TensorflowBasic/mobilenet_v1_1.0_224/frozengraph.pb' '--output_file=/home/ubuntu/DEEP_LEARNING/Prashant/TensorflowBasic/mobilenet_v1_1.0_224/float_model.tflite' '--inference_type=FLOAT' '--input_shape=1,224,224,3' '--input_array=input' '--output_array=MobilenetV1/Predictions/Reshape_1'
2018-04-12 16:36:16.190375: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1265] Converting unsupported operation: FIFOQueueV2
2018-04-12 16:36:16.190707: I tensorflow/contrib/lite/toco/import_tensorflow.cc:1265] Converting unsupported operation: QueueDequeueManyV2
2018-04-12 16:36:16.202293: I tensorflow/contrib/lite/toco/graph_transformations/graph_transformations.cc:39] Before Removing unused ops: 290 operators, 462 arrays (0 quantized)
2018-04-12 16:36:16.211322: I tensorflow/contrib/lite/toco/graph_transformations/graph_transformations.cc:39] Before general graph transformations: 290 operators, 462 arrays (0 quantized)
2018-04-12 16:36:16.211756: F tensorflow/contrib/lite/toco/graph_transformations/resolve_batch_normalization.cc:86] Check failed: mean_shape.dims() == multiplier_shape.dims()
Python Version - 2.7.6
Tensorflow Version - 1.5.0
Thanks In advance :)
The error Check failed: mean_shape.dims() == multiplier_shape.dims()
was an issue with resolution of batch norm and has been resolved in:
https://github.com/tensorflow/tensorflow/commit/460a8b6a5df176412c0d261d91eccdc32e9d39f1#diff-49ed2a40acc30ff6d11b7b326fbe56bc
In my case the error occurred using tensorflow v1.7
Solution was to use tensorflow v1.15 (nightly)
toco --graph_def_file=/path_to_folder/my_frozen.pb \
--input_format=TENSORFLOW_GRAPHDEF \
--output_file=/path_to_folder/my_output_model.tflite \
--input_shape=1,224,224,3 \
--input_arrays=input \
--output_format=TFLITE \
--output_arrays=MobilenetV1/Predictions/Reshape_1 \
--inference-type=FLOAT

Error posting API to API Manager (v1.10.0) from GREG (v5.1.0)

I have followed the instructions laid out in the documentation for GREG (5.1.0) to configure GREG to publish APIs to an external API Manager (1.10.0). I followed the configuration updates (modified the LifeCycle in the configuration.xml) and promoted a REST API. Unfortunately, when I promote the API to Production I don't receive any feedback in the GREG Publisher UI, the API is not imported into API Manager, and I receive the following errors in the GREG logs:
Note: I've scrubbed these logs for potentially sensitive information.
[2016-02-01 15:33:34,432] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_name : xxxxxxxxxxxx
[2016-02-01 15:33:34,451] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - interface_transports : https
[2016-02-01 15:33:34,454] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - uritemplate_httpVerb : get
[2016-02-01 15:33:34,454] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - interface_swagger : /_system/governance/apimgt/applicationdata/api-docs/1.0.0/xxxxxxxxx
[2016-02-01 15:33:34,454] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - contacts_entry : Technical Owner: xxxxxxxxx
[2016-02-01 15:33:34,455] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_endpointURL : http://xxxxxxxx/
[2016-02-01 15:33:34,456] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - uritemplate_authType : None
[2016-02-01 15:33:34,457] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_description : xxxxxxxxx
[2016-02-01 15:33:34,457] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_context : /xxxxxxxx/
[2016-02-01 15:33:34,457] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_version : 0.0.4
[2016-02-01 15:33:34,460] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - uritemplate_urlPattern : /xxxxxx/{id}.xml
[2016-02-01 15:33:34,460] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - overview_provider : admin
[2016-02-01 15:33:34,460] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - endpoints_entry : Prod:https://xxxxxxxxxxxxxxx/xxxxxx/
[2016-02-01 15:33:34,460] ERROR {org.wso2.carbon.governance.registry.extensions.executors.apistore.RestServiceToAPIExecutor} - security_authenticationType : None
I have created a gist with the logs for both API Manager and GREG.