I don't know how to use the gerrit-trigger plugin in a DSL pipelineJob. According to the dsl plugin doc triggers is deprecated for pipelineJobs. And from the wiki 1.77 replaced by pipelineTriggers. So I have change my triggers section to
properties {
pipelineTriggers {
triggers {
gerrit {
events {
patchsetCreated()
}
project('**My/Git/Repo', '**')
}
}
}
}
However, when I use pipelineTriggers i get the following
ERROR: (configure_seed_jobs.groovy, line 25) No signature of method: events() is applicable for argument types: (configure_seed_jobs$_run_closure1$_closure4$_closure9$_closure10$_closure11$_closure12) values: [configure_seed_jobs$_run_closure1$_closure4$_closure9$_closure10$_closure11$_closure12#3bcd6c54]
Possible solutions: gerritProjects(), buildFailureMessage(), buildNotBuiltMessage(), buildStartMessage(), buildSuccessfulMessage(), buildUnstableMessage(), buildUnsuccessfulFilepath(), changeSubjectParameterMode(), commentTextParameterMode(), commitMessageParameterMode(), customUrl(), dependencyJobsNames(), dynamicTriggerConfiguration(), escapeQuotes(), gerritBuildFailedCodeReviewValue(), gerritBuildFailedVerifiedValue(), gerritBuildNotBuiltCodeReviewValue(), gerritBuildNotBuiltVerifiedValue(), gerritBuildStartedCodeReviewValue(), gerritBuildStartedVerifiedValue(), gerritBuildSuccessfulCodeReviewValue(), gerritBuildSuccessfulVerifiedValue(), gerritBuildUnstableCodeReviewValue(), gerritBuildUnstableVerifiedValue(), gerritSlaveId(), nameAndEmailParameterMode(), notificationLevel(), serverName(), silentMode(), silentStartMode(), skipVote(), triggerConfigURL(), triggerOnEvents()
What am I missing?
I had the same problem, because either events{..} or project() is no more available to gerrit in pipelineTriggers, you should use triggerOnEvents {..} and gettitProjects{...} instead. For more details, you could find them in the document on your jenkins (e.g. http://0.0.0.0:8080/plugin/job-dsl/api-viewer/)
properties {
pipelineTriggers {
triggers {
gerritTrigger {
gerritProjects {
gerritProject {
compareType('PLAIN')
pattern('**My/Git/Repo')
branches {
branch {
compareType('PLAIN')
pattern('master')
}
}
}
}
triggerOnEvents {
changeMerged()
}
}
}
}
Related
I have this action it its model file HandlQuestionTimeOut.model.bxb :
action (HandleQuestionTimeOut)
{
type(Calculation)
description (Handles Question Time Out.)
collect
{
input (message)
{
type (core.Text)
min (Required) max (One)
}
}
output (core.Text)
}
This in HandleQuestionTimeOut.js
var console = require("console");
module.exports.function = function handleQuestionTimeOut (message)
{
console.log("handleQuestionTimeOut -> message: " + message);
return message;
}
This in the quiz.endpoints.bxb inside the endpoints bracket:
action-endpoint (HandleQuestionTimeOut)
{
accepted-inputs (message)
local-endpoint (HandleQuestionTimeOut.js)
}
I am trying to call that action with refresh like this:
input-view
{
match: Answer(this)
{
to-input: UpdateQuiz(action)
}
refresh
{
if(true)
{
spec
{
delay-seconds (3)
with-request
{
intent
{
goal {HandleQuestionTimeOut}
value: core.Text(Timeout)
}
}
}
}
}
// code continues...
Can you please tell what am I doing wrong? I don't get that HandleQuestionTimeOut log in the console.
Can you clarify you questions?
Though I noticed something, based on my personal opinion:
1) correct module.exports.function -> module.export.function
2) In the refresh section I think you need to specify condition for 'true' or is it there for debugging purpose?
I've just verified that this issue is fixed in 20B SDK release.
Please refer the release notes for details about other changes.
I would like to extract the requirements data in capella using m2doc, requirements (SystemFunctionalRequirement) are located in a "RequirementsPkg" package in System analysis, thanks to the "m:RequirementsPkg.eContents().summary" command I managed to retrieve the summary of all requirements but I would like to retrieve the name and the summary of a specific requirement.
Can you help me ?
Thanks in advance
This mechanism is deprecated. You should use the requirement extension.
Starting from the root element, you can use something like:
{ m:system.ownedArchitectures->filter(la::LogicalArchitecture).ownedRequirementPkgs.ownedRequirements.name }
With the requirement extension the easiest way is to create a service:
public List<Requirement> getRequirements(ExtensibleElement element) {
List<Requirement> res = new ArrayList<>();
for (ElementExtension extension : element.getOwnedExtensions()) {
if (extension instanceof Requirement) {
res.add((Requirement) extension);
break;
} else if (extension instanceof CapellaOutgoingRelation) {
res.add(((CapellaOutgoingRelation) extension).getTarget());
}
}
return res;
}
and call it, for instance on a diagram:
{ m:for req | '[LAB] IFE System - All Components, CEs'.representationByName().eAllContents(viewpoint::DRepresentationElement).semanticElements->filter(emde::ExtensibleElement).getRequirements() }
{ m:req.ReqIFLongName }
{ m:endfor }
I found interesting write operation codes while looking at SpannerIO, and want to understand reasons.
On write(WriteToSpannerFn) and REPORT_FAILURES failure mode, it seems trying to write failed mutations twice.
I think it's for logging each mutation's exceptions. Is it a correct assumption, and is there any workaround?
Below, I removed some lines for simplicity.
public void processElement(ProcessContext c) {
Iterable<MutationGroup> mutations = c.element();
boolean tryIndividual = false;
try {
Iterable<Mutation> batch = Iterables.concat(mutations);
spannerAccessor.getDatabaseClient().writeAtLeastOnce(batch);
} catch (SpannerException e) {
if (failureMode == FailureMode.REPORT_FAILURES) {
tryIndividual = true;
} else {
...
}
}
if (tryIndividual) {
for (MutationGroup mg : mutations) {
try {
spannerAccessor.getDatabaseClient().writeAtLeastOnce(mg);
} catch (SpannerException e) {
LOG.warn("Failed to submit the mutation group", e);
c.output(failedTag, mg);
}
}
}
}
So rather than write each Mutation individually to the database, the SpannerIO.write() connector tries to write a batch of Mutations in a single transaction for efficiency.
If just one of these Mutations in the batch fails, then the whole transaction fails, so in REPORT_FAILURES mode, the mutations are re-tried individually to find which Mutation(s) are the problematic ones...
I'm trying to setup a jenkins job with custom config file, original xml looks as follows (the relevant part):
<buildWrappers>
<org.jenkinsci.plugins.configfiles.buildwrapper.ConfigFileBuildWrapper plugin="config-file-provider#2.11">
<managedFiles>
<org.jenkinsci.plugins.configfiles.buildwrapper.ManagedFile>
<fileId>30de8d2f-621d-4c51-b644-4302b548fd15</fileId>
<targetLocation>./src/</targetLocation>
<variable/>
<replaceTokens>false</replaceTokens>
</org.jenkinsci.plugins.configfiles.buildwrapper.ManagedFile>
</managedFiles>
</org.jenkinsci.plugins.configfiles.buildwrapper.ConfigFileBuildWrapper>
</buildWrappers>
Here's my JobDSL attempt:
job('example') {
configure{
it / 'buildWrappers' << 'org.jenkinsci.plugins.configfiles.buildwrapper.ManagedFile' {
managedFiles {
org.jenkinsci.plugins.configfiles.buildwrapper.ManagedFile{
fileId '30de8d2f-621d-4c51-b644-4302b548fd15'
targetLocation './/src//'
}
}
}
}
}
What am I missing? Thanks!
You can use the built-in DSL: https://jenkinsci.github.io/job-dsl-plugin/#path/job-wrappers-configFiles
The built-in DSL will also resolve the fileId from the file name.
job('example') {
wrappers {
configFiles {
file('myCustomConfigFile') {
targetLocation('src')
}
}
}
}
I am trying to build on the "WebSharingAppDemo-SqlProviderEndToEnd" msdn sample application to build out a custom MSF implementation. As part of that I added parameterized filters to the provisioning. I have been referencing http://jtabadero.wordpress.com/2010/09/02/sync-framework-provisioning/ for some idea of how to do this. Now that I have that in place, when I re-initialize the "peer1" database and try to provision it initially I now get an error:
Scopes not created from a template cannot have FilterParameters.
Parameter '#my_param_name' was found on Table '[my_table_name]'.
Please ensure that no FilterParameters are being defined on a scope
that is not created from a template.
The only guess I have as to what a "template" is, is the provisioning templates that the Sync Toolkit's tools can work with, but I don't think that applies in the scenario I'm working with.
I have been unable to find anything that would indicate what I should do to fix this. So how can I get past this error but still provision my database with parameterized filters?
The below code is what I'm using to build the filtering into the provisioning (SqlSyncScopeProvisioning) object.
private void AddFiltersToProvisioning(IEnumerable<TableInfo> tables)
{
IEnumerable<FilterColumn> filters = this.GetFilterColumnInfo();
foreach (TableInfo tblInfo in tables)
{
this.AddFiltersForTable(tblInfo, filters);
}
}
private void AddFiltersForTable(TableInfo tblInfo, IEnumerable<FilterColumn> filters)
{
IEnumerable<FilterColumn> tblFilters;
tblFilters = filters.Where(x => x.FilterLevelID == tblInfo.FilterLevelID);
if (tblFilters != null && tblFilters.Count() > 0)
{
var tblDef = this.GetTableColumns(tblInfo.TableName);
StringBuilder filterClause = new StringBuilder();
foreach (FilterColumn column in tblFilters)
{
this.AddColumnFilter(tblDef, column.ColumnName, filterClause);
}
this.Provisioning.Tables[tblInfo.TableName].FilterClause = filterClause.ToString();
}
}
private void AddColumnFilter(IEnumerable<TableColumnInfo> tblDef, string columnName, StringBuilder filterClause)
{
TableColumnInfo columnInfo;
columnInfo = tblDef.FirstOrDefault(x => x.ColumnName.Equals(columnName, StringComparison.CurrentCultureIgnoreCase));
if (columnInfo != null)
{
this.FlagColumnForFiltering(columnInfo.TableName, columnInfo.ColumnName);
this.BuildFilterClause(filterClause, columnInfo.ColumnName);
this.AddParamter(columnInfo);
}
}
private void FlagColumnForFiltering(string tableName, string columnName)
{
this.Provisioning.Tables[tableName].AddFilterColumn(columnName);
}
private void BuildFilterClause(StringBuilder filterClause, string columnName)
{
if (filterClause.Length > 0)
{
filterClause.Append(" AND ");
}
filterClause.AppendFormat("[base].[{0}] = #{0}", columnName);
}
private void AddParamter(TableColumnInfo columnInfo)
{
SqlParameter parameter = new SqlParameter("#" + columnInfo.ColumnName, columnInfo.GetSqlDataType());
if (columnInfo.DataTypeLength > 0)
{
parameter.Size = columnInfo.DataTypeLength;
}
this.Provisioning.Tables[columnInfo.TableName].FilterParameters.Add(parameter);
}
i guess the error is self-explanatory.
the FilterParameters can only be set if the scope inherits from a filter template. you cannot set the FilterParameters for a normal scope, only FilterClause.
Using parameter-based filters is a two step process: Defining the filter/scope template and creating a scope based on a template.
I suggest you re-read the blog entry again and jump to the section Parameter-based Filters.