How to prevent a build from plugin code in Jenkins - build

I'm trying to write a plugin that will prevent a build from occurring based on certain conditions. I've tried putting the conditional checks in the prebuild method (overridden), but from what I can see, the best I can hope to accomplish from there is setting the build status to Result.ABORTED or Result.FAILURE.
Does anyone know how to either
remove a build from the build queue without it having any history (i.e, through the prebuild),
OR
What method is used to determine whether or not a build should be allowed?

At least one way is to extend QueueTaskDispatcher. With it you get job and node and can block it from being built on that node at that time. You can of course not care about the node, and just block the job always. The method will be called periodically for scheduled jobs, when Jenkins is trying to find a node to build it with.
import hudson.Extension;
import hudson.model.AbstractProject;
import hudson.model.Node;
import hudson.model.Queue.BuildableItem;
import hudson.model.queue.CauseOfBlockage;
import hudson.model.queue.QueueTaskDispatcher;
#Extension
public class MyTaskDispatcher extends QueueTaskDispatcher {
#Override
public CauseOfBlockage canTake(Node node, BuildableItem item) {
// only care about AbstractProject tasks
if (!(item.task instanceof AbstractProject<?, ?>)) return null;
AbstractProject<?, ?> proj = (AbstractProject<?, ?>) item.task;
if(!proj.getName().contains(node.getNodeName()) {
return new CauseOfBlockage.BecauseNodeIsBusy("Job name does not contain node name");
}
return null;
}
}

Related

How to configure unit testing for AnyLogic agent code?

How do you configure unit testing framework to help develop code that is part of AnyLogic agents?
To have a suitable test driven development rhythm, I need to be able to run all tests in a few seconds. I thought of exporting the project as a standalone application (jar) each time, but that's pretty slow.
I thought of trying to write all the code outside AnyLogic in separate classes, but there are many references to built-in AnyLogic classes, as well as various agents. My code would need to refer to these somehow, and I'm not sure how to do that except by writing the code inside AnyLogic.
I wonder if there's a way of adding the test runner as a dependency, and executing that test runner from within AnyLogic.
Does anyone have a setup that works nicely?
This definitely requires some advanced Java, but testing, especially unit testing is too often neglected in building good robust models. I hope this simple example is enough to get you (and lots of other modellers) going.
For Junit testing, we make use of two libraries that you can add as a dependency to your model.
Now there are two main types of logic that you will want to test in simulation models.
Functions in Java classes
Model execution
Type 1: Suppose I have this very simple Java class
public class MyClass {
public MyClass() {
}
public boolean getResult() {
return true;
}
}
And I want to test the function getResult()
I can simply create a new class and create a function that I annotate with the #Test modifier and then also make use of the assertEquals() method, which is standard in junit testing
import org.junit.Test;
import static org.junit.Assert.assertEquals;
public class MyTestClass{
#Test
public void testMyClassFunction1() {
boolean result = new MyClass().getResult();
assertEquals("The value of the test class 1", result, true);
}
Now comes the AnyLogic specific implementation (there are other ways to do this but this is the easiest/most useful, you will see in a minute)
You need to create a custom experiment
Now if you run this from the Run Model button you will get this output
SUCCESS
Run: 1
Failed: 0
You can obviously update and change the output as to your liking
Type 2: Suppose we have this very simple model
And the function getResult() simply returns an int of 2.
Now we need to create another custom experiment to run this model
And then we can write a test to run this Custom Experiment and check the result
Simply add the following to your MyTestClass
#Test
public void testMyClassFunction2() {
int result = new SingleRun(null).runExperiment();
assertEquals("Value of a single run", result, 2);
}
And now if you run the RunAllTests customer experiment it will give you this output
SUCCESS
Run: 2
Failed: 0
This is just the beginning, you can read up tons on using junit to your advantage

Flink: Access operator state after execution is complete

Assuming I have a custom RichFunction with some raw state. How can I get the state (from every parallel instance of the operator) back to the main/driver code when the flink jobs ends?
abstract class MyRichMap extends RichMapFunction[SomeType, Unit] {
protected var someVar: Engine = _
override def open(parameters: Configuration): Unit = {
// assume someVar inititation here
....
}
override def map(value: SomeType): Unit = {
engine.process(value)
}
val env = StreamExecutionEnvironment.getExecutionEnvironment
...
someSource.map (new MyRichMap())
env.execute()
// How to get engine or some field of it here? (e.g., engine.someCounter)
what's the best way to approach this?
If you want to test MyRichMap(), then you'd start with unit tests - see https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/testing.html
If you want to test a complete workflow, a simple approach inside of a single JVM (e.g. running locally command line or Eclipse) is to create a sink that captures results to a (thread-safe) singleton, and then check the contents. That implies your sources complete (are bounded) so the workflow will terminate.

Custom Execution Function in WSO2

I am trying to write a simple custom function extension for WS02 (4.2.0). My function basically takes in a String and returns the upper case. This is meant to be a first step POC for a more advanced custom function.
I implemented a class that extended the org.wso2.siddhi.core.executor.function.FunctionExecutor class, and created a ams.siddhiext file. I then packaged the class and the siddhiext in a JAR file using the maven-bundle plugin.
My function class looks like this
public class AnomalyDetector extends FunctionExecutor {
private final static Logger LOG = LoggerFactory.getLogger(AnomalyDetector
.class);
#Override
protected void init(ExpressionExecutor[] expressionExecutors, ExecutionPlanContext executionPlanContext) {
LOG.info("In AD:init()");
}
#Override
protected Object execute(Object[] objects) {
return null;
}
#Override
protected Object execute(Object o) {
LOG.info("In AD:process(" + o.toString() + ")");
String eventData = (String) o;
LOG.info("Event data : " + eventData);
if (eventData != null) {
return eventData.toUpperCase();
} else {
return "Null event data";
}
}
#Override
public void start() {
LOG.info("In AD:start()");
}
#Override
public void stop() {
}
#Override
public Map<String, Object> currentState() {
return null;
}
#Override
public void restoreState(Map<String, Object> map) {
}
#Override
public Attribute.Type getReturnType() {
return Attribute.Type.STRING;
}
}
I then put the jar in the /repository/components/lib/ since /repository/components/dropins/ did not pick it up.
I have 2 issues that are blocking me currently.
I wanted to write a simple execution plan that takes a value from an input stream (String), invoke my custom function and write the output to an export stream.
#Plan:name('AMSExecutionPlan')
#Import('AMSStream:1.0.0')
define stream amsStream (metrics_json string);
#Export('AnomalyStream:1.0.0')
define stream anomalyStream (anomaly string);
from amsStream
select ams:findAnomaly(metrics_json) as anomaly
insert into anomalyStream
I get the following validation error.
What could be wrong with my execution plan?
Whenever I change my custom function class, rebuild the jar and replace it in the wso2 classpath, and then restart ws02, I dont see the changes reflected in ws02. The log lines that I print out in my custom function class reflect an older version of the code. What should I do to make changes to my Custom function class on a live ws02 instance?
Thanks in advance!
Can you bundle the jar as an OSGI bundle and try? There can be a issue when converting your jar to an OSGI bundle.
Validation error you have pointed suggest that your extension is not returning the return type properly. But I can see you have implemented getReturnType() correctly. So may be your source and actual running code might not be synced up due to issue 2. So let's address that first.
In WSO2 servers lib folder is used to add non-OSGi dependencies and dropins for OSGi dependencies. Fact that it works in lib and not in dropins suggest that your jar is not packed as a bundle. To achieve that please follow below pom file from String extension. There are two things to note.
[1] Usage of bundle packaging
[2] Usage of bundle plugin
Update your pom referencing this and then you will be able to add your bundle to dropins directly. Also this is the reason why your changes are not reflected. When you add your jar to lib server will internally convert it to an OSGi bundle and add to dropins. Now when you update the jar in lib again the one in dropins will not get updated. It will be the old bundle. Hence changes are not reflected. This issue will also go away when you update the pom and build the bundle correctly.
[1] https://github.com/wso2/siddhi/blob/v3.1.0/modules/siddhi-extensions/string/pom.xml#L29
[2] https://github.com/wso2/siddhi/blob/v3.1.0/modules/siddhi-extensions/string/pom.xml#L57
Hope this helps!!

Exclude groovy slf4j logging from condition coverage in Sonar with Jacoco

We are using SonarQube 5.1 with Jacoco maven plugin 0.7.4, and all of our slf4j logging statements such as log.debug('Something happened') show that only 1 of 2 branches are covered. I understand that this is because slf4j internally does an if debug, and that's great, but we don't want this to throw off our numbers. We aren't interested in testing slf4j and we'd rather not run every test multiple times for different logging levels.
So, how can we tell Sonar and/or Jacoco to exclude these lines from coverage? Both of them have configurable file exclusions, but from what I can tell those are only for excluding your own classes from coverage (using the target dir), not the imported libraries. I tried adding groovy.util.logging.*' to the exclusion list anyway but it didn't do anything.
logger.isDebugEnabled() is killing my code coverage. I'm planning to exclude it while running cobertura is similar and suggested that for Cobertura the 'ignore' property should be used instead of 'exclude'. I don't see anything like that for Jacoco or Sonar in settings or documentation.
EDIT:
Example image from Eclipse attached, after running Jacoco coverage (Sonar shows the same thing in their GUI). This is actual code from one of our classes.
EDIT 2:
We are using the Slf4j annotation. Docs here:
http://docs.groovy-lang.org/next/html/gapi/groovy/util/logging/Slf4j.html
This local transform adds a logging ability to your program using LogBack logging. Every method call on a unbound variable named log will be mapped to a call to the logger. For this a log field will be inserted in the class. If the field already exists the usage of this transform will cause a compilation error. The method name will be used to determine what to call on the logger.
log.name(exp)
is mapped to
if (log.isNameLoggable() {
log.name(exp)
}
Here name is a place holder for info, debug, warning, error, etc. If the expression exp is a constant or only a variable access the method call will not be transformed. But this will still cause a call on the injected logger.
Hopefully this clarifies what's going on. Our log statements become 2 branch ifs to avoid expensive string building for log levels that aren't enabled (a common practice, as far as I know). But that means that to guarantee coverage of all these branches, we have to run every test repeatedly for every logging level.
I did not find a general solution for excluding it, but if your codebase allows you to do so, you could wrap your logging statements in a method with an annotation containing "Generated" in its name.
A simple example:
package org.example.logging
import groovy.transform.Generated
import groovy.util.logging.Slf4j
#Slf4j
class Greeter {
void greet(name) {
logDebug("called greet for ${name}")
println "Hello, ${name}!"
}
#Generated
private logDebug(message) {
log.debug message
}
}
Unfortunately javax.annotation.Generated is not suitable, because it has only a retention of SOURCE, therefor I (ab)used groovy.transform.Generated here, but can easily create your own annotation for that purpose.
I found that solution here: How would I add an annotation to exclude a method from a jacoco code coverage report?
UPDATE: In Groovy you can solve it most elegantly with a trait:
package org.example.logging
import groovy.transform.Generated
import groovy.util.logging.Slf4j
#Slf4j
trait LoggingTrait {
#Generated
void logDebug(String message) {
log.debug message
}
}
...and then...
package org.example.logging
import groovy.util.logging.Slf4j
#Slf4j
class Greeter implements LoggingTrait {
void greet(name) {
logDebug "called greet for ${name}"
println "Hello, ${name}!"
}
}
Unfortunately the property log is interpreted as property of Greeter, not of LoggingTrait, so you must attach #Slf4j to the trait and the class implementing the trait.
Nevertheless doing so gives you the expected logger - the one of the implementing class:
14:25:09.932 [main] DEBUG org.example.logging.Greeter - called greet for world

Is there a way to configure log4j to ignore subpackages with a regular expression (REGEX)?

I am using log4j in my application.
If I want to turn on logging for a package I can simple do the below [in my log4j.properties file]:
log4j.logger.com.myorg.somepackage= DEBUG
This will cause log4j to log any messages from "com.myorg.somepackage" to my root logger.
My problem is, how do I stop logging from a package if I use plugins like maven shade?
For example, let's say you have package "com.myorg.somepackage" which is relocated (via maven shade plugin) to "com.someotherorg.dependency.com.myorg.somepackage".
If I wanted to set the level to warn I know I could do the below:
log4j.logger.com.someotherorg.dependency.com.myorg.somepackage= WARN
However, in my case, the dependency is shaded for multiple projects and I dont want to have to:
log4j.logger.com.someotherorg.dependency.com.myorg.somepackage= WARN
log4j.logger.com.someotherorg1.dependency.com.myorg.somepackage= WARN
log4j.logger.com.someotherorg2.dependency.com.myorg.somepackage= WARN
...etc
So how can I have log4j ignore "com.myorg.somepackage" regardless of where it lies in the package name? Is there some sort of REGEX I don't know about for this?
I would like to do something along the lines of:
log4j.logger.*.com.myorg.somepackage= WARN
but that doesn't work.
I can't find default functionality to do this in log4j, however log4j does allow you to create your own filters.
Here is what I did to resolve it:
import org.apache.log4j.Level;
import org.apache.log4j.spi.Filter;
import org.apache.log4j.spi.LoggingEvent;
public class CustomFilterer extends Filter {
#Override
public int decide(LoggingEvent event) {
if (!event.getLoggerName().contains("com.myorg.somepackage")) {
return Filter.NEUTRAL;
} else {
if (event.getLevel().isGreaterOrEqual(Level.WARN)) {
return Filter.NEUTRAL;
} else {
return Filter.DENY;
}
}
}
}
`