Scala spec unit tests - unit-testing
I ve got the following class and I want to write some Spec test cases, but I am really new to it and I don't know how to start. My class do loke like this:
class Board{
val array = Array.fill(7)(Array.fill(6)(None:Option[Coin]))
def move(x:Int, coin:Coin) {
val y = array(x).indexOf(None)
require(y >= 0)
array(x)(y) = Some(coin)
}
def apply(x: Int, y: Int):Option[Coin] =
if (0 <= x && x < 7 && 0 <= y && y < 6) array(x)(y)
else None
def winner: Option[Coin] = winner(Cross).orElse(winner(Naught))
private def winner(coin:Coin):Option[Coin] = {
val rows = (0 until 6).map(y => (0 until 7).map( x => apply(x,y)))
val cols = (0 until 7).map(x => (0 until 6).map( y => apply(x,y)))
val dia1 = (0 until 4).map(x => (0 until 6).map( y => apply(x+y,y)))
val dia2 = (3 until 7).map(x => (0 until 6).map( y => apply(x-y,y)))
val slice = List.fill(4)(Some(coin))
if((rows ++ cols ++ dia1 ++ dia2).exists(_.containsSlice(slice)))
Some(coin)
else None
}
override def toString = {
val string = new StringBuilder
for(y <- 5 to 0 by -1; x <- 0 to 6){
string.append(apply(x, y).getOrElse("_"))
if (x == 6) string.append ("\n")
else string.append("|")
}
string.append("0 1 2 3 4 5 6\n").toString
}
}
Thank you!
I can only second Daniel's suggestion, because you'll end up with a more practical API by using TDD.
I also think that your application could be nicely tested with a mix of specs2 and ScalaCheck. Here the draft of a Specification to get you started:
import org.specs2._
import org.scalacheck.{Arbitrary, Gen}
class TestSpec extends Specification with ScalaCheck { def is =
"moving a coin in a column moves the coin to the nearest empty slot" ! e1^
"a coin wins if" ^
"a row contains 4 consecutive coins" ! e2^
"a column contains 4 consecutive coins" ! e3^
"a diagonal contains 4 consecutive coins" ! e4^
end
def e1 = check { (b: Board, x: Int, c: Coin) =>
try { b.move(x, c) } catch { case e => () }
// either there was a coin before somewhere in that column
// or there is now after the move
(0 until 6).exists(y => b(x, y).isDefined)
}
def e2 = pending
def e3 = pending
def e4 = pending
/**
* Random data for Coins, x position and Board
*/
implicit def arbitraryCoin: Arbitrary[Coin] = Arbitrary { Gen.oneOf(Cross, Naught) }
implicit def arbitraryXPosition: Arbitrary[Int] = Arbitrary { Gen.choose(0, 6) }
implicit def arbitraryBoardMove: Arbitrary[(Int, Coin)] = Arbitrary {
for {
coin <- arbitraryCoin.arbitrary
x <- arbitraryXPosition.arbitrary
} yield (x, coin)
}
implicit def arbitraryBoard: Arbitrary[Board] = Arbitrary {
for {
moves <- Gen.listOf1(arbitraryBoardMove.arbitrary)
} yield {
val board = new Board
moves.foreach { case (x, coin) =>
try { board.move(x, coin) } catch { case e => () }}
board
}
}
}
object Cross extends Coin {
override def toString = "x"
}
object Naught extends Coin {
override def toString = "o"
}
sealed trait Coin
The e1 property I've implemented is not the real thing because it doesn't really check that we moved the coin to the nearest empty slot, which is what your code and your API suggests. You will also want to change the generated data so that the Boards are generated with an alternation of x and o. That should be a great way to learn how to use ScalaCheck!
I suggest you throw all that code out -- well, save it somewhere, but start from zero using TDD.
The Specs2 site has plenty examples of how to write tests, but use TDD -- test driven design -- to do it. Adding tests after the fact is suboptimal, to say the least.
So, think of the most simple case you want to handle of the most simple feature, write a test for that, see it fail, write the code to fix it. Refactor if necessary, and repeat for the next most simple case.
If you want help with how to do TDD in general, I heartily endorse the videos about TDD available on Clean Coders. At the very least, watch the second part where Bob Martin writes a whole class TDD-style, from design to end.
If you know how to do testing in general but are confused about Scala or Specs, please be much more specific about what your questions are.
Related
Why does thinkscript throw these issues when I try to create a counter?
When I try to create a counter and increment it in an if-else statement the thinkscript compiler throws confusing errors which tell me it's not allowed, yet I've seen this done in several examples. They even have a reserved word: rec in order to allow for incrementing counters. score = score + 1; produces: # Already assigned: Score at... rec score = score + 1; produces: # identifier already used: score at ... # not allowed inside an IF/THEN/ELSE statement # # TD Ameritrade IP Company, Inc. (c) 2017-2019 # input price = close; input length = 9; input displace = 0; def score = 0; def smavrgg = Average(price[-displace], length); def expMvAvrg = ExpAverage(price[-displace], length); plot SMA = smavrgg; SMA.SetDefaultColor(GetColor(1)); plot AvgExp = expMvAvrg; AvgExp.SetDefaultColor(GetColor(1)); # 1 if uptrend, 0 if downtrend def lastTrendisUp = (close[0] - close[1]) > 0 ; def secondLastTrendisUP = (close[1] - close[2]) > 0; def thirdLastTrendisUP = (close[2] - close[3]) > 0; def fourthLastTrendisUP = (close[3] - close[4]) > 0; input lookback = 5; # defines intBool (array) that indicates whether one or the other crossed. def bull_cross = SMA crosses above AvgExp; def bear_cross = AvgExp crosses below SMA; # returns the highest value in the data array for the lookback. # so [0, 1, 0, 0] means a cross happened within the last units. and 1 will be returned. if (bull_cross[0] or bear_cross[0]) then { if lastTrendisUp { # Already assigned: Score at... score = score + 1; # identifier already used: score at ... # not allowed inside an IF/THEN/ELSE statement rec score = score + 1; } else { } } else if (bull_cross[1] or bear_cross[1]) { if secondLastTrendisUP { } else { } } else if (bull_cross[2] or bear_cross[2]) { if thirdLastTrendisUP { } else { } } else if (bull_cross[3] or bear_cross[3]) { if fourthLastTrendisUP { } else { } } else if (bull_cross[4] or bear_cross[4]) { } else { } # If most recent cross happened in the last 4 # and most recent cross occured on a green candle. def bull_lookback = Highest(bull_cross, lookback); def bear_lookback = Highest(bear_cross, lookback); # def think = if bull_lookback or bear_lookback plot signal = if bull_lookback then 2 else if bear_lookback then 1 else 0; signal.AssignValueColor(if signal == 2 then Color.DARK_GREEN else if signal == 1 then Color.DARK_RED else Color.DARK_ORANGE); AssignBackgroundColor(if signal == 2 then Color.DARK_GREEN else if signal == 1 then Color.DARK_RED else Color.DARK_ORANGE);
Once you define a variable in Thinkscript and assign it, it's only valid for one bar, it behaves as a constant so it can't be reassigned. I'm pretty sure you can't even place a Def command into a conditional, just like in most codes. In order to create a 'dynamic' SCORE, you need to assign the dynamic value in the same line you instantiate. You don't need def score = 0; since when you define the variable, it will have a zero value anyway. You also don't the extra variables for the 'trendisup' placeholders, because really secondLastTrendisUp is the same as saying lastTrendisUp[1] because it was already computed in the last bar. You can accomplish the counter without the extra variables using a FOLD statement, like this: def score= fold index=0 to 4 with p=0 do p + ((bearcross[index] or bullcross[index]) and lastTrendisUp[index]); This will add one to the score each time the conditions are true, and assign the total to the SCORE variable. I think this is what you would like to accomplish, I can't tell, since you never show what you're doing with the score variable later on... If you are looking to just find out if bullcross or bearcross condition and also the lasttrendisup condition evaluates to true in any of the last five bars, then you add 'while p=0' above the with, and it will return a one to SCORE as soon as it encounters the first true instance.
counter to increase a variable by 1, on each bar: score = score[1] + 1; the [1] means, go get the value from that variable from 1 bar ago.
The answer is that variables in thinkscript cannot be changed.
Scala memory issue on List vs. Vector
I wrote a solution to project Euler problem #59 in Scala and I do not understand why switching between Vector and List adds what I think is a memory leak. Here is a working, brute force solution using Vectors. val code = scala.io.Source.fromFile("e59.txt").getLines() .flatMap(l => l.split(',')).map(_.toInt).toVector val commonWords = scala.io.Source.fromFile("common_words.txt").getLines().toVector def decode(k: Int)(code: Vector[Int])(pswd: Vector[Int]): Vector[Int] = { code.grouped(k).flatMap(cs => cs.toVector.zip(pswd).map(t => t._1 ^ t._2)).toVector } def scoreText(text: Vector[Int]): Int = { if (text.contains((c: Int) => (c < 0 || c > 128))) -1 else { val words = text.map(_.toChar).mkString.toLowerCase.split(' ') words.length - words.diff(commonWords).length } } lazy val psswds = for { a <- (97 to 122); b <- (97 to 122); c <- (97 to 122) } yield Vector(a, b, c) val ans = psswds.toStream.map(decode(3)(code)) .map(text => (text, scoreText(text))) .maxBy(_._2)._1.sum println(ans) I store original code (a collection of ordered ints), each password and some common English words as Vectors. However, if I replace Vector with List, my program slows down with each checked password and eventually crashes: val code = scala.io.Source.fromFile("e59.txt").getLines() .flatMap(l => l.split(',')).map(_.toInt).toList val commonWords = scala.io.Source.fromFile("common_words.txt").getLines().toList def decode(k: Int)(code: List[Int])(pswd: List[Int]): List[Int] = { println(pswd) code.grouped(k).flatMap(cs => cs.toList.zip(pswd).map(t => t._1 ^ t._2)).toList } def scoreText(text: List[Int]): Int = { if (text.contains((c: Int) => (c < 0 || c > 128))) -1 else { val words = text.map(_.toChar).mkString.toLowerCase.split(' ') words.length - words.diff(commonWords).length } } lazy val psswds = for { a <- (97 to 122); b <- (97 to 122); c <- (97 to 122) } yield List(a, b, c) val ans = psswds.toStream.map(decode(3)(code)) .map(text => (text, scoreText(text))) .maxBy(_._2)._1.sum println(ans) Error: java.lang.OutOfMemoryError: GC overhead limit exceeded at java.lang.String.valueOf(String.java:2861) at java.lang.Character.toString(Character.java:4439) at java.lang.String.valueOf(String.java:2847) at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200) at scala.collection.TraversableOnce$$anonfun$addString$1.apply(TraversableOnce.scala:349) at scala.collection.immutable.List.foreach(List.scala:381) at scala.collection.TraversableOnce$class.addString(TraversableOnce.scala:342) at scala.collection.AbstractTraversable.addString(Traversable.scala:104) at scala.collection.TraversableOnce$class.mkString(TraversableOnce.scala:308) at scala.collection.AbstractTraversable.mkString(Traversable.scala:104) at scala.collection.TraversableOnce$class.mkString(TraversableOnce.scala:310) at scala.collection.AbstractTraversable.mkString(Traversable.scala:104) at scala.collection.TraversableOnce$class.mkString(TraversableOnce.scala:312) at scala.collection.AbstractTraversable.mkString(Traversable.scala:104) at Main$$anon$1.Main$$anon$$scoreText(e59_list.scala:14) at Main$$anon$1$$anonfun$5.apply(e59_list.scala:26) at Main$$anon$1$$anonfun$5.apply(e59_list.scala:26) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1222) at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1212) at scala.collection.immutable.Stream.foreach(Stream.scala:595) at scala.collection.TraversableOnce$class.maxBy(TraversableOnce.scala:227) at scala.collection.AbstractTraversable.maxBy(Traversable.scala:104) at Main$$anon$1.<init>(e59_list.scala:27) at Main$.main(e59_list.scala:1) at Main.main(e59_list.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:70) Files used: common_words.txt a able about across after all almost also am among an and any are as at be because been but by can cannot could dear did do does either else ever every for from get got had has have he her hers him his how however i if in into is it its just least let like likely may me might most must my neither no nor not of off often on only or other our own rather said say says she should since so some than that the their them then there these they this tis to too twas us wants was we were what when where which while who whom why will with would yet you your e59.txt 79,59,12,2,79,35,8,28,20,2,3,68,8,9,68,45,0,12,9,67,68,4,7,5,23,27,1,21,79,85,78,79,85,71,38,10,71,27,12,2,79,6,2,8,13,9,1,13,9,8,68,19,7,1,71,56,11,21,11,68,6,3,22,2,14,0,30,79,1,31,6,23,19,10,0,73,79,44,2,79,19,6,28,68,16,6,16,15,79,35,8,11,72,71,14,10,3,79,12,2,79,19,6,28,68,32,0,0,73,79,86,71,39,1,71,24,5,20,79,13,9,79,16,15,10,68,5,10,3,14,1,10,14,1,3,71,24,13,19,7,68,32,0,0,73,79,87,71,39,1,71,12,22,2,14,16,2,11,68,2,25,1,21,22,16,15,6,10,0,79,16,15,10,22,2,79,13,20,65,68,41,0,16,15,6,10,0,79,1,31,6,23,19,28,68,19,7,5,19,79,12,2,79,0,14,11,10,64,27,68,10,14,15,2,65,68,83,79,40,14,9,1,71,6,16,20,10,8,1,79,19,6,28,68,14,1,68,15,6,9,75,79,5,9,11,68,19,7,13,20,79,8,14,9,1,71,8,13,17,10,23,71,3,13,0,7,16,71,27,11,71,10,18,2,29,29,8,1,1,73,79,81,71,59,12,2,79,8,14,8,12,19,79,23,15,6,10,2,28,68,19,7,22,8,26,3,15,79,16,15,10,68,3,14,22,12,1,1,20,28,72,71,14,10,3,79,16,15,10,68,3,14,22,12,1,1,20,28,68,4,14,10,71,1,1,17,10,22,71,10,28,19,6,10,0,26,13,20,7,68,14,27,74,71,89,68,32,0,0,71,28,1,9,27,68,45,0,12,9,79,16,15,10,68,37,14,20,19,6,23,19,79,83,71,27,11,71,27,1,11,3,68,2,25,1,21,22,11,9,10,68,6,13,11,18,27,68,19,7,1,71,3,13,0,7,16,71,28,11,71,27,12,6,27,68,2,25,1,21,22,11,9,10,68,10,6,3,15,27,68,5,10,8,14,10,18,2,79,6,2,12,5,18,28,1,71,0,2,71,7,13,20,79,16,2,28,16,14,2,11,9,22,74,71,87,68,45,0,12,9,79,12,14,2,23,2,3,2,71,24,5,20,79,10,8,27,68,19,7,1,71,3,13,0,7,16,92,79,12,2,79,19,6,28,68,8,1,8,30,79,5,71,24,13,19,1,1,20,28,68,19,0,68,19,7,1,71,3,13,0,7,16,73,79,93,71,59,12,2,79,11,9,10,68,16,7,11,71,6,23,71,27,12,2,79,16,21,26,1,71,3,13,0,7,16,75,79,19,15,0,68,0,6,18,2,28,68,11,6,3,15,27,68,19,0,68,2,25,1,21,22,11,9,10,72,71,24,5,20,79,3,8,6,10,0,79,16,8,79,7,8,2,1,71,6,10,19,0,68,19,7,1,71,24,11,21,3,0,73,79,85,87,79,38,18,27,68,6,3,16,15,0,17,0,7,68,19,7,1,71,24,11,21,3,0,71,24,5,20,79,9,6,11,1,71,27,12,21,0,17,0,7,68,15,6,9,75,79,16,15,10,68,16,0,22,11,11,68,3,6,0,9,72,16,71,29,1,4,0,3,9,6,30,2,79,12,14,2,68,16,7,1,9,79,12,2,79,7,6,2,1,73,79,85,86,79,33,17,10,10,71,6,10,71,7,13,20,79,11,16,1,68,11,14,10,3,79,5,9,11,68,6,2,11,9,8,68,15,6,23,71,0,19,9,79,20,2,0,20,11,10,72,71,7,1,71,24,5,20,79,10,8,27,68,6,12,7,2,31,16,2,11,74,71,94,86,71,45,17,19,79,16,8,79,5,11,3,68,16,7,11,71,13,1,11,6,1,17,10,0,71,7,13,10,79,5,9,11,68,6,12,7,2,31,16,2,11,68,15,6,9,75,79,12,2,79,3,6,25,1,71,27,12,2,79,22,14,8,12,19,79,16,8,79,6,2,12,11,10,10,68,4,7,13,11,11,22,2,1,68,8,9,68,32,0,0,73,79,85,84,79,48,15,10,29,71,14,22,2,79,22,2,13,11,21,1,69,71,59,12,14,28,68,14,28,68,9,0,16,71,14,68,23,7,29,20,6,7,6,3,68,5,6,22,19,7,68,21,10,23,18,3,16,14,1,3,71,9,22,8,2,68,15,26,9,6,1,68,23,14,23,20,6,11,9,79,11,21,79,20,11,14,10,75,79,16,15,6,23,71,29,1,5,6,22,19,7,68,4,0,9,2,28,68,1,29,11,10,79,35,8,11,74,86,91,68,52,0,68,19,7,1,71,56,11,21,11,68,5,10,7,6,2,1,71,7,17,10,14,10,71,14,10,3,79,8,14,25,1,3,79,12,2,29,1,71,0,10,71,10,5,21,27,12,71,14,9,8,1,3,71,26,23,73,79,44,2,79,19,6,28,68,1,26,8,11,79,11,1,79,17,9,9,5,14,3,13,9,8,68,11,0,18,2,79,5,9,11,68,1,14,13,19,7,2,18,3,10,2,28,23,73,79,37,9,11,68,16,10,68,15,14,18,2,79,23,2,10,10,71,7,13,20,79,3,11,0,22,30,67,68,19,7,1,71,8,8,8,29,29,71,0,2,71,27,12,2,79,11,9,3,29,71,60,11,9,79,11,1,79,16,15,10,68,33,14,16,15,10,22,73
Large amount of Lists create more load on GC comparing to the same Vectors. But your problem is not about right choice of collections, but about wrong use of Stream. Scala's streams can be very memory inefficient if used improperly. In your case, I assume, you were trying to use Stream to avoid eager computation of the transformed passwds collection, but you actually made the things worse (as Stream not only memoized your elements, it created extra overhead with Stream wrappers of these elements). What you had to do is just to replace toStream with view. It will create collection wrapper which makes nearly all transformations lazy (basically what you tried to achieve). val ans = psswds.view.map(decode(3)(code)) .map(text => (text, scoreText(text))) .maxBy(_._2)._1.sum After this tiny fix you program runs fine even with -Xmx5m (I checked). There are also many other things to optimize in your program (try to avoid creating excessive collections), but I'll leave it to you.
How to maintain an immutable list when you impact object linked to each other into this list
I'm trying to code the fast Non Dominated Sorting algorithm (NDS) of Deb used in NSGA2 in immutable way using Scala. But the problem seems more difficult than i think, so i simplify here the problem to make a MWE. Imagine a population of Seq[A], and each A element is decoratedA with a list which contains pointers to other elements of the population Seq[A]. A function evalA(a:decoratedA) take the list of linkedA it contains, and decrement value of each. Next i take a subset list decoratedAPopulation of population A, and call evalA on each. I have a problem, because between each iteration on element on this subset list decoratedAPopulation, i need to update my population of A with the new decoratedA and the new updated linkedA it contain ... More problematic, each element of population need an update of 'linkedA' to replace the linked element if it change ... Hum as you can see, it seem complicated to maintain all linked list synchronized in this way. I propose another solution bottom, which probably need recursion to return after each EvalA a new Population with element replaced. How can i do that correctly in an immutable way ? It's easy to code in a mutable way, but i don't find a good way to do this in an immutable way, do you have a path or an idea to do that ? object test extends App{ case class A(value:Int) {def decrement()= new A(value - 1)} case class decoratedA(oneAdecorated:A, listOfLinkedA:Seq[A]) // We start algorithm loop with A element with value = 0 val population = Seq(new A(0), new A(0), new A(8), new A(1)) val decoratedApopulation = Seq(new decoratedA(population(1),Seq(population(2),population(3))), new decoratedA(population(2),Seq(population(1),population(3)))) def evalA(a:decoratedA) = { val newListOfLinked = a.listOfLinkedA.map{e => e.decrement() new decoratedA(a.oneAdecorated,newListOfLinked)} } def run()= { //decoratedApopulation.map{ // ? //} } } Update 1: About the input / output of the initial algorithm. The first part of Deb algorithm (Step 1 to Step 3) analyse a list of Individual, and compute for each A : (a) domination count, the number of A which dominate me (the value attribute of A) (b) a list of A i dominate (listOfLinkedA). So it return a Population of decoratedA totally initialized, and for the entry of Step 4 (my problem) i take the first non dominated front, cf. the subset of elements of decoratedA with A value = 0. My problem start here, with a list of decoratedA with A value = 0; and i search the next front into this list by computing each listOfLinkedA of each of this A At each iteration between step 4 to step 6, i need to compute a new B subset list of decoratedA with A value = 0. For each , i decrementing first the domination count attribute of each element into listOfLinkedA, then i filter to get the element equal to 0. A the end of step 6, B is saved to a list List[Seq[DecoratedA]], then i restart to step 4 with B, and compute a new C, etc. Something like that in my code, i call explore() for each element of B, with Q equal at the end to new subset of decoratedA with value (fitness here) = 0 : case class PopulationElement(popElement:Seq[Double]){ implicit def poptodouble():Seq[Double] = { popElement } } class SolutionElement(values: PopulationElement, fitness:Double, dominates: Seq[SolutionElement]) { def decrement()= if (fitness == 0) this else new SolutionElement(values,fitness - 1, dominates) def explore(Q:Seq[SolutionElement]):(SolutionElement, Seq[SolutionElement])={ // return all dominates elements with fitness - 1 val newSolutionSet = dominates.map{_.decrement()} val filteredSolution:Seq[SolutionElement] = newSolutionSet.filter{s => s.fitness == 0.0}.diff{Q} filteredSolution } } A the end of algorithm, i have a final list of seq of decoratedA List[Seq[DecoratedA]] which contain all my fronts computed. Update 2 A sample of value extracted from this example. I take only the pareto front (red) and the {f,h,l} next front with dominated count = 1. case class p(x: Int, y: Int) val a = A(p(3.5, 1.0),0) val b = A(p(3.0, 1.5),0) val c = A(p(2.0, 2.0),0) val d = A(p(1.0, 3.0),0) val e = A(p(0.5, 4.0),0) val f = A(p(0.5, 4.5),1) val h = A(p(1.5, 3.5),1) val l = A(p(4.5, 1.0),1) case class A(XY:p, value:Int) {def decrement()= new A(XY, value - 1)} case class ARoot(node:A, children:Seq[A]) val population = Seq( ARoot(a,Seq(f,h,l), ARoot(b,Seq(f,h,l)), ARoot(c,Seq(f,h,l)), ARoot(d,Seq(f,h,l)), ARoot(e,Seq(f,h,l)), ARoot(f,Nil), ARoot(h,Nil), ARoot(l,Nil)) Algorithm return List(List(a,b,c,d,e), List(f,h,l)) Update 3 After 2 hour, and some pattern matching problems (Ahum...) i'm comming back with complete example which compute automaticaly the dominated counter, and the children of each ARoot. But i have the same problem, my children list computation is not totally correct, because each element A is possibly a shared member of another ARoot children list, so i need to think about your answer to modify it :/ At this time i only compute children list of Seq[p], and i need list of seq[A] case class p(x: Double, y: Double){ def toSeq():Seq[Double] = Seq(x,y) } case class A(XY:p, dominatedCounter:Int) {def decrement()= new A(XY, dominatedCounter - 1)} case class ARoot(node:A, children:Seq[A]) case class ARootRaw(node:A, children:Seq[p]) object test_stackoverflow extends App { val a = new p(3.5, 1.0) val b = new p(3.0, 1.5) val c = new p(2.0, 2.0) val d = new p(1.0, 3.0) val e = new p(0.5, 4.0) val f = new p(0.5, 4.5) val g = new p(1.5, 4.5) val h = new p(1.5, 3.5) val i = new p(2.0, 3.5) val j = new p(2.5, 3.0) val k = new p(3.5, 2.0) val l = new p(4.5, 1.0) val m = new p(4.5, 2.5) val n = new p(4.0, 4.0) val o = new p(3.0, 4.0) val p = new p(5.0, 4.5) def isStriclyDominated(p1: p, p2: p): Boolean = { (p1.toSeq zip p2.toSeq).exists { case (g1, g2) => g1 < g2 } } def sortedByRank(population: Seq[p]) = { def paretoRanking(values: Set[p]) = { //comment from #dk14: I suppose order of values isn't matter here, otherwise use SortedSet values.map { v1 => val t = (values - v1).filter(isStriclyDominated(v1, _)).toSeq val a = new A(v1, values.size - t.size - 1) val root = new ARootRaw(a, t) println("Root value ", root) root } } val listOfARootRaw = paretoRanking(population.toSet) //From #dk14: Here is convertion from Seq[p] to Seq[A] val dominations: Map[p, Int] = listOfARootRaw.map(a => a.node.XY -> a.node.dominatedCounter) //From #dk14: It's a map with dominatedCounter for each point val listOfARoot = listOfARootRaw.map(raw => ARoot(raw.node, raw.children.map(p => A(p, dominations.getOrElse(p, 0))))) listOfARoot.groupBy(_.node.dominatedCounter) } //Get the first front, a subset of ARoot, and start the step 4 println(sortedByRank(Seq(a, b, c, d, e, f, g, h, i, j, k, l, m, n, o, p)).head) }
Talking about your problem with distinguishing fronts (after update 2): val (left,right) = population.partition(_.node.value == 0) List(left, right.map(_.copy(node = node.copy(value = node.value - 1)))) No need for mutating anything here. copy will copy everything but fields you specified with new values. Talking about the code, the new copy will be linked to the same list of children, but new value = value - 1. P.S. I have a feeling you may actually want to do something like this: case class A(id: String, level: Int) val a = A("a", 1) val b = A("b", 2) val c = A("c", 2) val d = A("d", 3) clusterize(List(a,b,c,d)) === List(List(a), List(b,c), List(d)) It's simple to implement: def clusterize(list: List[A]) = list.groupBy(_.level).toList.sortBy(_._1).map(_._2) Test: scala> clusterize(List(A("a", 1), A("b", 2), A("c", 2), A("d", 3))) res2: List[List[A]] = List(List(A(a,1)), List(A(b,2), A(c,2)), List(A(d,3))) P.S.2. Please consider better naming conventions, like here. Talking about "mutating" elements in some complex structure: The idea of "immutable mutating" some shared (between parts of a structure) value is to separate your "mutation" from the structure. Or simply saying, divide and conquerror: calculate changes in advance apply them The code: case class A(v: Int) case class AA(a: A, seq: Seq[A]) //decoratedA def update(input: Seq[AA]) = { //shows how to decrement each value wherever it is: val stats = input.map(_.a).groupBy(identity).mapValues(_.size) //domination count for each A def upd(a: A) = A(a.v - stats.getOrElse(a, 0)) //apply decrement input.map(aa => aa.copy(aa = aa.seq.map(upd))) //traverse and "update" original structure } So, I've introduced new Map[A, Int] structure, that shows how to modify the original one. This approach is based on highly simplified version of Applicative Functor concept. In general case, it should be Map[A, A => A] or even Map[K, A => B] or even Map[K, Zipper[A] => B] as applicative functor (input <*> map). *Zipper (see 1, 2) actually could give you information about current element's context. Notes: I assumed that As with same value are same; that's default behaviour for case classess, otherwise you need to provide some additional id's (or redefine hashCode/equals). If you need more levels - like AA(AA(AA(...)))) - just make stats and upd recursive, if dеcrement's weight depends on nesting level - just add nesting level as parameter to your recursive function. If decrement depends on parent node (like decrement only A(3)'s, which belongs to A(3)) - add parent node(s) as part of stats's key and analise it during upd. If there is some dependency between stats calculation (how much to decrement) of let's say input(1) from input(0) - you should use foldLeft with partial stats as accumulator: val stats = input.foldLeft(Map[A, Int]())((partialStats, elem) => partialStats ++ analize(partialStats, elem)) Btw, it takes O(N) here (linear memory and cpu usage) Example: scala> val population = Seq(A(3), A(6), A(8), A(3)) population: Seq[A] = List(A(3), A(6), A(8), A(3)) scala> val input = Seq(AA(population(1),Seq(population(2),population(3))), AA(population(2),Seq(population(1),population(3)))) input: Seq[AA] = List(AA(A(6),List(A(8), A(3))), AA(A(8),List(A(6), A(3)))) scala> update(input) res34: Seq[AA] = List(AA(A(5),List(A(7), A(3))), AA(A(7),List(A(5), A(3))))
Computing all values or stopping and returning just the best value if found
I have a list of items and for each item I am computing a value. Computing this value is a bit computationally intensive so I want to minimise it as much as possible. The algorithm I need to implement is this: I have a value X For each item a. compute the value for it, if it is < 0 ignore it completely b. if (value > 0) && (value < X) return pair (item, value) Return all (item, value) pairs in a List (that have the value > 0), ideally sorted by value To make it a bit clearer, step 3 only happens if none of the items have a value less than X. In step 2, when we encounter the first item that is less than X we should not compute the rest and just return that item (we can obviously return it in a Set() by itself to match the return type). The code I have at the moment is as follows: val itemValMap = items.foldLeft(Map[Item, Int)]()) { (map : Map[Item, Int], key : Item) => val value = computeValue(item) if ( value >= 0 ) //we filter out negative ones map + (key -> value) else map } val bestItem = itemValMap.minBy(_._2) if (bestItem._2 < bestX) { List(bestItem) } else { itemValMap.toList.sortBy(_._2) } However, what this code is doing is computing all the values in the list and choosing the best one, rather than stopping as a 'better' one is found. I suspect I have to use Streams in some way to achieve this?
OK, I'm not sure how your whole setup looks like, but I tried to prepare a minimal example that would mirror your situation. Here it is then: object StreamTest { case class Item(value : Int) def createItems() = List(Item(0),Item(3),Item(30),Item(8),Item(8),Item(4),Item(54),Item(-1),Item(23),Item(131)) def computeValue(i : Item) = { Thread.sleep(3000); i.value * 2 - 2 } def process(minValue : Int)(items : Seq[Item]) = { val stream = Stream(items: _*).map(item => item -> computeValue(item)).filter(tuple => tuple._2 >= 0) stream.find(tuple => tuple._2 < minValue).map(List(_)).getOrElse(stream.sortBy(_._2).toList) } } Each calculation takes 3 seconds. Now let's see how it works: val items = StreamTest.createItems() val result = StreamTest.process(2)(items) result.foreach(r => println("Original: " + r._1 + " , calculated: " + r._2)) Gives: [info] Running Main Original: Item(3) , calculated: 4 Original: Item(4) , calculated: 6 Original: Item(8) , calculated: 14 Original: Item(8) , calculated: 14 Original: Item(23) , calculated: 44 Original: Item(30) , calculated: 58 Original: Item(54) , calculated: 106 Original: Item(131) , calculated: 260 [success] Total time: 31 s, completed 2013-11-21 15:57:54 Since there's no value smaller than 2, we got a list ordered by the calculated value. Notice that two pairs are missing, because calculated values are smaller than 0 and got filtered out. OK, now let's try with a different minimum cut-off point: val result = StreamTest.process(5)(items) Which gives: [info] Running Main Original: Item(3) , calculated: 4 [success] Total time: 7 s, completed 2013-11-21 15:55:20 Good, it returned a list with only one item, the first value (second item in the original list) that was smaller than 'minimal' value and was not smaller than 0. I hope that the example above is easily adaptable to your needs...
A simple way to avoid the computation of unneeded values is to make your collection lazy by using the view method: val weigthedItems = items.view.map{ i => i -> computeValue(i) }.filter(_._2 >= 0 ) weigthedItems.find(_._2 < X).map(List(_)).getOrElse(weigthedItems.sortBy(_._2)) By example here is a test in the REPL: scala> :paste // Entering paste mode (ctrl-D to finish) type Item = String def computeValue( item: Item ): Int = { println("Computing " + item) item.toInt } val items = List[Item]("13", "1", "5", "-7", "12", "3", "-1", "15") val X = 10 val weigthedItems = items.view.map{ i => i -> computeValue(i) }.filter(_._2 >= 0 ) weigthedItems.find(_._2 < X).map(List(_)).getOrElse(weigthedItems.sortBy(_._2)) // Exiting paste mode, now interpreting. Computing 13 Computing 1 defined type alias Item computeValue: (item: Item)Int items: List[String] = List(13, 1, 5, -7, 12, 3, -1, 15) X: Int = 10 weigthedItems: scala.collection.SeqView[(String, Int),Seq[_]] = SeqViewM(...) res27: Seq[(String, Int)] = List((1,1)) As you can see computeValue was only called up to the first value < X (that is, up to 1)
Scala objects not changing their internal state
I am seeing a problem with some Scala 2.7.7 code I'm working on, that should not happen if it the equivalent was written in Java. Loosely, the code goes creates a bunch of card players and assigns them to tables. class Player(val playerNumber : Int) class Table (val tableNumber : Int) { var players : List[Player] = List() def registerPlayer(player : Player) { println("Registering player " + player.playerNumber + " on table " + tableNumber) players = player :: players } } object PlayerRegistrar { def assignPlayersToTables(playSamplesToExecute : Int, playersPerTable:Int) = { val numTables = playSamplesToExecute / playersPerTable val tables = (1 to numTables).map(new Table(_)) assert(tables.size == numTables) (0 until playSamplesToExecute).foreach {playSample => val tableNumber : Int = playSample % numTables tables(tableNumber).registerPlayer(new Player(playSample)) } tables } } The PlayerRegistrar assigns a number of players between tables. First, it works out how many tables it will need to break up the players between and creates a List of them. Then in the second part of the code, it works out which table a player should be assigned to, pulls that table from the list and registers a new player on that table. The list of players on a table is a var, and is overwritten each time registerPlayer() is called. I have checked that this works correctly through a simple TestNG test: #Test def testRegisterPlayer_multiplePlayers() { val table = new Table(1) (1 to 10).foreach { playerNumber => val player = new Player(playerNumber) table.registerPlayer(player) assert(table.players.contains(player)) assert(table.players.length == playerNumber) } } I then test the table assignment: #Test def testAssignPlayerToTables_1table() = { val tables = PlayerRegistrar.assignPlayersToTables(10, 10) assertEquals(tables.length, 1) assertEquals(tables(0).players.length, 10) } The test fails with "expected:<10> but was:<0>". I've been scratching my head, but can't work out why registerPlayer() isn't mutating the table in the list. Any help would be appreciated.
The reason is that in the assignPlayersToTables method, you are creating a new Table object. You can confirm this by adding some debugging into the loop: val tableNumber : Int = playSample % numTables println(tables(tableNumber)) tables(tableNumber).registerPlayer(new Player(playSample)) Yielding something like: Main$$anon$1$Table#5c73a7ab Registering player 0 on table 1 Main$$anon$1$Table#21f8c6df Registering player 1 on table 1 Main$$anon$1$Table#53c86be5 Registering player 2 on table 1 Note how the memory address of the table is different for each call. The reason for this behaviour is that a Range is non-strict in Scala (until Scala 2.8, anyway). This means that the call to the range is not evaluated until it's needed. So you think you're getting back a list of Table objects, but actually you're getting back a range which is evaluated (instantiating a new Table object) each time you call it. Again, you can confirm this by adding some debugging: val tables = (1 to numTables).map(new Table(_)) println(tables) Which gives you: RangeM(Main$$anon$1$Table#5492bbba) To do what you want, add a toList to the end: val tables = (1 to numTables).map(new Table(_)).toList
val tables = (1 to numTables).map(new Table(_)) This line seems to be causing all the trouble - mapping over 1 to n gives you a RandomAccessSeq.Projection, and to be honest, I don't know how exactly they work, but a bit less clever initialising technique does the job. var tables: Array[Table] = new Array(numTables) for (i <- 0 to numTables) tables(i) = new Table(i) Using the first initialisation method I wasn't able to change the objects (just like you), but using a simple array everything seems to be working.