Instantiating a Case Class from a Large Parameter List - list

Guys I'm in trouble when I have a large parameter list, but when I have a few work perfectly, does anyone have any idea what it might be?
Small parameter list, Ok
scala> case class Foo(a: Int, b: String, c: Double)
defined class Foo
scala> val params = Foo(1, "bar", 3.14).productIterator.toList
params: List[Any] = List(1, bar, 3.14)
scala> Foo.getClass.getMethods.find(x => x.getName == "apply" && x.isBridge).get.invoke(Foo, params map (_.asInstanceOf[AnyRef]): _*).asInstanceOf[Foo]
res0: Foo = Foo(1,bar,3.14)
scala> Foo(1, "bar", 3.14) == res0
res1: Boolean = true
when I have a very large list of parameters, it displays the following error below:
scala> case class Foo(a1: String,a2: String,a3: String,a4: String,a5: String,a6: String,a7: String,a8: String,a9: String,a10: String,a12: String,a13: String,a14: String,a15: String,a16: String,a17: String,a18: String,a19: String,a20: String,a21: String,a22: String,a23: String,a24: String)
defined class Foo
scala> val params2 = Foo("bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar","bar").productIterator.toList
params2: List[Any] = List(bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar, bar)
scala> val test = Foo.getClass.getMethods.find(x => x.getName == "apply" && x.isBridge).get.invoke(Foo, params2 map (_.asInstanceOf[AnyRef]): _*).asInstanceOf[Foo]
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:347)
at scala.None$.get(Option.scala:345)
... 46 elided

There is 22 limit on case classes. Bigger case classes still compiles but there are some limitations on those.
https://underscore.io/blog/posts/2016/10/11/twenty-two.html

Related

Akka stream equivalent of flatmap operator in RxJava

I have a background in RxJava and now I am getting started using Akka Streams.
When I need to compose a stream with the result of the first stream in RxJava, I usually do the following:
val fooBarObserable = fooObservable.flatmap { foo ->
return barObservable.map { bar -> someOperation (foo, bar) }
}
In this example, fooObservableemits Foo type, barObservable emits Bar type and fooBarObserable emits FooBar type.
obs:
Observable is very similar to Source in Akka Streams.
barObservable will be a Flow
So what's the easy way to compose a stream like that in Akka?
you can use flatMapConcat, as follows :
final case class Foo(a: String)
final case class Bar(a: String)
val fooSource = Source(List(Foo("q-foo"), Foo("b-foo"), Foo("d-foo")))
val barSource = Source(List(Bar("q-bar"), Bar("b-bar"), Bar("d-bar")))
private val value: Source[String, NotUsed] = fooSource.flatMapConcat(foo => {
barSource.map(bar => {
foo.a + bar.a
})
})
Running this : value.runWith(Sink.foreach(println)) will yield :
q-foo q-bar
q-foo b-bar
q-foo d-bar
b-foo q-bar
b-foo b-bar
b-foo d-bar
d-foo q-bar
d-foo b-bar
d-foo d-bar
Caveat: I'm not quite familiar with the RxJava so I may misunderstand what semantics it is you are after.
As a flow is not terminated you couldn't really combine its output with a source in the style of fooSource.operation(barFlow) as that wouldn't give you a reasonable graph of stages back - you'd first need to provide a barSource and combine the barFlow with that.
You could compose it the other way around though, for example like this:
val barFlow: Flow[Bar, Bar, NotUsed] = ???
val fooSource: Source[Foo, NotUsed] = ???
val fooToBarFooPairsFlow: Flow[Bar, (Bar, Foo), NotUsed] =
barFlow.zip(fooSource)
Which would tuple up pairs of values of foo and bar let you later run it with a barSource.via(fooToBarFooPairsFlow).map { case (bar, foo) => op(bar, foo) }

How to have function returned value in a SML record

I am new to SML, trying to explore SML record and types, specifically how to have function inside a record.
For example, I created below type-
type foo={
var1:int,
f1: int -> int // want to have result of function f1 here
};
Now if I declare record of type 'foo'-
val rec1 = { var1= 10, ....}
I am not getting how to populate the 2nd parameter in the record. f1(10) is giving error.Also, can we declare and define the function inside 'type' like below -
type foo ={
var1:int,
f1 (x)=x+x
};
Please share your opinion.
You need to use a function expression:
val r = {var1 = 10, f = fn x => x}
No, you cannot define the value of a record field in a type definition. But you can define a little helper function as a "constructor":
fun new_foo i = {var1 = i, f = fn x => x+x}

Call collect on partial function as member of an object

I am using some functions that return Options, but I would like to replace them by PartialFunctions in order to use the collect function in Scala. In detail I am trying to call collect (and collectFirst) on a list of objects that contain a partial function in the following way:
class A(val bs : List[B]) {
def getAll(i : Int): List[Int] = bs.map(_.foo(i))
}
class B(val y : Int) {
val foo : PartialFunction[Int, Int] = {
case x if x > y => y
}
}
The above code compiles and does what I want to if the function foo is defined for all values in bs.
val a = new A(List(new B(1), new B(2), new B(3)));
println(a.getAll(5))
prints "List(1, 2, 3)", but of course I run into an error if foo is not defined for a value. Hence, I want to replace it by "collect"
def getAll(i : Int): List[Int] = bs.collect(_.foo(i))
In my understanding collect should work pretty much the same as map, yet still the above code does not compile (I get a type mismatch because foo cannot be resolved). What is the best way in this case?
collect expects to receive a partial function (not a function or lambda), that will take an element of your collection as an input: PartialFunction[B, Int]. But here you want just to call your PartialFunction[Int, Int] inside another function, so you don't even passing a PartialFunction into collect (you're passing function (_.foo(i)): B => Int).
Solution without collect (but still with PartialFunction as a member of B):
def getAll(i : Int): List[Int] = bs.flatMap(_.foo.lift(i))
lift rises your function from PartialFunction[Int, Int] to Int => Option[Int] here.
If you really really want to collect:
import Function._
def getAll(i : Int): List[Int] = bs.collect(unlift(_.foo.lift(i)))
unlift reduces B => Option[Int] into PartialFunction[B, Int]
The ugliest version is collect{ case x if x.foo.isDefinedAt(i) => x.foo(i) }
Better solution is to move your partial function outside of B:
case class A(bs : List[B]) {
def getAll(x : Int): List[Int] = bs.collect {
case B(y) if x > y => y
}
}
case class B(val y : Int)

Explicit polymorphic type in record

In OCaml, it is possible to define explicit polymorphic type in a record
type foo = { f : 'a. unit -> 'a };;
It seems we can assign only general values to f like
{ f = fun () -> failwith ""; }
or
{ f = fun () -> exit 1; }
How to use this language feature in real world? Is there any good practical example?
This isn't really connected with records. If you declare any function to have type 'a. unit -> 'a (takes nothing and returns whatever the caller wanted) then you can only use it for functions that don't return.
Here's a slightly more useful example: a record containing a function for finding the length of lists (of any type).
# type foo = { f : 'a. 'a list -> int };;
type foo = { f : 'a. 'a list -> int; }
# let foo = { f = List.length };;
val foo : foo = {f = <fun>}
# foo.f [1;2;3];;
- : int = 3
It can be useful if you wanted to pass a function like List.length as an argument to another function, and have it use it on multiple types:
Say we want to pass List.length to test. We can't do it directly:
# let test fn = fn [1;2;3] + fn ["a";"b";"c"];;
Error: This expression has type string but an expression was expected of type
int
But we can use a record:
# let test foo = foo.f [1;2;3] + foo.f ["a";"b";"c"];;
val test : foo -> int = <fun>
# test foo;;
- : int = 6

Record labels must be globally unique? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Two fields of two records have same label in OCaml
In Ocaml 3.12.0, is it necessary that any labels of a record have globally unique names?
type foo = { a : int; b : char; }
# type bar = {a : int; b : string};;
type bar = { a : int; b : string; }
# {a=3; b='a'};;
{a=3; b='a'};;
Error: This expression has type char but an expression was expected of type
string
I guess if the record is created anonymously, the only way for the compiler to know which type I'm referring to is the record names. Does declaring bar hide foo?
No, record labels don't have to be globally unique. But they have to be unique in module level.
Declaring bar doesn't hide foo; therefore, type inference is broken when refering to b field.
You can easily create submodules and use module names to distinguish between records with the same label:
module Foo = struct
type foo = {a: int; b: char}
end
module Bar = struct
type bar = {a: int; b: string}
end
let f = {Foo.a = 3; b = 'a'} (* Note that we only need to use module name once *)
let b = {Bar.a = 3; b = "a"}