Akka Streams inlets and outlets mismatch - akka

I'm getting the following error:
java.lang.IllegalArgumentException: requirement failed: The inlets [] and outlets [BlockOut.out] must correspond to the inlets [] and outlets [BlockOut.out]
I have a very simple graph:
val g1 = GraphDSL.create() { implicit builder =>
import GraphDSL.Implicits._
val in: Source[ByteString, Any] = Source.single(ByteString(digest))
val flow: GraphStage[FlowShape[ByteString, ByteString]] = new ReadBlockStage(dataStore, blockingExecutionContext)
in ~> flow
SourceShape(flow.shape.out)
}
val sourceGraph: Source[ByteString, NotUsed] = Source.fromGraph(g1)
and my flow is defined like this:
class ReadBlockStage(dataStore: DataStore, implicit val exceutionContext: ExecutionContext) extends GraphStage[FlowShape[ByteString, ByteString]] with DefaultJsonProtocol {
val in = Inlet[ByteString]("DigestSpec.in")
val out = Outlet[ByteString]("BlockOut.out")
override val shape = FlowShape.of(in, out)
...
}
Why am I getting this error? The flow's "out" port is of type Outlet[ByteString], and my Source is Source[ByteString, NotUsed]. The error message is very confusing because it looks like the shape and the expected shape are the same.

I figured out the issue. I had forgotten to perform the builder.add() for each of the graph elements.

Related

Akka Stream Materialized Value

I want to reference the materialized value from the flow. Below is the code snippet, but its not compiling, error:
type mismatch;
found : (akka.NotUsed, scala.concurrent.Future[akka.Done])
required: (Playground.DomainObj, scala.concurrent.Future[akka.Done])
Code:
import akka.actor.ActorSystem
import akka.stream.scaladsl._
import scala.concurrent.Future
import akka.NotUsed
import akka.Done
implicit val actorSystem = ActorSystem("example")
case class DomainObj(name: String, age: Int)
val customFlow1:Flow[String,DomainObj,NotUsed] = Flow[String].map(s => {
DomainObj(s, 50)
})
val customFlow2 = Flow[DomainObj].map(s => {
s.age + 10
})
val printAnySink: Sink[Any, Future[Done]] = Sink.foreach(println)
val c1 = Source.single("John").viaMat(customFlow1)(Keep.right).viaMat(customFlow2)(Keep.left).toMat(printAnySink)(Keep.both)
val res: (DomainObj, Future[Done]) = c1.run()
Find the code in playground: https://scastie.scala-lang.org/P9iSx49cQcaOZfKtVCzTPA
I want to reference the DomainObj after the stream completes/
The materialized value of a Flow[String, DomainObj, NotUsed] is NotUsed, not a DomainObj, therefore c1's materialized value is (NotUsed, Future[Done]).
It looks like the intent here is to capture the DomainObj which is created in customFlow1. That can be accomplished with
val customFlow1: Flow[String, DomainObj, Future[DomainObj]] =
Flow[String]
.map { s => DomainObj(s, 50) }
.alsoTo(Sink.head)
val res: (Future[DomainObj], Future[Done]) = c1.run()
Note that Sink.head effectively requires that customFlow1 can only be used downstream of something that only emits once.

How to access metrics of Alpakka CommittableSource with back off?

Accessing the metrics of an Alpakka PlainSource seems fairly straight forward, but how can I do the same thing with a CommittableSource?
I currently have a simple consumer, something like this:
class Consumer(implicit val ma: ActorMaterializer, implicit val ec: ExecutionContext) extends Actor {
private val settings = ConsumerSettings(
context.system,
new ByteArrayDeserializer,
new StringDeserializer)
.withProperties(...)
override def receive: Receive = Actor.emptyBehavior
RestartSource
.withBackoff(minBackoff = 2.seconds, maxBackoff = 20.seconds, randomFactor = 0.2)(consumer)
.runForeach { handleMessage }
private def consumer() = {
AkkaConsumer
.committableSource(settings, Subscriptions.topics(Set(topic)))
.log(getClass.getSimpleName)
.withAttributes(ActorAttributes.supervisionStrategy(_ => Supervision.Resume))
}
private def handleMessage(message: CommittableMessage[Array[Byte], String]): Unit = {
...
}
}
How can I get access to the consumer metrics in this case?
We are using the Java prometheus client and I solved my issue with a custom collector that fetches its metrics directly from JMX:
import java.lang.management.ManagementFactory
import java.util
import io.prometheus.client.Collector
import io.prometheus.client.Collector.MetricFamilySamples
import io.prometheus.client.CounterMetricFamily
import io.prometheus.client.GaugeMetricFamily
import javax.management.ObjectName
import scala.collection.JavaConverters._
import scala.collection.mutable
class ConsumerMetricsCollector(val labels: Map[String, String] = Map.empty) extends Collector {
val metrics: mutable.Map[String, MetricFamilySamples] = mutable.Map.empty
def collect: util.List[MetricFamilySamples] = {
val server = ManagementFactory.getPlatformMBeanServer
for {
attrType <- List("consumer-metrics", "consumer-coordinator-metrics", "consumer-fetch-manager-metrics")
name <- server.queryNames(new ObjectName(s"kafka.consumer:type=$attrType,client-id=*"), null).asScala
attrInfo <- server.getMBeanInfo(name).getAttributes.filter { _.getType == "double" }
} yield {
val attrName = attrInfo.getName
val metricLabels = attrName.split(",").map(_.split("=").toList).collect {
case "client-id" :: (id: String) :: Nil => ("client-id", id)
}.toList ++ labels
val metricName = "kafka_consumer_" + attrName.replaceAll(raw"""[^\p{Alnum}]+""", "_")
val labelKeys = metricLabels.map(_._1).asJava
val metric = metrics.getOrElseUpdate(metricName,
if(metricName.endsWith("_total") || metricName.endsWith("_sum")) {
new CounterMetricFamily(metricName, attrInfo.getDescription, labelKeys)
} else {
new GaugeMetricFamily(metricName, attrInfo.getDescription, labelKeys)
}: MetricFamilySamples
)
val metricValue = server.getAttribute(name, attrName).asInstanceOf[Double]
val labelValues = metricLabels.map(_._2).asJava
metric match {
case f: CounterMetricFamily => f.addMetric(labelValues, metricValue)
case f: GaugeMetricFamily => f.addMetric(labelValues, metricValue)
case _ =>
}
}
metrics.values.toList.asJava
}
}

Can I apply Moshi #JsonQualifier to a type parameter?

I need to define the JsonAdapter for BigDecimal by the JsonQualifier annotation and use it on the items in a list.
#JsonQualifier
#Target(AnnotationTarget.TYPE)
#Retention(AnnotationRetention.RUNTIME)
annotation class JsonCoordinates
#JsonClass(generateAdapter = true)
data class LocationData(
val coordinates: List<#JvmSuppressWildcards #JsonCoordinates BigDecimal>
)
class LocationDataAdapterTest {
#Test
fun toJsonWithQualifier() {
val moshi: Moshi
val adapter: JsonAdapter<LocationData>
moshi = Moshi.Builder()
.add(BigDecimal::class.java, JsonCoordinates::class.java,
DecimalAdapter())
.build()
adapter = moshi.adapter(LocationData::class.java)
val data = LocationData(listOf(BigDecimal(10), BigDecimal(20)))
assertEquals("{\"coordinates\":[\"10\",\"20\"]}", adapter.toJson(data))
}
#Test
fun toJsonWithoutQualifier() {
val moshi: Moshi
val adapter: JsonAdapter<LocationData>
moshi = Moshi.Builder()
.add(BigDecimal::class.java, DecimalAdapter())
.build()
adapter = moshi.adapter(LocationData::class.java)
val data = LocationData(listOf(BigDecimal(10), BigDecimal(20)))
assertEquals("{\"coordinates\":[\"10\",\"20\"]}", adapter.toJson(data))
}
}
The test toJsonWithoutQualifier has success.
The test toJsonWithQualifier fails with the following exception:
java.lang.IllegalArgumentException: Platform class java.math.BigDecimal (with no annotations) requires explicit JsonAdapter to be registered
for class java.math.BigDecimal
for java.util.List<java.math.BigDecimal> coordinates
for class app.klosed.api.model.LocationData
Here is the complete code https://gist.github.com/dscoppelletti/a5937124e1b33e5e33771b7845b2ed9a

unmarshall nested json in spray-json

Using spray-json (as I'm using spray-client) in order to get a latitude,longitude object from the google maps API I need to have the whole response structure set up:
case class AddrComponent(long_name: String, short_name: String, types: List[String])
case class Location(lat: Double, lng: Double)
case class ViewPort(northeast: Location, southwest: Location)
case class Geometry(location: Location, location_type: String, viewport: ViewPort)
case class EachResult(address_components: List[AddrComponent],
formatted_address: String,
geometry: Geometry,
types: List[String])
case class GoogleApiResult[T](status: String, results: List[T])
object AddressProtocol extends DefaultJsonProtocol {
implicit val addrFormat = jsonFormat3(AddrComponent)
implicit val locFormat = jsonFormat2(Location)
implicit val viewPortFormat = jsonFormat2(ViewPort)
implicit val geomFormat = jsonFormat3(Geometry)
implicit val eachResFormat = jsonFormat4(EachResult)
implicit def GoogleApiFormat[T: JsonFormat] = jsonFormat2(GoogleApiResult.apply[T])
}
import AddressProtocol._
Is there any way I can just get Location from the json in the response and avoid all this gumph?
The spray-client code:
implicit val system = ActorSystem("test-system")
import system.dispatcher
private val pipeline = sendReceive ~> unmarshal[GoogleApiResult[EachResult]]
def getPostcode(postcode: String): Point = {
val url = s"http://maps.googleapis.com/maps/api/geocode/json?address=$postcode,+UK&sensor=true"
val future = pipeline(Get(url))
val result = Await.result(future, 10 seconds)
result.results.size match {
case 0 => throw new PostcodeNotFoundException(postcode)
case x if x > 1 => throw new MultipleResultsException(postcode)
case _ => {
val location = result.results(0).geometry.location
new Point(location.lng, location.lat)
}
}
}
Or alternatively how can I use jackson with spray-client?
Following jrudolph's advice to json-lenses I also got in quite a bit of fiddling but finally got things to work. I found it quite difficult (as a newbie) and also I am sure this solution is far from the most elegant - nevertheless I think this might help people or inspire others for improvements.
Given JSON:
{
"status": 200,
"code": 0,
"message": "",
"payload": {
"statuses": {
"emailConfirmation": "PENDING",
"phoneConfirmation": "DONE",
}
}
}
And case class for unmarshalling statuses only:
case class UserStatus(emailConfirmation: String, phoneConfirmation: String)
One can do this to unmarshal response:
import scala.concurrent.Future
import spray.http.HttpResponse
import spray.httpx.unmarshalling.{FromResponseUnmarshaller, MalformedContent}
import spray.json.DefaultJsonProtocol
import spray.json.lenses.JsonLenses._
import spray.client.pipelining._
object UserStatusJsonProtocol extends DefaultJsonProtocol {
implicit val userStatusUnmarshaller = new FromResponseUnmarshaller[UserStatus] {
implicit val userStatusJsonFormat = jsonFormat2(UserStatus)
def apply(response: HttpResponse) = try {
Right(response.entity.asString.extract[UserStatus]('payload / 'statuses))
} catch { case x: Throwable =>
Left(MalformedContent("Could not unmarshal user status.", x))
}
}
}
import UserStatusJsonProtocol._
def userStatus(userId: String): Future[UserStatus] = {
val pipeline = sendReceive ~> unmarshal[UserStatus]
pipeline(Get(s"/api/user/${userId}/status"))
}

How to mock spray-client response

I have a simple spray client :
val pipeline = sendReceive ~> unmarshal[GoogleApiResult[Elevation]]
val responseFuture = pipeline {Get("http://maps.googleapis.com/maps/api/elevation/jsonlocations=27.988056,86.925278&sensor=false") }
responseFuture onComplete {
case Success(GoogleApiResult(_, Elevation(_, elevation) :: _)) =>
log.info("The elevation of Mt. Everest is: {} m", elevation)
shutdown()
case Failure(error) =>
log.error(error, "Couldn't get elevation")
shutdown()
}
Full code can be found here.
I want to mock the response of the server to test the logic in the Success and Failure cases. The only relevant information i found was here but I haven't been able to use the cake pattern to mock the sendReceive method.
Any suggestion or example would be greatly appreciated.
Here's an example of one way to mock it using specs2 for the test spec and mockito for the mocking. First, the Main object refactored into a class setup for mocking:
class ElevationClient{
// we need an ActorSystem to host our application in
implicit val system = ActorSystem("simple-spray-client")
import system.dispatcher // execution context for futures below
val log = Logging(system, getClass)
log.info("Requesting the elevation of Mt. Everest from Googles Elevation API...")
import ElevationJsonProtocol._
import SprayJsonSupport._
def sendAndReceive = sendReceive
def elavation = {
val pipeline = sendAndReceive ~> unmarshal[GoogleApiResult[Elevation]]
pipeline {
Get("http://maps.googleapis.com/maps/api/elevation/json?locations=27.988056,86.925278&sensor=false")
}
}
def shutdown(): Unit = {
IO(Http).ask(Http.CloseAll)(1.second).await
system.shutdown()
}
}
Then, the test spec:
class ElevationClientSpec extends Specification with Mockito{
val mockResponse = mock[HttpResponse]
val mockStatus = mock[StatusCode]
mockResponse.status returns mockStatus
mockStatus.isSuccess returns true
val json = """
{
"results" : [
{
"elevation" : 8815.71582031250,
"location" : {
"lat" : 27.9880560,
"lng" : 86.92527800000001
},
"resolution" : 152.7032318115234
}
],
"status" : "OK"
}
"""
val body = HttpEntity(ContentType.`application/json`, json.getBytes())
mockResponse.entity returns body
val client = new ElevationClient{
override def sendAndReceive = {
(req:HttpRequest) => Promise.successful(mockResponse).future
}
}
"A request to get an elevation" should{
"return an elevation result" in {
val fut = client.elavation
val el = Await.result(fut, Duration(2, TimeUnit.SECONDS))
val expected = GoogleApiResult("OK",List(Elevation(Location(27.988056,86.925278),8815.7158203125)))
el mustEqual expected
}
}
}
So my approach here was to first define an overridable function in the ElevationClient called sendAndReceive that just delegates to the spray sendReceive function. Then, in the test spec, I override that sendAndReceive function to return a function that returns a completed Future wrapping a mock HttpResponse. This is one approach for doing what you want to do. I hope this helps.
There's no need to introduce mocking in this case, as you can simply build a HttpResponse much more easily using the existing API:
val mockResponse = HttpResponse(StatusCodes.OK, HttpEntity(ContentTypes.`application/json`, json.getBytes))
(Sorry for posting this as another answer, but don't have enough karma to comment)