Using Scala Primitives inside Java 8 Lambda is causing Serialization exception in Spark -


i working on codebase using java 8 spark 1.5. need interface scala library returning function[a,b] wrote little function converts function[a,b] functional[a,b] follow

 implicit def tojavafunction[a, b](fn: => b): function[a, b] = new function[a, b] {     override def apply(t: a): b = fn(t)   } 

when pass function spark , b anyval such long or int result in serialization exception, when manually box java.lang.integer etc works fine

so example code below results in exception

case class user(name:string,age:int) val fn: (e:user) => e.age 

//implicitly converted java , passed spark // exception caused by: java.lang.classcastexception: cannot assign instance of java.lang.invoke.serializedlambda field com.a.b.v.aggregators.aggregator$2.val$fn of type java.util.function.function in instance of com.a.b.v.aggregators.aggregator$2.val$fn @ java.io.objectstreamclass$fieldreflector.setobjfieldvalues(objectstreamclass.java:2133) @ java.io.objectstreamclass.setobjfieldvalues(objectstreamclass.java:1305) @ java.io.objectinputstream.defaultreadfields(objectinputstream.java:2006) @ java.io.objectinputstream.readserialdata(objectinputstream.java:1924)

however if following, works fine

val fn: (e:user) => e.age:java.lang.integer 

is there way can fix or force autoboxing dont have e.age:java.lang.integer everytime call function