I soon figured out that the Scala version in my Spark sbt project was 2.12.17 - so version 2. Looking at the Spark compatibility matrix I saw that Spark doesn’t support Scala 3 yet!
This is a known issue, but someone already found a workaround using .map(_.cross(CrossVersion.for3Use2_13)) in my build.sbt file:
This could be solved by adding the following dependency in my build.sbt file:
I updated my datasets.scala code as follows:
caseclassUsage(uid:Int,user:String,usage:Int)caseclassUsageCost(uid:Int,user:String,usage:Int,cost:Double)// Correct way to create TypeTags in Scala 3
importscala.reflect.runtime.universe._implicitvalusageTypeTag:TypeTag[Usage]=typeTag[Usage]implicitvalusageCostTypeTag:TypeTag[UsageCost]=typeTag[UsageCost]// Create explicit encoders (these should work now with TypeTags)
implicitvalusageEncoder:Encoder[Usage]=Encoders.product[Usage]implicitvalusageCostEncoder:Encoder[UsageCost]=Encoders.product[UsageCost]
The code now compiles but I’m encountering an error: