10. Custom Mappings

In this chapter we learn how to use custom Meta instances to map arbitrary data types as single-column values; and how to use custom Composite instances to map arbitrary types across multiple columns.

Setting Up

The examples in this chapter require the contrib-postgresql add-on, as well as the argonaut JSON library, which you can add to your build thus:

libraryDependencies += "io.argonaut" %% "argonaut" % "6.1-M4" // as of date of publication

In our REPL we have the same setup as before, plus a few extra imports.

import doobie.imports._, scalaz._, Scalaz._, scalaz.concurrent.Task, java.awt.Point

val xa = DriverManagerTransactor[Task](
  "org.postgresql.Driver", "jdbc:postgresql:world", "postgres", ""
)

import xa.yolo._

import argonaut._, Argonaut._

import scala.reflect.runtime.universe.TypeTag

import org.postgresql.util.PGobject

Meta, Atom, and Composite

The doobie.free API provides constructors for JDBC actions like setString(1, "foo") and getBoolean(4), which operate on single columns specified by name or offset. Query parameters are set and resulting rows are read by repeated applications of these low-level actions.

The doobie.hi API abstracts the construction of these composite operations via the Composite typeclass, which provides actions to get or set a heterogeneous sequence of column values. For example, the following programs are equivalent:

// Using doobie.free
FPS.setString(1, "foo") >> FPS.setInt(2, 42)

// Using doobie.hi
HPS.set(1, ("foo", 42))

// Or leave the 1 out if you like, since we usually start there
HPS.set(("foo", 42))

// Which simply delegates to the Composite instance
Composite[(String,Int)].set(1, ("foo", 42))

doobie can derive Composite instances for primitive column types, plus tuples, HLists. and case classes whose elements have Composite instances. These primitive column types are identified by Atom instances, which describe null-safe column mappings. These Atom instances are almost always derived from lower-level null-unsafe mappings specified by the Meta typeclass.

So our strategy for mapping custom types is to construct a new Meta instance (given Meta[A] you get Atom[A] and Atom[Option[A]] for free); and our strategy for multi-column mappings is to construct a new Composite instance. We consider both case below.

Meta by Invariant Map

Let’s say we have a structured value that’s represented by a single string in a legacy database. We also have conversion methods to and from the legacy format.

case class PersonId(department: String, number: Int) {
  def toLegacy = department + ":" + number
}

object PersonId {
  def fromLegacy(s: String): Option[PersonId] =
    s.split(":") match {
      case Array(dept, num) => num.parseInt.toOption.map(new PersonId(dept, _))
      case _                => None
    }
  def unsafeFromLegacy(s: String): PersonId =
    fromLegacy(s).getOrElse(throw new RuntimeException("Invalid format: " + s))
}

val pid = PersonId.unsafeFromLegacy("sales:42")

Because PersonId is a case class of primitive column values, we can already map it across two columns. We can look at its Composite instance and see that its column span is two:

scala> Composite[PersonId].length
res8: Int = 2

However if we try to use this type for a single column value (i.e., as a query parameter, which requires an Atom instance), it doesn’t compile.

scala> sql"select * from person where id = $pid"
res9: doobie.syntax.string.SqlInterpolator#Builder[shapeless.::[PersonId,shapeless.HNil]] = doobie.syntax.string$SqlInterpolator$Builder@2e18f12c

According to the error message we need a Meta[PersonId] instance. So how do we get one? The simplest way is by basing it on an existing instance, using nxmap, which is like the invariant functor xmap but ensures that null values are never observed. So we simply provide String => PersonId and vice-versa and we’re good to go.

implicit val PersonIdMeta: Meta[PersonId] = 
  Meta[String].nxmap(PersonId.unsafeFromLegacy, _.toLegacy)

Now it compiles as a column value and as a Composite that maps to a single column:

scala> sql"select * from person where id = $pid"
res10: doobie.syntax.string.SqlInterpolator#Builder[shapeless.::[PersonId,shapeless.HNil]] = doobie.syntax.string$SqlInterpolator$Builder@4916a225

scala> Composite[PersonId].length
res11: Int = 1

scala> sql"select 'podiatry:123'".query[PersonId].quick.run
  PersonId(podiatry,123)

Note that the Composite width is now a single column. The rule is: if there exists an instance Meta[A] in scope, it will take precedence over any automatic derivation of Composite[A].

Meta by Construction

Some modern databases support a json column type that can store structured data as a JSON document, along with various SQL extensions to allow querying and selecting arbitrary sub-structures. So an obvious thing we might want to do is provide a mapping from Scala model objects to JSON columns, via some kind of JSON serialization library.

We can construct a Meta instance for the argonaut Json type by using the Meta.other constructor, which constructs a direct object mapping via JDBC’s .getObject and .setObject. In the case of PostgreSQL the JSON values are marshalled via the PGObject type, which encapsulates an uninspiring (String, String) pair representing the schema type and its string value.

Here we go:

implicit val JsonMeta: Meta[Json] = 
  Meta.other[PGobject]("json").nxmap[Json](
    a => Parse.parse(a.getValue).leftMap[Json](sys.error).merge, // failure raises an exception
    a => new PGobject <| (_.setType("json")) <| (_.setValue(a.nospaces))
  )

Given this mapping to and from Json we can construct a further mapping to any type that has a CodecJson instance. The nxmap constrains us to reference types and requires a TypeTag for diagnostics, so the full type constraint is A >: Null : CodecJson: TypeTag. On failure we throw an exception; this indicates a logic or schema problem.

def codecMeta[A >: Null : CodecJson: TypeTag]: Meta[A] =
  Meta[Json].nxmap[A](
    _.as[A].result.fold(p => sys.error(p._1), identity), 
    _.asJson
  )

Let’s make sure it works. Here is a simple data type with an argonaut serializer, taken straight from the website, and a Meta instance derived from the code above.

case class Person(name: String, age: Int, things: List[String])

implicit def PersonCodecJson =
  casecodec3(Person.apply, Person.unapply)("name", "age", "things")

implicit val PersonMeta = codecMeta[Person]

Now let’s create a table that has a json column to store a Person.

val drop = sql"DROP TABLE IF EXISTS pet".update.run

val create = 
  sql"""
    CREATE TABLE pet (
      id    SERIAL,
      name  VARCHAR NOT NULL UNIQUE,
      owner JSON    NOT NULL
    )
  """.update.run

(drop *> create).quick.run

Note that our check output now knows about the Json and Person mappings. This is a side-effect of constructing instance above, which isn’t a good design. Will revisit this for 0.3.0; this information is only used for diagnostics so it’s not critical.

scala> sql"select owner from pet".query[Int].check.run

  select owner from pet

  ✓ SQL Compiles and Typechecks
  ✕ C01 owner OTHER (json) NOT NULL  →  Int
    - OTHER (json) is not coercible to Int according to the JDBC specification or any
      defined mapping. Fix this by changing the schema type to INTEGER, or the Scala
      type to Person or Json or PGobject.

And we can now use Person as a parameter type and as a column type.

scala> val p = Person("Steve", 10, List("Train", "Ball"))
p: Person = Person(Steve,10,List(Train, Ball))

scala> (sql"insert into pet (name, owner) values ('Bob', $p)"
     |   .update.withUniqueGeneratedKeys[(Int, String, Person)]("id", "name", "owner")).quick.run
  (1,Bob,Person(Steve,10,List(Train, Ball)))

If we ask for the owner column as a string value we can see that it is in fact storing JSON data.

scala> sql"select name, owner from pet".query[(String,String)].quick.run
  (Bob,{"name":"Steve","age":10,"things":["Train","Ball"]})

Composite by Invariant Map

We get Composite[A] for free given Atom[A], or for tuples, HLists, and case classes whose fields have Composite instances. This covers a lot of cases, but we still need a way to map other types. For example, what if we wanted to map a java.awt.Point across two columns? Because it’s not a tuple or case class we can’t do it for free, but we can get there via xmap. Here we map Point to a pair of Int columns.

implicit val Point2DComposite: Composite[Point] = 
  Composite[(Int, Int)].xmap(
    (t: (Int,Int)) => new Point(t._1, t._2),
    (p: Point) => (p.x, p.y)
  )

And it works!

scala> sql"select 'foo', 12, 42, true".query[(String, Point, Boolean)].unique.quick.run
  (foo,java.awt.Point[x=12,y=42],true)