Wednesday, May 27, 2015

Porting a Scala Play 2.3 application with Slick 2.1.0 to Play 2.4 and Slick 3.0.0

We recently ported a smallish scala web-application using in Play 2.3 and Slick 2.1.0 to Play 2.4.0 and Slick 3.0.0 and would like to share our experiences. The Play 2.4 migration guide covers many issues but it still took us some time to figure everything out.

Bumping all versions

A.k.a.: the easy part.

first we edit build.sbt:

scalaVersion := "2.11.6"  
libraryDependencies ++= Seq(  
  ...
  "com.typesafe.slick" %% "slick" % "3.0.0",
  "com.github.tminglei" %% "slick-pg" % "0.9.0",/* enum support, you might not need that */
  "com.typesafe.play" %% "play-slick" % "1.0.0",
  "com.typesafe.play" %% "play-slick-evolutions" % "1.0.0",
  "org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
  "org.slf4j" % "slf4j-nop" % "1.7.12"
)

also: remove jdbc and anorm from your libraryDependencies.

then project/plugins.sbt:

addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0")

then project/build.properties:

sbt.version=0.13.8  

Finally, do an sbt update clean compile and watch your carefully crafted codebase blow up in your face in a jumble of compile errors you wish you could unsee!

Play changes

Disclaimer: this is not an official instruction manual on porting Play apps, i am just sharing our own experiences :>

Missing implicit Messages

[error] ... could not find implicit value for parameter messages: play.api.i18n.Messages
[error] Messages("registration.email.registration.subject", queueInfo.event.eventName),
[error] ^

This error hit us quite hard, because it means that anything that uses Messages() has to have access to an implicit value of type Messages.

Fixing it meant that we had to

  • add (implicit messages: Messages) to every template that used Messages(), which meant that we had to
  • have every controller which made use of Messages or used views that made use of Messages implement the I18nSupport trait, which meant we had to
  • change all controllers from object to class and the @Inject() annotation, which meant we had to
  • change the routesGenerator to InjectedRoutesGenerator in build.sbt

At that moment we started feeling like Jack :D

So then, in build.sbt we used:

routesGenerator := InjectedRoutesGenerator

When we finally realized that our routing wasn't actually entirely broken now, but instead we were hitting an IntelliJ bug that caused the parsing/highlighting in the routes file to fail, happiness returned to our faces and we went on to Akka.

We also used Message in Akka actors for sending emails, so we had to get those pesky implicits there too.

So our case classes used for messaging changed from

case class RegistrationMessage(queueInfo: QueueInfo)  

to

sealed trait RegMessage {  
  val messages: Messages
}

case class RegistrationMessage(queueInfo: QueueInfo)(implicit val messages: Messages) extends RegMessage 

and our Actors themselves changed from something like

  override def receive = {
    case RegistrationMessage(queueInfo) =>

to

override def receive = {  
    case message: RegMessage => message match {
      case RegistrationMessage(queueInfo) =>
        implicit val messages = message.messages

The world of Actors, Controllers and Templates made sense again, so we could move on.

Logging

Log configuration in application.conf is deprecated, so just create a new file conf/logback.xml with the following content:

<?xml version="1.0" encoding="UTF-8"?>  
<configuration>

    <conversionRule conversionWord="coloredLevel" converterClass="play.api.Logger$ColoredLevel"/>

    <appender name="FILE" class="ch.qos.logback.core.FileAppender">
        <file>${application.home}/logs/application.log</file>
        <encoder>
            <pattern>%date [%level] from %logger in %thread - %message%n%xException</pattern>
        </encoder>
    </appender>

    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%coloredLevel %logger{15} - %message%n%xException{10}</pattern>
        </encoder>
    </appender>

    <appender name="ASYNCFILE" class="ch.qos.logback.classic.AsyncAppender">
        <appender-ref ref="FILE"/>
    </appender>

    <appender name="ASYNCSTDOUT" class="ch.qos.logback.classic.AsyncAppender">
        <appender-ref ref="STDOUT"/>
    </appender>

    <logger name="play" level="INFO"/>
    <logger name="application" level="DEBUG"/>

    <!-- Off these ones as they are annoying, and anyway we manage configuration ourself -->
    <logger name="com.avaje.ebean.config.PropertyMapLoader" level="OFF"/>
    <logger name="com.avaje.ebeaninternal.server.core.XmlConfigLoader" level="OFF"/>
    <logger name="com.avaje.ebeaninternal.server.lib.BackgroundThread" level="OFF"/>
    <logger name="com.gargoylesoftware.htmlunit.javascript" level="OFF"/>

    <logger name="slick.jdbc.JdbcBackend.statement" level="DEBUG"/>

    <root level="WARN">
        <appender-ref ref="ASYNCFILE"/>
        <appender-ref ref="ASYNCSTDOUT"/>
    </root>

</configuration>

Slick Changes

Firstly, the configuration format in application.conf changed:

This is actually not so much a change in Slick, but since the Slick documentation advises to use the new Typesafe Config i think it can be mentioned here. We somehow couldn't get play evolutions to work with the Typesafe Config way of configuring the db, so we used the standard slick way, which worked perfectlty fine.

before:

db.default.driver = org.postgresql.Driver  
db.default.url = "jdbc:postgresql://localhost/ea"  
db.default.user = "ea"  
db.default.password = "secret" 

after:

# Database configuration
# ~~~~~
slick.dbs.eaDB.driver="slick.driver.PostgresDriver$" # You must provide the required Slick driver!  
slick.dbs.eaDB.db.driver=org.postgresql.Driver  
slick.dbs.eaDB.db.url="jdbc:postgresql://localhost:5432/ea"  
slick.dbs.eaDB.db.user=ea  
slick.dbs.eaDB.db.password="secret"  
slick.dbs.eaDB.db.numThreads = 10  
slick.dbs.eaDB.db.connectionTimeout = 5000  
slick.dbs.eaDB.db.validationTimeout = 5000

#play.evolutions.db.eaDB.autoApply=true
play.evolutions.db.eaDB.enabled=true ## probably not necessary but we like being explicit  
play.evolutions.db.eaDB.autoCommit=false  

Secondly, in Slick 2.1.0 you would usually define database-related methods like so:

  def findById(id: Int)(implicit s: Session): Option[EventType] =
    filter(_.eventTypeId === id).firstOption

or

  def findById(id: Int): Option[EventType] = {
    DB.withSession { implicit s: Session =>
      filter(_.eventTypeId === id).firstOption
    }
  }

Slick 3.0.0 comes with a new, composable and entirely asynchronous API returning Futures for everything. I love it! It lets you do things like this

val deleteAction = Tiles.delete  
val loadAction = Tiles ++= extractTilesFromDump(new FileInputStream(dumpFile))

val futureResult = db.run(deleteAction.zip(loadAction).transactionally)  
futureResult.onSuccess { case a => println(s"Successfully deleted ${a._1} and imported ${a._2.get} rows") }  
futureResult.onFailure { case a => println(s"Failed to import: $a") }

We didn't want to change all our controllers to accommodate for this change right away though, so as a first step we modified our database classes to keep the same method signatures by hiding the asynchronous nature of the new API:

EaDB.scala:

object EaDB {  
  private val eadb: String = "eaDB"
  private val dbConfig = DatabaseConfigProvider.get[JdbcProfile](eadb)(Play.current)

  def result[R](a: DBIOAction[R, NoStream, Nothing]): R = Await.result(dbConfig.db.run(a), 1 second)

  def async[R](a: DBIOAction[R, NoStream, Nothing]): Future[R] = dbConfig.db.run(a)
}

Note that we had connection leaks using the old Database.forConfig method of acquiring a connection.

EventType.scala:

  def findById(id: Int): Option[EventType] =
    EaDB.result(filter(_.eventTypeId === id).result.headOption)

Note that firstOption was changed to headOption. the same goes for first.

Thirdly, some of the old, lower-level APIs have been deprecated

[warn] ... method list in trait Invoker is deprecated: Invoker convenience features will be removed. Invoker is intended for low-level JDBC use only.
[warn]     for (row <- q(event.eventId).list if currentPosition == -1) {
[warn]                                  ^
[warn] two warnings found

So this

import scala.slick.jdbc.{GetResult, StaticQuery => Q}  
...
implicit val resultMapping = GetResult[(Int, Participant)](r =>  
  (r.<<, Participant(r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<)))
val q = Q[Int, (Int, Participant)] + "select row_number() over() rn, a.* from (select * from participant where event_id = ? order by ts asc) a"  
...
for (row <- q(event.eventId).list if currentPosition == -1) {  
...

became

implicit val resultMapping = GetResult[(Int, Participant)](r =>  
  (r.<<, Participant(r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<, r.<<)))
val queryAction =  
  sql"""select row_number() over() rn , a.* from
       |(select * from participant where 
       |event_id = ${event.eventId} order by ts asc) a""".as[(Int, Participant)]

val result = EaDB.result(queryAction)  
...
for (row <- result if currentPosition == -1) {  
...

Note the neat sql interpolator, that will do parameter binding for you all without question marks. (uuuuh!)

Finally, we do want to use Slick 3's awesome powers of asynchronicity in some places

To that end we change our database code from

object Tiles extends TableQuery(new Tiles(_)) {  
  def list(): Seq[Tile] = {
    EaDB.result(sortBy(_.sortOrder).result) // remember? we used Await.result in there, so this blocks!
  }
}

to

object Tiles extends TableQuery(new Tiles(_)) {  
  def list(): Future[Seq[Tile]] = {
    EaDB.async(sortBy(_.sortOrder).result) // here we just call db.run
  }
}

and our controller from

class TilesResource @Inject()(val messagesApi: MessagesApi) extends Controller with I18nSupport {  
  def list() = Action { implicit rs =>
    ...
    Ok(Json.toJson(Tiles.list()))
  }
}

to

class TilesResource @Inject()(val messagesApi: MessagesApi) extends Controller with I18nSupport {  
  def list() = Action.async { implicit rs => // note the .async here
    ...
    Tiles.list().map { result => Ok(Json.toJson(result)) }
  }
}

VoilĂ ! Play 2.4 and Slick 3.0.0!

Thursday, May 21, 2015

Jira Git Stats Collector utility on GitHub

We have just pushed a very small utility tool named Jira Git Stats Collector to github. You're welcome to check it out :).

Tuesday, May 19, 2015

Meet us at this month's Scala Meetup@Vienna

We will attend tomorrow's Scala Meetup in Vienna, Austria. The program looks absolutely delicious :)

Java 8: Getting rid of checked Exceptions

The scenario

  • you have some DAO that can enrich a DTO with different pieces of information
  • the call site can specify which information options should be added to the DTO
  • there are many different types of options that could potentially be added
  • you hate switch/case but you love java 8 streams
  • you want to play with java 8

The problem

  • we want to collect enrichment-methods in a Map and execute them dynamically
  • our enrichment methods throw SQLException and (for some crazy reason) cannot be changed
  • we still want (for some other crazy reason) to propagate any SQLException up to the call site

The solution

  • we create a @FunctionalInterface representing something that consumes a T and throws a SQLException
  • we allow that interface to transform itself into a Function that takes a T and returns an Optional<SQLException>.
Thus we can remove the checked Exception that would otherwise have stopped us from using method references in options.stream().

import java.sql.SQLException;  
import java.util.Map;  
import java.util.Optional;  
import java.util.Set;  
import java.util.function.Function;

import static java.util.Arrays.asList;

public class SomeDao<T> {  
    private final Map<With, OptionHandler<T>> optionHandlers;

    public SomeDao() {
        this.optionHandlers = ImmutableMap.of(
                With.THIS, this::enrichWithThis,
                With.THAT, this::enrichWithThat
        );
    }

    public void enrich(final T someDto, final With... options)
            throws SQLException {
        final Optional<SQLException> e = asList(options).stream()
                .map(optionHandlers::get)
                .map(OptionHandler::toFunction)
                .map(function -> function.apply(someDto))
                .filter(Optional::isPresent)
                .map(Optional::get)
                .findAny(); // this will short circuit execution
                            // in case a SQLException occurs

        if (e.isPresent()) {
            throw e.get();
        }
    }

    @FunctionalInterface
    private interface OptionHandler<T> {
        void accept(final T t) throws SQLException;

        default Function<T, Optional<SQLException>> toFunction() {
            return argument -> {
                try {
                    accept(argument);
                    return Optional.empty();
                } catch (SQLException e) {
                    return Optional.of(e);
                }
            };
        }
    }

    public enum With {
        THIS, THAT
    }

    private void enrichWithThis(final T dto) throws SQLException {
        // something
    }

    private void enrichWithThat(final T dto) throws SQLException {
        // something
    }
}

The call site would typically look something like this:

 
  private final SomeDao<MyDto> someDao;
  ...

  private void prepareMyDto(final MyDto myDto) {
    someDao.enrich(myDto, With.THIS, With.THAT);
  }
}