I'm using Slick with a Play Framework 2.1 and I have some troubles.
Given the following entity...
package models
import scala.slick.driver.PostgresDriver.simple._
case class Account(id: Option[Long], email: String, password: String)
object Accounts extends Table[Account]("account") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def email = column[String]("email")
def password = column[String]("password")
def * = id.? ~ email ~ password <> (Account, Account.unapply _)
}
...I have to import a package for a specific database driver, but I want to use H2 for testing and PostgreSQL in production. How should I proceed?
I was able to workaround this by overriding the driver settings in my unit test:
package test
import org.specs2.mutable._
import play.api.test._
import play.api.test.Helpers._
import scala.slick.driver.H2Driver.simple._
import Database.threadLocalSession
import models.{Accounts, Account}
class AccountSpec extends Specification {
"An Account" should {
"be creatable" in {
Database.forURL("jdbc:h2:mem:test1", driver = "org.h2.Driver") withSession {
Accounts.ddl.create
Accounts.insert(Account(None, "[email protected]", "Password"))
val account = for (account <- Accounts) yield account
account.first.id.get mustEqual 1
}
}
}
}
I don't like this solution and I'm wondering if there is an elegant way to write DB-agnostic code so there are two different database engines used - one in testing and another in production?
I don't want to use evolution, either, and prefer to let Slick create the database tables for me:
import play.api.Application
import play.api.GlobalSettings
import play.api.Play.current
import play.api.db.DB
import scala.slick.driver.PostgresDriver.simple._
import Database.threadLocalSession
import models.Accounts
object Global extends GlobalSettings {
override def onStart(app: Application) {
lazy val database = Database.forDataSource(DB.getDataSource())
database withSession {
Accounts.ddl.create
}
}
}
The first time I start the application, everything works fine... then, of course, the second time I start the application it crashes because the tables already exist in the PostgreSQL database.
That said, my last two questions are:
onStart
method above DB-agnostic so that I can test my application with FakeApplication
?You find an example on how to use the cake pattern / dependency injection to decouple the Slick driver from the database access layer here: https://github.com/slick/slick-examples.
A few days ago I wrote a Slick integration library for play, which moves the driver dependency to the application.conf of the Play project: https://github.com/danieldietrich/slick-integration.
With the help of this library your example would be implemented as follows:
1) Add the dependency to project/Build.scala
"net.danieldietrich" %% "slick-integration" % "1.0-SNAPSHOT"
Add snapshot repository
resolvers += "Daniel's Repository" at "http://danieldietrich.net/repository/snapshots"
Or local repository, if slick-integration is published locally
resolvers += Resolver.mavenLocal
2) Add the Slick driver to conf/application.conf
slick.default.driver=scala.slick.driver.H2Driver
3) Implement app/models/Account.scala
In the case of slick-integration, it is assumed that you use primary keys of type Long which are auto incremented. The pk name is 'id'. The Table/Mapper implementation has default methods (delete, findAll, findById, insert, update). Your entities have to implement 'withId' which is needed by the 'insert' method.
package models
import scala.slick.integration._
case class Account(id: Option[Long], email: String, password: String)
extends Entity[Account] {
// currently needed by Mapper.create to set the auto generated id
def withId(id: Long): Account = copy(id = Some(id))
}
// use cake pattern to 'inject' the Slick driver
trait AccountComponent extends _Component { self: Profile =>
import profile.simple._
object Accounts extends Mapper[Account]("account") {
// def id is defined in Mapper
def email = column[String]("email")
def password = column[String]("password")
def * = id.? ~ email ~ password <> (Account, Account.unapply _)
}
}
4) Implement app/models/DAL.scala
This is the Data Access Layer (DAL) which is used by the controllers to access the database. Transactions are handled by the Table/Mapper implementation within the corresponding Component.
package models
import scala.slick.integration.PlayProfile
import scala.slick.integration._DAL
import scala.slick.lifted.DDL
import play.api.Play.current
class DAL(dbName: String) extends _DAL with AccountComponent
/* with FooBarBazComponent */ with PlayProfile {
// trait Profile implementation
val profile = loadProfile(dbName)
def db = dbProvider(dbName)
// _DAL.ddl implementation
lazy val ddl: DDL = Accounts.ddl // ++ FooBarBazs.ddl
}
object DAL extends DAL("default")
5) Implement test/test/AccountSpec.scala
package test
import models._
import models.DAL._
import org.specs2.mutable.Specification
import play.api.test.FakeApplication
import play.api.test.Helpers._
import scala.slick.session.Session
class AccountSpec extends Specification {
def fakeApp[T](block: => T): T =
running(FakeApplication(additionalConfiguration = inMemoryDatabase() ++
Map("slick.default.driver" -> "scala.slick.driver.H2Driver",
"evolutionplugin" -> "disabled"))) {
try {
db.withSession { implicit s: Session =>
try {
create
block
} finally {
drop
}
}
}
}
"An Account" should {
"be creatable" in fakeApp {
val account = Accounts.insert(Account(None, "[email protected]", "Password"))
val id = account.id
id mustNotEqual None
Accounts.findById(id.get) mustEqual Some(account)
}
}
}
I cannot give you a sufficient answer to this question...
... but perhaps this is not really s.th you want to do. What if you add an attribute to an table, say Account.active
? If you want to safe the data currently stored within your tables, then an alter script would do the job. Currently, such an alter script has to be written by hand. The DAL.ddl.createStatements
could be used to retrieve the create statements. They should be sorted to be better comparable with previous versions. Then a diff (with previous version) is used to manually create the alter script. Here, evolutions are used to alter the db schema.
Here's an example on how to generate (the first) evolution:
object EvolutionGenerator extends App {
import models.DAL
import play.api.test._
import play.api.test.Helpers._
running(FakeApplication(additionalConfiguration = inMemoryDatabase() ++
Map("slick.default.driver" -> "scala.slick.driver.PostgresDriver",
"evolutionplugin" -> "disabled"))) {
val evolution = (
"""|# --- !Ups
|""" + DAL.ddl.createStatements.mkString("\n", ";\n\n", ";\n") +
"""|
|# --- !Downs
|""" + DAL.ddl.dropStatements.mkString("\n", ";\n\n", ";\n")).stripMargin
println(evolution)
}
}