Custom DAO Implementation
The plugin supports loading a custom DAO for the journal and snapshot. You should implement a custom Data Access Object (DAO) if you wish to alter the default persistence strategy in any way, but wish to reuse all the logic that the plugin already has in place, eg. the Apache Pekko Persistence Query API. For example, the default persistence strategy that the plugin supports serializes journal and snapshot messages using a serializer of your choice and stores them as byte arrays in the database.
By means of configuration in application.conf
a DAO can be configured, below the default DAOs are shown:
jdbc-journal {
dao = "org.apache.pekko.persistence.jdbc.journal.dao.DefaultJournalDao"
}
jdbc-snapshot-store {
dao = "org.apache.pekko.persistence.jdbc.snapshot.dao.DefaultSnapshotDao"
}
jdbc-read-journal {
dao = "org.apache.pekko.persistence.jdbc.query.dao.DefaultReadJournalDao"
}
Storing messages as byte arrays in blobs is not the only way to store information in a database. For example, you could store messages with full type information as a normal database rows, each event type having its own table. For example, implementing a Journal Log table that stores all persistenceId, sequenceNumber and event type discriminator field, and storing the event data in another table with full typing
You only have to implement two interfaces org.apache.pekko.persistence.jdbc.journal.dao.JournalDao
and/or org.apache.pekko.persistence.jdbc.snapshot.dao.SnapshotDao
.
For example, take a look at the following two custom DAOs:
class MyCustomJournalDao(db: Database, val profile: JdbcProfile, journalConfig: JournalConfig, serialization: Serialization)(implicit ec: ExecutionContext, mat: Materializer) extends JournalDao {
// snip
}
class MyCustomSnapshotDao(db: JdbcBackend#Database, val profile: JdbcProfile, snapshotConfig: SnapshotConfig, serialization: Serialization)(implicit ec: ExecutionContext, val mat: Materializer) extends SnapshotDao {
// snip
}
As you can see, the custom DAOs get a Slick database, a Slick profile, the journal or snapshot configuration, an org.apache.pekko.serialization.Serialization, an ExecutionContext and Materializer injected after constructed. You should register the Fully Qualified Class Name in application.conf
so that the custom DAOs will be used.
For more information please review the two default implementations org.apache.pekko.persistence.jdbc.dao.bytea.journal.ByteArrayJournalDao
and org.apache.pekko.persistence.jdbc.dao.bytea.snapshot.ByteArraySnapshotDao
or the demo custom DAO example from the demo-akka-persistence site.
The APIs for custom DAOs are not guaranteed to be binary backwards compatible between major versions of the plugin. For example 4.0.0 is not binary backwards compatible with 3.5.x. There may also be source incompatible changes of the APIs for customer DAOs if new capabilities must be added to to the traits.