We have a database here on SQL Server 2014 Enterprise that is absolutely perfect for delayed durability. We can easily suffer data loss because the application loads via files, and we can reload it if anything ever goes wrong. What we really need is performance. I work in retail, and around the holidays in particular, we can have a lot of transactions rung across the chain in a single day.
There setup is simple. There is a data file, a data mapping application which talks with SQL Server, and SQL Server itself. The mapper reads from the data file and loads transactions into the two main databases (one for inventory, one for sales). Easy enough.
To test, I decided to reload the file from 12/23/16 - one of the busiest days of the year. We rang over 800,000 transactions, so you're literally looking at multiple millions of line items,...