To elaborate a little bit into some details of the excellent blog post by Jan Alm:
A very common migration procedure are on database level when it’s an SQL Server farm that are hosting a number of different databases servicing a couple of applications, so the way of migrating them might be to perform it in waves for the different applications and bring them one by one over to the new SQL Server environment.
* DBCC CHECKDB with DATA_PURITY
I have been wondering how to be able to verify that the database have been tested earlier by running DBCC with DATA_PURITY, after all it might be a database that have been migrated and upgraded a couple of times and the history of the database might not be documented.
So the database might be created originally on SQL Server 2000 or earlier versions (since the DATA_PURITY issue are adjusted in SQL Server 2005 and onwards).
And gladly Paul Randal has described that in one of his blog posts that it’s possible to check the dbi_dbccFlags field in the database boot page.
The actual command that hopefully have been run already:
DBCC CHECKDB (N’testDatabase’) WITH DATA_PURITY, NO_INFOMSGS;
* Database Compatibility Level
A very common setting if the database have been migrated from earlier version via backup and restore method or detach and attach method is that the Compatibility Level for the database haven’t been changed to the new version.
There are rather few reasons to keep the database in the old Compatibility Level meaning backward-compatibility unless there is some specific application functionality that are still using an old syntax.
The best thing is to check with the application vendor and adjust the T-SQL code syntax to adapt the new version functionality that are usually performing faster as well.
* Database Page Verify setting to CHECKSUM
Also a very common setting if the database have been migrated from earlier version is that the database Page Verify setting haven’t been adjusted so it might still be set as running the default for SQL Server 2000 (e.g. torn-page detection).
So there is very few reasons not to set the database to the Checksum setting since that one doesn’t add more load but instead more protection or early warnings against database corruption.
One thing to note is however by changing the database settings to Checksum doesn’t affect the earlier written data and that checksum is only written when pages in the database are updated or inserted.
* Database log file fragmentation
Also the database log file settings or earlier problems if it has been growing rather wildly also tagalong with an migration so it might not be performing that well despite a new and better performance environment.
Verify the log file fragmentation with the command; DBCC LOGINFO (‘testDatabase’)
* Database Diagrams
The database might have been migrated with an account that doesn’t exist any more so the owner field of the database is blank.
The symptom for that one is an error message when trying to expand the Database Diagrams folder for that database in Management Studio.
And of course the database setting Auto Close or Auto Shrink should not be used since that is a really bad setting.