Treat the Database like Code

  • steve.hansen (4/5/2012)


    This is an excellent (and timely) topic. We're pondering this same question right now, but we have a challenge that we haven't quite figured out how to overcome. Any suggestions would be greatly appreciated.

    Here's the deal. We have a third-party app that we use. The database is open to us for modification (and we do modify the vendor's scripts a little bit, and we also have a fair chunk of our own scripts). It would be a fairly trivial task to use a source control tool for our own stuff, but how do we incorporate new versions from the vendor?

    Keep in mind, when the vendor supplies a new version, it's in the form of a complete build, with potentially hundreds of changed sprocs, views, functions, and indexes. Anybody have any brilliant ideas?

    Brilliant? Heck no. Ideas? Sure.

    Branches. One for the original vendor code, and one for your updates. Then, when they come out with new versions, you update their branch and then you have to compare branch-to-branch to arrive at what parts of your code can be immediately re-implemented and what parts will have to be refactored. Somewhat manual process, but I'm assuming their releases are probably largely incremental in nature and not that frequent.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • As Grant said in an earlier post, Microsoft & Red Gate both have great tools to facilitate doing database development & maintaining database scripts. Both of them have excellent options for performing Schema Compares and generating the delta scripts to different targets for deployments. I have used both Microsoft's & Red Gate's Schema Compare tools and like them both.

    I have only used the Red Gate Source Control tool on a limited basis at a recent job to maintain the versions of the SQL objects, but have not tried there Migration features in the tool. The product seemed to be a great option for maintaining source control and migrating your deltas to different environments, though I would need more time to play around with it for better understanding.

    For the last four years at couple of different jobs, I have been using Microsoft Database Development Toolset. The toolset has been nicknamed Datadude in many past iterations. I have worked with Visual Studio 2008 Team System Database Tools (VSTSDB), VSTSDB GDR version, VS 2010 Database Tools, and most recently have begun working with the new SQL 2012 version of the Database tools now called SSDT (SQL Server Data Tools). These tools have been awesome for maintaining the teams database source code and has made deployments to different environments much easier. Since using these tools, it is very hard to go back to just coding in SSMS without them. I just started a new job this week and I am trying to get access to these great tools again for my database development. I miss them :(.

    I wouldn't have been able to keep up with the different environment changes at my last position on an agile project without them. It made maintaining different states of database such as DEV, SmokeTest, QA, Performance Testing, Demo, Production so much easier when using these tools.

    If some of you haven't tried Red Gate tools or the Microsoft toolset for database development, now is the time. In fact the ability to try the Microsoft toolset is even easier than before because unlike in the past version when you had to have a copy of Visual Studio Premium or Ultimate, the SSDT tools are now included with your SQL 2012 install for FREE.

    Have fun with them both Red Gate's & Microsoft's are great options in our toolbelts.

    John

  • Where I work, we have a C# Console EXE that downloads the scripts from each environment (SANDBOX, INT, QA, PROD) to the build server, applies them to the build server database, and then compares them at different intervals.

    SANDBOX to Source Control is done daily and if the changes aren't checked into the Main Branch, then the manager of the database is e-mailed, along with any off-shore Leads.

    SANDBOX to INT is done whenever a build goes to INT.

    INT to QA is done whenever a build goes to QA.

    QA to PROD is done whenever a build goes to PROD.

    In all comparisons, an e-mail is sent out with discrepencies. While this is an "after-the-fact" sort of notification, it becomes very easy to show when developers aren't checking in code. A simple telephone call to their manager tends to get things back in-sync.

    We maintain a current deployment build which is whatever current version we are working on and a branch for each version of the deployment build. These are maintained for DR, in case we can't get backups, we have files from which we can reconstruct the data, one load at a time.

    In addition, each version has a Build branch consisting of the all of the ALTER scripts necessary to build that version. When you change something, you write the ALTER script to apply the change, and then the DROP/CREATE script to apply to the current deployment build.

    We also require that developers provide a Rollback as part of the build. If you make a change via ALTER, you should be able to reverse the change.

    It may sound like extra work, but over the last two years, I've found that it takes a few extra minutes, and the Rollback scripts have quieted System Engineer concerns about command file deployments.

  • Grant Fritchey (4/5/2012)


    steve.hansen (4/5/2012)


    This is an excellent (and timely) topic. We're pondering this same question right now, but we have a challenge that we haven't quite figured out how to overcome. Any suggestions would be greatly appreciated.

    Here's the deal. We have a third-party app that we use. The database is open to us for modification (and we do modify the vendor's scripts a little bit, and we also have a fair chunk of our own scripts). It would be a fairly trivial task to use a source control tool for our own stuff, but how do we incorporate new versions from the vendor?

    Keep in mind, when the vendor supplies a new version, it's in the form of a complete build, with potentially hundreds of changed sprocs, views, functions, and indexes. Anybody have any brilliant ideas?

    Brilliant? Heck no. Ideas? Sure.

    Branches. One for the original vendor code, and one for your updates. Then, when they come out with new versions, you update their branch and then you have to compare branch-to-branch to arrive at what parts of your code can be immediately re-implemented and what parts will have to be refactored. Somewhat manual process, but I'm assuming their releases are probably largely incremental in nature and not that frequent.

    I have more experience with the Microsoft toolset, so I will speak to this one. You could maintain a DB project for the vendor code & a DB project for your custom code and include them in the same solution. So, when you build and deploy your overall solution both projects code will be compared to the target and the deltas would be rolled out accordingly. I believe the concept is Composite Projects.

    Since you are receiving the Vendor code as a complete build, I would probably have a schema only instance of the Vendor database on an instance and I would also have a DB project of the Vendor code. When the Vendor sent a new build, I would deploy it to the schema only instance of the Vendor database. Then, I would do a Schema Compare with the Vendor schema only instance as the source & my DB project as the target to reverse engineer the deltas into my Vendor project. Next, I would make any necessary changes in my custom code DB project for the Vendor database as a result of their new version. Then, I would build and deploy (publish for the new SSDT tools) solution to my target environments.

    Hope that helps you on your path to a possible solution. Let us know what you come up with.

    John

  • I always keep a DDL store of some sort or another, to be able to deploy a copy of a database. I also keep DML statements for creating lookup table contents and that kind of thing.

    Those kind of tables are one of the reasons it's hard to store a database in source control, since data can be part of "the code". If you need a list of US states, for example, in order for FKs in an Addresses table to work, then the contents of that USStates table are "part of the code", not "part of the data", functionally. Differentiating the two is often tricky. (Which is part of why I make a practice of storing common data of that sort in a Common database, so I don't have to store that in source control.)

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • For "data as part of code", we actually include a directory called "Data" with the scripts to reload the data.

    We have also created stored procedures to TRUNCATE and INSERT the data.

    Either methodology has worked for us.

  • GSquared (4/5/2012)


    I always keep a DDL store of some sort or another, to be able to deploy a copy of a database. I also keep DML statements for creating lookup table contents and that kind of thing.

    Those kind of tables are one of the reasons it's hard to store a database in source control, since data can be part of "the code". If you need a list of US states, for example, in order for FKs in an Addresses table to work, then the contents of that USStates table are "part of the code", not "part of the data", functionally. Differentiating the two is often tricky. (Which is part of why I make a practice of storing common data of that sort in a Common database, so I don't have to store that in source control.)

    I personally have gone back and forth on this subject, trying to figure out the most efficient way to store this type of data. I like to call it domain data or system data. Currently, I have been storing this type of data as Post-Deployment scripts in my Visual Studio DB projects. I have been using the MERGE statement in the scripts, where I create a @table variable to hold my source data and make sure my target table is in the same state as the script's source table when the code is deployed. I only include the scripts in a deployment if there was a change to that particular data. Though, you could probably include them in every deploy to make sure they were always in sync.

  • I maintain DDL scripts for each object in the database. I don't maintain a series of incremental change scripts, but rather there is one script per object. Typically, the script contains a CREATE followed by incremental ALTER / DROP statements that are applied as needed by querying the state of the schema information. This makes the scripts self documenting and deployments less problematic.

    For example:

    -- TFS24662 2009/03/05: Created

    create table dm.referral

    (

    constraint pk_referral primary key clustered ( patient_id, referral_datetime ),

    patient_id int not null,

    referral_datetime datetime not null,

    icd_code varchar(7) not null,

    provider_id int not null

    );

    -- TFS84211 2010/07/15: Add index on referral.provider_id

    if not exists

    (

    select 1 from sys.sysindexes

    where name = 'ix_referral_provider_id'

    )

    begin

    create index ix_referral_provider_id on dm.referral (provider_id);

    end;

    -- TFS78821 2011/10/23: Increase lenth of icd_code from 5 to 7.

    if not exists

    (

    select 1 from information_schema.columns

    where table_schema = 'dm'

    and table_name = 'referral'

    and column_name = 'icd_code'

    and character_maximum_length != 7

    )

    begin

    alter table dm.referral

    alter column icd_code varchar(7) not null;

    end;

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Most shops that I have been in the past are already using Visual SourceSafe (VSS). So, this is kind of old news for me, but as someone already stated before, Redgate now has a SourceControl application themselves, plus the fact their ads are plastered all over SQLServer Central. Its kind of interesting that I have never read a negative review or comment posted on their site on any of their products. They must all be perfect. 😀

    "Technology is a weird thing. It brings you great gifts with one hand, and it stabs you in the back with the other. ...:-D"

  • chrisn-585491 (4/5/2012)


    Steve,

    I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.

    Greatly? I guess if going from 1-2 to 4-5 is greatly, you might be right. It's an interesting topic to me, so I write on it. I use Subversion locally since I'm working solo, so that's what I mention. No slant against Git, Mercurial, or anything else, but I just don't use them. Red Gate supports them, so there isn't any intention to try and get people to use SVN over another one. It's just the one that comes to mind.

  • TravisDBA (4/5/2012)


    Most shops that I have been in the past are already using Visual SourceSafe (VSS). So, this is kind of old news for me, but as someone already stated before, Redgate now has a SourceControl application themselves, plus the fact their ads are plastered all over SQLServer Central. Its kind of interesting that I have never read a negative review or comment posted on their site on any of their products. They must all be perfect. 😀

    The product is new, and doesn't have a lot of competition, so maybe people aren't complaining because they have something?

    SQL Source Control isn't a VCS in itself. It's a layer that integrates SSMS with your VCS (git, TFS, SVN, VSS, Hg, etc.). Someone can work in SSMS and do the check in/check out process. It handles, but not all of the VCS stuff you'd want. For example, it doesn't do the branching. You need your VCS console/app for that.

  • The product is new, and doesn't have a lot of competition, so maybe people aren't complaining because they have something?

    Note that I said no one is complaining on their site on ANY of their products, not just this one.:-D

    "Technology is a weird thing. It brings you great gifts with one hand, and it stabs you in the back with the other. ...:-D"

  • Steve Jones - SSC Editor (4/5/2012)


    chrisn-585491 (4/5/2012)


    Steve,

    I could be cynical, but the number of source/version control editiorials and articles on SQLServerCentral and SimpleTalk has increased greatly since RedGate started making SQL Source Control. The fact that many of them cover SVN instead of better choices like git or Mercurial disappoints me. But the fact that the topic is being brought up and discussed is good thing for the community overall.

    Greatly? I guess if going from 1-2 to 4-5 is greatly, you might be right. It's an interesting topic to me, so I write on it. I use Subversion locally since I'm working solo, so that's what I mention. No slant against Git, Mercurial, or anything else, but I just don't use them. Red Gate supports them, so there isn't any intention to try and get people to use SVN over another one. It's just the one that comes to mind.

    Becuase I don't want to install TFS locally, I've tried several different versions of SVN, Vault, Git, Mercurial and a couple of others (which I won't name because they stunk). Mercurial was very good, if a little weird to work with. But SVN is just too easy so I went with that. I'm using UberSVN right now and it's pretty great. Nice UI in a Web page and pretty standard SVN command line. It's primarily a path of least resistance thing for me.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • TravisDBA (4/5/2012)


    Most shops that I have been in the past are already using Visual SourceSafe (VSS). So, this is kind of old news for me, but as someone already stated before, Redgate now has a SourceControl application themselves, plus the fact their ads are plastered all over SQLServer Central. Its kind of interesting that I have never read a negative review or comment posted on their site on any of their products. They must all be perfect. 😀

    Any time you want to see the warts & all presentation, come watch mine. I don't hold back. We have a ton of advantages over Visual Studio. They have a ton of advantages over us. Both tools are functional, and that's the important part.

    For example, one advantage in Visual Studio is the ability to have pre/post deployment scripts. SQL Source Control doesn't support that kind of thing... yet. So we're not perfect, not even close. We're just striving to do a good job and get better.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • I use TFS, but up until about three years ago, I was still using Visual Source Safe 6.0. It was good enough for versioned source control, which is why it has persisted for so long. It's what, 15 years old now?

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

Viewing 15 posts - 16 through 30 (of 48 total)

You must be logged in to reply to this topic. Login to reply