Version Control

The point with git/hg init though is that you’re up and running with version control on your local machine instantly. You can push this to a remote server whenever you want, and it’s really easy. Point is, you can be using all the benefits of version control in a matter of 1 second. It takes 1 second to setup a local repo, and you can push that to a remote server and use it in a similar way to how you use svn really easily.

Here’s another article that has info on the differences (including branching): https://git.wiki.kernel.org/index.php/GitSvnComparison

Here’s an article that does a side by side comparison of branching between git and subversion: http://markmcb.com/2008/10/18/3-reasons-to-switch-to-git-from-subversion/

The way git/mercurial handles branching and merging is different to svn, and it’s far better in the DVCS model. Take a look at the above article.

I’m guessing you’ve not really spent much time using git or mercurial, which may be why you think the way svn handles branching is acceptable. All I can say is that I was forced to handle svn recently and it felt like a major step backwards - it’s just not very nice to use - it’s slow, cumbersome and gets in the way. I used to like it when I used it before I switched to mercurial, but since I understood how branching and merging work in mercurial, there’s just no way I could go back to svn.

And I still think it’s easier to use git or mercurial than it is to use svn - to use it locally for one developer, you don’t even need a server - you’re up and running in seconds with no messing about.

I’d recommend just giving mercurial a try in your spare time. Have an experiment with branching, and see what you think.

Check this site out: http://hginit.com

The only real benefit I find immediate of git/mercurial is that you don’t really have to think about your branch structure, as most merges are treated or act like a baseless merge. SVN on the other hand works better with a structured branch/merge workflow, you go from this branch to this one, then this one, last prod. The same doesn’t hold true for git/mercurial and I think that is what everyone enjoys most is you don’t have to follow a structured branch workflow. You can go from branch X to prod without any of the in between branches being involved. Granted, you can do this in other version controls too, it isn’t as obvious though.

Unfortunately, you can’t ever sell me that creating repos locally is ever a good thing. I just don’t see the purpose as you aren’t getting the benefit you really need – your source on a remote location so if something happens locally, you aren’t screwed. That just won’t sell me on git or mercurial as it fails to protect you and your code, regardless of how easy it is to do it (that just means people will continue to do it and potentially lose their changes/data for not pushing it to a remote location much more).

Oh well, I’ll leave my two cents at that and simply say, all of the pages you link to purely show that they needed a distributed system for their workflow. That isn’t a clear cut case for everyone needs it. There are specific times when it makes sense to make that move, and there are clear indicators that you don’t need to as well. Just because you can use it, doesn’t mean you need it. If you do not have a defined workflow in place you are purely using a DVCS because you don’t understand your workflow and it stops you from having to think about it. That is still dangerous in my opinion and although it has made your life easier, your development cycle might not be as refined as it could be.

The place I worked prior, was using TFS and I’ve already mentioned that is similar to SVN, and we would create 30+ branches a month! We’d merge them rarely having to deal with a conflict because our workflow defined the structure and the structure was setup for our development cycle. We never had conflicts or collisions, we could spawn up instances of our apps quickly without having to alter configuration files or having to manually change local files, it was seamless, but it definitely wasn’t a DVCS. The Linux kernel team needed a DVCS, they development cycle called for it, and they have a workflow in place that properly supports their DVCS initiative. I’ve worked with in SVN repos that would benefit from a DVCS model and I’ve worked in ones that just needed to be restructured because they failed to exhibit their development cycle properly.

For a one man team, an SVN repo is more than sufficient and usually a very good start, especially from a beginner perspective. You need to learn a LOT of lessons before moving into a system that lets you do anything you want. SVN will teach those lessons and eventually show/tell you when you’ve out-grown its system and need to move up.

Well, with regards to the local repo thing - I’d argue the opposite actually. If you’re interested in making sure your code is safe, the local repo + server model works far better. Look at it like this - say you have 5 developers working on a project, with one centralised server - with DVCS, this means you have a minimum of 6 copies of the entire repo. If the server has a hard drive failiure, that’s just one node, which means you have 5 other copies all ready to replace it. Subversion is actually LESS protective of your code and source control history, because even if you have 100 developers working on a project, you still have a single point of failure - lose that server, and you lose your entire version control history - so it’s actually worse for protecting your code.

What’s more, being able to commit locally means the experience of using version control is MUCH faster. Every operation is more or less instant. Using svn means you call a command and have to wait like 5-10 seconds for everything. It’s not a good user experience.

It also means you can really easily do experiments - you can clone your own repo onto your local machine again if you want and do all kinds of experiments with it - if it works, you can push back to the server, and if it doesn’t you can just remove that entire clone. This means you can use version control and have all the benefits without having to connect to the server each time, but when you’re ready to share your work, you do a push and it’s there for all to see. The distributed model is better in every single use case. I don’t think you could give me a single example where the purely centralised model is better.

Even for a single man project, I still think the DVCS is better, because it’s capable of handling a simple workflow too, and when you want to move upwards, it’s just a matter of learning how to do so. Svn requires a server even for a one man project, which doesn’t really make much sense to me. If you want the server you can do that anyway, but it’s great to be able to get up and running really quickly. You lose nothing - if you want a server it’s easy to do, if not that’s easy to do too.

The idea of needing to connect to a server just to see how a file has changed over time seems so old fashioned and cumbersome to me. It’s painful, and it doesn’t need to work that way.

Another significant gripe with svn - those horrible .svn directories you get in every single directory in your project. Always used to hate that too - but again, this just doesn’t happen with git/mercurial.

I can’t see any advantage to using svn under any conditions - honestly, I think git/mercurial win in every use case I can imagine.

Again, from the above article:

As of the time of writing this article, I’ve done 34 merges in about 2 weeks – I sit down in the morning and merge in all the branches that look good. As an example, during the last merge session I inspected and merged 5 seperate branches in 13 minutes. Once again, I will leave it as an exercise to the reader to contemplate how that would have gone in Subversion.

I’ll probably just leave it there now.

This conversation exploded while I was away. :xeye:

Ditto, but the opposite.

Yup. I first learned distributed VCS with Mercurial, and I used Git on a project about a year ago.

If you were making intermediate commits during those two weeks, then you would have created a branch right from the get go. Once your feature is ready to go live, you can merge that branch into the trunk. Not much else process to describe here. The merge would complete automagically.

Literally exactly the same. Branch from the trunk. Commit to the branch as normal. When it’s time to merge, just run merge. Done.

To start a repo in SVN… “svnadmin create”.

Your assumption is that you pushed it to other locations. If you only created it locally and never pushed that branch out to anywhere else, you don’t have that at all, you have a single point in failure. My concern is people will create it local and not push it until its ready (I could be wrong here, as maybe it does push the branch immediately to all nodes, but I personally wouldn’t think it would; but please correct me if I’m wrong :)) whereas with SVN you are required to have it remote immediately.

By and large, I actually agree with this statement. Although, I think your experiences with SVN are a-typical and, frankly, more likely due to a misunderstanding and misuse of the tool. Nonetheless, I fully agree and acknowledge that Git/Hg can do everything SVN can do, plus more.

The reason I personally haven’t made the switch yet comes down to one very specific reason. I significantly prefer to use GUIs than the command line, especially for VCS. When I was using Git about a year and a half ago, the best GUI client I could find was TortoiseGit, and it was atrocious. It was ridiculously slow, and if I needed to commit a large number of files, such as to add Symfony or Zend Framework, then it was 50/50 whether TortoiseGit could even complete the operation. Often times, it would simply crash.

Though, I did notice your link to the client SouceTree, and that there’s a Windows beta. Perhaps I’ll once again browse around and see how well the GUI client situation has improved.

Yeah, it’s interesting you should mention the lack of decent gui support for windows. I’ve been using sourcetree on the mac for over a year now (and love it), but haven’t used a windows machine in a few years.

As I stated above, we recently gave a graphic designer access to our source control, so he could make design related changes (our flow dictates that he can only push to me and my colleague directly - he’s not able to push to the main server, so we vet everything he does to make sure he doesn’t do anything silly like edit some php or anything like that). As part of this process, I needed to create a vagrant box for him, so he could have a reliable environment on windows. While testing on windows myself, I searched for a decent gui, assuming that there would be one somewhere, and just couldn’t find anything. Funnily enough, literally the day after struggling to find a gui for windows (and bear in mind, this was only last week), sourcetree for windows was released in beta form. It currently only supports Git (the Mac version supports Git + Mercurial), but it’s well worth your time. It’s largely due to this gui tool that I “got” DVCS, and I still use it every day in work. Take a look at it! :slight_smile:

I do see your point, but even if you only ever keep things local, you still have the same number of points of failure as svn - one. The moment you push to a remote location, you instantly have two points of failure, and this obviously increases whenever someone clones the repo. With svn, it always stays the same. Again, in an svn based system, 100 developers working on a project = 1 point of failure, whereas with a DVCS, 100 devs = at least 101 points of failure, although there could easily be more, as some devs will even clone their local repo for experiments and things like that.

Generally btw I tend to push to a remote location more or less instantly, usually on the first commit. However, sometimes, if I’m just doing something quick like say a small script or say a bug fix on another system (in which case maybe I’ll just download the files directly to my machine before creating a local repo), just locally doing it is fine and still gives you the benefits of version control for the brief amount of time you require it. I can still branch and merge and flip backward in time + see all changes etc directly on my local machine.

Depends what I’m working on basically. For the vast majority of use-cases, I’ll push to a remote location very quickly, and from that point on, I’ve already doubled the number of points of failure in my workflow compared to svn, and it only increases from there.

Here’s a nice intro video to sourcetree (3 mins long): https://www.youtube.com/watch?v=lDUOvLE8T4o

No, I have two with SVN from the start. My local copy and the server copy. Plus anyone else who checkouts the repo has a copy… sure they can’t sync up anymore if the server goes down, but that is okay, wouldn’t want them to. Plus I then really only need to backup one location, a good proven backup system will save my single server and allow me to restore it fairly quickly. I don’t really see your point personally. There is a reason why it was done fine for years, and continues to be fine for many companies for many more years – they don’t have an immediate need for a DVCS.

DVCS isn’t right for everyone and seriously, if you are a one man team, a DVCS makes little to no sense. You will likely have a remote off-site server you push to and your local machine. You basically have SVN.

Ah hang on, but when you talk about people checking out - you understand the difference there between a checkout in svn and a clone in git/hg? When you have a copy on your local machine in svn, you only have the working copy - that is, you’re not actually cloning the entire repo and all the vc history. Whenever someone clones a repo in git/hg, they don’t just get the working copy - they get the entire vc history of the entire project, so you have a complete copy of the repo. Yes, it’s true that in svn you’d have copies of the latest working copy, but in the case of a hard drive failure on the server, you’d lose all the vc history, whereas in git/hg, you wouldn’t. This is what I mean when I say your source is actually safer using git/hg.

That is a fair point, but I’d argue, I rarely care about the entire vc history if I know the trunk to be true and already in prod. For feature branches, that “may” be useful, but I’d still argue its usefulness. Please don’t take this link as a hit against GIT as that’s not my intention, but rather, the setup/workflow of your GIT or SVN is very important, be sure you are 1) using the right tool for the given job and 2) that you are using the specified version control properly, otherwise you may run into dire consequences (unfortunately, the KDE team setup GIT wrong in a very bad way).

Fair enough. Look, if it works for you then it works for you. I can only say that in my experience I much prefer git and mercurial over svn, and that when I had to use svn recently I really didn’t like it. A lot of that is just down to general user experience stuff too - like the fact that you have to wait after issuing any command. I’m so used to stuff happening instantly that even that felt like a step back to me.

There’s no point arguing over stuff like this though really - fact is, you have a workflow that works for you, and I’ve got a workflow that works for me :slight_smile: