Check in your NuGet packages

NuGet is really nice. It’s almost magical. Just run a simple command and all kinds of packages can get added to your project. Template code files are added. Your web.config is modified in the right manner. You don’t even have to check them in to source control thanks to automatic package restore.

You can even create your own packages and make your own private packages sources.

frustratedpony

But before we get carried away with this infatuation, let’s all remember that first loves rarely last. More than once I’ve been in the situation, that a build depended on NuGet packages residing on company internal NuGet repos. Work as a consultant on that project or hire a consultant to help? You get ccess to the source code, but still can’t build. Work from home? Forget it. Or how about simply working offline on a project with ONLY public NuGet packages? Use git to clean your work area and then rebuild. Sorry, git might allow you to work offline, but NuGet reintroduces that network connectivity dependecy, that we all thought we left behind, when we turned our backs to subversion, TFS or (the horror) SourceSafe.

There is also the question of code maintainability. The code that we write will have to be maintained (http://www.mit.edu/~xela/tao.html). You could easily have to maintain it 10 years into the future.
That is no problem of course, because NuGet will be here tomorrow as well, right? Well, there was a time that we thought ICQ was here to stay, later on Second Life was the future. Less than 10 years ago I worked in a company vastly dominant within its industry. We outright laughed when a certain fruit company made their first feeble attempts. Now, however, less than 10 years later, 50.000 employees have been layed off, everything that we thought was set in stone about how the market worked has been turned upside down. I see no reason to bet my professional reputation as someone who can maintain my code in the future on a dependency on some third party online package repository.

I’ve also been in projects where the continuous integration environment suddenly broke, because a bower package got removed by the developers behind it. “Well, you can’t remove NuGet packages once they are published!” Really? Maybe not today, but are you willing to bet that it doesn’t happen tomorrow?

Also, nuget.org doesn’t have a 100% uptime guarantee. It actually DOES go down every now and then. And then you can’t program for an undefined period of time due to circumstances beyond your control.

Just do it
Just do it

The solution to this problem is really simple. Check in your NuGet packages. Clean the /packages directory, build (so NuGet does the automatic package restore), check in whatever NuGet added to /packages and you are in the safe zone. NuGet restore will not attempt to go online as long as the /packages directory contains the packages you need.

This is not “best practice” you might say. I say “So what”?

The cost of adding /packages to source control is marginal at worst. There are NO downsides to it! What would you say if a colleague gave you access to a source code repo with all of his source files. Except one, which he insists on storing on a public OneDrive folder and restoring through a custom build step instead of checking it in?

So pretty please. With sugar on top. Check in your NuGet packages.

Pulp fiction quote

 

This Post Has 10 Comments

  1. Anders Poulsen

    Yeah, what he said!

  2. Josh

    Those are some good points you raise, and I like the OneDrive analogy. That’s definitely given me something to think about. One can get caught up in blindly following ‘best practice’ without reflecting on what you’re doing enough.

  3. Daniel Marbach

    you forget one important thing:

    Source control especially git is really bad with handling binaries. This has a large impact on your dev cycle for every operation you do (fork, fetch, branch,merge…). You can use a simpler approach. Download the packaged to a network drive, add a nuget.config to your sources which points to that drive. Backup the drive often on your org. Packages itself are already versioned. Done.

  4. Anders Poulsen

    Yes, I’ve heard the “git doesn’t do large binaries very well” argument before. I don’t buy it.
    First of all, the binaries you’d be checking in are rarely more than a few 100 kB in size. Remember, I’m not suggesting that you check in the .nupkg files. Those are zip-archives containing lots of files. You would be checking in a folder structure with some config files, template files and, yes, a few small dlls. No different than the rest of what you have in source control. Surely, you are not suggesting, that I’m not allowed to add my graphics assets to my source control? Or that I should avoid using image sprites because git likes it better if I use many small pngs?
    Second, the dev cycle operations that I do most often (check in, push, pull) are not affected in any way! The problems with handling binaries are biggest if you are checking in binaries that are changed often. You’d never, ever change any existing files in /packages. So no problem there. Branching and merging a lot is not a good practice anyway IMHO so even if it was affected, I’d still not be concerned. And forking… Well how often do you do that? Once pr project and then a hundred checking/push operations, right?
    Finally, downloading the packages to a network drive simply replaces the dependency on nuget.org with a dependence on a company internal private repo. I believe I covered that one in the original post, but just for the summary: that’s actually even worse.

  5. Anders Poulsen

    Sorry ’bout the rant…
    I guess what I really meant to say was: You shouldn’t let a technical implementation detail in your source control system be the deciding factor, when you make important strategic decisions.

  6. Nathan Smith

    I couldn’t agree with this more. I just had someone wipe out everything in /packages and Nuget attempted to work its Package Restore disaster, err…magic. Guess what? Things blew up. I really don’t understand the insatiable desire to keep dependencies out of source control. Further, as someone who travels (and is disconnected) a lot, it’s just downright irritating. Trying to pull down a !@#)(ton of dependencies on a plane’s wireless connection isn’t up there on the list of things I consider pleasant.

  7. Björn Ali Göransson

    You might need a .tfignore on solution level containing:

    !packages/*.dll

  8. Jeff

    In my opinion, a good approach would be to consider using an hybrid solution where you :
    * checking in the part of your nuget package that is necessary (for example: specific platform or version dlls, config files, etc.) ;
    * still use the versionning capabilities of the nuget package restore from a private store to force your developers to stay up to date with latest version of the package YOU consider to be the ones to use in your projects.

    This approach is applicable for your own nuget packages (those you create) as well as the public/external packages(copied in your private store).

Leave a Reply