Go screwed over the indie game Haunts



  • This story is several months old, so apologies if this has been posted already. I thought about posting this in the Go discussion from last month but decided against necro'ing old threads.

    A game called Haunts that raised $28,000 on Kickstarter is dead in the water. Here's the second-to-last kickstarter update:

    I’ve owed you all an update for a while, but frustratingly there hasn’t been much news to report. We got off to a roaring good start with community involvement in the open source Haunts effort, but we pretty quickly ran into a software roadblock that stymied all our efforts. Resolving has proved quite a challenge. Almost as challenging was actually understanding why we were having so much trouble. As a non-programmer, I could grasp the issue in theory but I didn’t think I could explain it well. So, I asked Michael, who’s an old friend and project backer who has been working on Haunts a lot, to explain it:

    “Haunts is written in Go, a new and quickly evolving programming language developed at Google to address certain programming challenges that are important to the kind of work they do. There is nothing special in Go that makes it particularly well suited to video games, but on the other hand there is nothing inherent in the language that makes it a bad choice, except for the fact that it's new and not well supported. Anyway, one of the neat features of Go is that it is designed to take advantage of distributed code development and reuse of code developed by other people. It's a slightly hacky feature, IMO, but it works.

    Basically, there is no master "script" that tells the compiler how all the bits of code relate to each other. Instead, that information is embedded directly in the files themselves. So rather than, for example, specifying a folder that contains your "headers", you actually embed in your program an absolute path to github, google, or some other file-sharing repository, and the Go compiler will find the code and build it into your program. If you want to use OpenGL for rendering graphics, you can search the internet for someone who has already built a version of OpenGL for Go, and put import ("github.com/go-gl/opengl/gl") directly into your code. Viola! Thus hilarity ensues.

    Because of course that's not how it really works. Instead, it makes a local copy of that external library. And if you have a local copy, you can edit it. And meanwhile, that person you copied the OpenGL code from? They are also changing their version (fixing bugs, adding features, adding bugs). Therefore there is no guarantee that your version and their version remain compatible: it takes work to maintain compatibility. So when you decide to share development of your project with other people, while "in theory" Go lets you just share the main code, which tells you where to get the imported code, in practice it takes more work than that, because we have to ensure we get the same versions. And that's where Go's feature is kinda hacky. Because it doesn't have a mechanism for doing that.

    Why does this matter? Well, when the original programmer on Haunts handed off the code to the community for development, the first problem we ran into is that the code seemed to be.... wrong. We know that there is "working" version of the game, albeit buggy and incomplete. However, we can't use the source code that we have to recreate that game. Firstly, the source simply doesn't compile. It has syntax errors, so there is literally no output- nothing to "run". But the community jumped right in, fixed the problems, and got the game to compile. Now it "runs", but it's not really a game at this point. First, it's horribly slow. Second, it crashes as soon as you try to play the actual "game" (as opposed to navigating the startup menus). Now, this is the kind of thing one expects when developing a game (or any program). There is a bug (or several) that can be found and fixed, and get us back to the same point as the "original". But this is the part the community is struggling with: why? We know it once worked. So now we're in a kind of no-mans land of the original programmer not being unable to help during his transition to a new job and a new city, but none of the new volunteers with enough time to jump into the potentially very hairy problem of figuring out what is wrong in a program they didn't write, in a language they probably don't know, and with the problems potentially being in one or more of a dozen "sub-programs" that are developed by yet other people that we have no connection with.

    Can it be done? Yes. The original programmer has now dug up all the original code his build was using and the community has it. Preliminary inspection reveals potentially significant differences, so we’re hopeful that we can make some real progress in the coming weeks.”

    And that’s where we’re at. Not what we’d hoped, not even close, but not dead yet. Once we get this conundrum finally solved (which we will), then we’ll put out the call again and try and get things rolling forward at a good speed in the new year. Until then, Happy Holidays! Rick

    So basically, the way Go handles dependencies like headers is to hardcode a path to a github repo that has the header you want then download a local copy of that header so you can edit it independently of the repo author because all Go programmers are enlightened perfect angels and this could never cause dependency problems ever.

    Here's part of the most recent update, which was posted in March, 3 months after the above one:

    Things are not going well. There has been no forward progress on the game, although there has been hard work put in trying to get the project back on track. Michael, a backer and friend and programmer, has been volunteering his time to dig into the code and try and see why no one has been able to get a version of the game to compile that even matches the unfinished state it was in when Jonathan (the original programmer) left.

    Michael has been unable to get a working version of the game to compile. He reached out directly to Jonathan for help. When Jonathan tried to get the game working from the current code, he also failed. He could not get a version of the game to work either, even using his own computer and expertise. He has not responded to additional requests for aid or insight into solving this very huge problem.

    I don’t have the technical expertise to understand exactly how that could be. Michael, who does have the technical expertise, is also at a bit of a loss, other than the fact that the shifting code base of the Go programming language combined with either changes in Lua or OpenGL libraries created some crippling incompatibility that Michael can’t track down and which apparently foiled even Jonathan’s attempt to create a working version.

    Yes. Not even the original programmer, with the original files on his original dev machine, can compile the source anymore. Woohoo!



  • Wow. Glad you started a new thread for this.



  • Go fuck itself?


  • Considered Harmful

    The obvious solution to me would be for it to not link to a repository but rather to a given tag (eg v1.2). This at least would be deterministic as to what code you pull. Also, hide the local cache from meddling users; make them fork the project in github if they want to make changes.

    This actually sounds like a neat feature if it was implemented correctly, one that might elicit a response other than "ugh, yet another language." One of the issues I struggled with when trying to start a FOSS project was how to bundle your dependencies without making it painful on other developers.



  • @joe.edwards said:

    The obvious solution to me would be for it to not link to a repository but rather to a given tag (eg v1.2).

    No. It's a terrible idea. For many, many reasons.

    Here's a short list:
    1. A programming language should not require people to use a *commercial* website for a particular feature to work (and yes GitHub is a commercial website, it's not some fru-fru happy smiling dolphin gift from heaven-- they're in it to make money)
    2. Even if you disagree with the above, you must agree it should not require people to use a single commercial website-- either you support *all* code hosting websites or *none*
    3. Similarly, it should support either *all* source control systems or *none*
    4. Repos can be deleted or made private without the main developer even noticing (since they are working off the cache)
    5. (if you accept the tag idea:) tags can be deleted without the main developer even noticing (since they are working off the cache)

    I mean I already think Nuget on Windows/Visual Studio is a bad idea, but it's nothing compared to this Go bullshit. Not even in the same arena of bad ideas. Not even in the same continent of bad ideas.

    If your code relies on a library, MAKE A COPY OF THAT LIBRARY AND PUT IT IN YOUR PROJECT. Sure, go ahead and keep a list of the URLs you downloaded it from, and occasionally check for updates, no arguments there. But shit like Nuget or this Go feature will just cause pain 6 months down the road.

    BTW if you are hemming and hawing about how to send other developers all your dependencies, you have too many goddamned dependencies. Use a language with a better library (like C#) so you don't need 40,000 dependencies to check an email account.

    BTW, the secondary WTF: how it is possible for Go to fail to compile a program, but also fail to spit out any kind of error message?


  • Considered Harmful

    @blakeyrat said:

    1. A programming language should not require people to use a commercial website for a particular feature to work

    2. Even if you disagree with the above, you must agree it should not require people to use a single commercial website-- either you support all code hosting websites or none

    I was under the (mistaken?) impression that it was repo-agnostic and just took a git URL of some kind.

    @blakeyrat said:
    3. Similarly, it should support either *all* source control systems or *none*

    Virtually impossible, but maybe a plugin system would work here.

    @blakeyrat said:
    4. Repos can be deleted or made private without the main developer even noticing (since they are working off the cache)
    You've got bigger problems if one of your dependency repos vanish (probably time to find a new library anyway at that point). I consider this an acceptable risk. @blakeyrat said:
    5. (if you accept the tag idea:) tags can be deleted without the main developer even noticing (since they are working off the cache)
    Maybe possible, but I believe it's considered bad etiquette to purge history without a damn good reason.


  • @joe.edwards said:

    @blakeyrat said:
    4. Repos can be deleted or made private without the main developer even noticing (since they are working off the cache)

    You've got bigger problems if one of your dependency repos vanish (probably time to find a new library anyway at that point). I consider this an acceptable risk.

    It's an issue because, while it still works on your machine due to the cache, you can't send the code to anybody else now. If you had downloaded your own local copy of the library and put it in your project, this would not be an issue.

    I don't see why "dependency repo vanishes from GitHub" is a reason to find a new library. Unless you assume GitHub is the *only place ever in the history of code ever* someone might want to store their library. The thought of, "oh they might have just moved it to another site" never occurred to you?

    @joe.edwards said:

    @blakeyrat said:
    5. (if you accept the tag idea:) tags can be deleted without the main developer even noticing (since they are working off the cache)

    Maybe possible, but I believe it's considered bad etiquette to purge history without a damn good reason.

    Fuck etiquette, we're talking ENGINEERING. We care only for CORRECTNESS, not politeness.

    And as long as you can delete tags (and you can BTW, I think Git is worse than Satan's armpit and even I know that), then Go's code here is not correct. Just like you can't write a programming language based on one site having a fru-fru happy dolphin image, you also can't write a programming language that relies on etiquette to create correct programs. I mean duh.



  • Wow. I don't know much about Go, but it's starting to sound suspiciously like GLOBOL

    @GLOBOL said:

    PREMISE: GLOBOL is a fantastic language that takes advantage of global variables.
    > GLOBOL takes the hassle and confusion out of figuring out scope. No longer need you ask "Which NUMBEROFPICKLES am I looking at right now?"

    ...
    In essence, GLOBOL forces all variables to be global. I do mean global in the sense that there is (almost) no sense of scope the values are stored on a server somewhere and everyone uses the same pool of variables. If I define COUNTER to be 7, then someone in China decides COUNTER should equal TRUE, the next time I read the value of COUNTER, it will be TRUE.



  • @blakeyrat said:

    If your code relies on a library, MAKE A COPY OF THAT LIBRARY AND PUT IT IN YOUR PROJECT. Sure, go ahead and keep a list of the URLs you downloaded it from, and occasionally check for updates, no arguments there. But shit like Nuget or this Go feature will just cause pain 6 months down the road.
     

    Agreed.  A compiler is a compiler.  Source control is source control.  Conflating the two is... a really bad idea.

     



  • I don't use Go, but looking at the documentation for writing Go code, these people are just full of it.

    Go specifies libraries by name, or optionally subdirectory and name.  Like

    [code]import (
        "flag"
        "html/template"
        "log"
        "net/http"
    )[/code]


  • ♿ (Parody)

    @Mason Wheeler said:

    Agreed.  A compiler is a compiler.  Source control is source control.  Conflating the two is... a really bad idea.

    Yes. This sounds to me like they looked at Maven and took it to the next level or something. I've never spent the time to figure out Maven. Its dependency stuff as I "understand" it sounds like an interesting idea, but every time I start to try to use it I'm overwhelmed by WTF. I get similar feelings about Go. A lot of the stuff they've tried sounds interesting and worth experimenting with, but like the results of most experiments should ultimately be rejected by anyone with sense.



  • @blakeyrat said:

    @joe.edwards said:
    The obvious solution to me would be for it to not link to a repository but rather to a given tag (eg v1.2).

    No. It's a terrible idea. For many, many reasons.

    I wouldn't completely agree. The php composer tool works on a similar principle: there is a json file which tells the composer which library and which version/tag to download (you can say something like "2.0." or "<= 1.6. and > 2.0.0").
    We're using the composer where i work and i find it to be a pretty cool thing.
    We got bit once when a 3rd party library introduced a bug, but it was caught in the testing process and we simply fixed the library to the last working version.



  • @Nelle said:

    @blakeyrat said:
    @joe.edwards said:
    The obvious solution to me would be for it to not link to a repository but rather to a given tag (eg v1.2).

    No. It's a terrible idea. For many, many reasons.

    I wouldn't completely agree. The php composer tool works on a similar principle: there is a json file which tells the composer which library and which version/tag to download (you can say something like "2.0." or ">= 1.6. and

    Did you seriously just use "PHP does something similar" as an attempt to argue that something is not a terrible idea?



  • Correction: extremely poor decisions screwed over the indie game Haunted.



  • @The_Assimilator said:

    Correction: extremely poor decisions screwed over the indie game Haunted.

    Yes, but that's not mutually-exclusive with: Go's designer's made extremely poor decisions that screwed the usability of the language. Both apply here.



  • @blakeyrat said:

    If your code relies on a library, MAKE A COPY OF THAT LIBRARY AND PUT IT IN YOUR PROJECT.

    Yes. I can't tell you how hard it is to get people to understand this. (I actually create a new repo so I don't have to make a copy of the library for each project I'm using it in, but same idea..)

    Maven has this problem: you're supposed to specify how to download your dependencies in the pom.xml file, but it basically puts you at the mercy of the developers. Now, for a big project like the Apache Java libs, they're probably not going to be like "eff you" and close down the repo. But who knows, maybe they get sued and a temporary injunction is issued making them close down the repo. Maybe they run out of money. Who knows? The point is, now your software stopped working because you relied on this "neat" feature to download dependencies for you.


  • Considered Harmful

    @morbiuswilters said:

    The point is, now your software stopped working because you relied on this "neat" feature to download dependencies for you.

    Eh, Go is a toy language. TRWTF is using it for something people pay for.



  • @joe.edwards said:

    This actually sounds like a neat feature if it was implemented correctly...

    Nooo!! You should always have local copies of external dependencies.

    The language this is a real pain with is C. If you're using some library and it has subtly different APIs between versions, you have to keep copies of each of these so you can cross-compile for each target platform. Or maybe the API is the same, but the ABI is ever-so-slightly different. And then your build machine has to have copies of the so libs from each target platform you are linking against. This is especially fun when you're not just doing this for different distros, but for two adjacent, minor releases of the same distro. Goooo Team FOSS!



  • @joe.edwards said:

    @blakeyrat said:
    1. A programming language should not require people to use a commercial website for a particular feature to work

    2. Even if you disagree with the above, you must agree it should not require people to use a single commercial website-- either you support all code hosting websites or none

    I was under the (mistaken?) impression that it was repo-agnostic and just took a git URL of some kind.

    @blakeyrat said:
    3. Similarly, it should support either *all* source control systems or *none*

    Virtually impossible, but maybe a plugin system would work here.

    @blakeyrat said:
    4. Repos can be deleted or made private without the main developer even noticing (since they are working off the cache)
    You've got bigger problems if one of your dependency repos vanish (probably time to find a new library anyway at that point). I consider this an acceptable risk. @blakeyrat said:
    5. (if you accept the tag idea:) tags can be deleted without the main developer even noticing (since they are working off the cache)
    Maybe possible, but I believe it's considered bad etiquette to purge history without a damn good reason.
    As usual, blakey's posterior has learned to speak.


  • @blakeyrat said:

    @The_Assimilator said:
    Correction: extremely poor decisions screwed over the indie game Haunted.

    Yes, but that's not mutually-exclusive with: Go's designer's made extremely poor decisions that screwed the usability of the language. Both apply here.

    Exactly. It's like deciding to save a few bucks and buy condoms made in Slovenia: there's plenty of fail to go around.



  • @Ben L. said:

    As usual, blakey's posterior has learned to speak.

    If we're being honest, this is worse to me, because the more VCSes they support, the more likely people are to use this awful, awful "feature". Dynamically pulling in your dependencies is one of those features that sounds so neat and elegant when you first hear it, that it's only after you've used it for a real project and had it blow up in your face that you realize what a disaster it is.



  • @morbiuswilters said:

    @Ben L. said:
    As usual, blakey's posterior has learned to speak.

    If we're being honest, this is worse to me, because the more VCSes they support, the more likely people are to use this awful, awful "feature". Dynamically pulling in your dependencies is one of those features that sounds so neat and elegant when you first hear it, that it's only after you've used it for a real project and had it blow up in your face that you realize what a disaster it is.

    Which is why you should always make a clone of the repository you're using on your server if you don't trust the person maintaining the repository. You've already said that. I don't see why you can't read your own advice. It's good advice.



  • @Ben L. said:

    As usual, blakey's posterior has learned to speak.

    AHA and as you see TFS is nowhere to be seen!

    Of course I bet the developers of Go would rather eat the contents of Roseanne's liposuction bag than use a Microsoft product even for 10 seconds, but still.


  • Considered Harmful

    We use TFS at work and I can't stand it. I'd jump at any opportunity to switch to a proper DCVS.


  • ♿ (Parody)

    @morbiuswilters said:

    The language this is a real pain with is C. If you're using some library and it has subtly different APIs between versions, you have to keep copies of each of these so you can cross-compile for each target platform. Or maybe the API is the same, but the ABI is ever-so-slightly different. And then your build machine has to have copies of the so libs from each target platform you are linking against. This is especially fun when you're not just doing this for different distros, but for two adjacent, minor releases of the same distro. Goooo Team FOSSMSVCRT!



  • @Ben L. said:

    Which is why you should always make a clone of the repository you're using on your server if you don't trust the person maintaining the repository. You've already said that. I don't see why you can't read your own advice. It's good advice.

    Right, but why have a feature to load from a remote repository at all? It just seems to set up people up to get screwed over. This is worse than what it replaces, because by default programmers are usually going to make a copy of the library somewhere. With the "pull from repos" feature, you have to be smart enough to know you don't actually want to pull from a remote repo, but instead need to set up local repos mirroring the remote ones. Seems dodgy.

    And I don't trust any repository. I keep copies of the JVM just in case Oracle goes out of business tomorrow.



  • @blakeyrat said:

    @Ben L. said:
    As usual, blakey's posterior has learned to speak.

    AHA and as you see TFS is nowhere to be seen!

    Of course I bet the developers of Go would rather eat the contents of Roseanne's liposuction bag than use a Microsoft product even for 10 seconds, but still.

    If you really really want to use a piece of software that only has a Windows client (that I know of) to develop a cross-platform application, go right ahead. You aren't forced to use the default go tool.



  • @morbiuswilters said:

    why have a feature to load from a remote repository at all? It just seems to set up people up to get screwed over.

    Never mind the risks, feel the convenience!

    Seriously though, I can see no in-principle reason why this is inherently a bad idea even though the way Go does it looks horrible. To make it work reliably you'd need to use content-addressing for the remote stuff (i.e. something akin to a Magnet link) and then automatically cache it locally. Relying on remote components identified by any scheme not inherently derived from the content will inevitably cause the kind of breakage described.

    As far as I know, git is the only available DVCS based on reliable content addressing.



  • @flabdablet said:

    Seriously though, I can see no in-principle reason why this is inherently a bad idea even though the way Go does it looks horrible. To make it work reliably you'd need to use content-addressing for the remote stuff (i.e. something akin to a Magnet link) and then automatically cache it locally. Relying on remote components identified by any scheme not inherently derived from the content will inevitably cause the kind of breakage described.

    The problem is that you're still relying on an external source for your project to build. What happens if that source is down? Or gone forever?

    "Oh, it's cached." Okay, so you set up a new build environment and "Damn, guess that project went extinct a year ago."

    "Just go get it from the cache." So basically we're now relying on the "Let's check each developer's workstation to see if they have a copy" backup scheme.



  • @morbiuswilters said:

    What happens if that source is down?

    If you're an individual developer, you've got a cached copy; so unless your own backup procedures truly suck, the only time you'll care about that is on the first build.

    @morbiuswilters said:

    Or gone forever?

    If you're an organization, then you'd want to be making an org-level cache of all the content-addressable stuff used on projects inside your org. If the org is really, really simple then "Let's check each developer's workstation to see if they have a copy" might in fact be adequate.

    @morbiuswilters said:

    "Damn, guess that project went extinct a year ago."

    That's a general risk you run by relying on externally generated code in any form whatsoever; building support for external library links into the language doesn't necessarily make it any worse, it just moves the point at which you find out you're screwed from build environment setup to an actual build.



  • @Ben L. said:

    @blakeyrat said:
    @Ben L. said:
    As usual, blakey's posterior has learned to speak.

    AHA and as you see TFS is nowhere to be seen!

    Of course I bet the developers of Go would rather eat the contents of Roseanne's liposuction bag than use a Microsoft product even for 10 seconds, but still.

    If you really really want to use a piece of software that only has a Windows client (that I know of) to develop a cross-platform application, go right ahead. You aren't forced to use the default go tool.

    TFS also supports Eclipse and othes platforms via the command line.

    http://msdn.microsoft.com/en-us/vstudio/jj158939.aspx

    It doesn't make TFS any less shitty, it is basically source safe on steroids. I really don't get along with it, but it better than source safe (I always worked in Microsoft Shops). If you got to use it, install the TFS powertools.

    As for Tools like NuGet, NPM, Gems etc. You can set your own local repository. I think they are good when you are starting a project and they allow you to get up an running quickly, however you should always keep your own version of the libraries.



  • @flabdablet said:

    If you're an organization, then you'd want to be making an org-level cache of all the content-addressable stuff used on projects inside your org. If the org is really, really simple then "Let's check each developer's workstation to see if they have a copy" might in fact be adequate.

    I don't think that's ever an adequate solution. Even if I was the only developer, I'd make sure there was a central repository containing every dependency my project had.

    @flabdablet said:

    That's a general risk you run by relying on externally generated code in any form whatsoever; building support for external library links into the language doesn't necessarily make it any worse, it just moves the point at which you find out you're screwed from build environment setup to an actual build.

    Look, there's a way to deal with this, and that's putting all of your dependencies into source control. Even relying on a central cache seems hacky: the very name "cache" doesn't give me a lot of confidence about its durability.



  • It sounds like the obvious way is to have a "dependencies.go" file with the list of all libraries you need to download. When you start working on the project you run "go-get-deps" or something and it downloads everything to a "dependencies" folder (or a version control system). Then you can make a backup of that folder. So it's the same thing you'd have done manually but slightly more automated.

    It's worth mentioning that many languages come with their own "package managers". Python has pip (not official, but widely used), PHP has PEAR, Perl has CPAN. Go apparently tried to make that more automatic and decentralized, which seems not to be a very good idea.



  • @morbiuswilters said:

    Look, there's a way to deal with this, and that's putting all of your dependencies into source control.

    Sure. My point is that there's no inherent reason why that has to be something you do by hand while setting up a build environment, but that if you do want to have it happen automatically as part of the build process, you'd better design it around content-addressable primitives. If you don't, it will almost certainly end up failing in any of several completely predictable ways.



  • @morbiuswilters said:

    @flabdablet said:
    If you're an organization, then you'd want to be making an org-level cache of all the content-addressable stuff used on projects inside your org. If the org is really, really simple then "Let's check each developer's workstation to see if they have a copy" might in fact be adequate.

    I don't think that's ever an adequate solution. Even if I was the only developer, I'd make sure there was a central repository containing every dependency my project had.

    That's why we run a Maven proxy at my shop. Every developer pulls his artifacts through this proxy (well, unless they change their proxy/mirror settings, in which case you grab the cluebat), so it allows us to make a backup copy from just one location. I've seen software pop up to proxy Nuget as well...

     @morbiuswilters said:

    @flabdablet said:
    That's a general
    risk you run by relying on externally generated code in any form
    whatsoever; building support for external library links into the
    language doesn't necessarily make it any worse, it just moves the
    point at which you find out you're screwed from build environment setup
    to an actual build.

    Look, there's a way to deal with this, and that's putting all of your dependencies into source control.
    Even relying on a central cache seems hacky: the very name "cache"
    doesn't give me a lot of confidence about its durability.

    That's all fine until you have a couple of GB checked in. Binaries isn't the same as source code: they never change, they only get replaced by a file with a different name. So just make sure it's available somewhere outside of one developer's machine, and has backups, and leave source control alone. Unless you really have nothing else that is backuped, in which case I'd start panicking.



  • @spamcourt said:

    It sounds like the obvious way is to have a "dependencies.go" file with the list of all libraries you need to download. When you start working on the project you run "go-get-deps" or something and it downloads everything to a "dependencies" folder (or a version control system). Then you can make a backup of that folder. So it's the same thing you'd have done manually but slightly more automated.

    This is exactly how NPM works. I haven't used other package managers in anger, but there is nothing say to stop you from putting a dll into source control after installing it with NuGet.



  • TRWTF -- attempting to build a potentially commercial project with an experimental language.

    Also, Morbs, when I read that you keep local copies of the JVM because Oracle might go out of business tomorrow a tiny gleam of hope opened up in my heart....



  • @spamcourt said:

    It's worth mentioning that many languages come with their own "package managers". Python has pip (not official, but widely used), PHP has PEAR, Perl has CPAN. Go apparently tried to make that more automatic and decentralized, which seems not to be a very good idea.

    And C# has Nuget.

    And it's a terrible idea. And you shouldn't use it. (Except perhaps to do the initial download of the library before you add it to your project.)


  • Considered Harmful

    @Medezark said:

    Also, Morbs, when I read that you keep local copies of the JVM because Oracle might go out of business tomorrow a tiny gleam of hope opened up in my heart....

    Industry insider: "Oracle goes out of business tomorrow"; stock plummets



  • @flabdablet said:

    @morbiuswilters said:
    Look, there's a way to deal with this, and that's putting all of your dependencies into source control.

    Sure. My point is that there's no inherent reason why that has to be something you do by hand while setting up a build environment, but that if you do want to have it happen automatically as part of the build process, you'd better design it around content-addressable primitives. If you don't, it will almost certainly end up failing in any of several completely predictable ways.

    Fair enough.



  • @JBert said:

    That's why we run a Maven proxy at my shop. Every developer pulls his artifacts through this proxy (well, unless they change their proxy/mirror settings, in which case you grab the cluebat), so it allows us to make a backup copy from just one location. I've seen software pop up to proxy Nuget as well...

    And does your IT staff keep perpetual backups?

    @JBert said:

    That's all fine until you have a couple of GB checked in. Binaries isn't the same as source code: they never change, they only get replaced by a file with a different name. So just make sure it's available somewhere outside of one developer's machine, and has backups, and leave source control alone. Unless you really have nothing else that is backuped, in which case I'd start panicking.

    If you have that many dependencies, you can put them onto a backed-up network drive and be done with it. The only benefit Maven has is it's auto-magical, which I consider riskier than just managing deps myself.



  • @Medezark said:

    Also, Morbs, when I read that you keep local copies of the JVM because Oracle might go out of business tomorrow a tiny gleam of hope opened up in my heart....

    I don't want to get stuck using OpenJDK.. shudder

    Although, I think nowadays it uses Hot Spot so it's probably somewhat less awful.



  • @morbiuswilters said:

    @JBert said:
    That's why we run a Maven proxy at my shop. Every developer pulls his artifacts through this proxy (well, unless they change their proxy/mirror settings, in which case you grab the cluebat), so it allows us to make a backup copy from just one location. I've seen software pop up to proxy Nuget as well...

    And does your IT staff keep perpetual backups?

    @JBert said:

    That's all fine until you have a couple of GB checked in. Binaries isn't the same as source code: they never change, they only get replaced by a file with a different name. So just make sure it's available somewhere outside of one developer's machine, and has backups, and leave source control alone. Unless you really have nothing else that is backuped, in which case I'd start panicking.

    If you have that many dependencies, you can put them onto a backed-up network drive and be done with it. The only benefit Maven has is it's auto-magical, which I consider riskier than just managing deps myself.

    That's the thing: the proxy automatically dumps the whole thing to a backed-up network drive to let the IT staff make a new (incremental / full) backup every night, so at least in theory I should be able to build our sources from 5-6 years ago completely cut off from the Internet. Granted, I haven't been bored enough to try.



  • @JBert said:

    That's the thing: the proxy automatically dumps the whole thing to a backed-up network drive to let the IT staff make a new (incremental / full) backup every night, so at least in theory I should be able to build our sources from 5-6 years ago completely cut off from the Internet. Granted, I haven't been bored enough to try.

    So they keep daily backups forever? If so, then the strategy seems fine for you. I will point out most places using Maven do nothing of the sort. And it still seems to me that simply downloading dependencies and sticking 'em somewhere is simpler, but to each his own.



  • @JBert said:

    That's the thing: the proxy automatically dumps the whole thing to a backed-up network drive to let the IT staff make a new (incremental / full) backup every night, so at least in theory I should be able to build our sources from 5-6 years ago completely cut off from the Internet. Granted, I haven't been bored enough to try.
     

    Has anyone else actually tried?

    This sounds like a "set it and forget it" backup strategy, usually as a result of the recovery process itself being overlooked.



  • @lettucemode said:

    A game called Haunts that raised $28,000 on Kickstarter

    . . . when the original programmer on Haunts handed off the code to the community for development

    . . . none of the new volunteers with enough time to jump into the potentially very hairy problem of figuring out what is wrong in a program they didn't write, in a language they probably don't know

    . . . the original programmer not being unable to help during his transition to a new job and a new city,

    Original programmer has left for a new job.  Development is left up to "volunteers" in the "community".  So what was the $28,000 for?  Lunch?  A new car?  Purple dildos?


  • Considered Harmful

    @El_Heffe said:

    @lettucemode said:

    A game called Haunts that raised $28,000 on Kickstarter

    . . . when the original programmer on Haunts handed off the code to the community for development

    . . . none of the new volunteers with enough time to jump into the potentially very hairy problem of figuring out what is wrong in a program they didn't write, in a language they probably don't know

    . . . the original programmer not being unable to help during his transition to a new job and a new city,

    Original programmer has left for a new job.  Development is left up to "volunteers" in the "community".  So what was the $28,000 for?  Lunch?  A new car?  Purple dildos?

    $28k is enough to keep a decent programmer employed for a few months, tops. It's less than peanuts compared to the budget of the average game, especially since coding is just a small part of the overall project.


  • Considered Harmful

    Though I see now (video relevant) that they were plugging it as an already-playable game that just needed some polish and bugfixes.

    It actually looks like a game I wouldn't mind playing; maybe I can donate a little time to reviving it, though I don't know what to make of the silly toy language they wrote it in.



  • @joe.edwards said:

    $28k is enough to keep a decent programmer employed for a few months, tops.

    Yeah, exactly. Although you could outsource to India and buy, like, 100 programmers for $28k. Of course, they're going to do Nagesh-quality work..



  • @El_Heffe said:

    Purple dildos?

    No! You spoke the words! Now you've summoned him! That faint blue outline....the penetrating eyes....he'll be here soon. shudder


Log in to reply