Posted by idk in I2P

For over a decade, I2P has relied on the venerable Monotone service to support it's version-control needs, but during the past few years, most of the world has moved on to the now almost-universal Git version-control system. In that same time, the I2P Network has become faster and more reliable, and accessible workarounds to Git's non-resumability have been developed.

Today marks a significant occasion for I2P, today we switched-off the old mtn i2p.i2p branch, and moved the development of the core Java I2P libraries from Monotone to Git officially.

While our use of mtn has been questioned in the past, and it's not always been a popular choice, I'd like to take this moment, as perhaps the very last project to use Monotone to thank the Monotone developers, current and former, wherever they are, for the software they created.

4

Comments

You must log in or register to comment.

Google wrote

In that same time, the I2P Network has become faster and more reliable, and accessible workarounds to Git's non-resumability have been developed.

What's that?

1

idk OP wrote

The reason we stuck with mtn for so long is that mtn checkouts are always resumable. You can get partial ones and pick up right where you left off. It was an incredibly important feature, and one that Git still does not implement for git clone. To do a successful checkout on the old crypto required checking out at --depth 1 and even then could take multiple attempts. Things are much better now, but it's still comparatively slow and inconvenient to check out a ~389mb repository over git than it was with mtn. But git is resumable once you have at least a shallow clone, and moreover git can produce a git bundle which is a file that works exactly like a git repository for the purposes of cloning. So what you do to turn the crisis(git non-resumability) into an opportunity(redundant copies of the whole damn repository history everywhere) is start is you start generating git bundles of i2p.i2p every once in a while, at regular intervals(I recommend the tenth of the month) and distributing them with bittorrent-over-I2P. That way, you can download a near-complete copy of the repository from many peers and spend a minute or two --unshallowing the result, rather than cloning to --depth 1 and repeatedly --unshallow until you have a complete repository.

There are actually even cooler things you can do if you apply some git-transport magic and a way of providing a single memorable alias to a series of infohashes, there are some systems that do this it's just a matter of porting their dependencies into the I2P network or replacing them.

2

idk OP wrote

I guess since somebody has to generate and seed the bundle and for the time being, this is a scheduled rather than automatic task yes, this is not a purely client side solution yet. In a realistic future where we are able to do a similar thing, but with a git-transport that talks to the torrent client directly rather than by downloading a periodically generated bundle manually, the person seeding the periodic bundle could become much less important, assuming that most of the people seeding i2p.i2p through this hypothetical gittorrent-like system are updating to the latest code pretty frequently so that they're usually seeding pretty much the latest version. Swarm Merging would also be a huge help here I think. Then the only centalized(hypothetically) point you would have left is whatever you use to provide the human-readable alias you use to fetch the latest version of the corresponding(i2p.i2p) torrent. This is the part I don't quite get yet, I guess what other people(the gittorrent folks) have done is use a blockchain to distribute a list of infohashes associated with a given name that's registered by performing some kind of transaction? Not sure. Haven't had time to read up on that yet and every time somebody says "applied blockchain" people groan and say "are you sure?" I'm no different.

2