Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For repos to be independent, they need to have a defined interface, be able to test themselves against that interface as well as the expected behavior, and specify interdependencies with versioning (semver helps reduce effort on this point).

You can run integration tests in the dependent project across the interface of the dependency project, but tests in the dependency project should find 99% of breaking changes in the dependency.

So you can do whatever you want in any project, then test those changes. If your tests point out you've broken your contract with other projects, you decide to either solve that issue before releasing the change or cut a new major revision. Behavior that isn't covered by tests should basically be considered undefined, teams of dependency projects should contribute tests for things they want to rely on.

Either way the other project doesn't break -- it either updates to the new compatible version or stays behind on the last compatible release until it's been updated for the incompatible change. You should be confident having a robot automatically test projects with updated versions of dependencies and committing the change.

Of course, it's not always possible or desirable to have this degree of engineering around projects, but then the reality is that these aren't independent codebases, they are a single interdependent project with a single history and should be versioned/tested/released as such.



I understand your logic in theory, but not in practice. I maintain some 20 or so large applications, that all use a shared data access layer. That dal is safely cordoned off behind an interface layer, but I still have the same problem as the parent poster. If you make enough changes to any one of the subscribing applications, you will need to make an update to the DAL. If you update the DAL, you will need to make updates in the 19 other applications, and run the tests for all of them, if only to ensure they compile.

I could break the DAL further apart into 20 different pieces, one per application, but there is so much shared data access functionality between the applications, that it doesn't make sense.


There's no "logic in theory" and "logic in practice" there's just logic and people failing to apply it in practice.

If you can't come up with a small, stable interface between the the 20 applications and the shared component, then you don't have 20 applications + a shared component, you have one big app that is painfully maintained as separate repos across a large cross-sectional arbitrary boundary.

If the history of the project is so interdependent as to be singular (everything is changed in lockstep across repos, effectively a single history) why not have a monorepo? If versioning them under the pretense that they are independent modules costs you so much effort, why do you labor to do so?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: