Well, just like sands won't pile on top of each other by themselves and will easily go downhill, w/o some kind of intervention what you said is more likely to happen. I think code generally go through slow, long phases of reusability/modularity degradation, and then is improved/restored over relatively short, intense phase of refactoring. Often the latter can only be driven by those who are willing to do more than the absolute minimum for what is immediately needed.
I was thinking about it more in the evolutionary terms: Consider components to be memes. Which are going to survive? Those that are so nice that people really do want to use them. Those that are too hairy to get rid of once you are contaminated. Et c.
What mechanisms can be used to control the evil components? Can we turn to the artificial breeding instead of natural selection? Are there any lessons to be learned from population genetics? Epidemology should provide some hints for controlling the spreading. Et c.
Well the problem seems to be that there are two different times at which we pay 2 different costs.
a)Writing the component.
b)Changing the component.
An opportunistic agent would prefer the short-sighted path of minimal cost and always take into account the cost a and not b.
The problem is non-existent when different components are required by different people. The apache lucene project is a very good such example(many different languages, needs).
In general, because of previous experience with different projects, we have acquired the knowledge that changing components is important. Thus, Rules are introduced to make reusable components.
The best solution in my opinion is to create a rule such that every component is replacable as soon as it is created. In other words, there should be at least 2 components and the programmer of the software has to pay both the cost a and b at the same time.
Using xml documents to describe software and then generating them based on the components that are chosen in the xml document ,would be a very good way to enforce such rule.
Similarly, the ecological destruction of the Env can be solved in a similar way.
He explains that you achieve simplicity by pulling components apart conceptually.
If you google "simple made easy", you could find his phenomenal seminar on simplicity at infoq.
It's worth your time if you want simplicity and thus reusability.
I believe you have a logical error. It's only easy to replace one nice component with another nice not with hairy one. So replacing components is not the way projects rot. In fact replacing components is a form of refactor and thus code maintenance.
Hairy components grow naturally, it's not like you manufacture one and then dig it into project.
Interesting view. This sometimes causes another parallel effect on the team, which accelerates the problem: as the codebase gets worse, the best developers tend to leave and be replaced with worse (and harder to replace!) developers.
Making a component easily separable from the rest of the application also makes the component easier to maintain, reason about, test, etc. In other words, increases it's quality in a number of significant areas. The more a software project is composed of such components, the more useful, robust, and scslable it will tend to be. So the selective pressure that favors usefulness will give such components an edge.
I was not arguing that the resulting application will be extremely nice of useful. It very likely won't be. However, the compoent doesn't care. Or doesn't care too much. It's like a parasite: Don't be evil to the extent to kill the host. But short of that, do suck it dry.
Parasites that help the host be more successful would out compete the ones that make their hosts less successful, wouldn't they?
We call those symbionts and yes, it's a viable survival strategy.
Then there are true parasites that make the host less fit. For example, the first thing lot of them do is to castrate the host thus bringing its fitness to zero. It's a strategy that works fine as well.