I'd cynically add for the OP that the proliferation of languages is also due in part to companies attempting to establish market share and developer-types seeking to perpetuate their kind by developing yet another language or API. The many modern languages in some cases are easy to componentize, and APIs exist today (opengl, directx) that didn't exist 20 years ago. Manageability of humongous amounts of code(100's of thousands of lines) is better. But for many applications, programming languages and platforms haven't made the task at hand one iota easier... and in fact have often made life much harder.
Good example would be a web-based data entry form (who cares what it's written in) often being far less efficient than your standard 5250 green screen terminal data entry program when it comes to the program's intended purpose: allowing data entry personnel to enter information into the system as quickly as possible.
Or maybe it's how so many modern IDEs take 5, 10, 20, 30, 45 seconds to start up on my Raptor-fueled Core 2 when good ol' text mode Microsoft Professional Basic or Professional C editors started up in a second on machines with maybe 1% the compute power of modern systems and still featured full syntax highlighting, code lookups, rich help systems, etc.
Or maybe it's the fact that abstraction has led to programming languages that let you get a task done in any of a dozen ways, leading to widespread confusion about exactly how a new programmer is supposed to go about getting a job done.
And every couple of years, there's another group with another paradigm (XP, agile, pair programming, OOP, KISS, write-once-run-anywhere, waterfall, blah blah blah) advocating a reinvention of how people do their job in the name of some allegedly tweaked process.
This has been going on since the early 80s (maybe longer, I'm not THAT old!) and shows no sign of abating. Having watched the flow of things sometimes has me reflecting on how programming regularly changes, but doesn't always become better.