Slashdot had an article in January 2008 titled Professors Slam Java As "Damaging" To Students . In case the article is gone, it went something like this:
From my point of view, this represents a microcosm of the changes that are beginning to convulse computer science. We have the "old style" – very systems-oriented and emphasizing low-level programming, efficiency over all other criteria, connections with hardware, and details of how programming languages are implemented.
Contrast that with the "new style" – more abstraction; emphasis on a spectrum of criteria including reliability, maintenance, usability, security; connections with other software components and people; and software development that is independent of programming language.
Which is more important? Both of course!
For every engineer that industry needs with an "old style" education, industry needs 100 engineers with the "new style" education. So it's easy to argue that a "new style" education is more important.
On the other hand, if we teach somebody in a "new style" education, it's very hard for that person to later on pick up the system-level "old style" skills. But if we teach somebody in an "old style" manner, that person can learn the relevant "new style" skills on the job. So it's easy to argue that the "old style" education is more important. (Unless you're a program manager, in which case you complain to the universities that we aren't preparing people for actual work!)
The obvious compromise is that we need both. Just like Physics split into Physics and Engineering a century ago, Computing will split into a small, very technical and very hard field, and a much larger applied field (or several fields).
Pinning this macro change in computing on a single language, Java, is naive and narrow. By the way, very similar arguments were used to disparage C in the 1980s and Fortran in the 1960s.
– Jeff Offutt, January 2008