Most of the really successful open-source projects have been tools and platforms, not end-user applications; infrastructure, in other words.
Fifteen years ago, yes, but not any more. As an example, an Ubuntu distribution comes with a pretty complete set of end-user applications. They are not necessarily the best in their class, but they are sufficient for the needs of a LOT of people.
having that infrastructure enables more programming jobs than it prevents.
This is not self-evident to me. I can see arguments on both sides and don’t know which way the balance tilts.
Note that if we are looking at this purely from the selfish guild/union interests of programmers, the proliferation of open source is a bad thing as it demolishes the barriers to entry and increases the supply of labor.
Generally speaking, I see “displacing jobs” as a good thing. People have been trained by the media to think of jobs as benefits. They are not—they are costs. Costs of producing what we really want: value. If we can produce the same value with less jobs that’s excellent—that’s what’s known as increasing the productivity of labor and it’s a very good thing.
I acknowledge that it exists, but I would not describe Ubuntu’s out-of-box application suite as “really successful”; I’m talking more about the likes of Apache, gcc, OpenSSL, Linux as a server operating system. Firefox is about the only open-source end-user app that I can think of with that kind of success, and the Mozilla Foundation isn’t exactly running on a traditional OSS contribution model.
Leaving the definition of success aside, on Windows I can think of three things where free software is noticeably inferior: MS Office, Photoshop, games. But otherwise the market for small programs, utilities, etc. that existed fifteen years ago is essentially dead now. Partially it’s because so much moved into the browser, but partially it’s because good free alternatives exist (e.g. look at media players, for example).
Fifteen years ago, yes, but not any more. As an example, an Ubuntu distribution comes with a pretty complete set of end-user applications. They are not necessarily the best in their class, but they are sufficient for the needs of a LOT of people.
This is not self-evident to me. I can see arguments on both sides and don’t know which way the balance tilts.
Note that if we are looking at this purely from the selfish guild/union interests of programmers, the proliferation of open source is a bad thing as it demolishes the barriers to entry and increases the supply of labor.
Generally speaking, I see “displacing jobs” as a good thing. People have been trained by the media to think of jobs as benefits. They are not—they are costs. Costs of producing what we really want: value. If we can produce the same value with less jobs that’s excellent—that’s what’s known as increasing the productivity of labor and it’s a very good thing.
I acknowledge that it exists, but I would not describe Ubuntu’s out-of-box application suite as “really successful”; I’m talking more about the likes of Apache, gcc, OpenSSL, Linux as a server operating system. Firefox is about the only open-source end-user app that I can think of with that kind of success, and the Mozilla Foundation isn’t exactly running on a traditional OSS contribution model.
Leaving the definition of success aside, on Windows I can think of three things where free software is noticeably inferior: MS Office, Photoshop, games. But otherwise the market for small programs, utilities, etc. that existed fifteen years ago is essentially dead now. Partially it’s because so much moved into the browser, but partially it’s because good free alternatives exist (e.g. look at media players, for example).