It's just not true!
Ok, now before you go and start flaming and trolling me with all you have I have to state that I am (among others) a Linux administrator. We have around 4000 systems running on various Linux distributions and versions. Some very big, some quite small, but all have problems when you put them in large environments.
Linux is great when it comes to doing simple things reliably. If you look at a relatively simple webserver, things like serving mail or just plain and simple stuff you will have a great time with Linux and you will actually usually get good uptimes and just enjoy the experience.
Then you will start mulling and thinking, "Hey, we could do this for other purposes". And that's where the trouble usually starts. You probably need some sort of support agreement with the distributor. You are no longer that flexible, because since you want support you need to keep things in mind that could potentially throw you out of the support structures like tainted modules, HCL's and the likes.
Then try to explain that you need expert knowledge in large environments. There is a large community out there, but most of the problems you encounter are pretty unique. Ranging from stuff like modules that are shared by all hardware interfaces using them. Want an example? Try adding new LUN's from a SAN online. If your HBA will actually find them but the SCSI stack of the OS won't detect them you have two options: Reloading some modules or just reboot the server. Neither option will look good when you have to explain that to the customer.
I like Linux. I actually like it a lot and I enjoy the challenges that come with the administration of the servers. And I can tell you that each operating system has it's own problems in various areas, none of them are perfect.
But... Some are better suited for large environments where uptime is critical. Linux is not one of them in my opinion, but I would love to hear your comments on my ideas and any other opinions (that are actually based on something ).