Storage provisioning: Do you really really really need that much?

By Renegade on Wednesday 9 September 2009 09:46 - Comments (6)
Categories: EMC, General, Storage, Views: 3.496

I received a link to an article where we can find an interview with Symantec's Mathew Lodge and their view on data deduplication. I couldn't help but noticing the following quote:
According to a recent survey by Applied Research, more than half of all organizations expect to spend more on storage in 2009 than they did in 2008. But at the same time, the latest Symantec State of the Data Center Report indicates that storage utilisation hovers at just 50%.
Now, that got me thinking on a couple of things. First off, I tried to look up this survey. Unfortunately, the results from the Applied Research-West seem to be beyond my Google skills. On the other hand they seem to be the standard company used by Symantec for surveys that somehow seem to have results that are aligned with Symantec's product portfolio. Talk about a coincidence!

Anyway as they said, "more than half of all organizations expect to spend more on storage in 2009 than they did in 2008" I was pondering how this could be? We are seeing technologies like the deduplication mentioned in the article. Almost all vendors are able to offer something similar. Same can be said about thin or virtual provisioning. Heck, thanks to the effort in the blogosphere and feedback from partners and customers, EMC even decided to change it's policy and make virtual provisioning free for the V-Max, DMX4 and DMX3.

Seems a bit odd that almost all storage vendors are delivering methods to reduce the disk space footprint in their SAN and NAS, but we still see an increase in expenditure. Sure enough the licensing costs for such new features are to be included. And perhaps you even need to buy new hardware to fully utilize such new features. But all of the big vendors are quick enough to tell us the return on invest when we purchase new stuff. So that can't be it, right?

And you know what? They are right!

Simple enough, we don't know how much disk space our users need! Hell, most of the time, the user himself doesn't even know! And then there's the fact that it's too easy to get new storage.

We provision like there's no tomorrow. Not just disk space, but also computational power. You need to test something? Here, have a VM and go right ahead. What? You're on Solaris? No problem, here's a brand new sparkling zone, just for you. How much disk space do you need? Two Tera? No wonder we called it Terabyte, those are monstrous amounts of disk space.

I know the dilemma, and when you ask your users if they really need all of that, you usually get a blank look on their faces, outrage - How dare you ask me that, isn't it obvious? -, or perhaps even an educated guess.Some will even give you forecasts... If you are lucky.

Things will get better with technology like TP and dedupe. And things will get worse when we go for new technologies like cloud., but fact of the matter is, we have made provisioning too easy, and we've somehow lost the art of asking if they really really really need it. Usually the answer to anyone provisioning is a simple "no".