By Michelle Maisto, Geeknet Contributing Editor on August 1st, 2012
“Oh, cloud!” were the first words Microsoft CEO Steve Ballmer said during his keynote address at the Microsoft Worldwide Partner Conference—in 2010.
In early July 2012, Microsoft held its 2012 WPC and the cloud was again a major focus. But Ballmer’s 2010 opening remarks could well have stood in for this year’s (more collaborative and arguably less inspirational) keynote.
Ballmer followed his 2010 exclamation by saying he’d been talking about the cloud for four years and was excited to be talking about it during “a year in which I think it’s been clear that the opportunity and the transition to the cloud for enterprise and business customers … is absolutely clear.”
He went on to say that the move to the cloud can seem scary, since it “makes us reinvent our business models, yours and ours. But it’s a change that’s inevitable.”
Two years later, that comment can seem overly optimistic, or prescient. I asked four analysts what they thought about the following question: Today, is the move to the cloud “inevitable” for businesses of any size to remain competitive?
“I believe we’ve already seen that ‘a’ move to the cloud is inevitable, as it’s already occurred to a significant extent among both SMBs and larger enterprises. But is it ‘the’ move, or the end game? That depends,” says Christian Perry, a senior analyst with Technology Business Research (TBR).
“It’s more sensible to consider the impact of the cloud concept within traditional IT infrastructures,” Perry explains. “From a cost perspective, customers are finding that it’s good business sense to at least consider offloading archived or otherwise cold data to cloud storage. Businesses with widespread agility requirements also are increasingly considering cloud-based applications, but the cloud remains primarily an adjunct to traditional, brick-and-mortar efforts.”
Perry adds that the impact of the concept is also moving into the datacenter, and private clouds are delivering the same benefits, but with greater security and control.
“If you’re asking if ‘the’ move to the cloud is inevitable,” states Perry, “the answer is yes if you’re lumping private clouds into the discussion—and we do.”
Charles King, principal analyst with Pund-IT, says that if Ballmer was referring to highly virtualized, increasingly automated IT infrastructure processes and management, then he agrees.
“I believe IT is heading toward (or has already arrived at) a point where traditional approaches are untenable,” says King. “The state of data storage is a good example. Just a decade ago, the amount of information a single storage admin could effectively manage was pegged at a few hundred gigabytes. Today, that number is counted in terabytes, but how can enterprises whose storage infrastructures are growing toward/beyond the petabyte range afford that many personnel?”
King goes on to say, “You could argue that companies that refuse to seriously consider cloud will eventually reach a point where the burden – in terms of cost and inefficiencies – of supporting IT infrastructures will increasingly outweigh the inherent benefits those technologies are meant to provide.”
Roger Kay, founder of Endpoint Technologies, still sees some road ahead.
“The problem with cloud is that it relies on fast, ubiquitous, reliable, and secure communications, which we don’t have today, particularly in the United States,” Kay asserts. “Some European countries, Japan, and Korea have much better broadband infrastructure than we do. But 4G is coming, and that will improve things here.”
On the plus side, the cloud offers flexibility, simplified management, portability on the client slide, security, and arguably cost benefits, Kay explains. On the minus side are latency, availability, and productivity issues.
“If comms were perfect, cloud would be unarguably the way to go,” Kay states. “But they’re not, yet.”
Ken Hyers, also a TBR senior analyst, points out that enterprises can be cautious about moving critical core functions to the cloud due to the perceived loss of control.
“Questions about data protection and vulnerability are also of concern,” says Hyers. “A real-world example came a couple of weeks ago, when heavy storms along the Eastern Seaboard took out Amazon’s cloud due to power outages, directly impacting the ability of several companies, including Netflix, to deliver their services.”
However, Hyers adds, “I think that the cost savings of cloud computing, storage and services, and the increased flexibility and scalability that the cloud provides from a services delivery standpoint makes it increasingly appealing to all but the most risk-averse companies. So they’ll continue to move more functions, starting with non-core and non-critical ones, and move towards more complex ones as they grow comfortable with the cloud.”
What are your thoughts? Is 2012 the year your company makes the move, even partially, to the cloud? Let me know in the comments section below.
Michelle Maisto is a Geeknet contributing editor who has been covering mobility and enterprise technology for more than a decade.
By Bob Hunt, Senior IT Pro Evangelist, Microsoft on July 11th, 2012
As an IT professional begins to peel back the layers of the private cloud and understand all of the parts of the IT infrastructure that it can automate and impact, a rational conclusion would be that the private cloud has the ability to be the new center of the IT universe. If you’re relatively new to the private cloud architecture (yes, the private cloud truly is an IT architecture), I can understand that this may seem like an optimistic viewpoint, but when you step through the complete set of capabilities, you may be pleasantly surprised.
Reach across multiple platforms
To get started on making my case for this point, we need some background information. A great place to get started is from a video I posted a while back titled “The IT Pro’s Heaven – The Private Cloud,” which provides a thorough overview of the full gambit of technologies that encompass the private cloud architecture. To summarize this video in a sentence, with Microsoft System Center at its core, a private cloud can automate, manage, and deploy physical, virtual, and cloud-based resources across multiple platforms for your organization, as well as provide self-service for your customers. This means that if I have a traditional datacenter with the majority of my IT infrastructure running on bare metal (i.e. not virtualized), a private cloud can interact with those assets on all levels. The same holds true if your infrastructure is largely virtual with Microsoft, VMware, or Xen-based virtualization: the private cloud can handle all aspects of that environment.
Private cloud for public cloud-based resources
What might be a bit more surprising to some is that the private cloud can also do the same for public cloud-based resources as well. As I mentioned in a previous blog post here, there are multiple cloud architectures available in the public cloud. When IT pros think of moving IT assets to the public cloud, the initial reservation is the amount of control that is lost when moving a resource they are responsible for there. This is where the public cloud actually shines. The private cloud can interact with, manage, automate, and deploy applications hosted in Windows Azure; deploy software hosted on VMs or deploy VMs to the cloud also hosted in Windows Azure; or manage an Office 365 implementation.
Total infrastructure control
To summarize, through a single pane of glass, the private cloud gives you control of your entire infrastructure regardless of where its living (physical, virtual, or cloud), and the workload is a first-class citizen operationally as it can be automated, managed, and deployed in a manner that makes the most sense for your business.
So back to my original point. With this knowledge, it really makes sense to evaluate your plans for using the private cloud in a manner that helps add operational efficiency while allowing you to leverage all aspects of your existing IT infrastructure. The private cloud has the ability to be the new center of the IT universe, and there is no forced upgrade to all virtual assets or major changes to run your servers or applications in the public cloud while still maintaining the control you seek of your infrastructure as an IT pro.
Ready to adapt
Even better news is that with Windows Server 2012, your IT infrastructure is ready to adapt to your business requirements better than ever. Whether it’s higher density on your VM hosts, better performance, or improved interaction with cloud-based resources, this is a significant paradigm shift worth looking into as well.
Looking to get started with the private cloud? Resources to begin to download evaluation software can be found here. A blog post on the resources needed to build a private cloud lab environment can be found on the Slashdot site here. Additional information on private cloud offerings can be found here.
By Stephen Wellman, Vice President & Editor in Chief at Geeknet on July 2nd, 2012
IDC expects worldwide spending on cloud services to exceed $55 billion by 2014. Despite so much buy-in, a good number of IT decision makers are still unsure of their options and the financial implications they could have. Because of this, the firm has developed a Cloud Decision Framework Tool, to help IT organizations evaluate their cloud strategies.
“The painstaking evaluation struggles that once plagued the cloud decision-making process have been all but eliminated,” an IDC vice-president said in a press announcement June 14.
At the CloudExpo 2012 event in New York City the day before, this theme was also present, with Kevin Hanes, a Dell executive director, delivering a lunchtime keynote offering advice on helping customers to “drive business change.” Hanes made three points — reiterations of industry themes, really — that are as worth repeating for potential cloud customers as for the people trying to sell to them.
The first point was that the cloud “isn’t a single path, it’s a transformation.” Microsoft’s cloud team has likewise repeated in webinars that the cloud is “a journey.” That can mean, generally, that it’s a thing in flux, to be learned from and tweaked over time.
More literally it could mean, as a Microsoft UK tech specialist laid out in a June 11 blog post, taking advantage of Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) features in Windows Azure to upgrade an application in stages — first the front end, then the middle tier and then the backend database.
“Over this journey,” he wrote, “you have gradually increased the value you get from your cloud, on your time frames, in your terms.”
Hanes’ second point was that the cloud can be overwhelming, but a pragmatic solution can help a customer to move forward. He described breaking down a project for a client — a university with limited funds — into manageable stages. Portioning the job not only made it seem less overwhelming but actually enabled the client to begin realizing new savings before it moved on to the next bit.
This made me think of a recent blog post by Forrester analyst James Staten. Alongside an inverted triangle diagram, Staten explained that an appeal of cloud IaaS is that it can offer specific expertise in areas that a majority of professionals are lacking. It’s another example of the cloud simplifying complexities (and stresses) and helping customers move forward.
Finally, Hanes talked about “meeting customers where they are,” or “recognizing that they’ve made investments,” which oftentimes leads to hybrid solutions.
“I think investors sometimes put companies in a box and say, ‘You’re either in private cloud or you’re in public cloud,’” Bill Koefoed, general manager of Investor Relations, said during Microsoft’s April 19 earnings call, resisting such labels.
David Linthicum, writing for InfoWorld has also bristled at this either-or. The focus, he said in a March 30 article, needs to be on the “architecture and the right-fitting enabling technology, including both private and public cloud technology, and not gratuitous opinions.”
He added that companies shouldn’t put a limit on their possible solution options, whether that means private or public or both. All of the above are fine, he wrote, ”as long as you do your requirements homework and can validate that you have chosen the right solution.”
That homework is an important part of this. Cloud solution providers will indeed hold your hand, if that’s what you need. But first you need to decide if you need your hand held.
Have you put off investing in a cloud solution due to feeling overwhelmed or frustrated about where to get started? If so, please let us know by adding a comment below. I am sure our readers would love to help you out.
By Bob Hunt, Senior IT Pro Evangelist, Microsoft on June 13th, 2012
As IT professionals begin to look at the various aspects of cloud computing, there is no shortage of options to start evaluating where to get started.
When I’ve spoken to IT pros during the various events at which I’ve delivered presentations, many of them said they have been able to delay or avoid using cloud computing in their IT environments altogether as their lists of projects didn’t explicitly require the use of cloud computing. As time has gone by, not only have the number of cloud computing options increased, but so have the applications and number of requests from project owners and management asking IT to look at either using or incorporating some cloud technologies into their projects.
This crossroads of IT professionals’ careers – where they must move into one or more aspect of cloud computing that they are responsible for – is uncharted territory that adds complexity and risk to a project that, in the past, they were able to engineer out of a project plan.
When the time comes to start to learn one or more aspects of cloud computing as it relates to the IT environment an IT professional supports, it’s important to understand where to start the journey. I’m thus going to outline several places to get started in the areas of public, private, hybrid, and hosted cloud computing options.
Public cloud options are possibly the most well-known cloud computing offerings, but depending on the solution chosen, they offer the least amount of customization for IT professionals and/or the organizations that they support. The large pool of public cloud resources, available on demand, is an appealing offering to quickly replace a workload in the datacenter. The risk for an IT pro with most public cloud offerings is that, depending on the vendor chosen, this offering replaces significant portions of the IT pro’s job responsibilities, potentially at a lower cost due to the massive scale of public cloud offerings.
With respect to leveraging hybrid cloud functionality, where the opportunity exists to leverage both on-premise and public cloud resources, this is an option that is appealing to get the IT pro’s feet wet in public cloud computing without outsourcing an entire workload to the cloud. With the hybrid cloud, the IT pro is able to choose how much or how little of the cloud to include in a solution, but this flexibility doesn’t typically offer the cost savings of a public cloud-only solution.
Hosted server solutions offer the ability to replace the hardware a server is installed on with a virtual server hosted in the cloud, typically at a lower cost than running the same server on-premise. Sometimes hosted server solutions increase server resiliency as these solutions offer standard high availability and disaster recovery options that small to medium-sized businesses can’t often afford. Hosted server offerings do however add an additional site for IT pros to manage, and sometimes don’t offer features like security configurations to on-premise servers, which adds additional administration and server management complexity.
Unlike these cloud computing options, the private cloud offers what may be the best mix of control and customization the IT pro is looking for, while leveraging the flexible architecture of a public cloud solution. With the private cloud, an on-premise pool of resources can be created that are available to both IT pros and application owners on-demand in a self-service fashion. This pool of resources lives on premise under the control of the IT pro, but is assembled using an architecture not unlike a public cloud with IT pro and application owner self-service enabled by default. Through a single pane of glass, the IT pro can deploy and monitor the workload and get deep application insight, but still maintain complete control of the solution.
In addition, the ability to track resource usage back to an individual department for charge-back or “look-back” offers enhanced reporting that may not have existed previously. After understanding this architecture, private cloud offerings seem to be where most IT pros can agree the best benefits of cloud computing options exist.
Looking to get started with the private cloud? Resources to begin to download evaluation software can be found here. A blog post on the resources needed to build a private cloud lab environment can be found on the Slashdot site. Additional information on private cloud offerings can be found here.
By Christopher Yeich, Director of Strategic Content for Geeknet Media on June 1st, 2012
Cloud computing was the topic of a recent Greenpeace report that described the growth and scale of investments in the cloud as “truly mind-blowing,” and cited an expected 50-fold increase in the amount of digital information being created by 2020.
The report’s truer purpose, however, was to call attention to the less-than-green energy sources powering the data centers behind some clouds — and to make headlines. For example, it called out Apple as a major offender, while noting in fine print that it didn’t have all the facts on the company. Uncharacteristically, Apple responded that it’s currently building the world’s greenest data center and its next project will, better still, run on 100 percent renewable energy.
The report serves as a prompt, too, for discussing the environmental benefits of cloud solutions, whether public, private, or, as is most often the case, a hybrid of the two.
“The concept they had was great — let’s take a look and make sure things are being done right. But I think maybe they were stretching it a bit,” Ian Campbell, CEO of Nucleus Research offered with a kind laugh, during a private interview.
“The point I’d be rallying for instead,” he said, “is that if you’re not cloud-based in some way, you’re not as efficient as you could be.”
Nucleus Research has found cloud-based applications to use 91 percent less energy than traditional on-premise applications. In a survey of a major cloud-based solutions provider, Nucleus found its customers were saving the energy equivalent of 11 barrels of oil every hour.
Additionally, Accenture — commissioned by Microsoft to study whether there is a true net benefit to moving to the cloud, or whether one is just “outsourcing” one’s environmental impact — analyzed the use of three applications in cloud-based versus premise-based solutions and found the benefits to be greater the smaller the company. The decrease in CO2 emissions for the cloud-based apps, it found, was more than 90 percent for companies with approximately 100 users, between 60 and 90 percent for companies with closer to 1,000 users, and between 30 and 60 percent for those with 10,000 users.
Which isn’t to say that only small companies can see a major impact. Unilever, for example, has stated that it plans to use cloud-based technology to double its business without increasing its environmental footprint at all.
Even premise-based cloud solutions can realize environmental benefits.
“With a private cloud, if you’re using virtual machines,” said Nucleus’ Campbell, “you’re definitely getting a significant benefit by doing that.”
During intense spikes in traffic, server farms are able to create efficiencies by spreading out the load. Private clouds, explained Campbell, are a “good way to take a step in that same direction.”
The Accenture study reached the same conclusion.
“Virtualization offers a strategy to improve server utilization for both cloud and on-premise scenarios by allowing applications to run in an environment separated from the underlying physical servers,” it stated. “Multiple virtual machines can share a physical server running at high utilization, which reduces the number of physical servers required to meet the same demand.”
In this way, it continued, IT departments can “narrow the efficiency gap between an on-premise deployment and a multi-tenant cloud service.”
Campbell noted that companies consistently perform cost analyses, balancing costs against rewards. Reducing hardware and so reducing energy demands, he said, “is more of a benefit analysis. You can be green and still save money.”
He added, wryly, “Even the cold, black heart of the CFO can be satisfied.”
Has a desire for a reduced carbon footprint — or to be “greener” — at all motivated your own cloud-solution decisions? Let me know in the Comments section below.
Christopher Yeich is Director of Strategic Content for Geeknet Media.
By Christopher Yeich on May 2nd, 2012
Enterprises deploying private clouds with a primary focus on cost savings may be missing the mark, according to a recent blog post by Thomas Bittman, a Gartner vice president and distinguished analyst. While reduced costs are likely to follow, private clouds should be pursued in the interest of business requirements.
“Enterprises engaged in private cloud projects to reduce their costs will usually fail to meet objectives,” writes Bittman, “as well as miss the mark on potential business benefits.”
Gartner expects 2012 to be a major year for private clouds, with the technology moving from hype or pilot programs to mainstream deployments. In that rush from A to B, however, it also expects some casualties.
“Staying on top of best practices and learning from early adopters is a must,” Bittman emphasizes.
Below are very simplified version of three very different deployments: each a success and each motivated for more strategic reasons than frugality:
1. Eyes-Road, an e-commerce platform for France’s optical industry, needed a more agile and reliable platform. Its revenue depends on the volume of transactions made by its subscribing members — businesses such as eyeglass makers and contact lens laboratories — and service interruptions could cause members to cancel their SLAs at a direct loss to Eye-Road. A major interruption in service at high-traffic times had previously cost the company 30% of its subscribers.
It needed to guarantee members service availability 24 hours a day, seven days a week, and it needed to be able to scale to handle consistently high-traffic times of day.
Eyes-Road chose a combination of Hyper-V and Microsoft cloud services, and now it promises 99.98 percent availability, can scale as traffic demands, can more quickly address new member processing protocols, and didn’t lose any time in transitioning to the new solution.
2. Turkish bank DenizBank was struggling with its growing success. It had hundreds of stand-alone servers across three data centers, and increasing cooling costs and employee numbers to support it all. Also becoming more sprawling were its server maintenance and software licensing costs. Compounding all of this was the growing popularity of mobile banking and the pressure to have its systems be agile and always available.
DenizBank needed greater control of its infrastructure and costs, as well as to reduce the numbers of servers it was buying, the number of people necessary to run it all, and the amount of time it took to get things done.
It decided to virtualize its servers (with Hyper-V technology), and then configured its data centers as a private cloud environment using System Center 2012. Today, its entire business runs from 64 host servers than can scale to 1,500 virtual machines. It’s been able to save $7 million in server consolidation, avoided $12 million in data center costs, and anticipates a 20% reduction in IT staff costs. Further, services can be deployed faster, critical apps are always available, and — something the company equally celebrates — its more efficient data center has helped it save 3.3 gigawatts of energy and the carbon emissions equivalent of taking 400 cars off the road.
3. In Brazil, Dotz Marketing needed more advanced business intelligence (BI) tools. The company runs a rewards program for a number of clients whose customers receive dotz — awards points that are said to have almost become a second currency in Brazil.
Every second, more than 500 dotz are awarded, and every day Dotz was processing 1.5 million transactions from across 6,000 locations. It also needed a scalable database solution that could better manage these.
It deployed a database solution in a private-cloud environment, opted for a feature called Integration Services to handle the volumes of data being created by those million-plus daily transactions and another called Power View, which enables non-technical users to create rich Bi reports.
While before, it took weeks for reports to be created, as the data had to be gathered and analyzed, they can now be created almost instantly, enabling stores or brands to better plan marketing campaigns and promotions, and to more quickly detect and act on trends. Plus, with less time now spent on report preparation, employees can spend more time on what Dotz calls “strategic pursuits.”
While each of these companies turned to a Microsoft-based solution, it’s the variety of their needs and the possibilities for addressing them that’s being stressed here. Private clouds can be business savers — if they’re deployed with care and in pursuit of business-critical goals.
Christopher Yeich is Director of Strategic Content for Geeknet Media.
By Christopher Yeich on April 12th, 2012
IT decision makers are having a moment. Enterprises are in a time of tremendous change, evolution, and possibility that’s not for the faint of spirit.
Forrester Researcher VP and Principal Analyst James Staten has remarked that the role of the IT professional is no longer about controlling everything — which isn’t even a possibility anymore — but rather that IT leaders should shoot to be “the translators between IT and business.”
Two re-imagined aspects of the traditional enterprise that arguably embody these changes more than any others are private cloud computing and bring-your-own-device (BYOD) policies. It perhaps explains why to learn about one is to hear constant echoes of the other. Both represent 180-degree changes to once-central tenets, are sometimes met with resistance and, once accepted, can deliver outcomes well beyond expectations.
While keeping a focus on the cloud, it’s worthwhile to more deeply consider three pieces of advice that may be applied to either trend. Who knows — change your thinking once, and you may wind up benefitting twice.
One: It’s unstoppable; there’s no use fighting it.
Just as applications have become central to the mobile consumer experience, so have they become intrinsic to enterprises, and private clouds offer the best control yet. They make apps simpler to manage and update, offer smarter views into how they’re performing, and can cull actionable analytics from them.
“According to our surveys, the majority of enterprises are planning to build private clouds over the next few years,” Gartner VP Tom Bittman says in a video on the research firm’s site. “Their goal is that they’ll be achieving faster deployment of services … and reducing their costs.”
Bittman adds that private cloud “is not something to ignore,” as its effects extend beyond IT to the business. He explains: “Businesses are looking for new ways, new techniques, to increase their speed and their agility, and private cloud can deliver that.”
Two: A shift in the organization’s cultural mindset is required and shouldn’t be underestimated.
The Internet changed the way people communicate and interact with information. The private cloud arguably represents IT’s “big introduction” to the Internet. It enables businesses to treat IT more like a service — something that end-users can more easily help themselves to, instead of feeling beholden to.
Big changes, however, require conversations and training — inevitably there will be the heel-draggers, those resistant to or unsure about the change.
Forrester’s Staten has advised: “Knowing how each [executive] role in your organization feels about cloud computing will affect your cloud strategy, and being mindful of the attitudes they bring to your efforts could be the difference between a successful implementation and a catastrophic failure.”
Three: When done correctly, it can be fantastic for business.
Writing in the Harvard Business Review, Andrew McAfee, a principal research scientist at MIT’s Center for Digital Business, called cloud computing a “sea change — a deep and permanent shift in how computing power is generated and consumed,” and added that there’s a common pattern in the introduction of such “novel techniques.”
“The unanticipated benefits,” he wrote, “often outweigh the intended ones.”
The widely considered benefits of the private cloud are security and control; the cost-effectiveness of being able to make use of current IT assets; and the business agility they enable, with IT able to scale resources up or down as necessary.
Still other benefits of the private cloud, as the Big Fat Finance Blog has pointed out, “include increased IT productivity and efficiency, the ability of business users to self-provision the desired IT resources, and an increased ability to monitor and measure IT consumption for the purposes of chargeback or, as is more likely, show back.”
Should IT decision makers be trendsetters? No way. Change makers? Absolutely.
Christopher Yeich is Director of Strategic Content for Geeknet Media.
By John Jainschigg on March 6th, 2012
Part of my job entails running research computing projects on Microsoft Windows Server 2008/R2 virtual machines, which my specialized hosts custom-configure to support changing requirements. While the superficials might differ, the core challenges I face are the same as those confronted by any enterprise IT leader in determining how best to virtualize a diverse array of applications. The key to success, in both cases, is freedom to experiment – and ultimately to embrace diverse configurations – without undue cost.
This occurred to me today as I was re-reading Microsoft’s January white paper on Private Cloud, which undertakes an exhaustive technology breakdown and cost-comparison between MPC and VMware. The paper shows that Microsoft’s decision to license System Center on a per-processor (rather than per-VM or memory-footprint) basis, and to integrate licensing for multiple products underneath, helps customers build densely efficient private and hybrid clouds much more cost-efficiently. Here, the freedom to explore diverse VM configurations cost-effectively makes for a more-robust operation, better performance, and happier end-users.
Being able to play around with VMs (without being ‘taxed’) is what makes clouds revolutionary: the ability to tune resources to tasks, dial in lots of power and privilege to mission-critical apps, and experiment with thinning-out support for more-peripheral functionality until you find settings that work well and produce consistent ROI on cloud investment.
Microsoft’s Harold Wong, an IT Pro Evangelist for the U.S. Southwest and a frequent contributor on TechNet, has had a lot to say about this recently from several perspectives relevant to MS Private Cloud. On February 20, he tackled the commonsense mechanics of virtualizing Tier 1 biz-critical applications (http://blogs.technet.com/b/haroldwong/archive/2012/02/20/virtualization-is-it-really-possible-to-virtualize-tier-1-business-critical-apps.aspx), including in the process a lot of very useful reminders about how server hardware gets specified in enterprises (often by the database team, which may or may not end up providing ideal platforms for virtualization hosts).
A few days later, he followed with a detailed discussion of how a business might go about configuring VMs to serve required workloads (http://blogs.technet.com/b/haroldwong/archive/2012/02/24/right-sizing-virtual-machines-is-it-really-important.aspx), including Exchange Server 2010. He makes the point (actually, he includes numerous datapoints) that for most businesses, diverse configurations will be the norm, and that – before you’ve evaluated the workloads, the hardware, the requirements exhaustively (and played around) – you can’t know precisely in advance what’s going to end up being the ideal balance point between performance (and/or simple ‘reliable functionality’) and ROI.
For these and other reasons, the freedom to experiment, monitor, and optimize without cost impacts is a good and necessary thing, and is even more critical to smaller- and mid-sized businesses than for large enterprises. A very large datacenter will inevitably provide opportunities of scale, whereas a smaller one, for a smaller business, needs to do all the same things (more or less) on a smaller hardware footprint, so it will typically show more statistical diversity of configuration across any span of hardware and VMs than its larger cousin.
John Jainschigg is a contributing editor to Slashdot and SourceForge.