Head in the cloud? Here are 5 concerns to address before making the move

From the mHealthNews archive
By Benjamin Harris
10:35 am

With all the talk about switching from data centers to cloud-based computing, it seems as if the cloud is an ethereal magic bullet for every problem that healthcare IT might face, from reduced costs to improved flexibility. That reasoning is especially noticeable in mHealth, where the cloud is an enticing solution to the issue of storing and managing data that can't reside on mobile devices or with doctors and patients on the move.

Not so fast, says Steve Jacobs, president of Velocity Data Centers, a firm that provides private cloud solutions.

While "there are some definite business advantages to operating in a cloud IT environment, the risks are very real and concerning," he says. And for all of the pros of cloud-based solutions floating around, there are some cons that can affect any organization that relies as much on data as healthcare does.

Jacobs offers five pain points that providers should consider in deciding whether to make the jump to the cloud.

1. Data security. While proponents of cloud computing tout government- and industry-level standards of encryption and security, that doesn't mean an organization becomes impenetrable once it migrates to the cloud. Mighty as a cloud company's barriers might be, Jacobs says, data breaches and security failures are still common. Should the unthinkable happen, the healthcare organization – not the cloud provider – is going to have to take the hit.

"When a breach occurs, the responsibility to handle that fallout falls upon the healthcare provider," says Jacobs. "Even though the public cloud was the one that had the breach, it's the hospital that has to stand there and say, 'Oops, we messed up.'"

Jacobs also notes that who is responsible for the mistake is of little concern to the end consumer, which in a healthcare environment is ultimately the patient. If they are concerned that the security of their data is inadequate, they will lose trust in their provider.

2. Service levels. One only needs to look in the recent news for examples of the sometimes fragile nature of a public cloud. When Hurricane Sandy lashed the East Coast, "there were data centers that were flooded, (and) generators that ran out of fuel," says Jacobs. "Whose systems are returned to service first?" The dark joke in the IT world, he says, is this: "He who holds the biggest contract is returned to service first."

Cloud providers do not necessarily have the staff or resources to get every client back online within a reasonable amount of time, and pure economics dictate that clients with a larger service agreement (who would be on the hook for more downtime reimbursement) are going to see a faster return to service. While that may not be a problem for large institutions, smaller hospitals may not see the same speedy return to service as their larger counterparts.

3. Performance. "You hear a lot of things about the public cloud being elastic," says Jacobs, meaning that an organization can grow and shrink its computing footprint as it needs by simply buying more or less cloud space. This works well in principle, he says, but what happens when another client begins to gobble up space rapidly? Jacobs notes that cloud providers "do not build network and server and storage capacity for the maximum possible usage; they build to the minimum." In this model, a provider knows that its clients won't be using the maximum amount of storage or bandwidth they've signed up for, and it will build accordingly.

This isn't an issue until someone needs a massive amount of cloud resources immediately. The provider will shuffle around what is available, and that can slow down performance for others. If one large organization has a big day in the cloud, Jacobs says, that can tax the resources of a cloud provider and negatively impact other clients of the same cloud – including hospitals. "You have no control" over how the cloud and one's computing is run, says Jacobs. "Your system slows down and you have no idea why or how to fix it."

4. Energy. Jacobs says public clouds fall far behind when it comes to energy efficiency. The magic bullet is data center infrastructure management – software that integrates hardware with building controls such as air conditioning. Normally, a cloud's A/C system will chase its use pattern, he points out. Servers get used more, which brings heat up, then the A/C will kick in as a response. If an IT team knows when their servers are used the hardest, Jacobs says, they can "match up that usage curve" with data center infrastructure management software to energy-intensive processes like temperature control so that the system is ready for its normal spikes and doesn't have to work in over drive to respond to temperature increases.

Because of the wide variety of user loads that are seen inside a public cloud, it's much harder to predict when use levels are going to rise. Jacobs says they don't have the same opportunities to optimize their environment to be as energy-efficient as possible. "You'll never find that efficiency in a public cloud," he says. "You have no control over the facility or the usage level."

5. Emotions. "Perception is reality," says Jacobs. "If a patient or a doctor doesn't believe that the public cloud is ready, then it doesn't really matter." He cites several examples of public clouds experiencing failures, data breaches and performance issues – and that stigma carries over to a patient's willingness to trust an organization that relies on a public cloud.

If companies decide to invest in their own data center, though, there are many more variables that can be controlled. "If we own data centers and we own our own hardware, we can manage to our service levels," Jacobs says. "At the end of the day I can go to my boss and I can say I know where my data is."