Risk assessments leave hospitals hamstrung
Lisa Gallagher, HIMSS senior director of privacy and security, isn't particularly optimistic about providers' ability to prove patient data is safe. Asked to assign a letter grade to the security situation in U.S. hospitals, she says, "probably a C."
The question, then, is how many extra credit assignments must be undertaken to bump that up to a B+?
Gallagher says it's the basic assignment that's the big problem for most providers.
"The fundamental activity that has to happen for organizations to be compliant with HIPAA and HITECH and every other regulation that comes from outside the industry - as well as your Stage 1 meaningful use - is to conduct a security risk assessment, and to do ongoing security risk analysis," she says.
"But if you look at the numbers [see chart], you see they're pretty low," Gallagher adds. "We have organizations trying to meet meaningful use Stage 1, and they're calling me and saying, 'We can meet all of the requirements of Stage 1, except the risk analysis requirement.'"
That, she says, "is very concerning, because our mission is to enable these organizations to do what they need to do" to reach meaningful use. "And we don't want a single security requirement to be a barrier."
Why is the assessment so problematic? Mostly because doing one is so far outside the areas of most health professionals' expertise. "Security and security risk assessment is a discipline that this industry just does not have a handle on," says Gallagher. "They don't understand it, they don't have people on their staff who can do it, they know they need to hire a consultant and they don't always have the time and the budget to do that."
Moreover, the requirement has long been swept under the rug. "For a long time, they thought, 'No one's monitoring HIPAA compliance, so we can put this off. We don't understand it, it requires resources we don't have, so we're just not going to do it."
Those days are over. Now, providers understand the stakes, at least. "I think there's a lot more awareness than there was before," says Gallagher. "Folks are understanding that, with their use of technology it's not only username and password; they understand that there are security policies and procedures they need to follow."
But even if organizations have some resources to put toward a security risk assessment, she says, "they don't know how to do it, and they're confused about the threshold to compliance."
HHS has never come right out and said, specifically, "'If you do the following, you are compliant,'" says Gallagher. They said, 'Do a risk assessment, document it and make sure you mitigate any findings that you have.' But there's no standard for what is minimum to be compliant. And that's causing the industry a lot of stress."
Unfortunately, she doesn't see many signs of improvement. The numbers have "been flat for the past few years, and I am really concerned," says Gallagher. "There's not an answer that is easily reachable right now. I don't see in the next year that we will make much progress."
That's too bad. Because it "shouldn't be a barrier; it should enable your organization to practice sound security management," she says.
But the stipulation has obviously proven difficult for many providers to handle. "The crisis point," says Gallagher, "is going to be if we see that people can't meet Stage 1 meaningful use because of one single requirement."
So why is HHS so vague in its calls for security assessments? Why can't they be more prescriptive?
"It goes all the way back to when HIPAA was written," says Gallagher. "It's hard to be prescriptive when you have such a variation in the size, scope and nature of the covered entities it applies to."
A small physician practice with just a couple docs only can do a basic security assessment, and that would be enough, in other words. "For a larger enterprise, it would be a completely different exercise."
There are providers, there are clearinghouses ... a single standard is simply too difficult to define.
So HHS left plenty of wiggle room. "They said, 'What is compliant is actually doing a security risk assessment, and not doing it to a certain set of requirements,'" says Gallagher. "They left it flexible and they thought they were doing the industry a favor."
Clearly, when it comes to meaningful use, at least, that's not quite the favor they thought it was.
In a recent webinar hosted by Healthcare IT News, Ravila White, director of enterprise security services and architecture at Seattle-based Providence Health & Services, described the health system's new approach to privacy and security after a couple high-profile breaches.
"We've taken a look and said, 'We're going to make sure that information security and protection, from a technical perspective, is actually driven by the business. With it driven by the business, we can be sure we're in line with what our clinicians need, what our patients need, and then as well as the folks who are supporting the various different business applications. ...
A part of that is adopting a set of guiding principles … and those two overarching principles really rely on interoperability – if a solution is not interoperable, then we have to have a pretty good reason for choosing it.
The other part is making sure we apply the principle of reapplication. Meaning, anything that's working very well, we're going to take that as a less learned and reapply that. Our team is pretty strapped. We have a lot of area to cover. We're across five states. We have varying degrees of compliance requirements. And so we have to try to find out what has worked best in one area that we can reapply for all the other regions and ministries we have.
Any time we do a vulnerability scan, or if there's an engineering project, an impact assessment is performed, which then is provided to enterprise security, and then they're able to use that to put together a risk assessment that would help inform our risk posture."