Hacker News new | past | comments | ask | show | jobs | submit login

Some HIPAA regulations that pre-date the rise of shared virtual servers in "the cloud" are quite outdated and cause quite a bit of trouble for no real benefit.



> Some HIPAA regulations that pre-date the rise of shared virtual servers in "the cloud" are quite outdated and cause quite a bit of trouble for no real benefit.

What HIPAA regulations are you talking about? Other than HITECH guidance (which can sort-of be seen as a "HIPAA regulation"), HIPAA regulations don't generally specify technologies at all, and I can't think of any that I would describe as outdated or troublesome due to the rise of shared virtual servers and "the cloud", whether they predate it or not.


The biggest thing is that we can't run software with unencrypted PHI on physical hardware that is simultaneously running other people's code. In practical terms this means that we have to pay AWS some $ to get dedicated instances and also we can't use ELBs in the standard (easy) way. There are some other things as well.


> In practical terms this means that we have to pay AWS some $ to get dedicated instances

This is a feature, not a bug. It also is neither HITECH nor HIPAA; it is instead AWS's requirement in order to sign your BAA.

> we can't use ELBs in the standard (easy) way

Also neither HITECH nor HIPAA. ELBs are used in a PHI-related scenario identically to any other scenario. Unless you are referring to using it as an SSL terminator, in which case I would say "the standard (easy) way is always wrong".


> The biggest thing is that we can't run software with unencrypted PHI on physical hardware that is simultaneously running other people's code.

There is no, AFAICT, no regulation under HIPAA or related law that requires this. Certain service providers may have determined that they cannot provide guarantees of privacy/security without this technical restriction.


That seems like a fairly reasonable thing given you're talking about encrypted PHI... it's some extra $ for a considerable reduction to overall attack surface when processing the most sensitive type of personal data.

I don't think this meets OP's definition of "wrong".


In practical terms I don't agree that the threat of someone doing all of the following things is worth worrying about (in comparison to many other more likely failures):

1) determine what physical hardware in aws the target is running code on

2) somehow get the aws virtual machine manager to let the attacker run their malicious code on the same hardware

3) somehow pierce the protections of the virtual machine to read memory being used by the target application

4) figure out how the data is stored in memory in order to make sense of anything that was read


> In practical terms I don't agree that the threat of someone doing all of the following things is worth worrying about

In AWS case, this is an AWS rule about when they will sign a HIPAA BAA, even though there is no HIPAA regulation that specifically prohibits the arrangement at issue. AWS clearly thinks it is worth worrying about.

When you run your own public cloud, you can determine what risks are worth accepting potential liability for.


Yes, I agree that Amazon is behaving perfectly rationally given the legal environment. My point is that the legal environment has been designed in an un-optimal way from a technical perspective. Identifying such a situation was rayiner's request.


> Yes, I agree that Amazon is behaving perfectly rationally given the legal environment.

I'm not commenting on Amazon's rationality (I haven't actually evaluated the security concerns that would determine that.)

> My point is that the legal environment has been designed in an un-optimal way from a technical perspective.

And you haven't pointed to anything in the legal environment that is suboptimal from a technical perspective. You haven't even pointed to anything in the legal environment at all.

Amazon (as a BAA) has certain administrative responsibilities for putting administrative and technical safeguards in place to prevent breaches, and certain obligations and liabilities in the case of breaches. HIPAA and related laws and regulations do not specify the specific administrative or technical safeguards, though they do specify areas that must be addressed.

Amazon has decided that the particular technical arrangement you prefer is too high of a risk, but you haven't pointed out anything that indicates that this is the result of an outdated regulation that results in poor technical choices rather than technology-neutral regulation and a reasonable evaluation of the security concerns of the particular technical arrangement you would prefer.


People said the same thing about cold boot attavks against encryption keys. Yet today the police and others are using that and other NAND attacks regularly.

HIPPA is a very easy compliance standard to meet. If it seems difficult to meet those requirements with your standard tool configurations, you should think about what that means with respect to the integrity of your data.


I would like to see a case of a cold boot attack by the police.


Memory forensics is a thing.

Google around with terms like forensics and "Volatility" or "Volatility toolkit" and you should find some presentations and other references.


I know what memory forensics is and I use Volatility and Second Look and quite a few other things pretty often, I've asked specifically about an instance of cold boot attack that you claimed in a hyperbole that are used often or at all by the police.

You know what I don't need a case, please find me a jurisdiction in which cold boot attacks have passed forensic certification, e.g. a link to the process like for example from a body equivalent to the ASTM https://www.astm.org/Standards/forensic-science-standards.ht... would suffice.


I asked my doctor to email me my records and was told it is illegal due to HIPAA it makes no fucking sense but that's what it is.


It isn't illegal to email records under HIPAA. But your doctor probably doesn't have a system set up to securely email records (such things do exist), and their practice has probably adopted privacy policies that don't allow emailing for that reason. Doctors aren't generally compliance experts, and are much more likely to know what the policies of their place of work allow than the distinctions between what HIPAA allies and what their place of employment has adopted as policy based on the particular technology they've decided to adopt and their particular level of risk tolerance and other factors.


Such as? HIPAA generally has to do with organizational access controls and not specific technologies.


Also, certain provisions of FERPA precluding use of cloud accounts for holding student data. I think those may be the archetypal examples.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: