ARE GOOGLE HOME AND ALEXA HIPAA COMPLIANT?
Most people probably wouldn’t trust a virtual doctor, but there does seem to be potential for the use of AI assistants in healthcare.
As virtual assistant programs become more and more common, we are bound to run into certain issues. AI is basically a new frontier, and virtual assistants are in that category for sure. Most people probably wouldn’t trust a virtual doctor, but there does seem to be potential for the use of AI assistants in healthcare. If nothing else, they could take care of things like patient screening, initial evaluation, and general advice. That being said, there are some legal issues that must be considered here.
WHAT IS HIPAA?
HIPAA is a law, and its name stands for “Health Insurance Portability and Accountability Act.” That’s kind of a mouthful, but it tells you a lot. This law was created in 1996, in response to growing concerns about the ways in which health information is shared. Patient confidentiality is very important, so it makes sense that these standards have existed for a long time.
Like most laws, this one is long and complex. However, you only need to understand two aspects: The “privacy rule” and the “security rule.” To clarify, each of these is actually a set of rules, but they are usually referenced together.
The privacy rules dictate who can access your personal health information and who cannot. Obviously, you are permitted to see those records, as is your doctor. For most other situations, your health info cannot be given out without your permission.
The Security rules, on the other hand, are much simpler. These are obligations that every healthcare provider must follow. Any company that deals with medical records is also obliged by law to follow the standards. These standards dictate that those dealing with sensitive health information must ensure that the information is kept secure. If that information is improperly disclosed, the institution that collected it could be sued.
IS GOOGLE HOME HIPAA-COMPLIANT?
To answer this question, let’s look at a respected HIPAA journal. They are probably the best people to answer that question, as they probably keep closer track of this law than anyone (including the government). According to this source, Google Home and Google Assistant are definitely not HIPAA-compliant.
HIPAA demands that a person be informed if any sort of biometric identifier has been added to their personal health information. When you bring a device like this into the doctor’s office, there is no way to know who might be listening. Because of this, the security of your personal health information cannot be guaranteed. Thus, Google is clearly in violation of the security rule.
IS ALEXA HIPAA-COMPLIANT?
In the beginning, Alexa was no more compliant than Google, but that seems to have changed. Still, it is nowhere near 100% compliance. According to the same HIPAA journal we consulted before, Alexa does have at least six skills that comply fully with HIPAA guidelines.
Thanks to these changes, patients can now use Alexa to:
Check the status of prescriptions
Enroll in a Cigna health plan
Send health status reports to parents
Check blood sugar and analyze trends
Find emergency care in the area
Book an emergency same-day appointment
Amazon has gone to a lot of trouble to ensure that these activities comply with all of the privacy and security standards laid out in the HIPAA. Most professionals seem to be satisfied with the way they have done things, but it should be noted that Alexa is only allowed to do the six things on this list. Any disclosure that is not listed above would probably be illegal.
THE GREY AREAS
As with anything related to new technology, there are bound to be some legal grey areas. For now, you should just remember that any unauthorized recording of health information will probably not be in compliance. That includes anything recorded by a virtual assistant unless it falls under one of the six exceptions listed above.
But, what happens if you find a situation that isn’t covered under the rules? If you should find yourself in that situation, just ask yourself the following questions:
Does the virtual assistant require you to approve a privacy statement? If so, what is in that statement?
Can these recordings be legally subpoenaed?
Are the provisions of HIPAA’s privacy rule being met?
Are the provisions of HIPAA’s security rule being met?
Has the patient consented to disclosure with full awareness?
In most cases, virtual assistants like Google Home and Alexa are not HIPAA-compliant. As we can see, only one of them is capable of compliance, and only under six specific circumstances. Thus, it might be a good idea to disallow these things in hospitals, clinics, and doctor’s offices. Even if they offer some interesting new possibilities, there is no way to justify the privacy risks that come from all those unregulated surveillance devices.
If you take one thing away from all this, it should be the knowledge that personal assistants can also function as surveillance devices over which you have little to no control. That’s why they are not capable of full compliance. We hope that this question has been answered to your satisfaction. If so, you can reward our efforts by filling out the contact form.