When is a healthcare chatbot GDPR-compliant?

0_I1JI0a0pj4bhlcih.png

by Vagelis H. 12/31/2020

This article focuses on healthcare chatbots, so we will discuss how to make a HIPAA-compliant chatbot be GDPR-compliant as well. This is necessary if a healthcare entity plans to interact with EU patients, in addition to US patients (HIPAA is a US regulation).

For a discussion on how to make a bot HIPAA-compliant see our previous article.

Similar to HIPAA, GDRP is a set of guidelines and not technical directives, so ensuring GDPR-compliance is a joint responsibility of the chatbot designer and the chatbot platform. For example, the chatbot must request permission to store and use data provided by the user; this is generally the responsibility of the chatbot designer. On the other hand, securing the data using encryption or other techniques is a primary responsibility of the platform.

Here are the key additional measures you have to take to make your HIPAA-compliant chatbot also GDPR-compliant:

1. (platform responsibility) Report data breaches within 72 hours, whereas HIPAA requires reporting within 60 days or more.

2. (platform responsibility) Right to erasure (also known as “the right to be forgotten”), that is, have a mechanism in place so users may request that all their private data is deleted. You must delete the data within one month from the time the request is received.

3. (bot designer responsibility) Explicit consent for storing and purpose of using data outside direct patient care. For example, the bot could start with a message “Do you agree to have your responses stored and used to improve your user experience? Yes/No

A link to a terms page is generally not adequate.

This is similar to the ubiquitous cookies consent messages we see in almost all Web sites recently, to comply with GDPR.

The platform could force adding a predefined message at the beginning of each chatbot, but the bot designer is in a better position to make the message more meaningful and insert it in the right position, given the purpose of a chatbot. Also, some chatbots may not need consent, as they don’t collect any personal information, for example, anonymous FAQ bots.

4. (joint responsibility) Consent for transferring data outside the EU. The simplest way to handle this would be to store the chatbot data in the EU, for example, in EU data centers of AWS. If you prefer to store the data outside the EU, you need to get consent from users from that. You could add this to the consent for the purpose of the chatbot (Measure 3)

Business agreement: Similarly to how companies must sign a Business Associate Agreement (BAA) for HIPAA, for GDPR there is an equivalent Data Processing Agreement (see template).

Chatbot media: As discussed in a previous article, only Web-based chatbots are HIPAA-compliant, due to the third party in-the-middle complication. This restriction is not necessarily true for GDPR. For example, there can be a GDPR compliant bot on Messenger. However, the company (Data Controller) and not the chatbot platform (Data Processor) would typically be responsible to make sure that Messenger deletes the user data when requested (right to erasure).


Guest User