In addition to industry-standard security & privacy policies, SmartBot360™ uses proprietary state-of-the-art technology to exchange private information directly between the customer & the business. This bypasses the "chat log stored on user device" and the "man-in-the-middle" vulnerabilities presented by Facebook Messenger, SMS or other chat media.
Read below about our secure architecture and the issues it addresses or contact us for more information and a free quote.
the Smartbot360™ secure architecture
A chat can start on a non-compliant medium like Facebook Messenger or SMS. When sensitive information must be exchanged, a secure link is sent to the user to seamlessly continue securely.
No registering or passwords are necessary, thus achieving a frictionless yet secure communication.
An agent using SmartBot360™ Management Dashboard™ can also manually switch to a HIPAA-compliant chat with the press of a button.
COMMON security & privacy concerns addressed by Smartbot360™
The above figure shows the main modules involved in a chatbot deployment. These are the main security concerns:
- Man-in-the-middle: If a chatbot is deployed on one of the chat media (Facebook Messenger, Slack, Skype and so on) or on standard mobile texting (SMS), then the owner of the medium (Facebook or the mobile carrier) have access to the conversation. This automatically makes the conversation non-HIPAA compliant.
- Chat log stored on user device: For example, if you use SMS to exchange sensitive data, anybody with access to your phone could read these messages as they are stored unencrypted and without password protection on your phone.
- Encryption of messages in transit: Fortunately, most of the media and connections are SSL-encrypted, so this is not a concern with most chatbot platforms out there. An exception is SMS, which is unencrypted. Nevertheless, this is something to always check.
- Encryption of data at rest: The Conversation Management Engine should use an encrypted database to store the chat log.
- Use of external NLP services: If a chatbot platform relies on external libraries or services to analyze the user text, e.g., extract a date or a phone number, then this communication must be secured. A reasonable approach is to never send any personal identifying information (e.g., name or address) or any session information to such services, so they cannot associate messages with users. Another approach is to only use libraries inside the Conversation Management Engine, which do not communicate with outside entities.
- Logging and access rights: This is more relevant for HIPAA compliance, which requires that the chatbot platform logs all actions. It also requires that the chatbot platform follows strict policies on who is granted access to what data; in general employees should only be given access to sensitive data if they sign the right forms and have a real need to access this data.