by Matthew Klahre, Andrew Baer, and Jay Gantz
Chatbots like Siri, Google Assistant, Alexa, and other personal assistants, raise a host of novel privacy concerns as their capabilities continue to rapidly expand. One such concern is that by directly integrating third-party services (e.g., Uber, Pandora), personal assistants are moving away from traditional, downloaded, apps and the privacy best practices associated with them. To help avoid litigation or regulation, chatbot operators should consider creating and following privacy best practices for this type of integration, specifically keeping in mind the notice requirement of the Fair Information Practice Principles.
In general, these third-party integrations can be grouped into two broad categories: registered services (which require a user to register before using, like Uber or Pandora), and unregistered services (for which user registration is not required, like CNN Flash Briefing or Akinator the Genie). We’ll discuss each in turn.
A user is required to authenticate themselves before using a registered service through a chatbot platform, typically by logging in to their third-party service account via the chatbot platform. For example, before using Uber via Google Home (the standalone speaker for Google Assistant), a user must first use the Google Home smartphone app to authenticate into their Uber account. This authentication process is an opportunity for the third-party service operator to give conspicuous notice of its privacy practices with respect to information collected via the chatbot platform. In fact, some authentication protocols, such as OAuth, specifically provide for this type of notice. Chatbot platform operators should consider contractually requiring third-party service operators (for example, by including the requirement in the chatbot platform developer agreement), to take advantage of it. In addition, to the extent that the user has not previously accepted a “Terms of Service” (or similar agreement), such as when creating their account either for the chatbot platform or the integrated third-party service, each operator should consider using this opportunity to obtain acceptance.
A user is not typically required to take an additional authentication step before using an unregistered service, which makes developing a privacy best practice more complex than for registered services. For example, Akinator the Genie is a guessing game that is available on Google Home and Amazon Echo, and does not require a user to have an account to use the application. Chatbot platform operators should consider following one of two approaches for these types of services – either an app-store-like or an in-context approach.
With an app-store-like approach, a chatbot platform operator would not enable any third-party services by default, and instead maintain a store, directory, or other visual interface through which a user would discover and activate third-party services. For example, Amazon Echo requires users to manually enable Alexa’s third-party “skills” through the Alexa smartphone app. This enabling process is an opportunity for the third-party service operator to give conspicuous notice of its privacy practices with respect to information collected via the chatbot platform. Chatbot platform operators with a store, directory, or other visual interface should consider including a conspicuous space within their design templates for such information, and also consider contractually requiring third-party service operators (for example, by including the requirement in the chatbot platform developer agreement), to take advantage of it.
On the other hand, chatbot platform operators may not always require users to discover and activate third-party integrated services. For example, Akinator the Genie is available on Google Home without further action from the user. In these cases, the chatbot platform operator should look for other opportunities to make the privacy practices of third-party integrated services conspicuously available to end users. For example, Google requires all third-party Actions developed for Google Assistant to post links to their privacy policies in Google’s Directory where users browse Actions. Google also encourages Actions developers to make their privacy policies available on their websites, and further recommends creating a Google Sites page or publishing a public Google Doc if an Action doesn’t already have a website to do so. In addition, Google requires that Action developers get in-context permission before accessing certain types of user data on the Google Home device.
Another option would be for third-party service operators to inform users about privacy practices through the conversation functionality of the chatbot. During the user’s first interaction with the service, users could be informed that the third party’s terms and privacy practices can be found online at a succinct and easy-to-remember URL for the user to review. In addition, a chatbot could explicitly tell users about some of the third-party service’s sensitive privacy practices – for example, make users aware that geolocation is used in order to provide clothing recommendations based on the weather. The chatbot operator could require users to indicate acceptance of these terms with a verbal command that is logged and stored by either the chatbot operator or the third-party service operator as a record of notice.
There is no one-size-fits-all approach, and as chatbot platforms continue to innovate, so should the means via which they provide their users with conspicuous notice of their and their partners’ privacy practices.
Any chatbot platform or integrated third-party service operator with questions or concerns about privacy issues should not hesitate to contact us.