New Media Doesn’t Mean New Rules: The Challenges of Chatbots
While chatbots can help brands become closer to consumers, they can also create legal and ethical challenges.
Ready to try SMW+, our new on-demand streaming service for marketers? Start your free trial today to access exclusive content!
In 2017, more than 1.3 billion people used Facebook Messenger. Brands are now leveraging Messenger and other chatbot platforms to communicate directly with consumers and deliver relevant content using Big Data and AI technology.
However, while chatbots can help brands become closer to consumers, they can also create legal and ethical challenges. These were discussed at Social Media Week Los Angeles by panelists Hannah Taylor and Daniel Goldberg from Frankfurt Kurnit Klein & Selz, with Audrey Wu, Co-Founder & CEO, CONVRG.
What is a Chatbot?
Wikipedia describes a chatbot as a computer program designed to simulate conversations with human users, via a chat interface.
They are becoming increasingly popular among businesses with an online presence who want to create a better customer experience.
In the first year of the Messenger platform, to April 2017, Facebook Messenger developers created 100,000 bots exchanging two billion messages per month.
On Facebook Messenger, there are three message types:
- Standard messaging – these have a 24-hour messaging window and must be initiated by the consumer first ie you opt-in to take to a chatbot
- Subscription messaging – these have no 24-hour window but they must be transactional in nature. No advertising or promotional content can be communicated.
- Sponsored messaging – can be sent outside the 24-hour standard messaging window but only if you have had previous interactions.
If you wish to use chatbot messaging on Facebook, you must comply with their policy and usage guidelines.
Do consumers know they are talking to chatbots?
The key concern amongst regulators and legislators surrounds the awareness levels of consumers and their level of understanding around whether they know they are talking to a chatbot. Potentially it’s deceptive to not make the use of chatbots clear.
“If a platform does not provide an opportunity to make proper disclosures, then it should not be used to disseminate advertisements that require such disclosures.” Federal Trade Commission (FTC)
Facebook also says you must ensure that you provide all necessary disclosures to people using Messenger.
“Disclosures that are an integral part of a claim or inseparable form should not be communicated through a hyperlink. Instead, they should be placed on the same page and immediately next to the claim and be sufficiently prominent so that the claim and the disclosure are read at the same time, without referring the consumer somewhere else to obtain this important information.” Federal Trade Commission (FTC)
From a marketing perspective this is ‘super clunky’ said Wu. The solution is to make chatbot conversations more of a teaser conversation towards further information, rather than providing direct answers with all the disclosure details in the chat window.
For example, if a consumer asks a pricing question. The chatbot could respond with ‘great question, take a look at….’ Rather than ‘The price is $x’, which would require all the disclosure detail alongside the response.
Chatbots and privacy issues
New media doesn’t mean new rules and Goldberg emphasized the need to apply FTC privacy principles to the use of chatbots:
- Reasonable security
- Limit collection and retention
- Sensitive data
- Reasonable expectation
US companies should also be aware of the new General Data Protection Regulation (GDPR) law on data protection and privacy for all individuals within the European Union.
Learn the latest trends, insights and best practices from the brightest minds in media and technology. Sign up for SMW Insider to watch full-length sessions from official Social Media Week conferences live and on-demand.
Write for Us
Interested in sharing your ideas and insights with the world? Become a SMW News contributor and reach 300k readers each month.