• No results found

Exploring age differences in motivations and technology acceptance for chatbots within a customer service context

N/A
N/A
Protected

Academic year: 2021

Share "Exploring age differences in motivations and technology acceptance for chatbots within a customer service context"

Copied!
49
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Exploring age differences in motivations and technology acceptance for chatbots within a customer service context.

Tyler Pilgrim # 11571659

University of Amsterdam Graduate School of Communication Research Master’s Communication Science

Supervisor: Margot van der Goot June 28, 2019

(2)

Abstract: A qualitative study was employed in order to explore age differences in chatbot usage in a customer service context. Using the socioemotional selectivity theory and technology

acceptance factors, age differences in motivations as well as perceived complexity and perceived security of chatbots were examined among older and younger adults. While no major differences were found in terms of motivations of using chatbots for customer service. There were some age specific differences in how older adults assess perceived complexity and perceived security of chatbots. Additionally, older adults tended to rate perceived security of chatbots higher than younger adults, opposite to what was expected.

Introduction

Chatbots in commercial communication

Just over a decade ago, before the rise of new media communications, such as social media, commercial enterprises often relied on more traditional forms of communications with their customers. When customers needed to reach commercial entities, they would have to call a phone line in which an agent would be able to help direct their enquiry whether it be for support or other various transactional interactions such as making a payment. Today as commercial entities operate at global scale, many entities have opted for scalable cost saving techniques such as web care (Van Noort, 2012) which is proactive and reactive online support via communication tools such as social media. One newer form of customer support communication, which will be at the crux of this study, are chatbots.

Chatbots, also known as disembodied conversational agents (DCAs), are designed to perform simple tasks such as sending airline tickets or providing healthcare tips or shopping advise for customers with questions (Araujo, 2018). One of the key properties to chatbots is that

(3)

they are disembodied agents, that are programmed to perform actions based on language processing with no actual human involved in the informational transaction (Araujo, 2017). To this day chatbots are an understudied form of communication in the communication sciences field due to their infancy in the market. Chatbots have exploded on the scene with over 100,000 chatbots being implemented in one year using just the FaceBook messenger platform alone (Araujo, 2017). Many giants such as KLM Royal Dutch Airlines, Twitter Inc., and Amazon, have invested in this technology as a way for customers to receive answers quickly without having to pick up the phone, visiting a physical location, or sending an email.

Chatbots and older adults

Currently there is a gap in research regarding what motivates older adults to interact with customer service chatbots as well as how accepting they are of the technology. Various studies have discussed chatbots but not in terms of age differences in motivations and technology acceptance (Araujo 2018, Hill et. al. 2015, Xu et. al. 2017, Kramer et. al. Yaghoubzadeh et. al., 2013). With 67% of adults older than 65 using the internet, older adults are an important target group (Cotton, 2017).

As many organizations using chatbot technologies serve a wide variety of demographics, older adults are an important target group to study as chatbots should be designed to

accommodate all ages. Studying older adults may help us to narrow the ‘technological divide’ between digital immigrants, older adults who did not grow up with advances telecommunication technologies, and digital natives, younger adults who have been submersed in advanced

communication technologies from a young age making them more ‘fluent’ in the technology (Prensky, 2001).

(4)

This study will seek to aim in expanding the literature on age differences in the use of communication technologies, specifically chatbots for customer service. There is currently no known literature exploring age differences in motivations and technology acceptance for chatbots in a customer service context. It’s expected that there will be age differences in motivational drivers for using a chatbots based on Carstensen’s (1991) Socioemotional

Selectivity Theory which outlines how goals and motivations alter over a person’s lifespan. It’s also expected that there will be age differences in technology acceptance of using a chatbot for customer service purposes. Technology acceptance can be understood as the adoption of a new technology into everyday life (Mitzner, 2010). Age differences are predicted as it has been shown in prior studies that age differences are present in the use of other communication technologies. For example, one study found that there were significant results in regards to reasons for using the internet among young adults and older adults. It was found that young adults preferred to use the internet in order to communicate with friends or strangers as opposed to older adults (Thayer & Ray, 2006).

Additionally, an aim is to help supply information to companies or organizations using chatbots for customer service in providing insights into the needs and motivations which are unique to older adults. With more research in this space, companies will have a better idea on how to enable their chatbots to provide a service that meets the needs of their diverse customer base.

A qualitative study will be employed in order to begin exploring the age differences in chatbot communication in a customer service context. A qualitative method was selected as this study is meant to be explorative in nature in order to help set in place frameworks to be used in future research.

(5)

Theoretical Background Motivations

As discussed above, differences in motivation towards chatbot usage between older and younger adults will be a key concept investigated in this study. One theory of motivational changes across the lifespan is the Socioemotional Selectivity Theory (SST) (Carstensen, 1991). The SST has been around for a few decades and was introduced by Carstensen (1991) in order to explain the evolution of goals and motivations in humans over the course of their lifespans. SST says that perceived time left in an individual’s life will impact their goals, preferences, and even cognitive abilities (Carstensen, 2006). Carstensen states that there are two main categories of motivational drivers that influence people’s goals and preferences; acquisition of knowledge and regulation of emotional states (Carstensen, 2006). These two categories tend to switch in

prominence as people perceive their lifespans to begin to shrink and become more finite.

Carstensen argues that young adults will seek activities that lead to more knowledge acquisition, while older adults will tend to seek activities that provide an emotional reward (Carstensen, 2006). In a study by Carstensen and Fung (2003), the two found that younger adults preferred advertisements in which the slogans promised knowledge acquisition while older adults

preferred advertisements with slogans promoting emotional rewards (Carstensen & Fung, 2003). Using the SST, one would argue that older adults will be motivated to use chatbots in a customer service context to fulfill an emotional need or that promise an emotional experience, while young adults will be motivated to use chatbots for information seeking.

SST implies predictions in social interactions. Carstensen explains “reduced rates of interaction in late life are viewed as the result of lifelong selection processes by which people

(6)

strategically and adaptively cultivate their social networks to maximize social and emotional gains and minimize social and emotional risks” (Carstensen, 1987, 1991). In other words, as adults go through their life, their social circles become smaller as they start to value and seek more meaningful social interactions, while younger adults will cast a wider social net for the benefit of gaining more information. In a study by Carstensen, results showed as age increases, satisfaction in interactions with acquaintances decreases, while interactions with close friends and family increases in satisfaction and emotional closeness (Carstensen, 1991).

While the SST has to do with goals, motivations, and selecting social interactions, the original SST has no mention of technology and chatbots. It is argued in this paper that chatbots do represent a form of social interaction. There have been many studies in recent years focusing on how people interact with computers and unconsciously recognize computers as social actors, this is known as the Computers as Social Actors (CASA) paradigm (Nass & Moon, 2000). One could argue that this diminishing of social circles from the SST will also be present in older adult’s motivation to use a chatbot for customer service as they may be seen as social actors. One could expect that this will manifest in older adults having a lack of motivation to use chatbots, instead relying on friends or family to help them with their customer support needs.

In summary of the above, the SST states that goals and motivations change as people advance through their lifespans. The SST will be applied as a framework to understand if this change in motivations is prevalent in older versus younger adults’ in terms of chatbot

interactions in a customer service. Thus, the following research question has been formulated to explore motivational factors for using chatbots in older adults in comparison with younger adults:

(7)

RQ1: What motivates older adults (compared with younger adults) to utilize chatbots in a customer service context.

Technology acceptance

In addition to expecting differences in motivations for using chatbots for customer service purposes, I also argue that the chatbot user’s technology acceptance, or whether an individual is willing to adopt the use of a new technology, will depend on their age. Mitzner (2010) suggests that a number of factors such as perceived complexity of the technology as well as individual characteristics will impact an older adult’s acceptance of a technology (Mitzner, 2010).

Perceived complexity, is whether the individual perceives a technology, in our case chatbots, as complex (Caine et. al., 2006). Caine et. al., goes on to explain that design of features in a product should try to minimize perceived complexity as much as possible for an individual to be more likely to accept a new technology (Caine et. al., 2006). While Mitzner (2010)’s study looks at attitudes towards technology of older adults in regards to complexity, there was no comparison with younger adults to see if there are age differences in perceived complexity across multiple age groups. Additionally, it has been found that as people age, there are unique needs due to diminishing physical and cognitive abilities that should be accounted for in technology design (Wagner et. al., 2010). For example, declines in vision, psychomotor skills, memory span, or special abilities may present new challenges for older adults accepting a technology (Wagner et. al., 2010). Thus, one could argue that there will be age differences in perceived complexity of using a chatbot for customer service due to the above factors.

One other aspect to be explored is how secure older adults perceive chatbots in comparison with younger adults. Mitzner (2010) in a study found that security reasons were

(8)

often noted from older adults as a reason they disliked using technology, ultimately impacting their technology acceptance. In this study, it will be explored whether there are differences among older and younger adults in how secure they perceive chatbots to be. With the above in mind, the following research question has been developed:

RQ2: How does technology acceptance differ between older and younger adults for chatbot usage in a customer service context.

Methods Sample

Interviews were performed among two distinct age groups. The first group, the older adults, included adults 55 years and older1 (N = 7). 55 was chosen as the cut off age as the National Health Interview Survey (NHIS) in the United States defines older age as 55 years or above (Schoenborn & Heyman, 2009). Ages in this group ranged from 55 to 81. The sample consisted of 5 women and 2 men. They lived in various regions in the United States. Education was varied ranging from a high school degree to post doctorate level.

The second group, the young adults, consisted of adults 30 and under (N = 7). Ages in this group ranged from 19 to 30. This group was used to explore whether there are age

differences with the older adults. The young adults group represents a different generation who has grown up with technology, or as Prensky (2001) would consider digital natives. The sample consisted of 3 women and 4 men. Three participants in this group lived in the Netherlands and four participants lived in the United States. Education in this group varied from high school

1

(9)

degree to master’s degree level. A full table participant table can be found in item 4 in the appendix.

Participants were recruited by interviewing members of my personal network as well as by soliciting on Facebook.

Interviews

The study was approved by the Ethics Review Board of the University of Amsterdam prior to conducting interviews (2019-PC-10430). Interviews conducted in the Netherlands (3 in total) were done face-to-face, while interviews conducted with participants from the United States were done on video chat via Google Hangouts. Participants who were interviewed face-to-face where given a consent form to sign as well as two documents (1) a blobtree image (blobtree.com, 2019) and (2) a document containing a list of chatbots being used today by companies for

customer service purposes ranging from the travel industry to a telecommunications company (Item 1 in appendix). Video chat participants had less bots to choose from due to the fact that bots using Facebook Messenger had to be removed from the list, since they were unable to log in remotely. Documents were sent beforehand for interviews via video chat.

Participants were instructed to interact with two of the chatbots from the chatbot list by asking cutstomer service related questions using a computer. Once the chatbot connected to a live agent or the participant acknowledged that they were finished, an in-depth interview followed.

An interview guide was developed (Item 2 in appendix) in order to ensure consistency among the different interviews as well as warranting that the necessary information was obtained. The interview guide consisted of four main topics; (1) their experience with the

(10)

chatbot, (2) technology acceptance, (3) motivation, and (4) general reflection. Topics 1-3 were employed directly after interacting with a chatbot for immediate recall of their experience of a specific chatbot. Topic 4 followed after they had finished interacting with the two chatbots in order to zoom out and reflect on chatbots on a more general level.

Topic 1 of the interview was designed in order to capture their immediate thoughts after using a chatbot in a customer service context. This section included a blobtree (Item 3 in appendix) in which the participant had to choose from a number of characters to best describe how they were feeling after the chatbot interaction. Participants were also asked how the interaction molded their perception of the brand or company.

Topic 2 served to obtain information from the participant regarding perceived complexity and how secure they felt the chatbot interaction was. Participants were asked open ended

questions such as how the company could improve the chatbot in the future. Participants were also given two characteristics to rate, Complex and Secure, 1 to 10 adapted from Mitzner et. al. (2010) to obtain levels of perceived complexity as well as how secure they perceived the chatbot to be. 10 indicated that the characteristic described the chatbot interaction very well, while 1 indicated that the characteristic did not describe the chatbot interaction well. Participants were asked to give a numerical score and then probed on why they assigned that specific score.

Topic 3 and topic 4 were used to derive motivational drivers for using a chatbot. Topic 3 followed directly after interacting with one chatbot by asking the participant to explain their expectations before using the chatbot. Topic 4 followed after the using the two chatbots to obtain information from the participants on the values of using chatbots for customer service, how chatbots can be useful for customer service, as well as what types of services or products chatbots are best suited for.

(11)

Analysis

The first step of the analysis was transcription of the interviews using a professional AI

transcription service, AmberScript, and then edited by hand to ensure transcriptions were correct. The transcriptions were then divided into two groups, older adults and young adults, followed by open coding (Strauss and Corbin, 1998). Codes were added to the transcriptions by using

Atlas.ti. The interviews were read several times in their entirety to ensure understanding and context of their responses. After coding, quotations were then divided into a total of 8 networks, 4 per age group including; expectations, what chatbots are useful for, what chatbots are not useful for, and most valuable reasons for using a chatbot for customer service. After consulting with a supervisor, networks were collapsed into two main concepts; (1) motivational drivers for using chatbots for customer service and (2) technology acceptance of chatbots for customer service.

Motivational drivers were interpreted from interviewee responses by assigning their responses to three dimensions in Atlas.ti: (1) Reasons to use a chatbot, (2) In which areas they are motivated to use a chatbot, and (3) Expectancy violations. (1) Motivations consisted of responses in which participants expressed their expectations of using a chatbot and the values of using a chatbot, both in a customer service context. (2) Participants also often indicated in what areas such as: services, products, and industries they were motivated to use chatbot for customer service. Lastly, (3) Expectancy Violations were interpreted using responses from the participant in which the chatbot technology failed in terms of what they were expecting from the interaction. Expectancy violations can be understood as the individual having an expectation of a behavior, in our case the behavior of chatbots, but the actual behavior does not perform in the way the

(12)

individual anticipated (Burgoon, 1993).

Technology acceptance of a chatbot for customer service was interpreted from responses from the participants by assigning responses to two dimensions using coding in Atlas.ti; (1) Perceived complexity and (2) perceived security. Complexity was built using responses from the participants in which they rated how complex the interaction was as well as their reasoning behind the rating. Perceived security also came from asking the participants to rate how secure they felt their interaction to be with the chatbot as well as their reasoning behind their rating. See figures 1-4 for visual representations below.

Results

Motivations

Interviewee responses were divided into three dimensions: (1) Motivation reasons, (2) In which areas are they motivated to use a chatbot, (3) and Expectancy Violations. For each dimension, the related indicators will be described with interviewee quotations as shown in Figure 1. Older adults will be presented first, with younger adults to follow. Motivations will later be compared between older and younger adults in the discussion section.

(13)

Motivations of older adults to use chatbots for customer service

Figure 1: Results of older adults’ motivations to use chatbots for customer service. *Indicators not on young adult motivations.

Motivations: Reasons

The most commonly cited motivation by participants was to get an answer to their question. When asked what they were expecting from the chatbot interaction Participant 8 said: “to have my question answered “. Participant 12 said: “Well to get to get an answer.” While many indicated that they were expecting an answer, some interviewees mentioned they use chatbots to ask simple ‘black and white” questions. Participant 9 said: “I think binary questions are good or

(14)

questions that have clear cut data that can be entered like numbers or yes no. Is it this or is it that, yeah. I think that things that require explanation. Or that are nuanced are not good for chatboxes at all”. Participant 11 said: “I guess chatbots again it's good for more simple basic not too complicated if you know what you're asking for then it's a useful tool”. Interviewees also noted that they used chatbots so they could multitask while asking a question. Participant 8 said: “Kind of allows me to do other stuff again while I’m working on the chatbox. You know like

multitasking.”

Another prevalent motivation for using a chatbot for customer service was the speed in which they can be helped. The interviewees mentioning this saw using a chatbot for customer service as a medium which is faster than calling or emailing, where they may have to wait on hold or anticipate long response times. Participant 9 said: “But I think that, I think you just no one likes to wait on hold. Yeah. And I think these respond quickly”. Additionally, some cited that they did not have to use the phone at all, Participant 14 said: “But I want to say also I hate calling companies so I really do like the idea of having a chat person that you know because if you call you're gonna be number 34 and you're going to wait three to five minutes or 20. And so I'm actually pretty pro chat I guess.”

Interviewees also alluded to the fact that chatbots are easy and convenient forms of communication for customer service. Participant 11 said that the interaction was: “Quick easy succinct. Saves time.” Participant 12 said that, “convenience” is the most valuable reason for using a chatbot for customer service.

Some interviewees mentioned that they use chatbots as a bridge to connect to live customer service agents. Participant 10 said: “Sometimes these companies will hide behind the virtual assistant. There is no phone number or any contact information. The only way to get

(15)

through is through a virtual assistant that can be sometimes that can be frustrating. But I don't mind them if they get you to where you need to be.”

Lastly, a motivation cited by the interviewees for using a chatbot for customer service is to help them navigate a website. Participant 10 said: “Because I would say there's a lot of times when I'm internet shopping for example and I can't find something and I know it's there. Like the navy-blue shoes. Now I know they have it I saw it in the catalog and so I can even give them the catalog number. But the website keeps coming back saying no. I don’t see them anywhere or I don't know what a catalog is. So that's when I typically will open assistance thing and say help me. Yeah I'm in for this very specific item and in those cases it's usually very successful.”

Motivations: In which areas are older adults motivated to use a chatbot

One area that interviewees noted where they were motivated to use chatbots for customer service was in E-commerce support. E-commerce support is defined here as the interviewee bought something online and needed help from the company such as shipping. Participant 14 said: “I use them a lot. I use them with Amazon and I use them with clothing. And I buy candles. Don't tell (NAME REDACTED) online and I find them useful. If I did not receive my product.” Technical support was another area noted. Participant 13 said: “I have found chats useful when dealing with customer service on technical issues with computer software or internet like the cable company to schedule a service call with a chatbot. I will contact Frontier or just some place to make an appointment with the chatbot or if I'm having trouble like with a web site.” Non-urgent services was the next area, Participant 9 said: “Speaker 1: I think non-Non-urgent services. So, I think it would not be useful for health care for example” The last area was making

(16)

Motivations: Older adults’ expectancy violations

Expectancy violations are situations in which the behavior of the chatbot did not perform to how the individual anticipated. Some participants mentioned that there was an expectation violation in not being able to connect to a live agent from the chatbot. Participant 13 said when asked how they would improve their experience: “The first question would be if you'd like to bypass this chat with a computer click here to get sent directly to a person.” Additionally, several

interviewees mentioned their frustration in their expectation of getting their question answered not met. Participant 14 said: “It did not meet my needs and it didn't address what I was looking for.” While a similar expectation violation but worth noting as separate was that the chatbot did not understand the participant’s question. Participant 8 noted:” I thought my question was very basic in nature. If I had spoken to a live person there wouldn't have been a question as to what my question was. No confusion I'd say.”

Lastly, some interviewees mentioned that chatbot did not support a broad enough range of issues, so they were not able to help with their specific question. Participant 10 said: “I would make more branches to the tree you know, I think. Let's see what was my first question. It had sent me to four possibilities none of which was, none of the above.”

(17)

Motivations of young adults to use chatbots for customer service

Figure 2: Results of young adults’ motivations to use chatbots for customer service. *Indicators not on older adult motivations.

Motivations: Reasons young adults use chatbots

By far the most cited reason for using a chatbot for customer service was that they did not need to use the phone. Participant 5 said: “I think like some people just don't like talking to people on the phone. I don't want to talk to people on the phone all the time. So, providing an easy way for people to just kind of like type in their question and like get it reasonably answered.” Many interviewees cited that they were motivated to use chatbots for customer service due to them

(18)

being fast and convenient. Starting with fast, Participant 3 indicated: “I expected that he gave me a fast answer to my problem.” Participant 7 said that they found value in chatbots in saving time: “Time. My time is important”. Convenience was also something that appeared in

interviewee responses, Participant 2 said: “It's super-fast it's fast super-fast and convenient.” Somewhat related, some interviewees mentioned their motivation for using chatbots was because they have busy lives. Participant 5 said: “I think it's useful for people who are like busy and couldn't like don't have like 45 [minutes]. Like I’ve spent an hour and a half on the phone with with an airline”. Interviewees also said using a chatbot for customer service enables them to multitask. Participant 7 said: “I can multitask. I can feed the baby and I can ask the question.”

Interviewees also indicated using chatbots as an efficient way to gain information such as finding information on a website. Participant 6 said: “Thought that it was pretty like enjoyable experience and I think that I would have gotten information that I needed without having to just sort of like fish. It was just like spoon fed to me in a way and that was a better experience than the last.” Participant 5 said: “ … it feels like an easier way to navigate their help center without actually like going through and searching through a bunch articles. It felt really easy way that I could go through if I was a Verizon customer and just find a bunch of different topics that. I could easily get to as opposed to not trying to navigate some kind of complicated help center”.

Some interviewees mentioned that they use chatbots because it removes needing to talk to a human completely. Participant 1 said: “You have your days where you're like I don't want to deal with any humans so I just want to have a chatbot that I can like ask questions and I can just be myself. You know sometimes when you're interacting with humans you have to be friendly. You have to be polite and. Maybe you had the worst day in your life when you still

(19)

have to be polite whereas a chatbot you're like. You don't have to pretend or act in a way that you don't want. You can just be yourself be natural in the mood that you are.”

Interviewees indicated that they used chatbots to answer a question that they had

Participant 14 said: “Yeah I mean I expected it to answer my questions….” One note regarding this indicator was that many interviewees mentioned that they used chatbots for simple questions for example: Participant 4: “It's kind of like an FAQ where you can just like type in your

question and answer it. I definitely find them to be useful in that aspect but anything more complex I think it's a little harder for them.”

Lastly, the chatbots provide interviewees a way to ask questions outside of normal business hours. Participant 1 said: “It's like for me it's like I really just use a chatbot if I have a specific question that I cannot look myself up on the website of the company itself. Yeah it's like for a specific question that I have and you cannot call them because they're like closed or

whatever.”

Motivations: In which areas are young adults motivated to use a chatbot

E-commerce support was one area mentioned, Participant 4 mentioned: “Like definitely any retail companies would be really good for just because like wanting to track an order or just like linking to the shipping rates or the cost of an item I think a chatbot would be really good for…” Another area that was mentioned was banking, Participant 5 said: “Yeah I think like utilities and like banks and stuff are pretty good for chatbots like where I need to just know something right away.” Travel also came up as an industry that was useful with chatbots for customer service. Participant 2 said: …anything that has to do with customer experience customer support. Flights makes sense...”. Lastly, government organizations were mentioned as useful for chatbots. For

(20)

example, Participant 3 said:” for example for a government page because things are really difficult to understand and you have to search information and go from this to from there. So if you just use a virtual assistant it's [snaps fingers]. She'll tell you what you need to search and then it's faster.”

Lastly, booking appointments with companies was noted as another area. Participant 6 said: “Recently I had to I had some issues with my internet service. And I used the chatbot, like the sonic chatbot, to schedule like a technician to come. And like well first they tried to like troubleshoot it with me like over the chat and we weren't able to fix anything so I was able to like actually schedule a person like a person to come to my house. Which was cool. I liked being able to use that for that.”

Motivations: Young adults’ expectancy violations

Young adults mentioned several expectancy violations during their chatbot interactions. The first area that emerged was that participants were unable to get their question answered, which was brought up often. Participant 2: “They didn't meet my expectations I was expecting to get an answer for what I was looking for and I just got something completely different.” Interviewees also said that the chatbot sometimes did not understand their questions correctly. Participant 2 said: “So I think I made a question that was easy to understand for a chatbot and still couldn't.”

Some participants mentioned that the answers they got were too robotic. Participant 6 said: “I expected to, like even if even if it was a robot, I like was expecting it to be more like hi like what can I help you with today kind of a thing. But it was more like it was more like a search bar.” Lastly, one participant mentioned that the chatbot’s responses were not as fast as they would have liked. Participant 4 said: “Disliked the delay in the welcome message…”.

(21)

Technology acceptance

Interviewee responses were divided into two main dimensions to make up technology

acceptance: perceived complexity and Perceived Security. For each dimension, indicators were divided into in support of the dimension and not in support of the dimension. Figure 3 shows an overview of the indicators that make up each dimension in technology acceptance for chatbot use in a customer service context. Older adults will be presented first, with younger adults to follow. Technology Acceptance will later be compared between older and younger adults in the

(22)

Technology Acceptance of older adults for chatbot use in a customer service context

Figure 3: Technology acceptance older adults. *Indicators not in young adults’ technology acceptance results.

Technology Acceptance: Older Adults’ Perceived Complexity

Comments given from the interviewees that the chatbots were perceived as complex often had to do with how information was organized within the customer service chatbot’s responses. One interviewee mentioned that there were too many types of services to choose from when they started to interact with the chatbot. Participant 8 said: “There are so many components in terms

(23)

of internet versus phone et cetera et cetera. And then billing just a lot more components to choose from. That's the only reason it would make it more difficult because of the more

components.” Another interviewee mentioned they thought there were too many steps combined into one response, as opposed to a back and forth sequence like in a conversation. Participant 9 said: “Yeah I think break I think breaking down the steps required to get a response. And then we create sequence instead of trying to do it all in one box.” It was also mentioned that there were too many layers of incorrect responses to get to the information the interviewee wanted to receive. Participant 14 said: “It just didn't answer my question and gave me things that I wasn't looking for. And then when I chose something new from the list, it still didn't answer it. So, it was it was complex. It was too many layers.”

Some interviewees mentioned it was difficult to find the chatbot on the company’s

website. There were several times in which the interviewer had to give detailed steps on where to find it. Participant 14 said: “It was a little complex because I couldn't find her right away.” Lastly, it was mentioned by interviewees that the interaction was complex simply because the chatbot was not able to understand their question. Participant 14 said: “Speaker 1: it was difficult because it said to ask the question. And when I asked the question it misinterpreted it and then it gave me things to choose from.”

Interviewees also mentioned several things which made the chatbot interaction not complex, or a low level of perceived complexity. Interviewees mentioned that the chatbot gave clear instructions of what to do, making it less complex. Participant 8 said: “There is absolutely no misunderstanding of what I was doing or challenge. I guess it was clear very clear.” Another area that came up was that the chatbot was well-written making things less complicated.

(24)

get into tech support that’s run by someone who is using English as a second language and it can get complicated fairly quickly. This seems to be very, it's well written.” Additionally,

interviewees noted that being able to ask questions in natural language to a chatbot made things easy. Participant 8 said: “Yeah I asked very English language kinds of questions. And it was able to decipher them.”

Lastly, an interviewee mentioned that using the chatbot was not complex because all the information was there in one box in front of them without having to open multiple windows the computer. Participant 8 said:” It was easy. It was very user friendly. It is not too difficult to ascertain. It was basic. What it needs is to be basic, you didn't have to open a bunch of items to get to where you needed to go its right there.” Appendix item 2 contains the full list of older adult’s complexity scores.

Technology Acceptance: Older adults’ secure

Older adults cited two reasons why they perceived the chatbot interaction as secure. The first one was that they just automatically assumed it was secure. When asked when they had given the interaction a score of highly secure, Participant 8 said: “a 10. But I don't know that, I'm just going to make that assumption.” Another interviewee was unsure of how secure it was, so they rated it as highly secure. Participant 11 said: “Well I I'm not sure how secure it is. I Guess 10.” The second reason they perceived the chatbots as secure was because the interviewee had used the company previously. Participant 10 said: “I've done it on Amtrak before. So yeah I'm giving it 10.”

There were more areas of why older adults did not feel that using chatbots for customer service was secure. The first was that there is no way to tell whether a chatbot is secure or not.

(25)

Participant 9 said: “There's no way to display it. So, when I think of secure that to me means that the information that I share with them including my I.D. Yeah it I'm not, is it secure and there is no way to assess that.” It was also mentioned that the chatbot was so unintelligible that it made it feel not secure. Participant 13 said: “No, there's no way I would type in my credit card

information to a chat where it doesn't even know what I'm saying.” Another participant’s anti-virus software was set off when entering the chat, making them feel like it was not secure. Participant 10 said: “However the second part as I said I was gonna be a two-part answer. Second part of that is when we started our discussion I think it was the English interviews and I popped open the Macy's thing. It set off my Microsoft XP antivirus. So, I've got, I would have to ding you on trust there but I don't know if that's because we're doing an experiment and it's not the real Macy's or what triggered that but that's going to color my perception.” Lastly, one interviewee felt that the company would have access to the transcripts from their support

conversation. Participant 13 said: “Anybody that works there can pull up that transcript and see it.” Appendix item 3 contains the full list of older adult’s secure scores.

(26)

Technology Acceptance of young adults for chatbot use in a customer service context

Figure 4: Technology acceptance young adults. *Indicators not in older adults’ technology acceptance results.

Technology Acceptance: Young Adults’ Perceived Complexity

Young adult interviewees felt the chatbots to be complex due to two main reasons. The first was that the interviewee had to adjust the way they would normally speak in order to accommodate the chatbot. Participant 2 said: “The only thing I had to keep in mind that my sentences were, the

(27)

way I was speaking had to be adjusted a little bit. And even for example trying to be less social or less kind that I usually try to be when I speak with another human being since this was machine I just tried to go to the point.” Also, another interviewee mentioned that it was easy to get lost using the chatbot. Participant 5 said: “I think it would be pretty easy to get lost if you didn't really know what you were doing. But for somebody who does know what they're doing they can probably find the right answer without having to talk to a person.”

There were far more reasons cited by the interviewees for why interacting with the chatbot was not perceived as complex. The first and most noted was that the chatbot had pre-existing options to choose from. For example, if the chatbot asked ‘Did I solve your problem?’ instead of typing yes or no, the interviewee would click yes or no. Participant 4 said: “I don’t find this one complex at all just because it's so streamlined on how you can easily get there. You answer either by having the option to just click on one of the three options it gave, or just

entering in your question.” Some participants found that the chatbots were not complex because there was a level of familiarity to other services that they use. Participant 7 said: “Pretty easy. It just popped up asked you to ask a question. It was like, like back in the day when we used aim” (AOL Instant Messenger). Another interviewee said using the chatbot was not complex because it was like talking to another person. Participant 5 said: “It was it was so easy for me just to type in a few things it was almost like talking to my wife just saying like can we book a ticket to Oakland on Amtrak to leave on the twenty third and come back on the 24th. Like it was it was that kind of easy.” It was also noted that there are few barriers to entry to start a

conversation with a chatbot, making it less complex. Participant 4 said: “I find it to be easy. Just because it's pretty straightforward. Once you click on the link it takes you right there. And all you have to do is in your name and your email address and make your question.” Lastly, it was

(28)

mentioned that when the chatbot asked for feedback such as, ‘Is this what you meant?’ it made the interaction less complex. Participant 6 said: “Easier. I think that part of it has to do with like the real-time feedback. So, it was like continue to ask me like do this give you the information that you needed. Like are you satisfied with this information.” Appendix item 4 contains the full list of young adult’s complexity scores.

Technology Acceptance: young adults’ secure

Some young adults just assumed that the interaction that they were having with the chatbot was secure. Participant 2 said: “I don't have any reason to think that it is not secure.” Other said the interaction was secure because they were interacting with a big company. Participant 3 said: “Also because it's a big company. You just trust them.” One interviewee said they felt secure about the interaction because when it came time for the participant to provide personal

information it directed them to a website instead of inputting it in the chatbot. Participant 5 said: “So I said it would be easier if they would make me book in in the chat but I felt it was more secure and it took me to like an actual booking page on Amtrak.” Others mentioned that they felt the interaction was secure because the chatbot didn’t ask them for personal information.

Participant 4 said: “Because just because I don't really know it I didn't really have like an

account it couldn't really ask for those kinds of sensitive questions about like money or anything. So, I just didn't experience it being secure but I didn't experience it not being secure.” Lastly, one interviewee mentioned that they would provide sensitive information to a chatbot but they

wouldn’t necessarily feel good about. Participant 6 said: “Yeah. Like I probably would do it [give sensitive information] but I wouldn't feel like great about it

(29)

Various reasons of why interviewees felt the chatbot interaction to not be secure were provided. Interviewees mentioned that they are unable to assess the security from just using a chatbot. Participant 7 said: “I don't really know. So, a five.” It was mentioned that the

interaction did not feel secure because it would feel like giving sensitive information to a person. Participant 6 said: “I think it's because it feels like it's a person. You know even though it's like probably a robot and It seemed. It seems like weird though because like I would put my

information like into an account or something you know.” Another area for not feeling that the interaction was secure was being unfamiliar with the company. Participant 1 said: “Well because I mean to be honest I don't know the company well so I don't want to like underrate them. But I also don’t want it to be like over rated.” Lastly, an interviewee mentioned that they felt less secure simply because they were unaware of what a chatbot could do with their information. Participant 7 said: “especially just because I know it's a bot so who knows what they could do.” Appendix item 5 contains the full list of older adult’s secure scores.

Discussion

Age differences in motivations for using a chatbot for customer service

The aim of the current study was to explore motivational and technology acceptance differences in chatbot usage in a customer service context. The first dimension explored was reasons older and younger adults were motivated to use customer service chatbots. Based on Socioemotional Selectivity Theory (SST) (Carstensen, 1991), it was expected that older adults would use

chatbots in a customer service context to fulfill an emotional need, while young adults would be motivated to use chatbots for information seeking (Carstensen, 2006). In the current interview study, this expectation was not observed for the older adults. There was no evidence supporting

(30)

that the reasons older adults were using chatbots in a customer service context was in order to fulfill some form of emotional need. In fact, many of the motivational reasons for using a chatbot overlapped with the younger adults such as to get an answer to my question, to ask simple

questions, to multitask, to get a fast response, so they don’t have to call, convenience, and lastly to find information on a website. Different from young adults, a distinctive motivational reason for using a chatbot in the older adult group was to connect to a live agent. While this is

interesting to note that older adults were using chatbots to connect to a live agent which may indicate they were looking for customer support with a more ‘human-touch’, although it remains undetermined whether this was to achieve an emotional need.

On the other hand, applying the SST (Carstensen, 2006), it was expected that younger adults would use chatbots as an information seeking tool. This was observed with young adults citing that they use customer service chatbots like search engines to locate information on a website which may be hard to locate on their own. Motivational indicators for younger adults versus older adults revolved around them having a busy life and enabling them to contact customer service at convenient times, which may be out of the organization’s business hours. Lastly, the young adult group said they would use a chatbot so they did not have to interact with a human at all, something that did differ from the older adults as older adults often cited that they used chatbots in a customer service context in order to speak with a live agent.

The second dimension explored was in which areas older and younger adults are motivated to use chatbots. Older adults mentioned they would use chatbots for e-commerce support and making appointments (both of which were also found among younger adults). Older adults also said they would use customer service chatbots for technical support and non-urgent services (non-urgent service was defined by one older interviewee as not an emergency

(31)

healthcare service), both of which were not found in the younger adults. Younger adults also indicated that they would use chatbots for banking, travel, and government related questions. Again, applying SST there was no evidence to support that older adults were motivated to use chatbots to fulfill emotional needs within their chosen areas.

Lastly, the third dimension of motivations explored was expectancy violations in chatbot usage. There were some small age differences in expectancy violations, but nothing that

supported older adults using chatbots in order to fulfill emotional needs (as proposed based on SST). Expectancy violations in the older adults group often had to do with the performance of the chatbot in not being able to answer their question or not understanding their question.

Additionally, not being able to connect to a live agent and the chatbot not being broad enough to cover their issue was specific to the older adults group. Young adults expressed similar

expectancy violations with the chatbot not answering their question as well as not understanding their question. Young adults also indicated that the conversation felt too robotic and that the chatbot was too slow.

It was also expected, based on SST, that older adults might have a lack of motivation to use chatbots for customer service, instead relying on friends and family to obtain customer service answers on their behalf due to their shrinking of social circles (Carstensen, 1991). This was also not found to be the case. In almost the entire sample of older adults, with the exception of one interviewee, the older interviewees had experience using chatbots for customer service needs and expressed that they would use chatbots for customer service again in the future. In none of the interviews did any individuals mention relying on friends or family to help with customer service requests instead of using a chatbot.

(32)

In general, there was no evidence found to support older adults were motivated to use customer service chatbots in order to fulfill an emotional need. On the other hand, there was some evidence to suggest that young adults are motivated to use chatbots as information seeking tools. There was quite a bit of overlap in the motivational reasons, areas, and expectancy

violations among the two groups suggesting from this sample that there were no major differences in motivations for using a chatbot for customer service.

Age differences in technology acceptance for using a chatbot for customer service

In terms of technology acceptance, it was expected that there would be age differences in perceived complexity and perceived security for chatbots in a customer service context. It was expected that older adults would perceive chatbots as more complex and less secure due to older adults being a part of the ‘digital immigrant’ generation, or not growing up with technology that is present today (Prensky, 2001). It was also expected that older adults would find chatbots more complex due to ageing cognitive abilities (Wagner et. al., 2010). In the present interview study, there was no evidence that overall older adults rated chatbots to be more complex (see item 5 in appendix). However, they did have differences in assessing how complex a chatbot was. For example, when giving reasons of why chatbots were complex, older adults tended to focus on how the information was provided or displayed. Older adults mentioned that there were too many components in the chatbot responses, that there were too many steps in a single response, and that there were too many layers. Older adults also cited that they had trouble when the chatbot did not understand their question and they had issues finding the chatbot. Younger adults on the other hand had fewer reasons for the chatbot being complex citing that they had to adapt the way they spoke to the chatbot so it would understand them and that sometimes the chatbot

(33)

was easy to get lost in. In sum, there was no evidence to support that overall older adults perceive chatbots as more complex, but they do rate complexity using different factors than young adults.

Lastly, it was expected that older adults would rate their interactions with the chatbots as less secure. This was also not the case, in fact it was slightly the opposite. Older adults tended to rate the chatbot interaction as more secure (see item 6 in appendix) than young adults (see item 8 in appendix). Both groups had a difficult time assessing how secure the chatbot interaction actually was. When older adults did not know how to assess the security of the chatbot they often rated the interaction as more secure, while younger adults tended to do the opposite with being more critical and rating the interaction as less secure. This may be due to younger adults being more familiar with newer forms of communication technology and have a better sense of the risks involved in using a chatbot for customer service.

Practical recommendations

There are some practical recommendations for companies or organizations using chatbots for customer service based off the results from this study. First, based on this sample, it does seem that chatbots are in-demand for customer service from older and younger adults. Both groups see the value in chatbots and are willing to use chatbot technology as a first step to get in contact with a company or organization. If the company’s user base includes older adults it would be recommended to create a chatbot experience that does not have long and complicated text

responses. Older adults seemed to value more simple text that did not contain too many steps and was more of a sequential conversation. Older adults also had a harder time finding chatbots on the company’s website. Organizations utilizing chatbots for customer service should be sure to

(34)

place the chatbot option in a place that is obvious so the user does not have to search for it. Additionally, older adults from this sample often mentioned that they were using a chatbot as a bridge to get to a real person. It would be recommended to have some way to connect to an agent if the initial interaction does not answer the user’s question. Or if there is no live agent via chat, include other contact methods so the user can easily continue receiving support using another communication mode.

In terms of practical recommendations for younger adults, these tend to be surrounding security. It is recommended that if the chatbot interaction needs to collect some form of sensitive information, this may not be best suited for a chatbot. If needed to collect sensitive information, have the chatbot link the user to a secure website so the user can input information on a full website instead of within the chatbot box. Additionally, young adults seemed to use chatbots as an information directory. Younger adults saw value of using a chatbot to find information on a website. It would be recommended that instead of using a traditional search bar on a customer service help webpage, perhaps try putting a chatbot there to provide helpful links or resources. Lastly, young adults from this sample acknowledged that they need to change the way they type in order to help the chatbot understand their request. To overcome this, practitioners can include clear instructions which tell the user how to interact with the chatbot. For example, if a chatbot can only read dates using numbers, make sure to tell the user that information clearly.

Exploring age differences, the study does not suggest that in order for chatbots in

customer service to be improved that there needs to be separate experiences for older and young adults. The recommendations would be advantageous for both older and young adults and not at the expense of one group.

(35)

Limitations and future research

There are some limitations that must be discussed for this study. The first is that there may have been some level of interview bias. Most of the interviewees were on some level familiar with the interviewer, which may have skewed some of the answers. Often, participants mentioned that they expected the chatbot to answer their question, but then the chatbot would not fully answer their question, although the participant would still say that the chatbot met their expectations. There were several cases of this in which the interviewer noticed that the chatbot was not

actually providing the correct answer but the participant still rated the interaction as not complex and meeting their expectations.

The next limitation is that the older adults in this sample may not be a true representation of ‘digital immigrants’ (Prensky, 2001). As all older adult interviews in this study were

performed over video call most of the older adults had some basic level of computers and experience with computers. Additionally, all older adult participants, with the exception of one, had experience using chatbots. One could argue that adults who are able to do interviews via video call and have experience with chatbots may not be truly representative of ‘digital immigrants’.

Lastly, participants were asked to interact with chatbots in a ‘unnatural’ way. For example, participants were asked to formulate questions on the spot using companies that they may not necessarily use in their personal lives, although interviewees were instructed to ask the chatbot a question that they might actually ask if they were on their own. Since the participant may or may not have actually been experiencing the issue they were chatting about, there may have been a different outcome. For example, had the participants had a real question that they needed to be answered by a chatbot for customer service, the failure for a chatbot to produce a

(36)

correct or appropriate answer may have resulted in harsher ratings of complexity, security, or expectations.

With the above in mind, there are some interesting opportunities for future research in this space. It would be interesting to continue to investigate age differences in how people perceive security in new communication technologies, such as chatbots. In this sample, young adults tended to be more critical while older adults seemed to assume things were secure. Additional qualitative research exploring how individuals assess security of a chatbot would be suggested to understand the mechanisms behind that assessment. This would be beneficial for organizations to design their chatbots in a way to be perceived as more secure.

Additionally, some older adults in this study mentioned not being able to tell whether a chatbot they were interacting with was a person or not. While none of the bots in this study caused that confusion, it was something that was noted from older adults when explaining their past experience with chatbots. This misunderstanding did not appear in the younger adult group. It would be suggested, in an experimental study, to have older and younger adults interact with customer service chatbots (one bot and one live agent) to determine if there are age differences in being able to assess whether a chatbot is a person or a robot. Additionally, more qualitative research can be performed to look at what variables in a chatbot do adults utilize to assess whether there is a human behind the screen or not.

Conclusion

In conclusion, a qualitative study was conducted in order to explore differences of older and younger adults in their experiences of customer service chatbots, specifically regarding motivations and technology acceptance. While there was overlap in motivations for using a

(37)

customer service chatbot among older and younger adults, there were still some age specific differences in why they use them, in which areas they use them, and their expectancy violations of how a chatbot should work. There was no evidence to support older adults were motivated to use chatbots to gain an emotional experience, based on the SST (Carstensen, 1991) Additionally, it was expected for there to be age differences in technology acceptance of customer service chatbots. While there were no major differences in how complex the two groups perceived the chatbots, there were age specific criteria that the two groups applied to assess whether the chatbot interaction was complex. Additionally, contrary to expectation, older adults perceived customer service chatbots as more secure than younger adults. Younger adults tended to be more critical when rating the perceived security of a chatbot.

(38)

References:

Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company

perceptions. Computers in Human Behavior,85, 183-189. doi:10.1016/j.chb.2018.03.051

Blob Shop - Home of the Blob Tree Communication Tools. (2019). Retrieved from https://www.blobtree.com/

Burgoon, J. K. (1993). Interpersonal Expectations, Expectancy Violations, and Emotional Communication. Journal of Language and Social Psychology,12(1-2), 30-48.

doi:10.1177/0261927x93121003

Caine, K. E., Obrien, M., Park, S., Rogers, W. A., Fisk, A. D., Ittersum, K. V., . . . Parsons, L. J. (2006). Understanding Acceptance of High Technology Products: 50 Years of

Research. Proceedings of the Human Factors and Ergonomics Society Annual Meeting,50(18), 2148-2152. doi:10.1177/154193120605001807

Carstensen, L. L. (1992). Social and emotional patterns in adulthood: Support for socioemotional selectivity theory. Psychology and Aging,7(3), 331-338. doi:10.1037/0882-7974.7.3.331 Carstensen, L. L. (2006). The Influence of a Sense of Time on Human

Development. Science,312(5782), 1913-1915. doi:10.1126/science.1127488

Cotten, S. R. (2017). Examining the Roles of Technology in Aging and Quality of Life. The Journals of Gerontology: Series B,72(5), 823-826. doi:10.1093/geronb/gbx109 Chung, J. E., Park, N., Wang, H., Fulk, J., & Mclaughlin, M. (2010). Age differences in

perceptions of online community participation among non-users: An extension of the Technology Acceptance Model. Computers in Human Behavior,26(6), 1674-1684. doi:10.1016/j.chb.2010.06.016

Fung, H. H., & Carstensen, L. L. (2003). Sending memorable messages to the old: Age differences in preferences and memory for advertisements. Journal of Personality and Social Psychology,85(1), 163-178. doi:10.1037/0022-3514.85.1.163

(39)

Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real conversations with artificial intelligence: A comparison between human–human online conversations and human–chatbot

conversations. Computers in Human Behavior,49, 245-250. doi:10.1016/j.chb.2015.02.026

Krämer, N. C., Simons, N., & Kopp, S. (n.d.). The Effects of an Embodied Conversational Agent’s Nonverbal Behavior on User’s Evaluation and Behavioral Mimicry. Intelligent Virtual Agents Lecture Notes in Computer Science,238-251. doi:10.1007/978-3-540-74997-4_22

Mitzner, T. L., Boron, J. B., Bailey Fausset, C., Adams, A. E., Charness, N., Czaja, S. J.,et al. (2010). Older adults talk technology: Technology usage and attitudes. Computers in Human Behavior, 26(6), 1710–1721

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. doi:10.1111/0022-4537.00153

Noort, G. V., & Willemsen, L. M. (2012). Online Damage Control: The Effects of Proactive Versus Reactive Webcare Interventions in Consumer-generated and Brand-generated Platforms. Journal of Interactive Marketing,26(3), 131-140.

doi:10.1016/j.intmar.2011.07.001

Prensky: Digital Natives, Digital Immigrants. (2001). From Digital Natives to Digital Wisdom: Hopeful Essays for 21st Century Learning,67-85. doi:10.4135/9781483387765.n6 Schoenborn, C. A., & Heyman, K. M. (2009). Health Characteristics of Adults Aged 55 Years and Over: United States, 2004-2007. PsycEXTRA Dataset. doi:10.1037/e623972009-001

Strauss, A., & Corbin, J. 1998. Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks: Sage.

Thayer, S. E., & Ray, S. (2006). Online Communication Preferences across Age, Gender, and Duration of Internet Use. CyberPsychology & Behavior,9(4), 432-440.

doi:10.1089/cpb.2006.9.432

Thompson, R. A. (1990). Emotion and self-regulation. In R. A. Thompson (Ed.), Socioemotional development: Nebraska symposium on motivation: 1988 (Vol. 39 pp. 367-467). Lincoln University of Nebraska Press.

(40)

Wagner, N., Hassanein, K., & Head, M. (2010). Computer use by older adults: A multi-disciplinary review. Computers in Human Behavior,26(5), 870-882.

doi:10.1016/j.chb.2010.03.029

Xu, A., Liu, Z., Guo, Y., Sinha, V., & Akkiraju, R. (2017). A New Chatbot for Customer Service on Social Media. Proceedings of the 2017 CHI Conference on Human Factors in Computing

Systems - CHI 17. doi:10.1145/3025453.3025496

Yaghoubzadeh, R., Kramer, M., Pitsch, K., & Kopp, S. (2013). Virtual Agents as Daily

Assistants for Elderly or Cognitively Impaired People. Intelligent Virtual Agents Lecture Notes

(41)

Appendix

Item 1: Chatbot list

List of chatbots for the interviews

Organization Where to find bot Scenarios

KLM* Ihttps://www.facebook.com/ messages/t/KLM  Ask for ticket prices and booking a ticket

Hipmunk* https://www.messenger.com/t/hipmunk  Book a trip

 Get vacation ideas

Kayak* https://www.messenger.com/t/90811893045/  Book a trip

Verizon https://www.verizon.com/vzbot/vzbotproxy/web?b=wav  Phone plans  Broken phone  General wireless plan information Amtrak https://www.amtrak.com/

“ask Julie at top right”

 Book a trip  General questions about routes  Information on seating Macy’s https://www.virtualagent- macys.com/Chat/Client?pageUrl=https%3A%2F%2Fwww.customerservice- macys.com%2Fapp%2Fcontact%3Fcm_sp%3Dnavigation-_-bottom_nav-_- contact_us%26lid%3Dglbbtmnav_contact_us-intl&area=oracle_cs&uniqueID=JVE618DZ7TORW  General FAQs  How to return an order  Track a package Tommy Hilfiger* https://www.facebook.com/ messages/t/KLM  Refunds  Find a store  Damaged product

*Facebook Messenger bots were removed for video chat interviews since interviewee could not log into Facebook remotely.

(42)

Item 2: Interview Guide

Interview Guide Introduction:

- The interview is about the chat function in customer service

- You will chat with the customer service of several organizations. On the laptop, I have the chat options open, and you will chat with two of them. For each, I will ask

you some questions. My questions are very open-ended, because we are exploring a wide range of experiences, we did not yet narrow it down.

- I will record the interviews, are you ok with that? You will remain anonymous: I will not mention your name on any of the documents.

- Do you have any questions before we start? - This is the informed consent form.

1. Experience of... Please share with me all your experiences during this chat (anything!)

Blobtree Which character on this picture describes this

experience during the chat best? Feel free to use more than one. Can you explain why you chose these ones?

Company perceptions Taking into consideration this experience with the chat: how do you see this company? What does this tell you about the company?

2. Technology Acceptance  What did you like about the chatbot? What did you dislike?

 Was using the chatbot easy or difficult? Can you explain your reasoning?  If you were to help the company make a

new version of this chatbot, what other elements or features would you include? What elements or features you would remove?

How well does each of these characteristics describe the experience with the chatbot? Please explain

 Complex  Reliable  Secure

(1-10. 10 describes the experience very well).

3. Motivation  What did you expect when starting the

(43)

chatbot meet your expectations?

 Would you use this chatbot in the future? Why or why not?

After specific chatbots

4. General reflection  Can you tell me ways that chatbots could be useful for customer service?

 Types of organizations?  Types of questions?

 Can you tell me ways that chatbots would not be useful for customer service?  What types of products or services do you

think chatbots would be best suited for?  What do you find to be the most valuable

reason for using a chatbot for customer service?

 When you would have a question for the customer service again: would you contact them in this way again or not? Why?

5. Demographics  Gender

 Age  Education

 Country of residence

(44)

Item 3: Blob tree image

(45)

# Age Gender Education Residence Medium Experience with chatbots? Notes 1. 28 M Bachelor’s NL Face-to-face Y 2. 30 M Master’s NL Face-to-face Y 3. 19 M High School NL Face-to-face Y 4. 30 F Bachelor’s US Video chat Y 5. 29 M Bachelor’s US Video chat Y 6. 29 F Master’s US Video chat Y 7. 27 F Bachelor’s US Video chat Y 8. 60 F Bachelor’s US Video chat Y 9. 67 M Postdoc US Video chat Y 10. 63 M Bachelor’s US Video chat Y 11. 65 F Bachelor’s US Video chat Y 12. 81 F High School US Video chat N Needed assistance in typing and opening chatbot 13. 54 F Bachelor’s US Video chat Y 14. 59 F Bachelor’s US Video chat Y

(46)

Older Adult Complex Scores Participant 1 2 3 4 5 6 7 8 9 10 8 chatbot 1 X chatbot 2 X 9 chatbot 1 X chatbot 2 X 10 chatbot 1 X chatbot 2 X 11 chatbot 1 X chatbot 2 X 12 chatbot 1 X chatbot 2 13 chatbot 1 X chatbot 2 X 14 chatbot 1 X chatbot 2 X

Item 5: Older adult’s complexity scores directly after using the chatbots. Participants were asked to rate from 1 to 10 how complex their interaction with the chatbot was. 1 indicates least

complex and 10 indicates most complex.

(47)

Older Adult Secure Scores Participant 0 1 2 3 4 5 6 7 8 9 10 8 chatbot 1 X chatbot 2 X 9* chatbot 1 chatbot 2 10 chatbot 1 X chatbot 2 X 11 chatbot 1 X chatbot 2 X 12 chatbot 1 X chatbot 2 13 chatbot 1 X** chatbot 2 X** 14 chatbot 1 X chatbot 2 X

Item 6: Older adult’s secure scores directly after using the chatbots. 1 indicates least secure and 10 indicates most secure. Participants were asked to rate from 1 to 10 how secure they perceived their interaction with the chatbot to be. *Participant 9 would not answer this as they indicated they were unable to rate how secure an interaction with a chatbot was. **Participant 13 insisted to rate secure as a 0 even though scale was 1 to 10.

(48)

Young Adult Complex Scores Participant 1 2 3 4 5 6 7 8 9 10 1 chatbot 1 X chatbot 2 X 2 chatbot 1 X chatbot 2 X 3 chatbot 1 X chatbot 2 X 4 chatbot 1 X chatbot 2 X 5 chatbot 1 X chatbot 2 X 6 chatbot 1 X chatbot 2 X 7 chatbot 1 X chatbot 2 X

Item 7: Young adult’s complex scores directly after using the chatbots. 1 indicates least complex and 10 indicates most complex. Participants were asked to rate from 1 to 10 how complex their interaction with the chatbot was.

Referenties

GERELATEERDE DOCUMENTEN

Recent research on stress-related personality disorders like posttraumatic stress, chronic fatigue and depression has already begun to rely on animal models, says Jaap Koolhaas, a

H3 (Moderator effect): The effect of seeing a friend’s post related to physical activity on social media on the intention to engage in physical exercise is moderated by the level

media, moderated by the level of physical activity together with users’ social media involvement, increase the intention to engage in exercise and consequently to post the results

2 The movement was fueled largely by the launch of FactCheck.org, an initiative of the University of Pennsylvania's Annenberg Public Policy Center, in 2003, and PolitiFact, by

Unfortunately, both approaches also appear 47 to be unable to provide a (adequate) general macroscopic description of the (non-) linear dissipative,

De behoudstrategieën kunnen geen van alle het voortbestaan van informatie waarborgen, zeker niet indien de inherente kwaliteit van informatie bewaard moet blijven. Dat wil niet

A systematic review of 263 peer-reviewed articles shows this required research is succeeding in providing a greater understanding of the South African culture and

( 2011 ) considers n-dimensional ones, which can be degenerate... degenerate models are not a priori freed from any constraint, and this is the point on which we would like to