Description
|
Voice assistants (VAs) are devices that utilize AI, machine learning, and NLP to facilitate users to perform diverse tasks verbally. VAs also have a unique feature that allows them to be “always on” so that every sound generated in their background can be analyzed and start interacting with users when they recognize their wake-up command, for instance, “Hey Siri” or “Okay Google”, which implies that VAs have to be listening to users at all time. This raises the issue of privacy in the form of perceived surveillance. This study aims to assess how perceived surveillance affects the continuance usage intention of VAs in Indonesia with the addition of personal information disclosure as a mediator. Surveillance effect model was utilized to measure perceived surveillance. The model was calculated using PLS-SEM based on online survey data (N=222) distributed over social media. It was revealed that perceived surveillance affects the continuance usage intention of VAs negatively and is partially mediated by personal information disclosure. The result also affirmed that trust, perceived risk, and prior negative experiences are predictors of perceived surveillance. Therefore, VA companies should be mindful of how their customers’ continuance usage intention is affected by how much perceived surveillance they feel. (2023-09-21)
|