Augmented Reality (AR) technology is rapidly embedding itself to various online retail applications supporting interactive and immersive shopping experiences. While these capabilities enhance consumer engagement and adoption, they rely heavily on confendiatial user data thereby intensifying privacy concerns. This creates a personalization–privacy paradox, where consumers’ desire for hyper-relevant experiences is tempered by heightened vulnerability to data misuse. Although extensively studied in traditional e-commerce and social media, this paradox remains underexplored in AR contexts, where immersive features further elevate perceived risks. Grounded in Privacy Calculus Theory (PCT), this study investigates how consumers weigh perceived personalization value against privacy risks. Data for the study was collected from 276 respondents who has used AR apps/platforms and analysed using python 3.11. The findings contribute to theory by extending PCT to immersive AR environments, and to practice by guiding businesses in balancing personalization with privacy protection to foster consumer trust. Additionally, the study offers implications for technology managers tasked with regulating emerging AR technologies to safeguard user interests without stifling innovation.
To provide customers with hyper relevant experience, many online retail firms have incorporated Augmented Reality (AR) feature on their application(apps) /platforms(Pandey & Pandey, 2025). AR help in lending online apps/platforms more features like immersability, interactability, improved visual imagery and personalization, thereby improving customer adoption of AR (Jiang et al.,2021). To provide such hyper realistic experiences AR apps/platforms access confidential information of the users like geolocation, biometrics, spatial information etc.(Cruz et al., 2025; Cunha & Krupsky 2025;Khan, 2024;). Thus providing customers with realistic and personalised experiences as the tradeoff between their privacy.
This type of data-dependence of AR powered apps/platforms has given rise to personalization-privacy paradox. This paradox is more intensified by the AR, wherein users on one side are conflicted by their desire for personalization and on the other hand are concerned by their privacy(Zimmermann et al., 2023; Patel, 2024). The very features that enhance realism require access to highly security and personal, thereby elevating the perceived stakes of data collection(Smink et al., 2019). The benefit of hyper-relevance is inextricably linked to a state of hyper-vulnerability. Despite extensive research on the personalization-privacy paradox in traditional e-commerce and social media, its implications within immersive AR environments remain significantly underexplored(Herriger et al., 2025; Mehmood et. al., 2025).Very less effort has been made to build an understanding of how consumers cognitively compare AR benefits against the perception of risk associated with sharing personal data(Qin, et al., 2024;Rauschnabel, et al., 2018; Lebeck, 2018). Also very few researchers have attempted to explore the application of PCT to the AR shopping context. Understanding this trade-off will ot merely help the academic researchers but is also critical for the businesses seeking to leverage AR for competitive advantage without eroding consumer trust, and for policymakers tasked with crafting regulations for these emerging technologies.
To address this gap, this study employs Privacy Calculus Theory (PCT) as its foundational framework. The study posit that a user’s decision to use AR apps/websites is a result of a rational calculation, where the perceived personalization value is weighed against the perceived privacy risk, with trust acting as a antecedent in this decision-making process. Based on these discussions, the objectives of this research are threefold:
Obj1:To examine the influence of privacy concern on perceived privacy risk and further on trust and intention to use.
Obj2: To determine the role of trust as a key driver of intention to use, establishing its direct and essential relationship with adoption behavior
Obj3: To examine the effect of perceived personalization value on trust and intention to use.
Augmented Reality technology enhances consumer visualization of the products /services(Tan & Reddy; 2022). AR enhances customer engagement by providing real world context and interactivity reduces uncertainty and helps consumer in better decision making (Thakkar & Kachhela, 2023). Studies have also shown that try-on feature supported by AR improves purchase confidence, reduce return rates and impacts consumer behavioural intention (Bezawada, 2025; Sekri, et al., 2024) The efficacy of AR technologies stems from its ability to provide hyper realistics ecperiencethat traditional online platforms cannot match (Nair et al., 2024). However, this immersive capability is computationally intensive and data-reliant, setting the stage for the central paradox this study examines.
Privacy Concern and Perceived Privacy Risk
Privacy concern in AR refers to users’ apprehension about how AR collects, process, and potentially misuse sensitive information such as biometric, facial, or location data (Herriger et al., 2025; Cowan et al., 2021). These concern arise from a perceived loss of control over personal information and influence users’ trust, perceived usefulness of the app, immersive experience, and ultimately their willingness to adopt or recommend such technologies. An individual's pre-existing disposition towards privacy is a key determinant of how they perceive specific risks (Riaz, 2023). Studies consistently show that users with high privacy concern are more likely to perceive higher risks in specific online transactions and are more resistant to data disclosure (Beldad, at al., 2011; Harborth & Pape , 2021). This study positions privacy concern as an antecedent to AR-specific perceived privacy risk, hypothesizing that a user's general privacy disposition will directly influence their assessment of the risks associated with a specific AR application. Based on this following hypothesis has been formulated
Hyp1: Privacy concern is positively associated with perceived privacy risk in the context of AR app/ platform.
Perceived Privacy Risk and Trust
Perceived privacy risk in AR apps arises when users feel vulnerable to misuse of their personal environment or identity information—for example, apps capturing biometrics, user’s movements, personal information (O'Hagan et al., 2023).T hese risks often undermine trust, which represents confidence that the AR provider will safeguard data responsibly (Chandrika & Gupta, 2024). If users believe an AR app could share or sell sensitive data, their trust in the platform decreases, reducing willingness to engage (Taub et al., 2023). Based on this following hypothesis has been formulated
Hyp2: Perceived privacy risk is negatively associated with trust in the AR app/platform
Perceived personalization value is an important determinant of consumer trust in AR applications/platform. When users experience tailored content and relevant recommendations, they perceive higher utility and relevance, which fosters a sense of reliability in the platform. This personalized engagement signals that the technology understands user preferences, thereby enhancing confidence in its use. However, personalization must be perceived as beneficial rather than intrusive. If users believe personalization relies on invasive data collection, they may discount its benefits, perceiving the trade-off as unfavourable (Van Buggenhout, al. al., 2023). Trust emerges when consumers believe that data-driven personalization enhances value without compromising privacy. Thus, perceived personalization value directly strengthens trust, serving as a critical driver of sustained AR usage. Based on this following hypothesis is formulated
Hyp3: Perceived personalization value is positively associated with trust in the AR app/platform.
Trust and Intention to Use
Trust is essential for AR app/platform usage because these app/platform often require access to personal data of the users like camera feeds, body movements, biometrics or private environments. When users trust AR applications and platforms to safeguard their data, they demonstrate greater willingness to adopt the technology despite its inherent risks (Elsotouhy, et al., 2024). Extent literature has concluded that trust in AR apps/platform directly influences user’s willingness to use it for virtual try on and 3-D spatial visualization (Kang et al., 2023; Koppens, 2021). Hence, trust not only reduces uncertainty but also improvises perceptions of reliability, which are critical for sustained user engagement (Barta, et al., 2023). Based on this following hypothesis has been formulated.
Hyp4: Trust is positively associated with intention to use the AR app/platform.
In AR app/platform, risks such as collection of personal data like facial bio markers, past history, behaviour matrices, user profiling can significantly lower users’ intention to use (Lehman et al., 2022). For example users may hesitate to use AR filters if they suspect their data collected is analysed without consent. Moreover, repeated reports of misuse or lack of transparency in data handling practices amplify these risks, further discouraging potential usage of technology (Bodhwani & Sharma, 2023). This negative relationship demonstrates the importance of clear communication regarding privacy policies, data usage policies of the apps/platforms using AR. Based on this following hypothesis has been formulated.
Hyp5: Perceived privacy risk is negatively associated with intention to use the AR app.
Perceived personalization value enhances adoption of AR apps. Users engage more with AR app/platform when apps offer meaningful, customized, and immersive experiences, such as recommending clothes, type of makeup, different types of furniture (Ahmed, 2025). Prior research shows that high personalization increases perceived usefulness and enjoyment, directly boosting intention to use AR technologies (Adawiyah et al., 2024; Holdack et al., 2022; Hung et al., 2021). Personalization also fosters emotional connection and relevance, creating experiences that feel unique to the user and making them more willing to overlook potential privacy concerns in favor of convenience and engagement (Abtahi et al., 2023). Based on this following hypothesis has been formulated.
Hyp6: Perceived Personalization Value is negatively associated with intention to use the AR app/platform.
Theoretical background
The study applies Privacy Calculus Theory(Culnan & Armstrong ,1999) in the context of AR app/platform to understand user’s intention to use them. The theory reveals the delicate balance users strike between risks and benefits. Privacy concern increases perceived risks, undermining trust and discouraging intention to use. Personalization value strengthens perceived benefits, motivating use despite risks. Trust act as an enabler for adoption when users feel confident their sensitive AR data will be protected. This perspective explains why some users avoid AR apps due to privacy fears, while others adopt them eagerly when personalization value is high and trust in the provider is strong. Furthermore, the framework highlights that effective risk communication and transparent privacy policies can play a decisive role in tipping the balance toward adoption. Based on this following conceptual model of the study is proposed(Figure 1).
FIGURE1: Conceptual Model of the Study
Source: Authors’ own
The methodology followed for the research is as follows.
Research Design
The study undertaken uses quantitative research design. Data was collected from respondents who had used an AR application or platform for any of the following in the past one year: personal purchase, creating a social media post, used for education or played an immersive game.
Measures and Instrumentation
The data was collected with the help of a self structured questionnaire(Annexture I). The items related to privacy concerns were adapted from Xu et al, 2012 and Smith et al., 1996; for perceived privacy risk from Malhotra et al., 2004, Dinev& Hart, 2006; for Trust in AR Apps from Gefen et al., 2012 and McKnight et al., 2003; for perceived personalization value from Awad & Krishnan, 2006 and Bleier & Eisenbeiss,2015 and intention to use from Venkatesh et al, 2003 and Rauschnabel et al., 2019 respectively. All items were measures on likert scale 1-5 where 1 means strongly disagree and 5 means strongly agree.
Sampling and Data Collection
The study employed purposive sampling to ensure that respondents had relevant experience with AR app/platform. The participants were expected to have used an AR-based platform or application for personal purchases within the past one year. This criterion ensured that respondents were familiar with both the personalization benefits and privacy implications of AR usage. The questionnaire for data collection was distributed via social media channels, email invitations, and professional networks. A total of 276 valid responses were obtained, which were deemed adequate for SEM.
Data Analysis Strategy
The data was analysed using Python 3.11 on Google collab. Various data analysis libraries used were pandas for data preprocessing and numpy for numerical computations. For factor analysis (EFA) pingouin and semopy was used for confirmatory factor analysis (CFA).
The result of the data analysis shows that (Table 1) of the sample of 276, 45.65 % were between the age of 18-24 years, 44.20% were between the age of 25-34 years and 10.14 % were of 35 and above age. Table 2 presents the results of the reliability and validity assessment for the measurement model.The KMO value and factor loadings for all items exceed the recommended threshold of 0.70, demonstrating strong item reliability. Cronbach’s alpha values for all constructs are above 0.70, indicating satisfactory internal consistency. Composite Reliability (CR) values, ranged from 0.867 to 0.964, also exceed the 0.70 benchmark, confirming construct reliability (Hair et. al., 2022) . Average Variance Extracted (AVE) values for all constructs are above 0.50, establishing convergent validity(Fornell and Larcker, 1981). These results collectively suggest that the measurement model exhibits robust reliability and validity, supporting its use in subsequent analysis.
Variable |
Category |
Frequency |
Percentage (%) |
Age |
18-24 years |
126 |
45.65 |
25-34 years |
122 |
44.20 |
|
35-and above |
28 |
10.14 |
|
Gender |
Male |
140 |
50.72 |
Female |
132 |
47.83 |
|
Other / Prefer not to say |
4 |
1.45 |
|
AR app type |
Social Media |
76 |
27.54 |
Education |
32 |
11.59 |
|
Gaming |
28 |
10.14 |
|
|
Shopping |
140 |
50.72 |
Source: Author’s own
Table 3 reports the results of discriminant validity assessment using the Heterotrait-Monotrait (HTMT) ratio of correlations. All HTMT values are below the acceptable threshold of 0.85 (Henseler et al., 2015), indicating tthe constructs are empirically distinct from one another. Thereby establishing discriminant validity.
Table 2: Analysis of measurement model: Reliability and Validity
Construct |
Items |
Factor Loadings |
Cronbach's Alpha (α) |
Composite Reliability (CR) |
Average Variance Extracted (AVE) |
Perceived Personalization Value (PPV) |
PPV1 |
0.966 |
0.970 |
0.964 |
0.87 |
PPV2 |
0.902 |
||||
PPV3 |
0.921 |
||||
PPV4 |
0.942 |
||||
Perceived Privacy Risk (PPR) |
PPR1 |
0.857 |
0.943
|
0.867 |
0.621 |
PPR2 |
0.742 |
||||
PPR3 |
0.793 |
||||
PPR4 |
0.756 |
||||
Trust (TR) |
TR1 |
0.774 |
0.890 |
0.867 |
0.621 |
TR2 |
0.709 |
||||
TR3 |
0.906 |
||||
TR4 |
0.849 |
||||
Intention to Use (ITU) |
ITU1 |
0.952 |
0.965 |
0.831 |
0.831 |
ITU2 |
0.929 |
||||
ITU3 |
0.924 |
||||
ITU4 |
0.837 |
||||
Privacy Concern (PC) |
PC1 |
0.844 |
0.947 |
0.779 |
0.779 |
PC2 |
0.876 |
||||
PC3 |
0.896 |
||||
PC4 |
0.913 |
||||
KMO=.876, Significance value=0.003 |
Source: Ouput from python
The measurement model’s outcome (Table 4) demonstrated a favourable fit with all the indices i.e χ2 /df, GFI , CFI , TLI and RMSEA with values 2.201, 0.956, 0.962, 0.994 and0.021 respectively(Kline, 2015; Tabachnick and Fidell, 2012).
Table 3: Discriminant Validity (HTMT Ratio)
Construct |
ITU |
PC |
PPR |
PPV |
TR |
ITU |
1 |
||||
PC |
0.238 |
1 |
|||
PPR |
0.475 |
0.459 |
1 |
||
PPV |
0.162 |
0.206 |
0.263 |
1 |
|
TR |
0.285 |
0.270 |
0.659 |
0.237 |
1 |
Source: Output from python
While the indices values obtained for the structural model for χ2 /df , GFI, CFI, TLI and RMSEA were 2.221, 0.954 , 0.993 , 0.991 and 0.025 respectively. Thus, the model fit values support the acceptance of the proposed model.
Table 4: Goodness-of-Fit Indices for the Structural Model
Fit Index |
Recommended Value |
Measurment model |
Structural model |
Result |
χ²/df |
< 3.0 |
2.201 |
2.221 |
Good |
Standardized Root Mean Square Residual (SRMR) |
< 0.08 |
0.040 |
0.045 |
Good |
Goodness of Fit Index(GFI) |
> 0.90 |
0.956 |
0.954 |
Good |
Comparative Fit Index (CFI) |
> 0.90 |
0.995 |
0.993 |
Good |
Tucker-Lewis Index (TLI) |
> 0.90 |
0.994 |
0.991 |
Good |
Root Mean Square Error of Approximation (RMSEA) |
< 0.08 |
0.021 |
0.025 |
Good |
Source: output from python
Table 5 presents the results of the structural path analysis.All six hypothesized relationships were found to be statistically significant at p < 0.001, thereby supporting the proposed model. Privacy Concern had a strong positive effect on Perceived Privacy Risk (H1: β = 0.743, C.R. = 11.203), confirming that users with high privacy concerns are more likely to perceive risks when engaging with AR app/platform. In turn, Perceived Privacy Risk negatively influenced Trust (H2: β = -0.424, C.R. = -7.367), showing that when risks are salient, consumer trust is undermined. At the same time, Perceived Personalization Value exerted a strong positive effect on Trust (H3: β = 0.631, C.R. = 11.230), highlighting the role of personalization in building trust in AR app usage. Trust istrongly predicted Intention to Use (H4: β = 0.771, C.R. = 7.762), reinforcing its position as a strong strong predictor of usage intention. Moreover, Perceived Privacy Risk had a significant negative effect on Intention to Use (H5: β = -0.297, C.R. = -3.822), indicating that privacy concerns directly discourage AR usage. Conversely, Perceived Personalization Value positively influenced Intention to Use (H6: β = 0.547, C.R. = 6.213), showing that users are willing to tradeoff privacy concerns in lieu of personalized experience.
Table 5: Hypothesis Testing
Hypothesis |
Path |
Std. Estimate (β) |
S.E. |
C.R. |
p-value |
Result |
H1 |
PC → PPR |
0.743 |
0.066 |
11.203 |
*** |
Supported |
H2 |
PPR → TR |
-0.424 |
0.058 |
-7.367 |
*** |
Supported |
H3 |
PPV → TR |
0.631 |
0.056 |
11.230 |
*** |
Supported |
H4 |
TR → ITU |
0.771 |
00.099 |
7.762 |
*** |
Supported |
H5 |
PPR → ITU |
-0.297 |
0.077 |
-3.822 |
*** |
Supported |
H6 |
PPV → ITU |
0.547 |
0.088 |
6.213 |
*** |
Supported |
***p < 0.001
Source: output from python
The results of this study emperically support the personalization–privacy paradox in the context of augmented reality (AR) application/platforms. Specifically, the findings reveal a dual dynamic in consumer decision-making: while privacy concerns intensify perceived risks, thereby eroding trust and discouraging adoption, the perceived benefits of personalization enhance both trust and behavioral intention to use AR platforms.
The positive relationship between privacy concern and perceived privacy risk reinforces prior work in online privacy concern studies, which suggests that consumers who are more sensitive to personal data misuse are more likely to perceive risks in data-driven technologies(Roesner & Kohno, 2021). In AR contexts, these risks are more magnified as the type of data collectd by AR apps is more personal.This findings are consistant with the findings of Yap et al., 2021 which outline to ensure more usage of AR apps/platforms firms need to cater to privacy issues emerging with technology usage.
The study highlights that perceived privacy risk negatively influences trust and intention to use. These results align with results of Zhang, 2024 and Almaiah et al, 2023 which concluded that elevated perceptions of privacy risk hinders the user’s willinginess to adopt technology. In the AR context, this suggests that no matter how sophisticated the personalization features may be, if consumers perceive the platform as unsafe or intrusive, their trust in the platform declines and their likelihood of using the service is hampered. This emphasizes the fragility of consumer trust in digital ecosystems, particularly those operating in immersive environments where personal data collection is highly visible.
The results also demonstrate that perceived personalization value exerts a strong positive effect on both trust and intention to use. AR applications thrive on their ability to deliver hyper realistic experiences, such as virtual tryons, spatial positioning of digital products in physical spaces etc. When consumers perceive these personalized interactions as valuable, their willingness to accept potential privacy risks increases(Liu, & Tao,2022). This suggests that personalization can function as a counterweight to risk perceptions, shifting the cost–benefit calculation in favor of adoption.
Furthermore, user’s trust in AR app/platformemerges as the strongest predictor of intention to use. Users are more willing to disclose personal information if they believe the AR app/platform will handle it responsibly and transparently(Sun et al, 2025). This reinforces the view that trust is not just another variable in adoption models but a central enabler of digital engagement in privacy-sensitive contexts.
Taken together, these findings extend our understanding of the personalization–privacy paradox by showing how these opposing forces play out in the AR environment. Unlike traditional online platforms, AR applications operate in highly immersive and data-intensive ecosystems, making the trade-offs more visible and more consequential. The results suggest that AR adoption depends on firms’ ability to simultaneously manage consumer fears about data vulnerability and deliver personalization experiences that are both meaningful and tangible (Pandey & Pandey, 2025).
Finally, the study concludes that users are not passive victims of personalization or privacy threats but active evaluators of trade-offs(Malik, 2024). Their technology usage is shaped by weighing of risks and benefits. This reflects a broader cultural shift in customer engagment on online platforms where users assessse the value of personalization against the vulnerabilities of data privacy.
The study has many implications for managers. Technology managers need to minimize the privacy concerns of the user regarding AR powered apps/platforms by including transparency in the data sharing guidelines, simplyfying consent mechanism & security guidelines. Simultaneously disclosure regarding what data /type of data is required, its storage and further usage for providing hyper relevant AR experiences will further strengthen the trust of the users. Ultimately, managers who successfully balance privacy protection with meaningful personalization will not only mitigate consumer skepticism but also foster stronger adoption intentions, thereby sustaing long term user engagement.
While the study sheds light on the personalization–privacy paradox in AR applications, it faces certain limitations .First, data for the study was self reported and there was no control over the type of AR usage. Future studies can adopt experimental research design wherein the users can first provided AR experience and then data can be collected. Second, the sample was restricted to users who had engaged with AR applications for personal purchases within the past year. While this ensures relevance, it may limit generalizability to non-purchasing users or those in other contexts such as education, entertainment, or healthcare. Finally, incorporating different data like like eye tracking, economic status, technical knowledge etc can strengthen the robustness of findings. Finally, the research was confined to a single cultural and geographical settings, which may influence privacy perceptions and personalization preferences. Comparative studies across different cultural settings eg digital divide could provide deeper insights into how context moderates the personalization–privacy paradox.