2016 9th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI) 2016
DOI: 10.1109/cisp-bmei.2016.7852920
|View full text |Cite
|
Sign up to set email alerts
|

Interaction design in Augmented Reality on the smartphone

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…Table 2 lists the examples of sensory dimensions used in the extant literature. Based on prior research on human-computer interaction, especially in the fields related to computational intelligence, the widely adopted sensory interactions in smartphone-based AR systems include visual, haptic and movement sensations (Lv et al, 2016). Accordingly, in this study, we use image, motion and touchscreen interactions to reflect visual-, kinesthetic-and haptic-based sensation stimulations, respectively (e.g.…”
Section: Sensory Dimensions Of Ar Interaction Characteristicsmentioning
confidence: 99%
See 2 more Smart Citations
“…Table 2 lists the examples of sensory dimensions used in the extant literature. Based on prior research on human-computer interaction, especially in the fields related to computational intelligence, the widely adopted sensory interactions in smartphone-based AR systems include visual, haptic and movement sensations (Lv et al, 2016). Accordingly, in this study, we use image, motion and touchscreen interactions to reflect visual-, kinesthetic-and haptic-based sensation stimulations, respectively (e.g.…”
Section: Sensory Dimensions Of Ar Interaction Characteristicsmentioning
confidence: 99%
“…clothes) would appear on oneself II3: I was able to see how it looks when I wear the virtual objects (i.e. clothes) Motion interaction (Lv et al, 2016;Nizam et al, 2018) MI1: I can control the directions in virtual world by my movement MI2: I can place the angles of virtual objects with real-time movement MI3: I can point the virtual objects at specific direction by my movement Touchscreen interaction (Heller et al, 2019;Kinzinger et al, 2022) TI1: I can select the virtual objects using my hands/fingers TI2: I can move the virtual objects using my hands/fingers TI3: I can use my hands/fingers to interact with virtual products Multisensory experience (Gao and Lan, 2020) ME1: This online shopping environment mobilizes many of my senses ME2: This online shopping environment provides me with a lot of sensory stimulation ME3: This online shopping environment has few sensory elements Information overload (Chen et al, 2009;Ding et al, 2017) IO1: I often find that the information on online shopping platforms is too much and over-whelming IO2: There was too much information about the certain product on this e-store so that I was burdened in handling it IO3: I often find that there are so many choices that I don't want to make the efforts to compare and choose IO4: Because of the plenty information on this e-store, I felt difficult in acquiring all of this information Spatial presence (Smink et al, 2020) SP1: was similar to experiencing the products in reality SP2: felt like it was in reality SP3: was as realistic to me as in the real world Product uncertainty (Liu et al, 2021) PU1: I feel uncertain that I have fully understood everything I need to know about the product I have chosen PU2: I am uncertain that the product I have chosen will be the same as I expect it to be PU3: I feel that purchasing the products involves a high degree of uncertainty about the actual quality Intention to purchase (Fenko et al, 2016) After using the AR app, the chance that I will buy product from this AR app is IP1: Uncertain/certain IP2: Unlikely/likely IP3: Improbable/probable Table A1. Measurement…”
Section: Appendixmentioning
confidence: 99%
See 1 more Smart Citation