[1] Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & MacIntyre, B. (2001). Recent advanced in augmented reality. IEEE Computer Graphics and Applications, 21(6), 34-47.
[2] Gun A. Lee and Mark Billinghurst. 2013. Building mobile AR applications using the outdoor AR library. In SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications (SA '13). ACM, New York, NY, USA, , Article 81 , 1 pages. DOI=http://dx.doi.org/10.1145/2543651.2543662
[3] Kato, Hirokazu, and Mark Billinghurst. "Marker tracking and hmd calibration for a video-based augmented reality conferencing system." Augmented Reality, 1999.(IWAR'99) Proceedings. 2nd IEEE and ACM International Workshop on. IEEE, 1999
[4] Dong Woo Seo and Jae Yeol Lee. 2013. Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Expert Syst. Appl. 40, 9 (July 2013), 3784-3793. DOI=http://dx.doi.org/10.1016/j.eswa.2012.12.091
[5] Michael R. Marner, Ross T. Smith, James A. Walsh, and Bruce H. Thomas, Spatial User Interfaces for Large-Scale Projector-Based Augmented Reality, IEEE Computer Graphics and Applications ( Volume: 34, Issue: 6, Nov.-Dec. 2014 )
[6] Eve Hoggan, Stephen A. Brewster, and Jody Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touch screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 1573-1582. DOI=http://dx.doi.org/10.1145/1357054.1357300
[7] Kenrick Kin, Maneesh Agrawala, and Tony DeRose. 2009. Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation. In Proceedings of Graphics Interface 2009 (GI '09). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 119-124.
[8] Izadi, Shahram, et al. "KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera." Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 2011.
[9] Newcombe, Richard A., et al. "KinectFusion: Real-time dense surface mapping and tracking." Mixed and augmented reality (ISMAR), 2011 10th IEEE international symposium on. IEEE, 2011
[10] Basori, AH, Qasim AZ. Extreme expression of sweating in 3D virtual human. Computers in Human Behavior. 2014 Jan 1;35:307-314. Available from, DOI: 10.1016/j.chb.2014.03.013
[11] Basori, A.H., et al., The feasibility of human haptic emotion as a feature to enhance interactivity and immersiveness on virtual reality game, in Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. 2008, ACM: Singapore. p. 1-2.
[12] Alkawaz MH, et al., "Oxygenation absorption and light scattering driven facial animation of natural virtual human", Multimedia Tools and Applications,pp 1-37. DOI 10.1007/s11042-016-3564-2
[13] Mohammed Hazim Alkawaz, Ahmad Hoirul Basori, Dzulkifli Mohamad, and Farhan Mohamed, "Realistic Facial Expression of Virtual Human Based on Color, Sweat, and Tears Effects," The Scientific World Journal, vol. 2014, Article ID 367013, 9 pages, 2014. doi:10.1155/2014/367013
[14] Alkawaz, M.H., Basori, A.H. & Mohd Hashim, S.Z. Multimed Tools Appl (2016). doi:10.1007/s11042-016-3564-2
[15] Ahmad Hoirul Basori," Emotion Walking for Humanoid Avatars Using Brain Signals", International Journal of Advanced Robotic Systems, Vol.10, http://journals.sagepub.com/doi/abs/10.5772/54764, doi:10.5772/54764
[16] Muhamed Abdul kareem Ahmed, Ahmad Hoirul Basori, "The Influence of Beta Signal toward Emotion Classification for Facial Expression Control through EEG Sensors ", Procedia Social and Behavioral Science, Elsevier, 6 Nov 2013, DOI: 10.1016/j.sbspro.2013.10.294
[17] Basori. Ahmad Hoirul, Bade. A, Sunar.M.S.,Daman.D, Saari.N, Hj.Salam,MD.S (2012). An integration Framework of Haptic Feedback to Improve Facial Expression, International Journal of Innovative Computing, Information and Control (IJICIC)Vol.8, No.11, November 2012
[18] Nazreen Abdullasim, Ahmad Hoirul Basori, Md Sah Hj Salam, Abdullah Bade,"Velocity Perception: Collision Handling Technique for Agent Avoidance Behavior",Telkomnika, Vol. 11, No. 4, April 2013, pp. 2264 ~ 2270
[19] Yusman Azimi Yusoff , Ahmad Hoirul Basori, Farhan Mohamed, "Interactive Hand and Arm Gesture Control for 2D Medical Image and 3D Volumetric Medical Visualization", Procedia Social and Behavioral Science, Vol.97, 6 November 2013, Pages 723-729, Elsevier
[20] Mohammad Riduwan Suroso, Ahmad Hoirul Basori, Farhan Mohamed, "Finger-based Gestural Interaction for Exploration of 3D Heart Visualization", Procedia Social and Behavioral Science, Vol.97, 6 November 2013, Pages 684-690, Elsevier
[21] John Hardy and Jason Alexander. 2012. Toolkit support for interactive projected displays. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia (MUM '12). ACM, New York, NY, USA,Article 42 , 10 pages. DOI=http://dx.doi.org/10.1145/2406367.2406419.
[22] Ahmad Hoirul Basori, Fadhil Noer Afif, Abdulaziz S. Almazyad, Hamza Ali Abujabal, Amjad Rehman, and Mohammed Hazim Alkawaz. 2015. Fast Markerless Tracking for Augmented Reality in Planar Environment. 3D Res. 6, 4, Article 72 (December 2015), 11 pages. DOI=http://dx.doi.org/10.1007/s13319-015-0072-5
[2] Gun A. Lee and Mark Billinghurst. 2013. Building mobile AR applications using the outdoor AR library. In SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications (SA '13). ACM, New York, NY, USA, , Article 81 , 1 pages. DOI=http://dx.doi.org/10.1145/2543651.2543662
[3] Kato, Hirokazu, and Mark Billinghurst. "Marker tracking and hmd calibration for a video-based augmented reality conferencing system." Augmented Reality, 1999.(IWAR'99) Proceedings. 2nd IEEE and ACM International Workshop on. IEEE, 1999
[4] Dong Woo Seo and Jae Yeol Lee. 2013. Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Expert Syst. Appl. 40, 9 (July 2013), 3784-3793. DOI=http://dx.doi.org/10.1016/j.eswa.2012.12.091
[5] Michael R. Marner, Ross T. Smith, James A. Walsh, and Bruce H. Thomas, Spatial User Interfaces for Large-Scale Projector-Based Augmented Reality, IEEE Computer Graphics and Applications ( Volume: 34, Issue: 6, Nov.-Dec. 2014 )
[6] Eve Hoggan, Stephen A. Brewster, and Jody Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touch screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 1573-1582. DOI=http://dx.doi.org/10.1145/1357054.1357300
[7] Kenrick Kin, Maneesh Agrawala, and Tony DeRose. 2009. Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation. In Proceedings of Graphics Interface 2009 (GI '09). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 119-124.
[8] Izadi, Shahram, et al. "KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera." Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 2011.
[9] Newcombe, Richard A., et al. "KinectFusion: Real-time dense surface mapping and tracking." Mixed and augmented reality (ISMAR), 2011 10th IEEE international symposium on. IEEE, 2011
[10] Basori, AH, Qasim AZ. Extreme expression of sweating in 3D virtual human. Computers in Human Behavior. 2014 Jan 1;35:307-314. Available from, DOI: 10.1016/j.chb.2014.03.013
[11] Basori, A.H., et al., The feasibility of human haptic emotion as a feature to enhance interactivity and immersiveness on virtual reality game, in Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. 2008, ACM: Singapore. p. 1-2.
[12] Alkawaz MH, et al., "Oxygenation absorption and light scattering driven facial animation of natural virtual human", Multimedia Tools and Applications,pp 1-37. DOI 10.1007/s11042-016-3564-2
[13] Mohammed Hazim Alkawaz, Ahmad Hoirul Basori, Dzulkifli Mohamad, and Farhan Mohamed, "Realistic Facial Expression of Virtual Human Based on Color, Sweat, and Tears Effects," The Scientific World Journal, vol. 2014, Article ID 367013, 9 pages, 2014. doi:10.1155/2014/367013
[14] Alkawaz, M.H., Basori, A.H. & Mohd Hashim, S.Z. Multimed Tools Appl (2016). doi:10.1007/s11042-016-3564-2
[15] Ahmad Hoirul Basori," Emotion Walking for Humanoid Avatars Using Brain Signals", International Journal of Advanced Robotic Systems, Vol.10, http://journals.sagepub.com/doi/abs/10.5772/54764, doi:10.5772/54764
[16] Muhamed Abdul kareem Ahmed, Ahmad Hoirul Basori, "The Influence of Beta Signal toward Emotion Classification for Facial Expression Control through EEG Sensors ", Procedia Social and Behavioral Science, Elsevier, 6 Nov 2013, DOI: 10.1016/j.sbspro.2013.10.294
[17] Basori. Ahmad Hoirul, Bade. A, Sunar.M.S.,Daman.D, Saari.N, Hj.Salam,MD.S (2012). An integration Framework of Haptic Feedback to Improve Facial Expression, International Journal of Innovative Computing, Information and Control (IJICIC)Vol.8, No.11, November 2012
[18] Nazreen Abdullasim, Ahmad Hoirul Basori, Md Sah Hj Salam, Abdullah Bade,"Velocity Perception: Collision Handling Technique for Agent Avoidance Behavior",Telkomnika, Vol. 11, No. 4, April 2013, pp. 2264 ~ 2270
[19] Yusman Azimi Yusoff , Ahmad Hoirul Basori, Farhan Mohamed, "Interactive Hand and Arm Gesture Control for 2D Medical Image and 3D Volumetric Medical Visualization", Procedia Social and Behavioral Science, Vol.97, 6 November 2013, Pages 723-729, Elsevier
[20] Mohammad Riduwan Suroso, Ahmad Hoirul Basori, Farhan Mohamed, "Finger-based Gestural Interaction for Exploration of 3D Heart Visualization", Procedia Social and Behavioral Science, Vol.97, 6 November 2013, Pages 684-690, Elsevier
[21] John Hardy and Jason Alexander. 2012. Toolkit support for interactive projected displays. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia (MUM '12). ACM, New York, NY, USA,Article 42 , 10 pages. DOI=http://dx.doi.org/10.1145/2406367.2406419.
[22] Ahmad Hoirul Basori, Fadhil Noer Afif, Abdulaziz S. Almazyad, Hamza Ali Abujabal, Amjad Rehman, and Mohammed Hazim Alkawaz. 2015. Fast Markerless Tracking for Augmented Reality in Planar Environment. 3D Res. 6, 4, Article 72 (December 2015), 11 pages. DOI=http://dx.doi.org/10.1007/s13319-015-0072-5
- Abstract viewed - 1401 times
- PDF downloaded - 1112 times
Affiliations
Ahmad Hoirul Basori
Faculty of Computing and Information Technology Rabigh, King Abdulaziz University, Kingdom of Saudi Arabia
Hani Moaiteq Abdullah AlJahdali
King Abdulaziz University, Kingdom of Saudi Arabia
TOU-AR:Touchable Interface for Interactive Interaction in Augmented Reality Environment
Abstract
Touchable interface is one of the future interfaces that can be implemented at any medium such as water, table or even sand. The word multi touch refers to the ability to distinguish between two or more fingers touching a touch-sensing surface, such as a touch screen or a touch pad. This interface is provided tracking the area by using depth camera and projected the interface into the medium. This interface is widely used in augmented reality environment. User will project the particular interface into real world medium and user hand will be tracked simultaneously when touching the area. User can interact in more freely ways and as natural as human did in their daily life