Manipulación de un brazo robot mediante el seguimiento de los ojos

Palabras clave: Eye tracking, robot manipulador, control, interfaz gráfica

Resumen

Actualmente hay poca literatura e investigaciones relacionadas con el seguimiento de ojos (Eye Tracking) para la manipulación de robots o dispositivos mecatrónicos. Los artículos que se tienen hoy en día están enfocadas al estudio del ojo humano especialmente a la medicina, marketing y juego; sin embargo, son pocas dedicadas al control o manipulación de la tecnología (domótica, robots móviles: terrestres, aéreos o acuáticos, brazos robóticos, dispositivos mecatrónicos, entre otros). Por lo anterior, este artículo se centra en el diseño y construcción de un dispositivo mecatrónico para registrar el seguimiento ocular del ser humano con la finalidad de manipular un brazo robots de tres grados de libertad; y el desarrollo de una interfaz gráfica hombre-máquina amigable para visualizar el ojo humano y los comandos para la manipulación del robot.

Descargas

La descarga de datos todavía no está disponible.

Biografía del autor/a

Jorge Gudiño-Lau, Universidad de Colima

Profesor-Investigador por la Facultad de Ingeniería Electromecánica de la Universidad de Colima

Cesar Llamas-Woodward, Universidad de Colima

Estudiante de noveno semestre de la Carrera Ingeniero en Tecnología Electrónicas por la Facultad de Ingeniería Electromecánica de la Universidad de Colima

Janeth Alcalá-Rodríguez, Universidad de Colima

Profesora-Investigadora por la Facultad de Ingeniería Electromecánica de la Universidad de Colima

Alejandro Jarillo-Silva, Universidad de la Sierra Sur

Profesor-Investigador por la Facultad de Ingeniería Electromecánica de la Universidad de Colima

Miguel Duran-Fonseca, Universidad de Colima

Profesor-Investigador por la Facultad de Ingeniería Electromecánica de la Universidad de Colima

Citas

Carelli, L.; Solca, F.; Tagini, S.; Torre, S.; Verde, F.; Ticozzi, N.;Ferrucci, R.; Pravettoni, G.; Aiello, E.N.; Silani, V. (2022). Gaze-ContingentEye-Tracking Training in Brain Disorders: A Systematic Review. Brain Sci. https://doi.org/10.3390/brainsci12070931

Poletti, B.; Carelli, L.; Solca, F.; Lafronza, A.; Pedroli, E.; Faini, A.; Zago, S.; Ticozzi, N.; Ciammola, A.; Morelli, C. (2017). An eye-tracking controlled neuropsychological battery for cognitive assessment in neurological diseases. Neurol. Sci. 38, 595–603.

Tao, L.;Wang, Q.; Liu, D.;Wang, J.; Zhu, Z.; Feng, L. (2020). Eye tracking metrics to screen and assess cognitive impairment in patients with neurological disorders. Neurol. Sci. 41. 1697–1704.

Syamimi Shamsuddin, Hanafiah Yussof, Luthffi Ismail, Fazah Akhtar Hanapiah, SalinaMohamed, Hanizah Ali Piah, and Nur Ismarrubie Zahari. (2012)Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO. In Proceedings of the 2012 IEEE 8th International Colloquium on Signal Processing and Its Applications. 188–193.

Kerstin Fischer. Interpersonal variation in understanding robots as social actors. (2011). In Proceedings of the 6th International Conference on Human-Robot Interaction. ACM, 53–60.

Fumihide Tanaka and Shizuko Matsuzoe. (2012). Children teach a care-receiving robot to promote their learning: Field experiments in a classroom for vocabulary learning. Journal of Human-Robot Interaction.

Henny Admoni, Christopher Datsikas, and Brian Scassellati. (2014) Speech and gaze conflicts in collaborative human-robot interactions. In Proceedings of the 36th Annual Conference of the Cognitive Science Society (CogSci’14).

Chaoran Liu, Carlos T. Ishi, Hiroshi Ishiguro, and Norihiro Hagita.(2012). Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In Proceeding of the 2012 7th ACM/IEEE International Conference Human-Robot Interaction (HRI). IEEE, 285–292.

B. Mutlu, T. Shiwa, T. Kanda, H. Ishiguro, and N. Hagita. (2009) Footing in human-robot conversations: How robots might shape participant roles using gaze cues. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction. ACM, 61–68.

Rich, Charles, Brett Ponsler, Aaron Holroyd, and Candace L. Sidner. (2010). Recognizing engagement in human-robot interaction. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 375–382.Staudte , Maria Staudte andMatthew

W. Crocker. (2011). Investigating joint attention mechanisms through spoken human–robot interaction. Cognition 120, 2 (2011), 268–291.

Tanaka, Y.; Kanari, K.; Sato, M. (2021). Interaction with virtual objects through eye-tracking. Int. Workshop Adv. Image Technol. 2021.

Fuchs, S.; Belardinelli, A. Gaze-Based. (2021). Intention Estimation for Shared Autonomy in Pick-and-Place Tasks. Front. Neurorobotics 2021. 2021.

Bai, K.;Wang, J.;Wang, (2020). H. Study on Fixation Effect of Human Eye to Calibration Interface. Trans. Beijing Inst. Technol. 2020.1195–1202.

Eid, M.A.; Giakoumidis, N.; El Saddik, A. (2016). A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study with a Person with ALS. IEEE- 4, 558–573.

Al-Rahayfeh, A.; Faezipour, M. (2013). Eye Tracking and Head Movement Detection: A State-of-Art Survey. IEEE J. Transl. Eng. Health Med. 2013.

Voznenko, T.I.; Chepin, E.V.; Urvanov, G.A. (2017). The Control System Based on Extended BCI for a Robotic Wheelchair. In Proceedings of the 8th Annual International Conference on Biologically Inspired Cognitive Architectures, BICA, Moscow, Russia, 1–5.

Hua, B.; Hossain, D.; Capi, G.; Jindai, M. (2016). Yoshida, I. Human-like Artificial Intelligent Wheelchair Robot Navigated by Multi-Sensor Models in Indoor Environments and Error Analysis. In Proceedings of the 2016 IEEE International Symposium on Robotics and Intelligent Sensors, IRIS, Tokyo, Japan, 17–20.

Chatterjee, S.; Roy, S. (2021). A low-cost assistive wheelchair for handicapped & elderly people. Ain Shams Eng. J., 12, 3835–3841.

Cojocaru, D.; Manta, L.F.; Vladu, I.C.; Dragomir, A.; Mariniuc, A.M. (2019). Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement. In Proceedings of the 23rd International Conference on System Theory, Control and Computing (ICSTCC). Sinaia, Romania, 9–11

Cojocaru, D.; Manta, L.F.; Pan˘a, C.F.; Dragomir, A.; Mariniuc, A.M.; Vladu, I.C. (2022). The Design of an Intelligent Robotic Wheelchair Supporting People with Special Needs, Including for Their Visual System. Healthcare. https://doi.org/10.3390/healthcare10010013

Poole, A. and Ball, L. J. (2016). Eye tracking in HCI and usability research. Encyclopedia of human computer interaction. 211-219.

Duchowski- (2017). A. Eye tracking methodology: Theory and practice.

Tobii (2022). https://www.tobiipro.com/product-listing/tobii-pro-glasses-3/

Martinez-Marquez, D.; Pingali, S.; Panuwatwanich, K.; Stewart, R.A.; Mohamed, S. (2021). Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review. Sensors.

Xu, B.; Li,W.; Liu, D.; Zhang, K.; Miao, M.; Xu, G.; Song, (2022). A. Continuous Hybrid BCI Control for Robotic Arm Using Noninvasive Electroencephalogram, Computer Vision, and Eye Tracking. Mathematics 2022, 10, 618. https://doi.org/10.3390/math10040618.

Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M. (2021) Gasparetto, A.Human–Robot Interaction through Eye Tracking for Artistic Drawing. Robotics, 10, 54. https://doi.org/10.3390/robotics10020054

Publicado
2022-11-11
Cómo citar
Gudiño-Lau, J., Llamas-Woodward, C., Alcalá-Rodríguez, J., Charre-Ibarra, S., Jarillo-Silva, A., & Duran-Fonseca, M. (2022). Manipulación de un brazo robot mediante el seguimiento de los ojos. Pädi Boletín Científico De Ciencias Básicas E Ingenierías Del ICBI, 10(Especial5), 114-120. https://doi.org/10.29057/icbi.v10iEspecial5.10105

Artículos más leídos del mismo autor/a

1 2 > >>