Sentiment analysis using machine learning: Applied to job interviews
DOI:
https://doi.org/10.37467/gka-revtechno.v8.2116Keywords:
Machine learningAbstract
In this work, a sentiment analysis model applied to job interviews using machine learning is proposed. A register of gaze fixations was made with "Eye Tracking" techniques. Subsequently, different algorithms of machine learning for sentiment analysis were analyzed, selecting supervised machine learning with Artificial neural networks. Once the model is obtained, it can be applied to job interviews for the staff pick in the organizations, through the interpretation of the eye accessing cues. The job interview is an important process in the staff pick with multiple purposes, including evaluating personality.
References
Alzate, J., Rocha, R., & Jimenez, J. (2016). Monitoreo ocular para predecir las maneras de aprender de un estudiante.
Anta, J. (2012). Detección Del Engaño: Polígrafo Vs Análisis Verbo-Corporal.
Baecchi, C., Uricchio, T., Bertini, M., & Del Bimbo, A. (2015). A Multimodal Feature Learning Approach for Sentiment Analysis of Social Network Multimedia.
Chen, X., Wang, Y., & Liu, Q. (2017). Visual and textual sentiment analysis using deep fusion convolutional neural networks.
Chica, H., Escobar, F. & Folino, J. (2005). La Entrevista Psiquiátrica Del Sujeto Simulador.
Chinsatit, W., & Saitoh, T. (2017). CNN-Based Pupil Center Detection for Wearable Gaze Estimation System.
Dessler, G. (2009). Administración de recursos humanos, enfoque latinoamericano. Ed. Pearseon, 5ª Edition, 145-146
Duy, T. & Tran, D. (2012). Emotion Recognition Using the Emotiv EPOC Device.
Florea, L., Florea, C., Vrânceanu, R., Vertan, C. (2013). Can Your Eyes Tell Me How You Think? A Gaze Directed Estimation of the Mental Activity.
Jain, S., & Pandey, A. (2013). Soft Computing, Artificial Intelligence, Fuzzy Logic & Genetic Algorithm in Bioinformatics.
Kron, L. (2016). Polygraph And Reliability In Psychological Assessment: Myth Or Reality?.
Lotito, F. (2015). Test psicológicos y entrevistas: usos y aplicaciones claves en el proceso de selección e integración de personas a las empresas.
Macan, T. (2009). The employment interview: A review of current studies and di-rections for future research.
Medhat, W., Hassan, A., & Korashy, H. (2014). Sentiment analysis algorithms and applications: A survey.
Mitchell, T. (1997), Machine Learning, McGraw-Hill, 14.
P. T. Costa Jr. and R. R. McCrae, Revised NEO Personality Inventory (NEO-PI-R) and NEO Five-Factor Inventory (NEO-FFI) Manual. Odessa, FL: Psychological Assessment Resources, 1992.
Papoutsaki, A., Sangkloy, P., &Laskey, J. (2016). WebGazer: Scalable Webcam Eye Tracking Using User Interactions.
Pérez, V., Mihalcea, R., & Morency, L. (2013). Utterance-Level Multimodal Senti-ment Analysis.
Petisco, J. (2015). Una Mirada A La Detección De Mentiras Empleando Fmri.
Poole, A. & Ball, Linden (2006). Eye Tracking in Human-Computer Interaction and Usability Research: current status and future prospects.
Poria, S., Peng, H., Hussain, A., Howard, N., & Cambria, E. (2017). Ensemble appli-cation of convolutional neural networks and multiple kernel learning for multimodal sentiment analysis.
Rauthmann, F., Seubert, C., Sachse, P., & Furtner, M. (2012). Eyes as windows to the soul: Gazing behavior is related to personality.
Soleymani, M., Garcia, D., Jouc, B., Schullere, B., Chang, S., & Pantic, M. (2017). A survey of multimodal sentiment analysis, Image and Vision Computing.
Tensorflow (2018). Presentación. Recuperado de https://www.tensorflow.org.
Zadeh, A., & Chen, M. (2017). Tensor Fusion Network for Multimodal Sentiment Analysis.
Downloads
Published
How to Cite
Issue
Section
License
Those authors who publish in this journal accept the following terms:
- Authors will keep the moral right of the work and they will transfer the commercial rights.
- After 1 year from publication, the work shall thereafter be open access online on our website, but will retain copyright.
- In the event that the authors wish to assign an Creative Commons (CC) license, they may request it by writing to publishing@eagora.org