Document Actions

You are here: FRIAS Fellows Fellows 2021/22 Prof. Dr. Evie Malaia

Prof. Dr. Evie Malaia

University of Alabama
Linguistic

External Senior Fellow
October 2017 - July 2018

Room 01 030

CV

Evie Malaia trained as a computational linguist and neuroscientist at Purdue University.  Since graduating in 2005 she has worked in academic research, and science and education policy.  Throughout her research career, she investigated different aspects of co-evolution of cognitive and linguistic processing, both in atypically developing populations (gifted, those on autism spectrum, and Deaf), and in Artificial Intelligence (AI) applications.  Her research uses complex systems and machine learning methods to model the co-development between cognition and linguistic ability in complex environment.  While in residence, Dr. Malaia will be collaborating with research groups in Freiburg, USA, and Austria on several projects concerned with acquisition and evolution of linguistic features as a result of brain-environment interaction, and their computational modeling using AI. 

Selected Publications

  • Malaia, E., Cockerham, D., Rublein, K. (2017). Visual integration of fear and anger emotional cues by children on autism spectrum and neurotypical peers: an EEG study. Neuropsychologia.
  • Malaia, E., Borneman, J.D., Wilbur, R.B. (2016). Assessment of information content in visual signal: analysis of optical flow fractal complexity. Visual Cognition, 24(3), 246-251.  
  • Malaia, E. (2014). It Still Isn't Over: Event Boundaries in Language and Perception. Language and Linguistics Compass, 8(3), 89-98. 
  • Malaia, E., Ranaweera, R., Wilbur, R.B., Talavage. T.M. (2012). Event segmentation in a visual language:  Neural bases of processing American Sign Language predicates. Neuroimage, 59(4), 4094-4101. 
  • Malaia, E., Wilbur, R., Weber-Fox, C. (2009). ERP evidence for telicity effects on syntactic processing in garden-path sentences. Brain and Language, 108(3), 145-158. 

FRIAS Project

Role of visual and linguistic complexity in language development

The proposed project will use the tools of information theory to test the hypothesis that information throughput capacity (ability to extract complex information from a linguisticallymeaningful visual signal) increases along the continuum ‘biological motion - gesture – sign language’ in both perception and production. The use of a visual language (sign language) in the analysis allows for a direct comparison between linguistic (sign language) and non-linguistic (gesture) means of communication with non-communicative human motion, while remaining within the same perceptual domain (vision). We then plan to develop convergent algorithms for analysis of information transfer based on biological signals across different data types (motion in gesture and sign language; neural activity data from fMRI). The long-term goal is to use these algorithms to investigate the influence of complex stimuli (i.e. visual language) on brain development during language acquisition.