Projects

“Relational discrimination in non-human primates”

A plethora of research exists on the understanding of relations between associated concepts or objects, but we as a species lack sufficient explanation for how this ability for abstract thought became possible. This study explores such questions by providing touch-screen games to young children and non-human primates.


Project Humans

A tool being developed through the Language and Learning Lab and led by Dr. Jon Willits, this project seeks to make the integration of neural networks into a simulated 3D environment like that facilitated by Unity more approachable to non-programmers.


“Evaluating the contribution of visual feature heterogeneity in conjunction search performance”

In recent years research at Vision Lab has been focused on the parallel processing in visual search and the variability in accumulation of information on non-target distractors to direct attention to the goal stimuli. This work, as a continuation of (Lleras, Wang, Ng, Ballew, Xu, & Buetti, 2020) seeks to investigate the visual search response time in heterogeneous displays when predicted by the distractors’ respective homogeneous search trends.


“Benefits of meditative practice in Virtual Reality environments on mood, stress, and cognitive functions in non-expert mediators”

While evidence exists to support the claim that directed meditation can improve mindfulness metrics and overall cognitive function, the benefit remains difficult to achieve for those less inclined to dedicate time to meditative practice. Studies like that from (Navarro-Haro et al, 2017) show that meditation in VR can encourage mindfulness and a greater sense of immersion, but this project aims to contribute to literature supporting the use of meditation in VR to improve cognitive performance in attention tasks like visual search.