Research

Research Overview

Research in the Neuroplasticity and Language Lab (NLL) focuses on the role of early neuroplasticity on language and brain development. We have been looking at the longitudinal development, ultimate attainment, brain anatomical structures, and real-time processing of American Sign Language (ASL) and Chinese Sign Language (CSL) syntactic structures by deaf and hearing signers with varying early language environments. Other interests in the lab include (bimodal) bilingual language development and processing, the role of sign language modality on language processing and perceptual learning, and the interaction between language and cognitive development. You can read more about our ongoing and finished projects and access the research output below.


Early language experience shapes deaf language and brain development 

Behavioral studies:

In a series of experiments, we examined the sentence processing strategies used by post-childhood first language (L1) signers of ASL when comprehending simple transitive events with absurd meanings, e.g. when an apple bites a man. We found that the use of basic word order is not always robust among post-childhood L1 signers, especially when it conflicts with other non-structural cues such as animacy or event plausibility. Furthermore, some deaf signers with early exposure to signing systems (e.g., Signed Exact English, Manually Coded English) also experience difficulties at the basic clausal level.
 

Cheng, Q. (2024, Nov). American Sign Language transitive sentence comprehension strategies by deaf English-ASL bilinguals: the role of early language environment. Talk presentation at the 49th Boston University Conference on Language Development. 

Cheng, Q., & Mayberry, R. I. (under review). American Sign Language basic clause comprehension strategies used by late first-language learners: Plausibility, animacy, and linguistic structure.

Mayberry, R. I., Hatrak, M., Ilbasaran, D., Cheng, Q., Huang, Y., & Hall, M. L. (2024). Impoverished language in early childhood affects the development of complex sentence structure. Developmental Science27(1), e13416. [link]

Cheng, Q., & Zeng, J. (2023). Grammatical attainment by deaf English learners with and without early sign language support as compared to hearing L2 learners. Proceedings of the 47th Boston University Conference on Language Development. Somerville, MA: Cascadilla Press. [pdf]

Cheng, Q., & Mayberry, R. I. (2021). When event knowledge overrides word order in sentence comprehension: Learning a first language after childhood. Developmental Science, e13073. [link] [pdf]

Cheng, Q., & Mayberry, R. I. (2019). Acquiring a first language in adolescence: the case of basic word order in American Sign Language. Journal of Child Language46(2), 214-240. [link][PubMed] [ICSCL poster]

Neural studies:

Using Diffusion Tensor Imaging (DTI) and structural Magnetic Resonance Imaging (MRI), we examined the brain anatomical features when early language experience is severely impoverished. We found decreased connectivity in the left arcuate fasciculus, a fiber bundle connecting Broca’s area and Wernicke’s area (Cheng et al. 2019). In a different study, we found that a number of core language-relevant regions, including the left Broca’s and the left Middle Temporal Gyrus, showed reduced volume and/or cortical thickness, as a function of longer delay in ASL onset (Cheng et al. 2023).
 

Cheng, Q., Roth, A., Halgren, E., Klein, D., Chen, J. K., & Mayberry, R. I. (2023). Restricted language access during childhood affects adult brain structure in selective language regions. Proceedings of the National Academy of Sciences120(7), e2215423120. [open access link] [SNL poster]

Cheng, Q., Roth, A., Halgren, E., & Mayberry, R. I. (2019). Effects of early language deprivation on brain connectivity: Language pathways in deaf native and late first-language learners of American Sign Language. Frontiers in Human Neuroscience, 13, 320. [open access link][SNL poster]


ASL visual mismatch responses (vMMR) using MEG

MEG machine
Auditory mismatch responses (aMMR) have been used in spoken language to examine automatic detections of linguistic anomalies/changes at phonetic, phonological, lexical, and morpho-syntactic levels. To date, very few studies have addressed similar questions for sign languages using visual MMR (vMMR). Comparing the localization of lexical MMR effects in spoken and sign languages can provide further insights on cross-modal neural mechanisms during lexical access. In the current study, we aim to examine ASL lexical mismatch response in ASL and to localize the vMMR responses using MEG. This study is in collaboration with Dr. Christina Zhao (SPHSC, UW).
 

Cheng, Q., Zhang, Y., Cheng, T., & Zhao, C. (2024, April). Examining MEG visual mismatch responses to American Sign Language by hearing signers and non-signers. Poster presentation at the Cognitive Neuroscience Society (CNS) 2024 Annual Meeting, Toronto, Canada. [poster]


Chinese Sign Language (CSL) word order

Using both production and comprehension methods, we are examining the word order patterns used by deaf native and late CSL signers in Shanghai and Jiaxing, China. This study is in collaboration with Dr. Hao Lin (Linguistics, Shanghai International Studies University).
 

 

Zhang, Y., Lin, H., & Cheng, Q. (2024, Nov). Structure flexibility in description of transitive events among native and late CSL signers. Talk presentation at the 49th Boston University Conference on Language Development. 

Cheng, Q., Hao, L., Snedeker, J. (2023, Mar). Word order in illiterate and semi-illiterate deaf signers with late onset of Chinese Sign Language. Poster presentation at the 36th Annual Conference on Human Sentence Processing, Pittsburgh, PA. [poster]

Lin, H., Cheng, Q., & Snedeker, J. (in prep). Word order in Chinese Sign Language.

 


(Bimodal) Bilingual cognitive control

 

苹果 píngguǒ, Apple in Chinese
In collaboration with Dr. Chuchu Li (Psychiatry, UCSD), we are exploring the role of cognitive control during bilingual language switching among Chinese-English, Korean-English, and English-ASL bilinguals, with a focus on the different cognitive control mechanisms involved in unimodal vs. bimodal language switching. 
 

Li, C., & Cheng, Q. (in revision). Language cues during perception and contexts affect language switching from comprehension to production.


Resolving syntactic-semantic conflicts during written sentence comprehension by deaf and L2 readers

 
Combining comprehension judgments and self-paced reading, we examine the processing mechanisms by hearing proficient Chinese readers and deaf readers with insufficient early language experience. The findings suggest that deaf readers in general endure more processing burden while resolving conflicting syntactic and semantic cues, and the increased processing burden may then contribute to an overall tendency to rely on the semantic cues. We also designed an experiment with intertwoven flanker and sentence comprehension tasks to explore the relationship between cognitive control and L2 processing efforts during English sentence comprehension.
 

Cheng, Q., & Oh, Y, (in prep). Does dynamic cognitive control engagement reduce L2 shallow processing? Evidence from an interwoven flanker and self-paced sentence reading task. 

Cheng, Q., Yan, X., Yang, L., & Lin, H. (2024). Resolving syntactic-semantic conflicts: comprehension and processing patterns by deaf Chinese readers. Journal of Deaf Studies and Deaf Education, enae008. [paper link] [HSP slides]


Other selected publications & collaborative projects

Lnu, A., Hauptman, M., Sampson, M., Cheng, Q., & Bedny, M. (2024, Oct). Fronto-temporal language network highly selective for sign language relative to action semantics, regardless of iconicity. Poster presentation at the 16th Annual Meeting of the Society for the Neurobiology of Language (SNL).

Ghosh, M., Lowe-Hines, S., Crandall, A., Cheng, Q., Weaver, K., Ko, A., Ojemann, J., Grannan, B. (2024, Oct). Network interactions mediate one-shot learning in the human brain. Poster presentation at the 54th Annual Meeting of Society for Neuroscience, Chicago, IL.

Cheng, Q., Silvano, E., & Bedny, M. (2020). Sensitive periods in cortical specialization for language: Insights from studies with deaf and blind individuals. Current opinion in behavioral sciences36, 169-176. [link] [full text access]

Caballero, G., & Cheng, Q. (2020). Person marking in Ja’a Kumiai (Yuman). Amerindia, 42, 23-47.