Dobsonfry0819
A higher phone dependence was observed in females compared to males. Supported by these results, we propose a model of how smartphone dependence can be linked to aberrations in brain networks, sex, sleep disturbances, and depression in adolescents.We have previously shown invasive vagus nerve stimulation to improve attention and working memory and alter emotion-attention interaction in patients with refractory epilepsy, suggesting that VNS might be useful in the treatment of cognitive impairment. The current research focuses on whether non-invasive, transcutaneous vagus nerve stimulation (tVNS) has similar effects to VNS. Furthermore, we aimed to assess whether tVNS has an impact on cognitive control in general or on underlying brain physiology in a task that mimics everyday life demands where multiple executive functions are engaged while encountering intervening emotional stimuli. Event-related potentials (ERP) evoked in such a task, specifically centro-parietal P3 and frontal N2 were used as biomarkers for attention allocation and cognitive control required to carry out the task. A single-blinded, sham-controlled, within-subject study on healthy subjects (n = 25) was conducted using Executive Reaction Time Test (RT-test), a Go/NoGo task engaging multial along with unaltered task performance during tVNS suggests fewer cognitive control resources were required to successfully withhold a prepotent response. Though caution is warranted, we suggest that tVNS may lead to more efficient neural processing with fewer resources needed for successful cognitive control, providing promise for its potential use in cognitive enhancement.It has long been hypothesized that pretend play is beneficial to social and cognitive development. However, there is little evidence regarding the neural regions that are active while children engage in pretend play. We examined the activation of prefrontal and posterior superior temporal sulcus (pSTS) regions using near-infrared spectroscopy while 42 4- to 8-year-old children freely played with dolls or tablet games with a social partner or by themselves. Social play activated right prefrontal regions more than solo play. Children engaged the pSTS during solo doll play but not during solo tablet play, suggesting they were rehearsing social cognitive skills more with dolls. These findings suggest social play utilizes multiple neural regions and highlight how doll play can achieve similar patterns of activation, even when children play by themselves. Doll play may provide a unique opportunity for children to practice social interactions important for developing social-emotional skills, such as empathy.This paper addresses how impairments in prediction in young adults with autism spectrum disorder (ASD) relate to their behavior during collaboration. To assess it, we developed a task where participants play in collaboration with a synthetic agent to maximize their score. The agent's behavior changes during the different phases of the game, requiring participants to model the agent's sensorimotor contingencies to play collaboratively. Our results (n = 30, 15 per group) show differences between autistic and neurotypical individuals in their behavioral adaptation to the other partner. Contrarily, there are no differences in the self-reports of that collaboration.The attended speech stream can be detected robustly, even in adverse auditory scenarios with auditory attentional modulation, and can be decoded using electroencephalographic (EEG) data. Speech segmentation based on the relative root-mean-square (RMS) intensity can be used to estimate segmental contributions to perception in noisy conditions. High-RMS-level segments contain crucial information for speech perception. Hence, this study aimed to investigate the effect of high-RMS-level speech segments on auditory attention decoding performance under various signal-to-noise ratio (SNR) conditions. Scalp EEG signals were recorded when subjects listened to the attended speech stream in the mixed speech narrated concurrently by two Mandarin speakers. The temporal response function was used to identify the attended speech from EEG responses of tracking to the temporal envelopes of intact speech and high-RMS-level speech segments alone, respectively. Auditory decoding performance was then analyzed under various SNR conditions by comparing EEG correlations to the attended and ignored speech streams. The accuracy of auditory attention decoding based on the temporal envelope with high-RMS-level speech segments was not inferior to that based on the temporal envelope of intact speech. Cortical activity correlated more strongly with attended than with ignored speech under different SNR conditions. These results suggest that EEG recordings corresponding to high-RMS-level speech segments carry crucial information for the identification and tracking of attended speech in the presence of background noise. This study also showed that with the modulation of auditory attention, attended speech can be decoded more robustly from neural activity than from behavioral measures under a wide range of SNR.Fast, online control of movement is an essential component of human motor skills, as it allows automatic correction of inaccurate planning. read more The present study explores the role of two types of concurrent signals in error correction predicted visual reafferences coming from an internal representation of the hand, and actual visual feedback from the hand. While the role of sensory feedback in these corrections is well-established, much less is known about sensory prediction. The relative contributions of these two types of signals remain a subject of debate, as they are naturally interconnected. We address the issue in a study that compares online correction of an artificially induced, undetected planning error. Two conditions are tested, which only differ with respect to the accuracy of predicted visual reafferences. In the first, "Prism" experiment, a planning error is introduced by prisms that laterally displace the seen hand prior to hand movement onset. The prism-induced conflict between visual and proprioceptive inputs of the hand also generates an erroneous prediction of visual reafferences of the moving hand.