Artikel
High-resolution μECoG-recording of auditory-evoked responses on the core and belt auditory cortex of the sheep
Suche in Medline nach
Autoren
Veröffentlicht: | 2. Juni 2015 |
---|
Gliederung
Text
Objective: Micro-electrocorticography (μECoG) is one of the most promising techniques for recording cortical activity with high spatial and temporal resolution for brain-machine-interface (BMI) applications. We recently developed a sheep model for the chronic testing of μECoG-based BMI devices using somatosensory evoked potentials (SEP). It was the aim of the present study to record auditory evoked potentials (AEP) on the sheep's cortex with a novel high-resolution μECoG array and to evaluate the use of AEPs for less stressful long-term assessment of implantable BMI devices.
Method: A 0.4 mm thick 19 mm x 32 mm medical silicone grid with 32 platinum-iridium electrodes (contact diameter 1.1 mm, center-to-center distance 4 mm) was connected to a Guger g-USBamp data acquisition system and positioned on the temporal auditory cortex of three anesthetized sheep. Auditory stimulation was performed with 3 s lasting signals consisting of 125 Hz bandwidth noise around a peak frequency of 8 kHz.μECoG data was recorded with 24 bit and 4.8 kHhz sampling rate.
Results: Distinctive auditory-evoked responses were detectable in the alpha- and gamma-band on both the auditory core and the belt region of the sheep's brain. We were able to differentiate between the belt-region's transient activity on signal on- and offset and the core-region's persistent activity during signal presentation.
Conclusions: AEPs can be recorded on the sheep's auditory cortex with very high resolution using μECoG. This technique allows the differentiation between the activity of the core and the belt region of the sheep's auditory cortex. μECoG-recording of AEPs may hold the key for stressless long-term testing of BMI devices in sheep by avoiding the use of painful electrical stimuli to elicit somatosensory evoked potentials for functional evaluation. This work was funded by BMBF grant "Braincon" (0316064C) and supported by BrainLinks-BrainTools Cluster of Excellence funded by the German Research Foundation DFG, grant number EXC 1086.