Processing may take a few seconds...



Speech enhancement using U-nets with wide-context units


[ 1 ] Instytut Automatyki i Robotyki, Wydział Automatyki, Robotyki i Elektrotechniki, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.2] Automation, electronics and electrical engineering

Year of publication


Published in

Multimedia Tools and Applications

Journal year: 2022 | Journal volume: vol. 81 | Journal number: iss. 13

Article type

scientific article

Publication language


  • speech enhancement
  • U-nets
  • DNN

EN In this article a new neural network for speech enhancement is proposed where single-channel noisy speech is processed in order to improve its intelligibility and quality. It is based on the U-net architecture, i.e. it is composed of two main blocks: encoder and decoder. Some of the corresponding layers in the encoder and decoder are connected with skip connections. In most of the encoder-decoder neural networks for speech enhancement known from the literature, the time-frequency resolution of the hidden feature maps is reduced. The main strategy in the presented approach is to maintain the time-frequency resolution of feature maps at all levels of the network while having large receptive field at the same time. In order to obtain features dependent on wide context we propose neural network units based on recurrent cells or dilated convolutions. The proposed neural network was evaluated using WSJ0 and TIMIT speech data mixed with noises from Noisex, DCASE and field recordings from Freesound online database. The results showed improvement over the baseline networks based on gated dilated convolutions or long-short term memory (LSTM) in terms of scale-independent speech-to-distortion ratio (SI-SDR), spectro-temporal objective intelligibility (STOI) and perceptual evaluation of speech quality (PESQ) measures.

Date of online publication


Pages (from - to)

18617 - 18639




Points of MNiSW / journal


Impact Factor

2.757 [List 2020]

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.