Image
 
Facebook
 
Twitter
 
YouTube
 

Talk on Thursday

Sep 23, 2021 at 12:00
Place: Zoom Seminar
Series: Talk

Claudio Gallicchio, University of Pisa, Italy


Abstract:

Deep Neural Networks (DNNs) are a fundamental tool in the modern development of Machine Learning. Beyond the merits of the training algorithms, a great part of DNNs success is due to the inherent properties of their layered architectures, i.e., to the introduced architectural biases. This talk explores recent classes of DNN models in which the majority of connections are untrained, i.e., randomized or more generally fixed according to some specific heuristic. 
Limiting the training algorithms to operate on a reduced set of weights implies intriguing features. Among them, the extreme efficiency of the learning processes is undoubtedly a striking advantage with respect to fully trained counterparts. Besides, despite the involved simplifications, randomized neural systems possess remarkable properties both in practice, achieving state-of-the-art results in multiple domains, and theoretically, allowing us to analyze intrinsic properties of neural architectures.
This talk will cover the major aspects regarding Deep Randomized Neural Networks, with a particular focus on Deep Reservoir Computers for time-series and graphs.

 

Link Zoom: https://uibuniversitat.zoom.us/j/83043347066

Miguel C. Soriano
TEL: 971 17 13 14
E-mail: miguel@ifisc.uib-csic.es

 
Facebook
 
Twitter
 
YouTube
 

If you are not a member of IFISC and want to unsubscribe from this list just send a mail to semfis-unsubscribe@ifisc.uib-csic.es and then reply to the confirmation mail.