BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20251105T093114EST-4668BfwTix@132.216.98.100 DTSTAMP:20251105T143114Z DESCRIPTION:Title: A Central Limit Theorem for Deep Neural Networks and Pro ducts of Random Matrices\n\nAbstract: We study the output-input Jacobian m atrix for deep ReLU neural networks when they are initialized with random weights. We reduce the problem to studying certain products of random matr ices and show that the norm of columns of this matrix are approximately lo g-normal distributed. The result holds for a large class of random weights . The variance depends on the depth-to-width aspect ratio of the network\; this result provides an explanation for why very deep networks can suffer from the 'vanishing and exploding ' gradient problem that makes these net works difficult to train. Based on joint work with Boris Hanin.\n DTSTART:20181112T190000Z DTEND:20181112T200000Z LOCATION:Room 1205\, Burnside Hall\, CA\, QC\, Montreal\, H3A 0B9\, 805 rue Sherbrooke Ouest SUMMARY:Mihai Nica (University of Toronto) URL:/mathstat/channels/event/mihai-nica-university-tor onto-291431 END:VEVENT END:VCALENDAR