BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20250918T055436EDT-1021fSkTsv@132.216.98.100 DTSTAMP:20250918T095436Z DESCRIPTION:Title:\n\nFeature Learning in Two-layer Neural Networks: The Ef fect of Data Covariance\n\nAbstract: \n\nWe study the effect of gradient-b ased optimization on feature learning in two-layer neural networks. We con sider a setting where the number of samples is of the same order as the in put dimension and show that\, when the input data is isotropic\, gradient descent always improves upon the initial random features model in terms of prediction risk\, for a certain class of targets. Further leveraging the practical observation that data often contains additional structure\, i.e. \, the input covariance has non-trivial alignment with the target\, we pro ve that the class of learnable targets can be significantly extended\, dem onstrating a clear separation between kernel methods and two-layer neural networks in this regime.\n\n \n DTSTART:20231002T200000Z DTEND:20231002T210000Z LOCATION:Room 1104\, Burnside Hall\, CA\, QC\, Montreal\, H3A 0B9\, 805 rue Sherbrooke Ouest SUMMARY:Murat A. Erdogdu (University of Toronto) URL:/mathstat/channels/event/murat-erdogdu-university- toronto-351399 END:VEVENT END:VCALENDAR