BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20250709T203638EDT-7694R6hunb@132.216.98.100 DTSTAMP:20250710T003638Z DESCRIPTION:Title: Universal Approximation Theorems\n\nAbstract: The univer sal approximation theorem established the density of specific families of neural networks in the space of continuous functions and in certain Bochne r-Lebesgue spaces\, defined between any two Euclidean spaces. We extend an d refine this result by proving that there exist dense neural network arch itectures on a larger class of function spaces and that these architecture s may be written down using only a small number of functions. Refinements of the classical results of Hornik 1989 are also obtained. We prove that u pon appropriately randomly selecting the neural networks architecture's ac tivation function we may still obtain a dense set of neural networks\, wit h positive probability. This last result is used to overcome the difficult y of appropriately selecting an activation function in more exotic archite ctures.\n DTSTART:20191016T193000Z DTEND:20191016T203000Z LOCATION:Room LB 921-4\, CA\, Seminar Statistique Concordia SUMMARY:Anastasis Kratsios\, ETH Zurich URL:/mathstat/channels/event/anastasis-kratsios-eth-zu rich-301625 END:VEVENT END:VCALENDAR