BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//talks.is.ed.ac.uk//EN
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:19700329T010000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:19701025T020000
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CATEGORIES:All Mathematics
SUMMARY:Analysis of p-Laplacian Regularization in Semi-Supervised Learning - Matthew Thorpe (Cambridge)
DTSTART;TZID=Europe/London:20171004T160000
DTEND;TZID=Europe/London:20171004T170000
UID:TALK1116
URL:http://talks.is.ed.ac.uk/talk/1116/show
DESCRIPTION:The talk concerns a family of regression problems in a semi-supervised setting. The task is to assign real-valued labels to a set of n sample points, provided a small training subset of N labelled points. A goal of semi-supervised learning is to take advantage of the (geometric) structure provided by the large number of unlabelled data when assigning labels. In this talk the geometry is represented by the random geometric graph model with connection radius r(n). The framework is to consider objective functions which reward the regularity of the estimator function and impose or reward the agreement with the training data, more specifically we will consider discrete p-Laplacian regularization.\nThe talk concerns the asymptotic behaviour in the limit where the number of unlabelled points increases while the number of training points remains fixed. The results are to uncover a delicate interplay between the regularizing nature of the functionals considered and the nonlocality inherent to the graph constructions. I will give almost optimal ranges on the scaling of r(n) for the asymptotic consistency to hold. For standard approaches used thus far there is a restrictive upper bound on how quickly r(n) must converge to zero as n goes to infinity. Furthermore, I will present a new model which overcomes this restriction. It is as simple as the standard models, but converges as soon as r(n) -> 0.\nThis is joint work with Dejan Slepcev (CMU)
LOCATION:JCMB 5323
CONTACT:Kostas Zygalakis (kzygalak)
END:VEVENT
BEGIN:VEVENT
CATEGORIES:All Mathematics
SUMMARY:Heat kernels in graphs: A journey from random walks to geometry, and back - He Sun (Edinburgh)
DTSTART;TZID=Europe/London:20171011T160000
DTEND;TZID=Europe/London:20171011T170000
UID:TALK1145
URL:http://talks.is.ed.ac.uk/talk/1145/show
DESCRIPTION: Heat kernels are one of the most fundamental concepts in physics and mathematics. In physics, the heat kernel is a fundamental solution of the heat equation and connects the Laplacian operator to the rate of heat dissipation. In spectral geometry, many fundamental techniques are based on heat kernels. In finite Markov chain theory, heat kernels correspond to continuous-time random walks and constitute one of the most powerful techniques in estimating the mixing time.\nIn this talk, we will briefly discuss this line of research and its relation to heat kernels in graphs. In particular, we will see how heat kernels are used to design the first nearly-linear time algorithm for finding clusters in real-world graphs. Some interesting open questions will be addressed in the end.
LOCATION:JCMB 5323
CONTACT:Kostas Zygalakis (kzygalak)
END:VEVENT
BEGIN:VEVENT
CATEGORIES:All Mathematics
SUMMARY:TBA - Catherine Powell (Manchester)
DTSTART;TZID=Europe/London:20171018T160000
DTEND;TZID=Europe/London:20171018T170000
UID:TALK1146
URL:http://talks.is.ed.ac.uk/talk/1146/show
DESCRIPTION:
LOCATION:JCMB 5323
CONTACT:Kostas Zygalakis (kzygalak)
END:VEVENT
END:VCALENDAR