Conformal prediction provides finite-sample uncertainty guarantees, yet its reliability and sharpness is challenged by distribution shifts and structure in high-dimensional outputs. In this talk, I will provide a high-level overview of recent work addressing these two challenges, including some of my own contributions.
First, I'll discuss work on modeling distribution shifts using Lévy–Prokhorov ambiguity sets, which capture both local noise and global mass shifts. By propagating these shifts through the score function, this approach obtains closed-form robust quantiles and prediction sets with explicit coverage–robustness tradeoffs.
Second, I'll present CP4Gen, a conformal prediction method for conditional generative models based on clustering-driven density estimation of generated samples. CP4Gen yields compact, interpretable prediction sets and shows promising performance across synthetic, real-world, and climate datasets.
Together, these results suggest a potential path toward more reliable and robust uncertainty quantification in modern machine learning systems.