Modeling the risk of extreme weather events in a changing climate is essential for developing effective adaptation and mitigation strategies. Accurate risk and impact assessment often demands high-resolution detail in the future, but this is expensive for climate models to resolve with certainty. Thus, climate science’s grand computing challenges emerge in terms of nonlinearity, dimensionality, and uncertainty for mapping, climate projection, downscaling, and decision programming. In this talk, I discuss how machine learning offers improved efficacy in addressing challenges to estimating time-dependent climate risk, including searching for rare events. After that, I focus on two problems. The first is downscaling, familiar as super-resolution in image processing and computer vision. I will show that with just a little statistical-physical priming in the form of rudimentary physics and an active conditional Gaussian process, generative adversarial learning is quite effective at spatially downscaling extreme mid-latitude precipitation, a key driver for flood risk, from 0.25° to 0.01° resolution in both the detail and distribution of rainfall extremes. The second problem relates to efficacious high-resolution projections that incorporate dynamics learned from data into numerical models. We call such hybrid learning machines Neural Dynamical Systems (NDS). By treating learning as a stochastic process, we show that an ensemble approximation to the attendant Fokker Planck equations yields an uncertainty quantifying adjoint-free, fast, and parallel Ensemble Kalman Learning algorithm. When paired with assessments of information flow and information gain, the ability to select data and prioritize parameter updates for online learning and synthesizing stable structure-optimized NDS becomes feasible. An exciting new frontier in learning governing equations directly for climate risk opens.