The posterior in probabilistic programs with stochastic support decomposes as a weighted sum of the local posterior distributions associated with each possible program path.We show that making predictions with this full posterior implicitly performs a Bayesian model averaging (BMA) over paths.This is potentially problematic, as BMA weights can be unstable due to model misspecification or inference approximations, leading to sub-optimal predictions in turn.To remedy this issue, we propose alternative mechanisms for path weighting: one based on stacking and one based on ideas from PAC-Bayes.We show how both can be implemented as a cheap post-processing step on top of existing inference engines.In our experiments, we find them to be more robust and lead to better predictions compared to the default BMA weights.