One of the most important functions of Voting Advice Applications (VAAs) is to offer a set of results in which users can gauge how closely their stated political preferences match those of parties/candidates. Such results can range from a simple bar chart of overall agreement through to spatial maps that, in most cases, assume a two dimensional political space. In addition, many VAAs offer policy specific information comparing users' issue preferences with those of parties/candidates. Much of VAA-related research, especially that concerned with VAA-effects, is predicated on the assumption that users do, in fact, engage with the results provided by the VAA. However, to date, testing this assumption has received scant attention in the literature beyond users’ self-reports of satisfaction. In this paper we report findings from a VAA set up that allows us to examine how users engage with the information provided by these tools. In addition to providing descriptive statistics of user engagement, we investigate how individual-level characteristics (such as demographics), technology-related variables (such as types of device used) and stimulus-characteristics of the “advice” provided (e.g. whether discrepant from self-reported preferences) can affect users’ engagement with their results. The results of our research are not only of interest to VAA design but could also be relevant to a broader literature on the effects of political information provision.