This paper reports on how emotional states elicited by affective sounds can be effectively recognized by means of estimates of Autonomic Nervous System (ANS) dynamics. Specifically, emotional states are modeled as a combination of arousal and valence dimensions according to the well-known circumplex model of affect, whereas the ANS dynamics is estimated through standard and nonlinear analysis of Heart rate variability (HRV) exclusively, which is derived from the electrocardiogram (ECG). In addition, Lagged Poincaré Plots of the HRV series were also taken into account. The affective sounds were gathered from the International Affective Digitized Sound System and grouped into four different levels of arousal (intensity) and two levels of valence (unpleasant and pleasant). A group of 27 healthy volunteers were administered with these standardized stimuli while ECG signals were continuously recorded. Then, those HRV features showing significant changes (p<0.05 from statistical tests) between the arousal and valence dimensions were used as input of an automatic classification system for the recognition of the four classes of arousal and two classes of valence. Experimental results demonstrated that a quadratic discriminant classifier, tested through Leave-One-Subject-Out procedure, was able to achieve a recognition accuracy of 84.72% on the valence dimension, and 84.26% on the arousal dimension.