Prior computational studies have examined hundreds of visual characteristics related to color, texture, and composition in an attempt to predict human emotional responses. Beyond those myriad features examined in computer science, roundness, angularity, and visual complexity have also been found to evoke emotions in human perceivers, as demonstrated in psychological studies of facial expressions, dance poses, and even simple synthetic visual patterns. Capturing these characteristics algorithmically to incorporate in computational studies, however, has proven difficult. Here we expand the scope of previous computer vision work by examining these three visual characteristics in computer analysis of complex scenes, and compare the results to the hundreds of visual qualities previously examined. A large collection of ecologically valid stimuli (i.e., photos that humans regularly encounter on the web), named the EmoSet and containing more than 40,000 images crawled from web albums, was generated using crowd-sourcing and subjected to human subject emotion ratings. We developed computational methods to the separate indices of roundness, angularity, and complexity, thereby establishing three new computational constructs. Critically, these three new physically interpretable visual constructs achieve comparable classification accuracy to the hundreds of shape, texture, composition, and facial feature characteristics previously examined. In addition, our experimental results show that color features related most strongly with the positivity of perceived emotions, the texture features related more to calmness or excitement, and roundness, angularity, and simplicity related similarly with both of these emotions dimensions.