In clinical decision processes, relevant scientific publications and their associated medical images can provide valuable and insightful information. However, effectively searching through both text and image data is a difficult and arduous task. More specifically in the area of image search, finding similar images (or regions within images) poses another significant hurdle for effective knowledge dissemination. Thus, we propose a method using local regions within images to perform and refine medical image retrieval. In our first example, we define and extract large, characteristic regions within an image, and then show how to use these regions to match a query image to similar content. In our second example, we enable the formulation of a mixed query based upon text, image, and region information, to better represent the end user's search intentions. Given our new framework for region-based queries, we present an improved set of similar search results.