Although users may not have a definite idea of what images they would be interested in seeing, or are unable to communicate their preferences given the available metadata attributes, they may instead be able to iteratively refine selections to find images of interest. The presentation of a sample from a large set of images can stimulate memories; user can then follow paths through photo space by indicating that they would like to see more images “similar” to one or more of those displayed. Using rich similarity metrics is essential in obtaining effective navigation by this means. This style of interaction has much in common with Bates’ “berry picking” model of information retrieval (Bates, 1989). In this model, users wander through an information space, finding results and modifying their queries as they go. The final goal of the user adapts as they bounce through the results from each previous query. This approach is well-suited to develop BCI tools for photo browsing. The idea is to combine BCI with simple image search techniques. Users will mentally select pictures representing possible categories in their photo archives with a P300-based BCI, and image search techniques will provide similar pictures. In fact, there is some preliminary work that follow this P300 approach (Touyama, 2008). Also of interest is the use of rapid serial visual presentation (RSVP) paradigms for image triage (Gerson et al., 2006). In this approach, users watch many images presented a high rate (say, 4 Hz) and the presence of a P300 evoked potential indicates images of interest that are ranked on top of the final selection.