Abstract
Evaluation of the effectiveness of Geographic Information Retrieval (GIR) systems is challenging and time consuming. We describe an approach to such evaluations, where we use user generated content in the form of text and associated metadata to build a large test colletion automatically. We can thus show that the UGC test collection is useful for evaluating and exploring some of the critical aspects of a GIR, for instance by submitting large numbers of queries.