• About
  • Policies
  • What is open access
  • Library
  • Contact
Advanced search
      View Item 
      •   BUIR Home
      • Scholarly Publications
      • Faculty of Engineering
      • Department of Computer Engineering
      • View Item
      •   BUIR Home
      • Scholarly Publications
      • Faculty of Engineering
      • Department of Computer Engineering
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Automatic Ranking of Retrieval Systems in Imperfect Environments

      Thumbnail
      View / Download
      124.3 Kb
      Author(s)
      Nuray, Rabia
      Can, Fazlı
      Date
      2003-07-08
      Source Title
      SIGIR '03 Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
      Publisher
      ACM
      Pages
      379 - 380
      Language
      English
      Type
      Conference Paper
      Item Usage Stats
      217
      views
      235
      downloads
      Abstract
      The empirical investigation of the effectiveness of information retrieval (IR) systems requires a test collection, a set of query topics, and a set of relevance judgments made by human assessors for each query. Previous experiments show that differences in human relevance assessments do not affect the relative performance of retrieval systems. Based on this observation, we propose and evaluate a new approach to replace the human relevance judgments by an automatic method. Ranking of retrieval systems with our methodology correlates positively and significantly with that of human-based evaluations. In the experiments, we assume a Web-like imperfect environment: the indexing information for all documents is available for ranking, but some documents may not be available for retrieval. Such conditions can be due to document deletions or network problems. Our method of simulating imperfect environments can be used for Web search engine assessment and in estimating the effects of network conditions (e.g., network unreliability) on IR system performance.
      Keywords
      Automatic Performance Evaluation
      IR Evaluation
      Automation
      Computer simulation
      Correlation methods
      Database systems
      Query languages
      Search engines
      World Wide Web
      Automatic performance evaluation
      Information retrieval (IR) evaluation
      Information retrieval systems
      Permalink
      http://hdl.handle.net/11693/27506
      Published Version (Please cite this version)
      https://doi.org/10.1145/860435.860510
      Collections
      • Department of Computer Engineering 1510
      Show full item record

      Browse

      All of BUIRCommunities & CollectionsTitlesAuthorsAdvisorsBy Issue DateKeywordsTypeDepartmentsCoursesThis CollectionTitlesAuthorsAdvisorsBy Issue DateKeywordsTypeDepartmentsCourses

      My Account

      Login

      Statistics

      View Usage StatisticsView Google Analytics Statistics

      Bilkent University

      If you have trouble accessing this page and need to request an alternate format, contact the site administrator. Phone: (312) 290 2976
      © Bilkent University - Library IT

      Contact Us | Send Feedback | Off-Campus Access | Admin | Privacy