SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Braschler Martin) "

Sökning: WFRF:(Braschler Martin)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Braschler, Martin, et al. (författare)
  • A PROMISE for Experimental Evaluation
  • 2010. - 11
  • Konferensbidrag (refereegranskat)abstract
    • Participative Research labOratory for Multimedia and Multilingual Information Systems Evaluation (PROMISE) is a Network of Excellence, starting in conjunction with this first independent CLEF 2010 conference, and designed to support and develop the evaluation of multilingual and multimedia information access systems, largely through the activities taking place in Cross-Language Evaluation Forum (CLEF) today, and taking it forward in important new ways. PROMISE is coordinated by the University of Padua, and comprises 10 partners: the Swedish Institute for Computer Science, the University of Amsterdam, Sapienza University of Rome, University of Applied Sciences of Western Switzerland, the Information Retrieval Facility, the Zurich University of Applied Sciences, the Humboldt University of Berlin, the Evaluation and Language Resources Distribution Agency, and the Centre for the Evaluation of Language Communication Technologies. The single most important step forward for multilingual and multimedia information access which PROMISE will work towards is to provide an open evaluation infrastructure in order to support automation and collaboration in the evaluation process.
  •  
2.
  • Ferro, Nicola, et al. (författare)
  • PROMISE Retreat Report Prospects and Opportunities for Information Access Evaluation
  • 2013
  • Ingår i: ACM SIGIR Forum. - : Association for Computing Machinery (ACM). - 0163-5840 .- 1558-0229. ; 46:2, s. 60-84
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • The PROMISE network of excellence organized a two-days brainstorming workshop on 30th and 31st May 2012 in Padua, Italy, to discuss and envisage future directions and perspectives for the evaluation of information access and retrieval systems in multiple languages and multiple media. This document reports on the outcomes of this event and provides details about the six envisaged research lines: search applications; contextual evaluation; challenges in test collection design and exploitation; component-based evaluation; ongoing evaluation; and signal-aware evaluation. The ultimate goal of the PROMISE retreat is to stimulate and involve the research community along these research lines and to provide funding agencies with effective and scientifically sound ideas for coordinating and supporting information access research.
  •  
3.
  • Forner, Pamela, et al. (författare)
  • PROMISE Technology Transfer Day: Spreading the Word on Information Access Evaluation at an Industrial Event : WORKSHOP REPORT
  • 2013
  • Ingår i: SIGIR Forum. - 0163-5840 .- 1558-0229. ; 47:1, s. 53-58
  • Tidskriftsartikel (refereegranskat)abstract
    • The Technology Transfer Day was held at CeBIT 2013 from March 5 to March 9, at the Deutsche Messe in Hannover, Germany. PROMISE presented three events at CeBIT: a panel in the CeBIT Global Conference (CGC) - Power Stage, a one-day workshop hosted in the CeBIT Convention Center, and a stand "EU Language & Big Data Projects" in Hall 9. The whole program included 4 panelists, 12 invited talks, and an discussions among the speakers and with the public. This report overviews the aims and contents of the events and outlines the major outcomes. 
  •  
4.
  • Imhof, Melanie, et al. (författare)
  • Evaluation for operational IR applications : generalizability and automation
  • 2013
  • Ingår i: LivingLab '13 Proceedings of the 2013 workshop on Living labs for information retrieval evaluation. - New York : Association for Computing Machinery (ACM). - 9781450324205 ; , s. 11-12
  • Konferensbidrag (refereegranskat)abstract
    • Black box information retrieval (IR) application evaluation allows practitioners to measure the quality of their IR application. Instead of evaluating specific components, e.g. solely the search engine, a complete IR application, including the user's perspective, is evaluated. The evaluation methodology is designed to be applicable to operational IR applications. The black box evaluation methodology could be packaged into an evaluation and monitoring tool, making it usable for industry stakeholders. The tool should lead practitioners through the evaluation process and maintain the test results for the manual and automatic tests. This paper shows that the methodology is generalizable, even though the diversity of IR applications is high. The challenges in automating tests are the simulation of tasks that require intellectual effort and the handling of different visualizations of the same concept.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy