Federal Bid

Last Updated on 24 Feb 2016 at 9 AM
Sources Sought
Location Unknown

TREC/TRECVID/TAC Support Services

Solicitation ID AMDTC-16-0007
Posted Date 28 Jan 2016 at 9 PM
Archive Date 24 Feb 2016 at 5 AM
NAICS Category
Product Service Code
Set Aside No Set-Aside Used
Contracting Office Department Of Commerce Nist
Agency Department Of Commerce
Location Andorra

This is a SOURCES SOUGHT NOTICE for market research purposes. THIS IS NOT A REQUEST FOR QUOTATIONS. INFORMATION RECEIVED WILL BE USED FOR MARKET RESEARCH PURPOSES ONLY.

This Notice is for planning purposes only and is not a Request for Proposal, Request for Quotation, or an obligation on the part of the National Institute of Standards and Technology (NIST) for conducting a follow-on acquisition. NIST does not intend to award a contract on the basis of this Notice, or otherwise pay for the information requested. No entitlement or payment of direct or indirect costs or charges by NIST will arise as a result of submission of responses to this Notice and NIST use of such information.

The National Institute of Standards and Technology (NIST) is seeking to identify sources capable of providing support services for the Text Retrieval Conference (TREC), Text Retrieval Conference Video Retrieval Evaluation (TRECVID), and Text Analysis Conference (TAC).

Background Information

The National Institute of Standards and Technology (NIST) Information Technology Laboratory's (ITL) Information Access Division's Retrieval Group strives to promote the use of more effective and efficient techniques for manipulating information not specifically structured for automatic use--especially the browsing, searching, and presentation of that information. Accurate evaluation of the capabilities of current information access systems is a prerequisite to improving the systems' capabilities. Human-created annotations are essential for the evaluation of information systems. Examples of these annotations include: identification of documents that are relevant to a searcher's need, hand-written summaries of documents and sets of documents, and answers to questions. These annotations are critical for comparing the performance of automatic information systems and for understanding differences within the range of performance that people achieve at the same tasks.

The evaluation of search systems is performed within a series of workshops known as The Text Retrieval Conference (TREC). TREC requires the building of reference material for use in evaluation. This reference material includes a set of test questions plus judgments on whether text extracts answer the questions.

The Text Retrieval Conference Video Retrieval Evaluation (TRECVID) extends this paradigm to the evaluation of systems that use multimedia queries to research large collections of digital video.

For the Text Analysis Conference (TAC), researchers are developing automatic summarization systems that will help a person track or follow up on a news story by giving the person summaries that focus on what is new or different as the story develops over time.

The TREC, TRECVID, TAC conferences are held yearly.

NOTE: Please reference the attached PDF document for the complete notice.

Bid Protests Not Available