GEMS 2010
GEometrical Models of Natural Language Semantics
Business picture
Endorsed by:
Endorsed by:

July 16, 2010
Venue A, Room II
Uppsala University, Sweden

The GEMS 2010 workshop on semantic space, will be held in conjunction with the 48th Annual Meeting of the Association for Computational Linguistics (ACL-10), which will take place in Uppsala, Sweden, July 11-16, 2010.

19 June : Program is now available here
May : Katrin Erk will be GEMS 2010 invited speaker, abstract here

GEMS 2010 is the second edition of the GEMS workshop series, focusing on distributional methods and word spaces models for lexical semantics. The first GEMS workshop, held in conjunction with EACL-2009, has seen the participation of more than 60 attendants and the submission of more than 20 papers. This year, we will try to repeat the success of the the first edition. Our main goal is once again to stimulate research on semantic spaces and distributional methods for NLP, by adopting an interdisciplinary approach to allow a proper exchange of ideas, results and resources among often independent communities.

In this second edition, GEMS will broaden its focus to practical and industrial applications of distributional models. Many Web-companies such as Microsoft, Google and Yahoo! have in the last years embraced and effectively integrated in their infrastructure, semantic processors for computing distributional similarity among entities, queries, web pages and user-click-patterns. The workshop will aim at stimulating interactions between the academic and the corporate research sectors, and in discussing how far and in which way, distributional techniques are applied in Web Search.

The workshop aims at gathering contemporary contributions to large scale problems in meaning representation, acquisition and use, based on distributional and vector space models. The workshop will also explore the impact of such techniques on complex linguistic tasks, such as linguistic knowledge acquisition, semantic role labeling, textual entailment recognition, question answering, document understanding/summarization and ontology learning.