A PHP Error was encountered

Severity: Notice

Message: Only variable references should be returned by reference

Filename: core/Common.php

Line Number: 257

CrowdRec 2013 @ ACM RecSys 2013 - Hong Kong
  Workshop photos have been uploaded.

  The workshop concluded successfully.
  Papers and slides will be available on the programme page.

  Notifications have been sent out to authors.

  The deadline of submission has been extended to 26th July, 2013.

  CrowdRec 2013 Website launched.
We are pleased to announce the first workshop on crowdsourcing and human computation for recommender systems. The recommender systems we refer to as a broad spectrum of applications involving recommendation, information valuation, filtering, summarization, etc. in various contexts from e-commerce to social networking and mobile applications. This workshop aims to provide a scholarly venue for researchers and practitioners to exchange the advances of crowdsourcing and human computation technologies and applications, with an emphasis on the applications in recommendation systems. The potentials and advantages of crowdsourcing and human computation have been explored for a number of areas such as computer-human interaction and information retrieval, we believe that these advances can also benefit the research of recommender systems at large.
Call For Papers
Submissions are invited on theories, experiments, and explorations of how human (or the crowd) can involve the process of information valuation and recommendation. Topics include, but are not limited to the use of crowds and/or human computation, in the following areas of research:

Ground-truth annotation
  • Annotator user interface
  • Games with a purpose (GWAP) or other annotation-as-byproduct designs
  • Effective learning from crowd inputs
  • Mining from social media
Large-scale evaluation of recommender systems
  • Recommender evaluation metrics and studies
  • Personalization support
  • User modeling and profiling
Crowdsourcing and human computation methodologies
  • Security, privacy, trust, and reputation management
  • Quality assurance and cheat detection
  • Economics and incentive structures
  • Programming languages, tools and platforms providing enhanced support
  • Empirical and case studies of using crowdsourcing for recommendation
  • Policy and cultural issues related to crowd recommendation
The CFP can be downloaded here.
Each accepted paper requires at least one of the authors to obtain a workshop registration. Also, one of the authors (or a qualified substitute) must give a presentation of the paper at the workshop.