Community evaluation of crowd-sourced ideas

CHF 57.55
Auf Lager
SKU
HNM544J6S5U
Stock 1 Verfügbar
Geliefert zwischen Di., 05.05.2026 und Mi., 06.05.2026

Details

In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate.

Autorentext

Georg Terlecki-Zaniewicz studied at the Vienna University of Economics and Business. His main research focused on Crowdsourcing and other Open Innovation-related topics. Since 2012, Georg Terlecki-Zaniewicz had worked as Consultant and Innovation Manager for several medium-sized and large corporations.


Klappentext

In 2008, Google initiated an ideation challenge called 'Project 10 to the 100'. They asked quite openly 'What would help most?' and received more than 150,000 ideas from people all over the globe. As Google's 'Project 10 to the 100' and many other real life examples show (e.g. Cisco's 'I-Prize' or Nokia's 'Tune Remake'), the number of submissions to crowdsourcing contests can be stunning. However, the majority of submitted ideas is usually of very low quality. According to the so-called Sturgeon's (1958) law, 90% of everything is crap. Firms are often overloaded by too much 'noise' generated on crowdsourcing platforms. They face the problem of not being able to filter and select the best ideas (or only being able to do so with substantial effort). Recently, scholars have proposed that the integration of crowdsourcing communities into the evaluation process appears to be a very promising method to filter high-quality ideas. In this explorative study, Georg Terlecki-Zaniewicz analyzed eleven crowdsourcing platforms, and concluded with a framework that makes the evaluation of crowd-sourced ideas through community evaluation both more efficient and more accurate.

Weitere Informationen

  • Allgemeine Informationen
    • Sprache Englisch
    • Anzahl Seiten 92
    • Herausgeber AV Akademikerverlag
    • Gewicht 155g
    • Untertitel An explorative study on how to improve the prediction accuracy of crowdsourcing communities
    • Autor Georg Terlecki-Zaniewicz
    • Titel Community evaluation of crowd-sourced ideas
    • Veröffentlichung 15.05.2017
    • ISBN 3330509732
    • Format Kartonierter Einband
    • EAN 9783330509733
    • Jahr 2017
    • Größe H220mm x B150mm x T6mm
    • GTIN 09783330509733

Bewertungen

Schreiben Sie eine Bewertung
Nur registrierte Benutzer können Bewertungen schreiben. Bitte loggen Sie sich ein oder erstellen Sie ein Konto.
Made with ♥ in Switzerland | ©2025 Avento by Gametime AG
Gametime AG | Hohlstrasse 216 | 8004 Zürich | Schweiz | UID: CHE-112.967.470
Kundenservice: customerservice@avento.shop | Tel: +41 44 248 38 38