Eintrag weiter verarbeiten

Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions

Gespeichert in:

Personen und Körperschaften: Hoßfeld, Tobias (HerausgeberIn), Archambault, Daniel (HerausgeberIn), Purchase, Helen (HerausgeberIn)
Titel: Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions/ edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld
Format: E-Book
Sprache: Englisch
veröffentlicht:
Cham Springer 2017
Gesamtaufnahme: SpringerLink
Springer eBook Collection
Lecture notes in computer science ; 10264
Schlagwörter:
Quelle: Verbunddaten SWB
LEADER 04640cam a22008292 4500
001 0-1656222876
003 DE-627
005 20240122104310.0
007 cr uuu---uuuuu
008 171002s2017 gw |||||o 00| ||eng c
020 |a 9783319664354  |9 978-3-319-66435-4 
024 7 |a 10.1007/978-3-319-66435-4  |2 doi 
035 |a (DE-627)1656222876 
035 |a (DE-576)494011327 
035 |a (DE-599)BSZ494011327 
035 |a (OCoLC)1009857338 
035 |a (DE-He213)978-3-319-66435-4 
035 |a (EBP)04057458X 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
044 |c XA-DE 
050 0 |a QA76.9.U83  |a QA76.9.H85 
050 0 |a QA76.9.U83 
050 0 |a QA76.9.H85 
072 7 |a UYZG  |2 bicssc 
072 7 |a COM070000  |2 bisacsh 
072 7 |a UYZ  |2 bicssc 
245 1 0 |a Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments  |b Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions  |c edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld 
264 1 |a Cham  |b Springer  |c 2017 
300 |a Online-Ressource (VII, 191 p. 15 illus, online resource) 
336 |a Text  |b txt  |2 rdacontent 
337 |a Computermedien  |b c  |2 rdamedia 
338 |a Online-Ressource  |b cr  |2 rdacarrier 
490 1 |a Lecture Notes in Computer Science  |v 10264 
490 0 |a SpringerLink  |a Bücher 
490 0 |a Springer eBook Collection  |a Computer Science 
520 |a As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community 
520 |a Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments 
650 0 |a Computer science 
650 0 |a Computer communication systems 
650 0 |a Economic theory 
650 0 |a Computer Science 
650 0 |a User interfaces (Computer systems) 
650 0 |a Human-computer interaction. 
650 0 |a Computer networks . 
650 0 |a Application software. 
650 0 |a Econometrics. 
700 1 |a Hoßfeld, Tobias  |e Hrsg.  |4 edt 
700 1 |a Archambault, Daniel  |e Hrsg.  |4 edt 
700 1 |a Purchase, Helen  |e Hrsg.  |4 edt 
776 1 |z 9783319664347 
776 0 8 |i Druckausg.  |z 978-3-319-66434-7 
776 0 8 |i Printed edition  |z 9783319664347 
830 0 |a Lecture notes in computer science  |v 10264  |9 10264  |w (DE-627)316228877  |w (DE-576)093890923  |w (DE-600)2018930-8  |x 1611-3349  |7 ns 
856 4 0 |u https://doi.org/10.1007/978-3-319-66435-4  |m B:SPRINGER  |x Verlag  |z lizenzpflichtig  |3 Volltext 
856 4 2 |u https://swbplus.bsz-bw.de/bsz494011327cov.jpg  |m V:DE-576  |m X:springer  |q image/jpeg  |v 20171120121724  |3 Cover 
889 |w (DE-627)898997011 
912 |a ZDB-2-LNC  |b 2017 
912 |a ZDB-2-SCS  |b 2017 
912 |a ZDB-2-SEB 
912 |a ZDB-2-SXCS  |b 2017 
912 |a ZDB-2-SEB  |b 2017 
951 |a BO 
856 4 0 |u http://dx.doi.org/10.1007/978-3-319-66435-4  |9 DE-14 
852 |a DE-14  |x epn:3386742947  |z 2017-10-02T11:24:05Z 
856 4 0 |u http://dx.doi.org/10.1007/978-3-319-66435-4  |9 DE-Ch1 
852 |a DE-Ch1  |x epn:3386743366  |z 2017-10-02T11:24:05Z 
975 |o Springer E-Book 
975 |k Elektronischer Volltext - Campuslizenz 
856 4 0 |u http://dx.doi.org/10.1007/978-3-319-66435-4  |9 DE-Zwi2 
852 |a DE-Zwi2  |x epn:3386743552  |z 2017-10-02T11:24:05Z 
980 |a 1656222876  |b 0  |k 1656222876  |o 494011327 
openURL url_ver=Z39.88-2004&ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fvufind.svn.sourceforge.net%3Agenerator&rft.title=Evaluation+in+the+Crowd.+Crowdsourcing+and+Human-Centered+Experiments%3A+Dagstuhl+Seminar+15481%2C+Dagstuhl+Castle%2C+Germany%2C+November+22+%E2%80%93+27%2C+2015%2C+Revised+Contributions&rft.date=2017&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=book&rft.btitle=Evaluation+in+the+Crowd.+Crowdsourcing+and+Human-Centered+Experiments%3A+Dagstuhl+Seminar+15481%2C+Dagstuhl+Castle%2C+Germany%2C+November+22+%E2%80%93+27%2C+2015%2C+Revised+Contributions&rft.series=Lecture+notes+in+computer+science%2C+10264&rft.au=&rft.pub=Springer&rft.edition=&rft.isbn=3319664352
SOLR
_version_ 1792254162273042432
access_facet Electronic Resources
author2 Hoßfeld, Tobias, Archambault, Daniel, Purchase, Helen
author2_role edt, edt, edt
author2_variant t h th, d a da, h p hp
author_facet Hoßfeld, Tobias, Archambault, Daniel, Purchase, Helen
callnumber-first Q - Science
callnumber-label QA76
callnumber-raw QA76.9.U83 QA76.9.H85, QA76.9.U83, QA76.9.H85
callnumber-search QA76.9.U83 QA76.9.H85, QA76.9.U83, QA76.9.H85
callnumber-sort QA 276.9 U83 Q A76 19. H85
callnumber-subject QA - Mathematics
collection ZDB-2-LNC, ZDB-2-SCS, ZDB-2-SEB, ZDB-2-SXCS
contents As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community, Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments
ctrlnum (DE-627)1656222876, (DE-576)494011327, (DE-599)BSZ494011327, (OCoLC)1009857338, (DE-He213)978-3-319-66435-4, (EBP)04057458X
dech1_date 2017-10-02T11:24:05Z
doi_str_mv 10.1007/978-3-319-66435-4
facet_912a ZDB-2-LNC, ZDB-2-SCS, ZDB-2-SEB, ZDB-2-SXCS
facet_avail Online
finc_class_facet Informatik, Mathematik
format eBook
format_access_txtF_mv Book, E-Book
format_de105 Ebook
format_de14 Book, E-Book
format_de15 Book, E-Book
format_del152 Buch
format_detail_txtF_mv text-online-monograph-independent
format_dezi4 e-Book
format_finc Book, E-Book
format_legacy ElectronicBook
format_legacy_nrw Book, E-Book
format_nrw Book, E-Book
format_strict_txtF_mv E-Book
geogr_code not assigned
geogr_code_person not assigned
hierarchy_parent_id 0-316228877
hierarchy_parent_title Lecture notes in computer science
hierarchy_sequence 10264
hierarchy_top_id 0-316228877
hierarchy_top_title Lecture notes in computer science
id 0-1656222876
illustrated Not Illustrated
imprint Cham, Springer, 2017
imprint_str_mv Cham: Springer, 2017
institution DE-14, DE-Zwi2, DE-Ch1
is_hierarchy_id 0-1656222876
is_hierarchy_title Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions
isbn 9783319664354
isbn_isn_mv 9783319664347, 978-3-319-66434-7
issn_isn_mv 1611-3349
kxp_id_str 1656222876
language English
last_indexed 2024-02-29T17:12:41.441Z
local_heading_facet_dezwi2 Computer science, Computer communication systems, Economic theory, Computer Science, User interfaces (Computer systems), Human-computer interaction., Computer networks ., Application software., Econometrics.
marc024a_ct_mv 10.1007/978-3-319-66435-4
match_str hossfeld2017evaluationinthecrowdcrowdsourcingandhumancenteredexperimentsdagstuhlseminar15481dagstuhlcastlegermanynovember22272015revisedcontributions
mega_collection Verbunddaten SWB
multipart_link 093890923
multipart_part (093890923)10264
oclc_num 1009857338
physical Online-Ressource (VII, 191 p. 15 illus, online resource)
publishDate 2017
publishDateSort 2017
publishPlace Cham
publisher Springer
record_format marcfinc
record_id 494011327
recordtype marcfinc
rvk_facet No subject assigned
series Lecture notes in computer science, 10264
series2 Lecture Notes in Computer Science ; 10264, SpringerLink ; Bücher, Springer eBook Collection ; Computer Science
source_id 0
spelling Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld, Cham Springer 2017, Online-Ressource (VII, 191 p. 15 illus, online resource), Text txt rdacontent, Computermedien c rdamedia, Online-Ressource cr rdacarrier, Lecture Notes in Computer Science 10264, SpringerLink Bücher, Springer eBook Collection Computer Science, As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community, Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments, Computer science, Computer communication systems, Economic theory, Computer Science, User interfaces (Computer systems), Human-computer interaction., Computer networks ., Application software., Econometrics., Hoßfeld, Tobias Hrsg. edt, Archambault, Daniel Hrsg. edt, Purchase, Helen Hrsg. edt, 9783319664347, Druckausg. 978-3-319-66434-7, Printed edition 9783319664347, Lecture notes in computer science 10264 10264 (DE-627)316228877 (DE-576)093890923 (DE-600)2018930-8 1611-3349 ns, https://doi.org/10.1007/978-3-319-66435-4 B:SPRINGER Verlag lizenzpflichtig Volltext, https://swbplus.bsz-bw.de/bsz494011327cov.jpg V:DE-576 X:springer image/jpeg 20171120121724 Cover, (DE-627)898997011, http://dx.doi.org/10.1007/978-3-319-66435-4 DE-14, DE-14 epn:3386742947 2017-10-02T11:24:05Z, http://dx.doi.org/10.1007/978-3-319-66435-4 DE-Ch1, DE-Ch1 epn:3386743366 2017-10-02T11:24:05Z, http://dx.doi.org/10.1007/978-3-319-66435-4 DE-Zwi2, DE-Zwi2 epn:3386743552 2017-10-02T11:24:05Z
spellingShingle Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions, Lecture notes in computer science, 10264, As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community, Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments, Computer science, Computer communication systems, Economic theory, Computer Science, User interfaces (Computer systems), Human-computer interaction., Computer networks ., Application software., Econometrics.
swb_id_str 494011327
title Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions
title_auth Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions
title_full Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld
title_fullStr Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld
title_full_unstemmed Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions edited by Daniel Archambault, Helen Purchase, Tobias Hoßfeld
title_in_hierarchy 10264. Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions (2017)
title_short Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments
title_sort evaluation in the crowd crowdsourcing and human centered experiments dagstuhl seminar 15481 dagstuhl castle germany november 22 27 2015 revised contributions
title_sub Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions
title_unstemmed Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments: Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions
topic Computer science, Computer communication systems, Economic theory, Computer Science, User interfaces (Computer systems), Human-computer interaction., Computer networks ., Application software., Econometrics.
topic_facet Computer science, Computer communication systems, Economic theory, Computer Science, User interfaces (Computer systems), Human-computer interaction., Computer networks ., Application software., Econometrics.
url https://doi.org/10.1007/978-3-319-66435-4, https://swbplus.bsz-bw.de/bsz494011327cov.jpg, http://dx.doi.org/10.1007/978-3-319-66435-4