We have been conducting research into crowdsourcing for a few years at XRCE. Initially our research was guided by the idea that crowdsourcing could be used as a mechanism for the fulfillment of work that is currently outsourced. We began by focusing on the outsourced business process of the digitisation of healthcare insurance claims.
These documents, whether handwritten or printed are digitised and structured in outsourced operations before being delivered to the client for processing. In order to inform the design of technologies and processes to support this we studied two outsourced operations: one in-office in India and one with @home workers in the US. We fed into the design of a prototype system developed at Xerox Research Centre India .
Our key findings were that, even this so-called low skill work involved considerable skills; learning and expertise in terms of understanding relevant rules and policies and the skills to carry out such work in a timely manner to the required levels of quality. This also involves effective orchestration of the workforce: selection, training and on-going management and communication. This lead us to understand that in a crowdsourcing marketplace for Business Process Outsourcing (BPO) work it was highly likely that many of the same rules and criteria would apply, albeit with different specificities. Since then we have been following a theme of Relationship-Based Crowdsourcing (RBC) in our research.
During this initial research we realised that there was rather scant data of any qualitative depth on who the actual crowdworkers were, why they worked in these marketplaces, how they understood them, how they found and did work, and what their problems and concerns were. They were largely invisible and often talked about in rather machine –like terms. In order to get a deeper understanding of requirements for fleshing out RBC we chose to study these workers, and we focused on those who work on Amazon Mechanical Turk (mTurk), the best known crowdsourcing micro-task market. We have done two major studies in this area: the first was an ethnomethodological study of US Turkers through an analysis of posts and discussions on a key forum – Turker Nation; the second an ethnographic (interviews and observations) study of Indian Turkers, carried out by our PhD student, Neha Gupta.
In the study of Indian Turkers we have some great data which allows us some interesting comparisons with the US Turkers as well as to understand the implications of a transnational open market on the workers in different locations. We have not currently published this material so cannot share much at the moment. However, one thing is clear in the data, there are quite a few similarities but a great differentiator between the two constituencies in that Indian Turkers can earn comparatively good wages. That in turn changes a number of ways in which they both do the work and view the work.
All of this data has fed into and reinforced our ideas about the importance of RBC. This year we are focusing on developing technologies to support Turkers in the work to make the Turking work, providing them with tools and information to make better choices in the market, quicker. This is only one side of the relationship and we are also looking to develop other tools to enable better information and communication on both sides – trying to help workers find employers they want to work for and trust and vice versa.
Crowdsourcing is a fascinating topic because it focuses on emerging digitally instantiated and supported global or at least transnational markets which are currently only partially regulated. More and more work is being carried out by these means – some new and very much related to the Internet some more recognisable as traditional homeworking moved onto the Internet age. These markets are being born and morphing at a high rate and so it is highly interesting to research emerging rather than stable phenomena and try and have an influence on how they may develop in a sustainable, equitable and profitable fashion.