Amazon Mechanical Turk for LabelMe

Want to outsource your labeling task to the internet? Amazon Mechanical Turk allows access to many internet users who are ready to perform tasks for a fixed price.

The idea is simple: you provide a task and a selling price. Internet workers perform the task and are subsequently paid. In Mechnical Turk terminology, tasks are called "HITs", people requesting work are called "Requesters", and people who do the work are called "Workers".

This page describes how to set up LabelMe annotation tasks onto Mechanical Turk. The process is simple, as we have provided scripts for creating and sending LabelMe annotation tasks onto Mechanical Turk. All you have to do is follow the instructions below and pay workers on Mechanical Turk to label images. We collect the annotations, which are immediately available for download. In this way, everybody wins: Mechanical Turk workers get paid, you get your images annotated, and the computer vision community gets access to more hand-labeled data.

Instructions for setting up LabelMe on Mechanical Turk

Setting up LabelMe on Mechanical Turk is easy. The following are instructions for setting up LabelMe on Mechanical Turk.

1. Upload images onto LabelMe

2. Set up an Amazon Mechanical Turk account

You will need to set up an account as a Requester on Mechanical Turk. Instructions for setting up an account are here. Once you have created an account, sign in and try to access your account, along with the sandbox (used for debugging).

3. Download and install Amazon Mechanical Turk Command Line Tools

You will need to install the Amazon Mechanical Turk Command Line Tools. The tools provide the backbone for communicating with the Mechanical Turk servers. To start, you first need to request your access key and secret key. This is different than your username and password. To do this, create an Amazon Web Services account. Once the account is created, go to "Your Account->Access Identifiers", which is located at the top of the page. Here, you will find your access key and secret key.

Next, download the Amazon Mechanical Turk Command Line Tools (note that so far, we have only tested the Linux/Mac version). Unzip the file and follow the instructions inside the directory to install the Command Line Tools.

As a reminder, open and modify ./aws-mturk-clt-1.3.0/bin/ to include your access key and secret key. Also, make sure you set the following environment variables (e.g. using the command "export VAR=/path/to/file" in bash):

a) $MTURK_CMD_HOME - this should point to the location of your Amazon Mechanical Turk Command Line Tools (root directory).

b) $JAVA_HOME - this should point to the location of your Java installation.

4. Download the LabelMe-Mechanical-Turk toolbox

We provide a set of scripts that are used to interact with Mechanical Turk and set how the task is performed (e.g. how much do the workers earn, list of jobs, etc.). There are two ways to download the scripts to send LabelMe jobs to Mechanical Turk (currently, the scripts are available for Linux/Mac only):

A. Github repository

We maintain the latest version of the code on github. To pull the latest version, make sure that "git" is installed on your machine and then run "git clone" on the command line. You can refresh your copy to the latest version by running "git pull" from inside the project directory.

If you have an idea for a new feature and want to implement it, then let us know! With github, you can fork the code and send us a pull request. If we like your feature and implementation, then we will incorporate it into the main code.

B. Zip file

The zip file is a snapshot of the latest source code on github.

5. Submit jobs to Mechanical Turk

To submit jobs to Mechanical Turk, follow the remaining instructions inside demo.m inside the LabelMe-Mechanical-Turk toolbox.

Sample results and cost considerations

The quality of the annotations provided by Mechanical Turk workers is in general quite good. The following are example annotations provided by the workers:

The following are statistics for the tasks that we submitted to Mechanical Turk.

Task Price per image Task description # images # annotations
Time elapsed
# workers
1 $0.01 Label at least one object in the image 237 678 13.5 37
2 $0.01 Label at least five objects in the image 271 1492 23.13 43
3 $0.01 Label as many objects as you wish in the image 271 627 9.08 28

We also received feedback from the workers. The following are all of the feedback from the workers:

Positive Negative Other
fine this very heavy work for this hit one of them is all messed up
good $0.01 for five objects now? Sorry if anything is miss spelt.
Very interesting idea
fun trying this
No feedback. This was fun!

It seems in general that workers enjoy performing the task. In addition, sometimes they provided additional useful feedback on the task (e.g. workers would see polygons that other labelers had provided; one commented on the quality of a polygon provided by another worker).

The following paper provides additional information for labeling tasks on Mechanical Turk:

A. Sorokin and D. Forsyth. Utility data annotation with Amazon Mechanical Turk. First IEEE Workshop on Internet Vision at CVPR, 2008.

Let us know!

We are very excited about the annotation possibilities using Mechanical Turk with LabelMe. Please let us know if you are thinking of using this. We are curious about how you set the cost of the HIT, as well as the quality of the annotations. If you have any feedback on any part of the system (instructions, annotation tool, etc.), please let us know!

Advanced features

This section describes the structure of the labelme.input file, which specifies the set of images to annotate (the MATLAB function generateLabelMeInputFile.m can be used to generate this file). The file starts with the keyword urls and subsequently lists on each line the URL of the annotation tool for each image to annotate. The file has the following format:


Make sure that each image is listed once. This is important since currently LabelMe does not handle concurrency, so multiple people labeling the same image will overwrite each other.

We have added a few extra variables to the LabelMe annotation tool URL to customize the annotation process. Simply append &VAR=VAL to the URL as needed. The following is a list of variables:

Setting Meaning
&mt_sandbox=true Use Mechanical Turk sandbox mode. This mode is used for debugging on Mechanical Turk. You may want to start with this variable set to make sure everything works.
&N=5 The worker is required to label at least 5 polygons. Use N=inf to allow the worker to label as many as they want.
&mt_intro= You may customize the instructions that the worker sees. By default, the following instructions are given to the workers.
&mt_instructions=Place your instructions here You may customize the one-line instructions that the worker sees at the top of the labeling task. By default, the instructions are: Please label as many objects as you want in this image.
&actions=n This controls what actions the user is allowed to do. The following are possible actions:

n - create and edit new polygons
r - rename existing objects
m - modify control points on existing objects
d - delete existing objects
a - allow all actions
v - view polygons only (do not allow any editing)

To set the desired actions, use any combination of the letters above. For example, to allow renaming, modify control points, and delete actions, then set &actions=rmd. By default, &actions=n.
&viewobj=e This will control which objects the user sees. Use one of the following possible options:

e - view new and previously labeled objects
n - view new objects only
d - view new and deleted objects
a - view all objects (new, existing, deleted)

By default, &viewobj=e. Note that for deleted objects, these will be shown in gray and the object name in the object list will be italicized.
&objlist=visible This controls whether the object list on the right side is visible or not. Use &objlist=hidden to make it hidden.