Abstract. A method is presented to help users look up the meaning of an unknown sign from American Sign Language (ASL). The user submits a video of the unknown sign as a query, and the system retrieves the most similar signs from a database of sign videos. The user then reviews the retrieved videos to identify the video displaying the sign of interest. Hands are detected in a semi-automatic way: the system performs some hand detection and tracking, and the user has the option to verify and correct the detected hand locations. Features are extracted based on hand motion and hand appearance. Similarity between signs is measured by combining dynamic time warping (DTW) scores, which are based on hand motion, with a simple similarity measure based on hand appearance. In user-independent experiments, with a system vocabulary of 1,113 signs, the correct sign was included in the top 10 matches for 78% of the test queries.