The terms 'accuracy' and 'precision' are commonly used in the scientific literature to describe the quality of, for example, measurements or methods, where traditional definitions of the terms often associate 'accuracy' with systematic errors and 'precision' with random errors. However, several different definitions exist, and the meaning of the terms may vary. In a risk analysis context, which we study in more detail in this article, it is common to discuss new methods and risk results using the terms 'accuracy' or 'precision', but, in many situations, a proper clarification of what these terms actually mean is missing. Many authors seem to mix or not differentiate between the terms, some assuming that they are synonymous and a contrasting of the terms is unnecessary. As the concept of risk may also be subject to several interpretations, this may cause confusion and lead to unintended conclusions. A simple example from the oil and gas industry is used to illustrate the situation. The article offers some clarification of the current use of the terms and gives specific suggestions for how to interpret them in risk analysis. An outcome of the article is recommendations on how to define the terms in a risk analysis context.