Arguments in final course project reports written by engineering students are an essential feature of academic texts. Further revisions of instructors reveal to the students their mistakes, such as a lack of argumentation. However, the depth of revisions is often limited by the time availability of the instructor. In this paper, we present a system for argument assessment embedded in an Internet‐based Moodle course with Learning Tools Interoperability (LTI) standard, to help students improve argumentation in the problem statement, justification and conclusion sections of their final project report. The system identifies argumentative paragraphs along with argumentation level and provides recommendations to improve their writing. The analysis is achieved employing machine learning techniques with lexical features trained using an annotated collection of student writings. We performed a pilot test to compare control and experimental groups. Both groups consisted of undergraduate students of computer engineering programs from two different universities. We observed that using the argument assessment system increased the number of arguments in the experimental group. We discuss some further improvements for the system.