Disease symptoms often contain features that are not routinely recognized by patients but can be identified through indirect inspection or diagnosis by medical professionals. Telemedicine requires sufficient information for aiding doctors' diagnosis, and it has been primarily achieved by clinical decision support systems (CDSSs) utilizing visual information. However, additional medical diagnostic tools are needed for improving CDSSs. Moreover, since the COVID-19 pandemic, telemedicine has garnered increasing attention, and basic diagnostic tools (e.g., classical examination) have become the most important components of a comprehensive framework. This study proposes a conceptual system, iApp, that can collect and analyze quantified data based on an automatically performed inspection, auscultation, percussion, and palpation. The proposed iApp system consists of an auscultation sensor, camera for inspection, and custom-built hardware for automatic percussion and palpation. Experiments were designed to categorize the eight abdominal divisions of healthy subjects based on the system multi-modal data. A deep multi-modal learning model, yielding a single prediction from multi-modal inputs, was designed for learning distinctive features in eight abdominal divisions. The model's performance was evaluated in terms of the classification accuracy, sensitivity, positive predictive value, and F-measure, using epoch-wise and subject-wise methods. The results demonstrate that the iApp system can successfully categorize abdominal divisions, with the test accuracy of 89.46%. Through an automatic examination of the iApp system, this proof-of-concept study demonstrates a sophisticated classification by extracting distinct features of different abdominal divisions where different organs are located. In the future, we intend to capture the distinct features between normal and abnormal tissues while securing patient data and demonstrate the feasibility of a fully telediagnostic system that can support abnormality diagnosis.