Randomization-based neural networks have gained wide acceptance in the scientific community owing to the simplicity of their algorithm and generalization capabilities. Random vector functional link (RVFL) networks and their variants are a class of randomization-based neural networks. RVFL networks have shown promising results in classification, regression, and clustering problems. For real-world applications, learning algorithms that can train with new samples over previous results are necessary because of to the constant generation of problems related to large-scale datasets. Various online sequential algorithms, commonly involving an initial learning phase followed by a sequential learning phase, have been proposed to address this issue. This paper presents a training algorithm based on multiple online sequential random vector functional link (OS-RVFL) networks for large-scale databases using a shared memory architecture. The training dataset is distributed among p OS-RVFL networks, which are trained in parallel using p threads. Subsequently, the test dataset samples are classified using each trained OS-RVFL network. Finally, a frequency criterion is applied to the results obtained from each OS-RVFL network to determine the final classification. Additionally, an equation was derived to reasonably predict the total training time of the proposed algorithm based on the learning time in the initial phase and the time scaling factor compared to the sequential learning phase. The results demonstrate a drastic reduction in training time because of data distribution and an improvement in accuracy because of the adoption of the frequency criterion.