Hearing loss is an increasingly prevalent condition resulting from damage to the inner ear which causes a reduction in speech intelligibility. The societal need for assistive hearing devices has increased exponentially over the past two decades; however, actual human performance with such devices has only seen modest gains relative to advancements in digital signal processing (DSP) technology. A major challenge with clinical hearing technologies is the limited ability to run complex signal processing algorithms requiring high computation power. The CCi-MOBILE platform, developed at UT-Dallas, provides the research community with an open-source, flexible, easy-to-use, software-mediated, powerful computing research interface to conduct a wide variety of listening experiments. The platform supports cochlear implants (CIs) and hearing aids (HAs) independently, as well as bimodal hearing (i.e., a CI in one ear and HA in the contralateral ear). The platform is ideally suited to address hearing research for: both quiet and naturalistic noisy conditions, sound localization, and lateralization. The platform uses commercially available smartphone/tablet devices as portable sound processors and can provide bilateral electric and acoustic stimulation. The hardware components, firmware, and software suite are presented to demonstrate safety to the speech scientist and CI/HA user, highlight user-specificity, and outline various applications of the platform for research.