Electrocardiograms (ECGs) and photoplethysmography (PPG) facilitate non-invasive cardiovascular monitoring; however, the correlation between their respective waveforms, which exhibit high cycle correlation, remains underexplored. This study aims to estimate ECG signals from PPG data using an array of Deep Neural Networks (DNNs) across varied transformation feature domains, thereby making PPG measurements a more expedient and less effort-intensive alternative to ECG acquisition. A novel, subject-specific deep learning model is introduced, combining the architectures of Convolutional Neural Networks (CNN) and bidirectional Long Short-Term Memory (BiLSTM), termed ConvBiLSTM. This hybrid model proposes an automatic method for ECG signal reconstruction. To ensure model robustness against deformation, spatial characteristics are first extracted using CNNs, followed by the extraction of temporal characteristics from the CNN output via BiLSTM. The BiLSTM approach mitigates the issues of gradient disappearance and expansion without compromising accuracy, an improvement over traditional RNN and LSTM methods. The performance of four distinct feature domains, namely the Time Domain (TD), Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), and Wavelet Scattering Transform (WST), is evaluated with regards to their efficacy in ECG signal reconstruction from PPG data using the ConvBiLSTM model. Superiority of the proposed DNN combination over individual DNNs was demonstrated through comparison of ConvBiLSTM performance. Simulation results reveal that our method achieves superior root mean square error (RMSE) in ECG signal reconstruction across all feature domains. Given the widespread application of RMSE in ECG monitoring, this metric was chosen as the key evaluation criterion. The combination of WST for PPG signals and DWT for ECG signal features demonstrated the lowest RMSE at 0.0654, indicating the potential of this approach for effective ECG signal reconstruction using PPG data.