<p>Partially Observable Markov Decision Processes (POMDPs) is a modeling framework that allow decision makers to make decisions in uncertain environments. This framework is particularly suited for sequential decision making problems for which the uncertainties are revealed in stages. Constrained POMDPs (CPOMDPs) extend this framework by also accounting for various restrictions in the environment. This PhD thesis focuses on POMDPs, CPOMDPs and its extensions, proposes novel solution methods and studies applications of POMDPs in disease screening. First, we focus on developing efficient versions of the well-known algorithms for POMDPs. POMDPs are notoriously difficult to solve optimally, and exact solution algorithms typically fail to solve the instances with more than a few core states. In this regard, we consider scalable algorithms solve the POMDP problems. Specifically, we propose distributed and parallel versions of the exact solution methods such as Monahan's algorithm and the incremental pruning algorithm as well as the heuristics such as Lovejoy's upper bound and lower bound methods. Next, we study the CPOMDPs and propose linear programming-based solution methods as a flexible approach for those. We perform a detailed numerical study with five different problem instances to showcase the viability of the proposed solution approach for CPOMDPs. Lastly, we propose multi-objective CPOMDP models for the breast cancer screening problem. These CPOMDP models extend the previously proposed POMDP models for breast cancer screening by considering different screening modalities and simultaneously optimizing for expected total quality-adjusted life years maximization and life-time risk minimization objectives.</p>