This paper proposes two sequential metamodel‐based methods for level‐set estimation (LSE) that leverage the uniform bound built on stochastic kriging: predictive variance reduction (PVR) and expected classification improvement (ECI). We show that PVR and ECI possess desirable theoretical performance guarantees and provide closed‐form expressions for their respective sequential sampling criteria to seek the next design point for performing simulation runs, allowing computationally efficient one‐iteration look‐ahead updates. To enhance understanding, we reveal the connection between PVR and ECI's sequential sampling criteria. Additionally, we propose integrating a budget allocation feature with PVR and ECI, which improves computational efficiency and potentially enhances robustness to the impacts of heteroscedasticity. Numerical studies demonstrate the superior performance of the proposed methods compared to state‐of‐the‐art benchmarking approaches when given a fixed simulation budget, highlighting their effectiveness in addressing LSE problems.