We consider stochastic unconstrained bilevel optimization problems when only the first-order gradient oracles are available. While numerous optimization methods have been proposed for tackling bilevel problems, existing methods either tend to require possibly expensive calculations involving Hessians of lower-level objectives, or lack rigorous finite-time performance guarantees. In this work, we propose a Fully First-order Stochastic Approximation (F 2 SA) method, and study its non-asymptotic convergence properties. Specifically, we show that F 2 SA converges to an -stationary solution of the bilevel problem after −7/2 , −5/2 , or −3/2 iterations (each iteration using O(1) samples) when stochastic noises are in both level objectives, only in the upper-level objective, or not present (deterministic settings), respectively. We further show that if we employ momentum-assisted gradient estimators, the iteration complexities can be improved to −5/2 , −4/2 , or −3/2 , respectively. We demonstrate the superior practical performance of the proposed method over existing second-order based approaches on MNIST data-hypercleaning experiments.