The Restricted Access Window (RAW) mechanism proposed by IEEE 802.11ah promises to address one of the major problems of the Internet of Things (IoT): high channel contention in large-scale densely deployed sensor networks. The RAW feature allows the Access Point (AP) to divide stations into different groups, with only the stations in the same group being allowed to access the channel simultaneously. Existing station grouping strategies only support homogeneous scenarios, where all sensor stations have the same fixed data transmission interval, modulation and coding scheme (MCS) and packet size. In this paper, we present two contributions to address this issue. First, a surrogate model that predicts RAW performance given specific network conditions and RAW configuration parameters. It is fast to train and can be solved in real-time. Second, the Model-Based RAW Optimization Algorithm (MoROA), which uses the surrogate model to determine the optimal RAW configuration in real-time, for heterogeneous stations and dynamic traffic. We compare the accuracy of our surrogate model to simulation results. Performance of MoROA is compared to existing RAW optimization algorithms and traditional 802.11 channel access methods. The results shows that the trained surrogate model can accurately predict RAW performance with a relative error less than 7% and 10% for 95% and 98% of the RAW configurations respectively. MoROA achieves a throughput up to twice as high as traditional 802.11 channel access functions in dense heterogeneous networks.