Lean burn natural gas engines offer low particulate emissions than diesel counterparts and provides higher efficiency when compared to stoichiometric operation. However, with the lean burn strategy, three-way catalysts (TWC) compatibility is lost due to the oxidized exhaust stream. In comparison, the exhaust gas recirculation (EGR) dilution strategy can maintain compatibility with emission after-treatment systems. The maximum tolerated EGR levels are limited by the combustion stability degradation resulting from unfavorable mixture gas composition. Prechamber spark ignition (PCSI) systems, known to increase dilution tolerance in SI engines under lean conditions, was evaluated as a means to improve EGR dilution tolerance. Scavenging of residuals within the pre-chamber is typically a concern with these systems and as such studies on these systems working with various levels of EGR ratios are rare. In this work, an unscavenged (or unfueled, or passive) PCSI system installed in a medium-duty natural gas engine is modeled using CONVERGE CFD code. Simulation results are compared against the experimental data in terms of in-cylinder pressure and heat release rates from low to high (10% to 22%) EGR levels. The prediction capability of two combustion models, a multi-zone well-stirred reactor model and a flamelet-based combustion model, i.e. G-equation, are compared and evaluated under these conditions within the RANS framework. The G-equation model predictions agreed well with experiments up to 18.8% EGR dilution level. In comparison, the MZ-WSR model predicted slow prechamber combustion at all dilution levels which influenced the main chamber combustion phasing.