Purpose: To compare perceptual and physiological training load responses during various basketball training modes. Methods: Eight semi-professional male basketball players (age: 26.3 ± 6.7 years; height: 188.1 ± 6.2 cm; body mass: 92.0 ± 13.8 kg) were monitored across a 10-week period in the preparatory phase of the training plan. Player session ratings of perceived exertion (sRPE) and heart rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and two HR-based models, the training impulse (TRIMP) and summated-heart-rate-zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model. Results: Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP: r = 0.53, P < 0.05; sRPE-SHRZ: r = 0.75, P < 0.05) and tactical/game-play conditioning (sRPE-TRIMP: r = 0.60, P < 0.05; sRPE-SHRZ: r = 0.63; P < 0.05) than during specific conditioning (sRPE-TRIMP: r = 0.38, P < 0.05; sRPE-SHRZ: r = 0.52; P < 0.05). Further, the sRPE model detected greater increases (126-429 AU) in training load than the TRIMP (15-65 AU) and SHRZ models (27-170 AU) transitioning between training modes. Conclusions: While the training load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.