This paper innovatively solves the quantification problem of the colour wheel being connected at both ends in the HSL colour space by constructing a colour preference feature extraction model. It generates a library of 57 test schemes covering the entire colour gamut using a non-uniform sampling grid. Based on eye-tracking experiments, visual aesthetic parameters are quantified. Furthermore, by integrating the Pix2Pix image translation model with the SE-Inception V3 aesthetic scoring network, an intelligent colour matching algorithm is proposed. At the spatial perception level, four core elements (interface, path, node, and focus) are identified, and optimisation is conducted using seven regions in Area A as a case study. In terms of colour extraction, the SSIM reaches 0.676, an improvement of 8.1% to 8.8% over MCM/K-Means/OM, and the PSNR is 22.16 dB, an improvement of 7.1% to 18.3%. In terms of colour coordination, the normalised colour difference mean was 0.264, outperforming professional designers at 0.227, with a subjective score of 4.69/5.0. The entropy-weighted TOPSIS model showed spatial perception polarisation, with Zone C’s comprehensive index at 0.8685 and Zone G at 0.1492. IPA behavioural analysis revealed cultural experience-oriented spaces achieved a cultural-driven behavioural satisfaction score of 4.34. The study indicates that intelligent algorithms, by quantifying artistic elements and spatial perception metrics, can significantly enhance the scientific rigor and experiential quality of environmental art design.