Effusion cooling represents the state-of-the-art for liner cooling technology in modern combustion chambers, combining a more uniform film protection of the wall and a significant heat sink effect by forced convection through a huge number of small holes. From a numerical point of view, a high-computational cost is required in a conjugate computational fluid dynamics (CFD) analysis of an entire combustor for a proper discretization of effusion holes in order to obtain accurate results in terms of liner temperature and effectiveness distributions. Consequently, simplified CFD approaches to model the various phenomena associated are required, especially during the design process. For this purpose, 2D boundary source models are attractive, replacing the effusion hole with an inlet (hot side) and an outlet (cold side) patches to consider the related coolant injection. However, proper velocity profiles at the inlet patch together with the correct mass flow rate are mandatory to accurately predict the interaction and the mixing between coolant air and hot gases as well as temperature and effectiveness distributions on the liners. In this sense, reduced-order (RO) model techniques from the machine learning framework can be used to derive a surrogate model (SM) for the prediction of these velocity profiles with a reduced computational cost, starting from a limited number of CFD simulations of a single effusion hole at different operating conditions. In this work, an application of these approaches will be presented to model the effusion system of a non-reactive single-sector linear combustor simulator equipped with a swirler and a multi-perforated plate, combining ansys fluent with a matlab code. The employed surrogate model has been constructed on a training set of CFD simulations of the single effusion hole with operating conditions sampled in the model parameter space and subsequently assessed on a different validation set.