Multiple observations of phenomena which cannot be explained in the Standard Model suggest that we need new physics beyond what we already know. The proliferation of theories in the last decades and the absence of direct observation of new phenomena at the LHC suggests to develop analysis strategies as model-independent as possible, yet suitable to be applied in experimental searches at current or next-generation colliders. One of the critical points in performing this kind of phenomenological analyses is that, very often, large parameter scans are necessary: intensive and often redundant MC simulations have to be performed to cover relevant regions of signal parameter space and achieve enough accuracy in the determination of signal features. On the other hand, disk space and computing time are often limited, and the environmental impact of performing such computations is almost never taken into consideration. There is a growing need to devise strategies to optimise data production and share resources in the HEP community, both theory and experiment. I will describe a framework which allows such approach, where simulated signal samples are deconstructed into complete sets of basic elements to be combined a posteriori to perform different analysis. The framework is modular, collaborative, flexible and resource-friendly. I will describe it through concrete examples for specific phenomenological analyses and indicate possible short- and long-term developments and applications.