Skip to end of metadata
Go to start of metadata

Data compilation is understood as a set of statistical procedures performed on collected data to derive new information according to a given set of rules, resulting in intermediate data and final statistical outputs. Data compilation includes, among other things, integrating data from different sources, the use of weighting schemes, methods for imputing missing values or source data, statistical adjustment, balancing and cross-checking techniques and relevant characteristics of the specific methods applied. Part III starts with an introduction and overview of data compilation within the modes of services supply statistical framework (chapter 12), followed by a discussion of the integration of data from different sources (chapter 13), the compilation of resident/non-resident trade in services statistics (chapter 14), the compilation of foreign affiliates statistics (FATS) (chapter 15) and the compilation of additional indicators on the international supply of services (chapter 16)and concludes with a description of the estimation and modelling of missing data, and forecasting or back-casting (chapter 17). 

In this part: