A.  Summary of good practices

17.2.        In order to properly carry out imputations needed for filling data gaps and for data editing purposes, it is good practice for compilers to first check and load data from primary and secondary sources, followed by integration, processing and analysis of data. If the compiler believes that editing of the source data might be needed, the source data provider should be contacted to edit or adjust data if deemed necessary. 

17.3.        For dealing with unreported or aggregated transactions due to thresholds applied by ITRS, different approaches could be used, such as collecting information on small value transactions from periodic sample surveys or from an analysis of small transactions before the thresholds were raised. If a threshold is established, it is good practice for transactions below the threshold level to be reported in an aggregated amount and classified using the appropriate code. Alternatively, statistical estimations can be applied and updated periodically with actual data. 

17.4.        When data are not available on a sufficiently timely basis, the compiler may extrapolate certain series from earlier periods or interpolate new data points within a range of known data points. The choice of extrapolation or interpolation method should be based on the characteristics of the past series and the range of information available at the time of compilation. If other, more frequent, indicators provide evidence of seasonality in the series, then the data models and interpolation techniques should take those into account. 

17.5.        For back-casting, in the absence of relevant indirect measures, compilers may consider using a constant percentage change between the start and end points, if no better approximation is feasible. It is good practice for such relationships to be analysed to verify that they hold over time in order for the compiler to determine the appropriate back-cast time period. Again, if other, more frequent, indicators provide evidence of seasonality in the series, then the back-casting techniques should take that into account. 

17.6.        Model-based estimates can be used for various statistics on the international supply of services. For instance, for estimating travel services, a model could be constructed using primarily the number of visitors and other short-term individuals travelling (partially available from tourism statistics or transportation operators) and estimates of per capita expenditure obtained from occasional surveys of persons who travel. Additionally, for estimating the value of mode 4 and the number of persons moving under mode 4, model-based estimates could be developed using existing statistics on the international supply of services, travel information (including model-based estimates of the number of travellers), as well as existing data from, for example, tourism, migration or employment statistics. In that context, it is advised that compilers analyse available metadata, familiarize themselves with the methodologies behind data from other statistical frameworks, cooperate closely and exchange the relevant microdata with the other statistical domains and institutions involved. 

17.7.        Compilers should strive to systematically allocate all services transactions to the relevant individual EBOPS services categories, at the most detailed level possible, as well as to appropriate trading partners. If diverse transactions are bundled into a single payment or receipt, the compiler should estimate the relevant share of services transactions and allocate estimates to items/partners as relevant.

 

Next: B. Imputation for filling data gaps and for data editing purposes