Integrating Research Data into Official Statistics
1. Outcome
This Circular provides guidance on bridging the gap between scientific research data and official statistics for the compilation of ocean accounts. Ocean accounting requires diverse data sources that extend beyond traditional statistical surveys and administrative records to include oceanographic observations, biodiversity monitoring, ecosystem assessments, and other forms of research data generated by universities, research institutions, and international scientific programmes. This Circular addresses the systematic integration of research data into statistical production, maintaining the quality standards expected of official statistics while leveraging the unique capabilities of scientific research organisations.
By implementing the guidance in this Circular, practitioners will be able to identify research data sources relevant to ocean accounting applications, assess their fitness for statistical purposes using quality frameworks adapted from the United Nations National Quality Assurance Framework (UN NQAF), establish institutional partnerships with research organisations, and compile ocean accounts that combine traditional statistical sources with research data in a transparent and methodologically sound manner. The specific applications addressed include the use of oceanographic research data for ecosystem condition accounts (see TG-3.5 Ecosystem Condition), stock assessment data for fisheries accounts (see TG-6.7 Fisheries Stock Assessment), and biodiversity monitoring data for extent accounts (see TG-2.9 Ecosystem Extent).
The quality assessment dimensions presented here build on the overarching quality framework established in TG-0.7 Quality Assurance, while the data harmonisation techniques needed to reconcile research data with other sources are detailed in TG-4.6 Data Harmonisation. Key terms used in this Circular are defined in TG-0.6 Glossary.
2. Requirements
This Circular requires familiarity with:
-
TG-0.1 General Introduction to Ocean Accounts -- provides foundational understanding of ocean accounts components and the relationship between environmental and economic accounting frameworks, including the conceptual basis for integrating diverse data sources into a coherent accounting system.
-
TG-0.7 Quality Assurance -- establishes the overarching quality framework applicable to ocean accounting data, including the quality dimensions and assessment procedures that this Circular applies specifically to research data sources.
3. Guidance Material
The compilation of ocean accounts frequently requires data from scientific research programmes that were not originally designed for statistical purposes. This presents both opportunities and challenges. Research data can fill critical gaps in ocean observation and ecosystem monitoring that traditional statistical sources cannot address. However, integrating such data into official statistics requires careful attention to quality assessment, metadata documentation, and institutional coordination. This Circular provides a systematic approach to navigating these challenges.
The quality considerations discussed here should be understood within the broader quality assurance framework described in TG-0.7 Quality Assurance. For guidance on reconciling data from multiple sources with different classifications and spatial boundaries, see TG-4.6 Data Harmonisation.
3.1 Decision Use Cases for Research Data
Research data integration supports specific decision-making applications in ocean accounting. This section identifies the primary use cases where research data sources provide essential inputs that traditional statistical sources cannot supply.
3.1.1 Ecosystem condition accounts
Ecosystem condition accounts require measurements of biophysical and chemical variables that characterise the state of marine ecosystems[1]. For ocean environments, many of these measurements are only available through oceanographic research programmes:
Oceanographic surveys provide temperature, salinity, dissolved oxygen, nutrient concentrations, and pH measurements throughout the water column. The Global Ocean Observing System (GOOS) coordinates international efforts to standardise observation methods and improve data sharing[2]. For condition accounts in pelagic waters (see TG-6.5 Pelagic and Open Ocean Accounting), research vessel surveys and autonomous profiling floats (such as the Argo network) are the primary sources for subsurface condition data.
Acoustic surveys estimate biomass of schooling pelagic fish and benthic invertebrates using scientific echosounders. Regional fisheries management organisations (RFMOs) often conduct acoustic surveys to assess stock condition, providing data that can be integrated into ecosystem condition accounts where fish biomass serves as a condition indicator[3].
Water quality monitoring by environmental research agencies tracks pollution levels, turbidity, and harmful algal blooms. For coastal condition accounts, research monitoring of nitrogen and phosphorus loading provides essential data on eutrophication pressure.
The SEEA Ecosystem Accounting framework identifies specific condition characteristics that typically require research data inputs, including biotic characteristics (species diversity, biomass, community composition), abiotic characteristics (water temperature, pH, salinity), and functional characteristics (primary productivity, nutrient cycling)[4].
3.1.2 Fisheries stock assessment
Stock assessment for commercial and subsistence fisheries depends heavily on scientific research data[5]:
Catch-at-age data from research vessel surveys provide independent estimates of population abundance and age structure, complementing catch data reported by commercial vessels. For highly migratory species such as tuna, international research programmes coordinated through RFMOs are often the only source of systematic catch-at-age information across the species' range.
Tagging programmes using electronic tags, acoustic telemetry, and genetic markers reveal migration patterns, population structure, and survival rates. Close-kin mark-recapture methods using genetic analysis allow estimation of absolute abundance for high-value species such as southern bluefin tuna, where traditional survey methods are impractical[6].
Life history parameters including growth rates, natural mortality, and fecundity are typically derived from research laboratory studies and field sampling programmes rather than routine statistical collection.
National Statistical Offices (NSOs) compiling fisheries accounts rely on stock assessments produced by fisheries research institutes to estimate the physical and monetary value of aquatic resources, as described in TG-3.1 Asset Accounts.
3.1.3 Biodiversity and extent mapping
Research programmes provide essential data for ecosystem extent accounts and biodiversity indicators:
Benthic habitat mapping using multibeam echosounder surveys, remotely operated vehicles (ROVs), and drop cameras classifies seabed substrates and identifies ecosystem types such as cold-water coral reefs, sponge beds, and seagrass meadows. National hydrographic offices and marine research institutes are typically the custodians of these data.
Species occurrence records from biodiversity surveys, museum collections, and citizen science programmes are aggregated in global repositories such as the Ocean Biodiversity Information System (OBIS) and the Global Biodiversity Information Facility (GBIF)[7]. For ecosystem extent accounts based on the IUCN Global Ecosystem Typology, these occurrence data support delineation of ecosystem functional groups.
Coral reef monitoring through the Global Coral Reef Monitoring Network and regional programmes such as the Coral Triangle Initiative provides systematic assessments of coral cover, bleaching events, and reef condition that underpin extent and condition accounts for coral reef ecosystems (see TG-6.1 Coral Reef Ecosystem Accounting).
The Framework for the Development of Environment Statistics (FDES) notes that "scientific research data can be used to address data gaps" in environmental statistics, particularly for parameters that require specialised measurement techniques[8].
3.2 Types of Research Data
Research data relevant to ocean accounting encompasses a diverse range of sources, each with distinct characteristics, collection methodologies, and quality considerations. Understanding these distinctions is essential for identifying appropriate data sources and assessing their fitness for accounting purposes.
3.2.1 Oceanographic observation data
Oceanographic observation systems generate continuous or periodic measurements of physical, chemical, and biological ocean parameters[9]. These include temperature, salinity, currents, dissolved oxygen, nutrient concentrations, and chlorophyll levels. Major international programmes such as the Global Ocean Observing System (GOOS), the Argo float network, and regional ocean observing systems (e.g., the Integrated Marine Observing System in Australia) provide standardised data streams that can support ecosystem condition accounts[10]. The Intergovernmental Oceanographic Commission (IOC) of UNESCO coordinates international efforts to improve ocean observation capacity and data sharing[11].
Such data are typically collected using instrumented platforms including research vessels, moored buoys, autonomous underwater vehicles, satellite remote sensing, and profiling floats. The primary advantages include: systematic temporal coverage enabling trend analysis; standardised measurement protocols developed through international scientific consensus; and increasingly open access through global data repositories. However, spatial coverage can be uneven, with data density varying significantly between coastal and open ocean areas, and between developed and developing country waters[12].
Among the most directly relevant oceanographic variables for ecosystem condition accounts are the Essential Ocean Variables (EOVs) defined by GOOS. The EOV framework organises ocean observations into physics, biogeochemistry, and biology/ecosystems domains[13]. For ecosystem condition accounts, priority EOVs include sea surface temperature, dissolved oxygen, inorganic carbon, ocean colour (as a proxy for phytoplankton biomass), and marine habitat properties. For ecosystem extent accounts, relevant EOVs include hard coral cover, seagrass cover, mangrove cover, and macroalgal canopy cover. Practitioners should consult the current GOOS EOV specification sheets, which document readiness levels, observation requirements, and data product availability for each variable, to determine which EOVs are feasible data sources for their national accounting context.
3.2.2 Biodiversity survey data
Biodiversity surveys document species occurrence, abundance, and distribution through structured sampling programmes[14]. For marine environments, these include fish stock assessments, invertebrate surveys, marine mammal and seabird censuses, coral reef monitoring, and seagrass mapping exercises. Such data are fundamental for ecosystem extent accounts (mapping ecosystem types), ecosystem condition accounts (assessing biodiversity indicators), and ecosystem services accounts (quantifying provisioning services such as fisheries).
Biodiversity data are collected through diverse methodologies including visual census, acoustic surveys, environmental DNA (eDNA) sampling, and citizen science programmes[15]. The Global Biodiversity Information Facility (GBIF) aggregates species occurrence records from research institutions worldwide and provides standardised access through its data portal[16]. Regional initiatives such as the Ocean Biodiversity Information System (OBIS) focus specifically on marine biodiversity data[17].
A key consideration is that biodiversity surveys often follow sampling designs optimised for scientific research questions rather than comprehensive spatial coverage. This can result in sampling bias towards accessible locations or areas of particular scientific interest, which must be addressed when using such data for area-based ecosystem accounts[18].
3.2.3 Ecosystem monitoring programmes
Long-term ecosystem monitoring programmes track changes in ecosystem structure, function, and condition over time. Examples relevant to ocean accounting include coral reef monitoring networks (e.g., the Global Coral Reef Monitoring Network), mangrove forest assessments, seagrass habitat mapping, and kelp forest monitoring programmes[19]. These programmes typically combine remote sensing data with ground-truthing surveys to map ecosystem extent and assess condition indicators.
The SEEA Ecosystem Accounting framework recommends using a tiered approach for ecosystem monitoring, with Tier 1 using globally available default data, Tier 2 using regionally appropriate data, and Tier 3 using nationally collected data with full spatial and temporal coverage[20]. Research monitoring programmes often provide the foundation for Tier 2 and Tier 3 approaches, particularly for marine ecosystems where routine statistical collection is limited.
3.2.4 Remote sensing and Earth observation data
Satellite-based Earth observation provides systematic, repeated coverage of ocean and coastal areas at scales relevant to national accounting[21]. Key parameters measurable from space include sea surface temperature, ocean colour (indicative of chlorophyll and primary productivity), sea level, surface currents, coastal land cover change, and wetland extent. The European Union's Copernicus Marine Service and NASA's Ocean Biology Processing Group provide validated ocean data products derived from multiple satellite sensors[22].
Remote sensing data are particularly valuable for their temporal frequency (enabling detection of change) and spatial comprehensiveness (enabling complete national coverage). However, limitations include cloud cover interference, the inability to observe below the sea surface, and the requirement for ground-truthing to validate derived products. The SEEA Technical Guidance on Biophysical Modelling notes that remote sensing offers "enormous opportunities to disseminate data with very short time-lags and high-frequency"[23].
Practitioners should be aware of the spatial and temporal resolution characteristics of commonly used satellite sensors. Sentinel-2 provides optical imagery at 10-metre resolution with a five-day revisit time, suitable for mapping coastal habitats such as mangroves and seagrass beds. Landsat (currently Landsat 8 and 9) offers 30-metre resolution imagery with a 16-day revisit cycle and a continuous archive extending to 1972, making it valuable for long-term change detection of coastal land cover. MODIS provides daily global coverage at 250-metre to 1-kilometre resolution, suitable for broad-scale ocean colour and sea surface temperature monitoring. Sentinel-1 synthetic aperture radar operates independently of cloud cover at 5-20 metre resolution, enabling monitoring in persistently cloudy tropical coastal regions. The choice of sensor depends on the accounting application: ecosystem extent mapping typically requires higher spatial resolution (Sentinel-2, Landsat), while condition monitoring over large areas may use coarser but more frequent coverage (MODIS, VIIRS)[24]. For comprehensive guidance on satellite data sources, see TG-4.1 Remote Sensing and Geospatial Data.
3.2.5 Scientific research publications and datasets
Peer-reviewed scientific publications represent an important source of coefficients, conversion factors, and methodological parameters for ecosystem service modelling[25]. For example, estimates of carbon sequestration rates in mangroves, nutrient retention by seagrass meadows, or coastal protection values from coral reefs are frequently derived from published research rather than direct national measurement. While individual studies may be site-specific, synthesis studies and meta-analyses can provide generalised values applicable across similar ecosystem types.
Research datasets underlying publications are increasingly required to be deposited in open repositories as a condition of publication or funding. The Framework for the Development of Environment Statistics (FDES) notes that scientific research data "are usually available at no or low cost" and "can be used to address data gaps"[26]. However, the FDES also cautions that such data "often use terms and definitions that differ from those used in statistics", may have "limited scope", and are "often available on a one-time basis only"[27].
3.3 Quality Assessment Frameworks
Research data must be assessed for fitness for statistical purposes before incorporation into ocean accounts. The quality dimensions applied to research data differ in emphasis from those applied to survey data, reflecting the distinct characteristics of scientific data production. For the overarching quality framework applicable to all ocean accounting data, see TG-0.7 Quality Assurance.
3.3.1 Dimensions of data quality
The United Nations National Quality Assurance Framework (UN NQAF) identifies quality dimensions applicable to official statistics including relevance, accuracy, reliability, timeliness, punctuality, accessibility, clarity, coherence, and comparability[28]. When evaluating research data for ocean accounting, particular attention should be given to:
Relevance -- The degree to which data meet the needs of users. For ocean accounting, relevance assessment should consider whether the research data address the specific ecosystem components, condition characteristics, or service flows required by the account structure. The UN NQAF notes that "relevance is concerned with whether the available statistics meet the needs of users" and that "assessing relevance is a subjective matter dependent upon the varying needs of users"[29]. For research data, relevance questions include: Does the spatial coverage align with the accounting area? Does the temporal resolution match the accounting period? Are the measured variables directly usable or do they require transformation?
Accuracy and reliability -- The degree to which data correctly measure the phenomena they are designed to measure. For research data, this assessment should consider sampling design adequacy, measurement methodology validation, error propagation in derived products, and replication of results[30]. The SEEA Technical Guidance on Biophysical Modelling provides detailed guidance on accuracy assessment for modelled data, including approaches for validating look-up tables, process-based models, and machine learning outputs[31].
Coherence -- The degree to which data can be reliably combined with other data from different sources. Research data often use classifications, definitions, and spatial boundaries that differ from statistical standards. Achieving coherence requires mapping research classifications to standard statistical classifications (such as the IUCN Global Ecosystem Typology for ecosystem types) and reconciling spatial units[32]. See TG-4.6 Data Harmonisation for detailed guidance on harmonisation approaches.
Comparability -- The degree to which data are comparable over time and across space. Scientific monitoring programmes may change methodologies as measurement technology improves, creating breaks in time series. Documentation of methodological changes and development of bridging factors may be required to maintain temporal comparability[33].
Accessibility -- The ease with which data can be obtained and used. The UN NQAF emphasises that "accessibility refers to the physical conditions under which users can obtain data" and includes "the ease with which the existence of information can be ascertained, as well as the suitability of the form or medium through which the information can be accessed"[34]. For research data, accessibility considerations include licensing restrictions, data format compatibility, and the existence of documented data access protocols. The FAIR principles (Findable, Accessible, Interoperable, Reusable) provide a complementary framework addressed in Section 3.4.1.
3.3.2 Reproducibility, replicability, and scientific rigour
Research data quality is traditionally evaluated through the scientific peer review process. Two related but distinct concepts are relevant to assessing research data for accounting purposes.
Reproducibility refers to the ability to obtain consistent results using the same input data and the same methods[35]. It verifies that the computational or analytical pipeline produces identical outputs when re-executed. For ocean accounting, reproducibility is essential because accounts must be updatable on a regular basis using consistent methods. If a research dataset cannot be reproduced from documented inputs and procedures, the accounting compilation becomes dependent on individual researchers and is vulnerable to disruption.
Replicability refers to the ability to obtain consistent results when new data are collected using the same or similar methods[36]. It tests whether the underlying phenomenon is measured consistently across different sampling events. For ocean accounting, replicability matters when research-derived coefficients or modelled parameters are applied across multiple accounting periods or geographic areas. A replicable finding provides greater confidence that a value derived from one study site or time period can be reasonably applied in another context.
The SEEA Technical Guidance on Biophysical Modelling emphasises that "transparency of approaches is essential" and recommends that "not only the models and data sources, but also workflow, conceptual development, and approaches for qualitatively and quantitatively describing uncertainty are properly traced and available for examination"[37]. Code repository systems such as GitHub facilitate version control and reproducibility for modelled outputs.
3.3.3 Uncertainty quantification
Research data are characterised by various sources of uncertainty that must be documented and, where possible, quantified. Uncertainty can arise from sampling variability, measurement error, model parameter uncertainty, and structural model uncertainty. The SEEA Technical Guidance on Biophysical Modelling notes that "uncertainty matrices, which outline possible sources of uncertainty for each model" provide a basic approach to uncertainty documentation[38].
For ecosystem accounts derived from biophysical models, "outputs should be seen as best estimates, rather than absolute values" unless detailed parameterisation and validation has been conducted[39]. Statistical agencies should communicate uncertainty alongside point estimates, enabling users to assess fitness for their specific purposes. The tiered approach recommended in SEEA EA supports this, with lower tiers acknowledged to have greater uncertainty but serving valuable purposes for awareness-raising and broad trend analysis[40].
The UN NQAF recommends that "statistical agencies should publish information on the quality of the statistics they compile and disseminate" and that "quality information should include measures of accuracy"[41]. For research data integrated into ocean accounts, this translates to publishing confidence intervals, standard errors, or qualitative uncertainty assessments alongside the data values used in the accounts.
3.4 Metadata Standards
Comprehensive metadata documentation is essential for integrating research data into statistical systems. Metadata enable data discovery, support quality assessment, and provide the documentation required for reproducible compilation of accounts.
3.4.1 FAIR principles for research data
The FAIR Guiding Principles -- Findable, Accessible, Interoperable, and Reusable -- provide a framework for data management that facilitates integration across sources[42]. The SEEA Technical Guidance on Biophysical Modelling endorses FAIR approaches as "especially important for primary data sources such as statistical offices"[43]. The principles are:
-
Findable: Data are assigned globally unique and persistent identifiers; data are described with rich metadata; metadata clearly include the identifier of the data they describe; data are registered in searchable resources[44].
-
Accessible: Data are retrievable by their identifier using standardised protocols; protocols allow for authentication where necessary; metadata remain accessible even when data are no longer available[44:1].
-
Interoperable: Data use formal, accessible, shared languages for knowledge representation; data use vocabularies that follow FAIR principles; data include qualified references to other data[44:2].
-
Reusable: Data are richly described with accurate and relevant attributes; data are released with clear usage licences; data are associated with detailed provenance; data meet domain-relevant community standards[44:3].
NSOs should prioritise data sources that adhere to FAIR principles and should advocate for FAIR practices in partnerships with research organisations.
3.4.2 Domain-specific metadata standards
Several domain-specific metadata standards are relevant for ocean accounting:
ISO 19115 Geographic Information - Metadata is the international standard for geospatial metadata, providing a schema for describing geographic datasets and services[45]. ISO 19115 covers identification, extent, quality, spatial reference, content, distribution, and other properties of geographic data. For spatially-referenced ocean data, ISO 19115-compliant metadata should be a requirement. The Global Statistical Geospatial Framework (GSGF) notes that ISO 19115 "provides information about the identification, the extent, the quality, the spatial and temporal aspects, the content, the spatial reference, the portrayal, distribution, and other properties of digital geographic data and services"[46].
Darwin Core is a metadata standard for biodiversity data, providing terms for describing species occurrence records[47]. Darwin Core is the standard used by GBIF and OBIS for aggregating biodiversity observations from distributed sources. Ocean accounting projects drawing on biodiversity survey data should ensure compatibility with Darwin Core terms.
Climate and Forecast (CF) Conventions provide standards for describing climate and forecast data, particularly for gridded data in NetCDF format[48]. CF conventions are widely used in oceanography for describing variables, coordinates, and attributes of ocean model outputs and observational products.
SDMX (Statistical Data and Metadata eXchange) is the international standard for statistical data and metadata exchange[49]. SDMX provides data structure definitions, code lists, and metadata schemas that enable interoperability between statistical systems. As research data are brought into statistical production, conversion to SDMX-conformant formats facilitates integration with national statistical systems. The SDMX framework "sets standards that can facilitate the exchange of statistical data and metadata using modern information technology"[50].
S-100 Universal Hydrographic Data Model is the data framework maintained by the International Hydrographic Organization (IHO) for hydrographic, maritime, and related geospatial data[51]. S-100 provides a registry-based architecture for defining data products including bathymetry (S-102), surface currents (S-111), and water level information (S-104). For ocean accounting in coastal and maritime zones, S-100-conformant data products offer standardised representations of seabed topography, tidal conditions, and coastal morphology that can support both physical asset accounts and ecosystem extent mapping. Where national hydrographic offices produce S-100 products, these should be considered as a primary geospatial data source alongside remote sensing and survey-based inputs.
3.4.3 Data provenance documentation
Provenance documentation tracks the history of a dataset through its processing chain, enabling users to understand how data have been transformed from raw observations to derived products[52]. For research data used in ocean accounting, provenance should document:
- Original data sources and collection methods
- Processing steps and algorithms applied
- Software and version numbers used
- Personnel responsible for processing
- Dates of processing steps
- Quality control procedures applied
The SEEA Technical Guidance on Biophysical Modelling recommends maintaining "a data provenance system" that "improves users' ability to understand the fitness for purpose of data sets"[53].
3.5 Compilation Procedure
This section outlines a systematic procedure for assessing, acquiring, and integrating research data into ocean accounting programmes. The procedure consists of four phases: research data assessment, metadata alignment, quality assurance, and account integration.
3.5.1 Phase 1: Research data assessment
The first phase involves identifying candidate research data sources and conducting a preliminary fitness assessment:
Step 1.1: Identify data requirements -- Determine which components of the ocean accounts require research data inputs. This assessment should be guided by the account structure and the availability of alternative data sources. For example, if ecosystem condition accounts for pelagic waters are planned (see TG-6.5 Pelagic and Open Ocean Accounting), identify which condition variables (dissolved oxygen, chlorophyll-a, sea surface temperature) are available from research programmes versus traditional statistical sources.
Step 1.2: Survey available research data -- Conduct a systematic survey of research data sources within the accounting domain. This survey should cover:
- National research institutions (marine laboratories, oceanographic institutes, fisheries research centres)
- International research programmes (GOOS, Argo, regional ocean observing systems)
- Global data repositories (OBIS, GBIF, Copernicus Marine Service)
- Published scientific literature and associated datasets
Step 1.3: Apply integration checklist -- For each candidate data source, complete the integration checklist presented in Table 3.5.1. This checklist draws together the quality, metadata, and institutional considerations discussed in Sections 3.3 and 3.4.
Table 3.5.1: Research Data Integration Checklist
| Integration Criterion | Assessment Questions | Documentation Required |
|---|---|---|
| Spatial coverage | Does it cover the accounting area? | Geographic metadata (bounding box, coordinate system) |
| Temporal alignment | Does it match accounting periods? | Date/time stamps, temporal resolution |
| Methodological consistency | Are methods comparable to official statistics? | Methods documentation, peer-reviewed publications |
| Quality assurance | What QA procedures were applied? | Quality reports, validation studies |
| Institutional access | Can the NSO access/use the data? | Data sharing agreement, licensing terms |
| Classification alignment | Are categories mappable to SEEA/ISIC? | Classification concordance or crosswalk |
| Metadata completeness | Are ISO 19115/FAIR metadata available? | Metadata catalogue entry |
| Reproducibility | Can results be reproduced from documented inputs? | Code repository, processing documentation |
A data source that fails one or more criteria is not necessarily excluded; rather, the assessment identifies areas where additional work is needed (for example, developing classification concordances or negotiating data sharing agreements).
3.5.2 Phase 2: Metadata alignment
Once candidate data sources have been identified, the second phase addresses metadata harmonisation:
Step 2.1: Extract existing metadata -- Retrieve available metadata from the research data source. This may exist in ISO 19115 format (for geospatial data), Darwin Core format (for biodiversity observations), or NetCDF-CF format (for oceanographic model outputs).
Step 2.2: Assess metadata completeness -- Compare existing metadata against the requirements for statistical use. The UN NQAF identifies essential metadata elements including: identification (title, abstract, keywords); temporal extent (reference period, temporal resolution, update frequency); spatial extent (geographic coverage, coordinate reference system); data quality (accuracy, completeness, consistency); lineage (data sources, processing steps); distribution (access constraints, usage licences); and contact information (data custodian, responsible party)[54].
Step 2.3: Fill metadata gaps -- Where research data lack statistical metadata elements, work with the data provider to document missing information. Priority gaps include: correspondence to statistical classifications (e.g., mapping research ecosystem types to IUCN GET categories), uncertainty quantification (confidence intervals, accuracy assessments), and update schedules (will data be available on a recurring basis to support time series accounts?).
Step 2.4: Document provenance -- Create or enhance provenance documentation following the framework in Section 3.4.3. For research data that have undergone multiple processing steps (e.g., satellite imagery processed to ocean colour products, acoustic survey data processed to biomass estimates), the provenance chain must be fully documented to support reproducibility.
3.5.3 Phase 3: Quality assurance
The third phase applies the quality assessment framework from Section 3.3:
Step 3.1: Assess relevance -- Verify that the research data address the specific accounting requirements identified in Phase 1. Relevance assessment should consider both conceptual relevance (do the measured variables correspond to the accounting concepts?) and practical relevance (are the data sufficiently timely, granular, and complete for the intended use?).
Step 3.2: Evaluate accuracy -- Assess the accuracy of research data using available validation studies, inter-comparison exercises, or ground-truthing campaigns. For satellite-derived products, consult published accuracy assessments from the product specification or validation reports. For field survey data, review sampling design adequacy and measurement precision. For modelled outputs, assess the model's skill metrics against independent observations.
Step 3.3: Test coherence -- Verify that research data can be combined with other data sources used in the accounts. Coherence testing should identify discrepancies in spatial boundaries, temporal reference periods, or measurement units that require harmonisation (see TG-4.6 Data Harmonisation).
Step 3.4: Check comparability -- Assess whether research data are comparable across time and space. For time series accounts, document any methodological changes that create breaks in comparability. For accounts covering multiple regions, verify that research data use consistent methods across the geographic domain.
Step 3.5: Quantify uncertainty -- Where feasible, quantify the uncertainty associated with research data values. This may take the form of standard errors (for survey-based estimates), confidence intervals (for modelled values), or qualitative uncertainty categories (high/medium/low confidence). Uncertainty estimates should be documented in metadata and, where appropriate, published alongside account values.
3.5.4 Phase 4: Account integration
The final phase integrates quality-assured research data into ocean accounts:
Step 4.1: Apply classification concordances -- Where research data use different classifications from statistical standards, apply the concordances or crosswalks developed in Phase 2. For example, if research biodiversity data use scientific taxonomic names but the account structure requires aggregation to functional groups, apply the taxonomic-to-functional-group mapping.
Step 4.2: Reconcile spatial and temporal boundaries -- Align research data to the spatial and temporal structure of the accounts. This may require spatial aggregation (from fine-resolution survey points to accounting spatial units), temporal aggregation (from monthly observations to annual accounting periods), or gap-filling (interpolating missing values).
Step 4.3: Document data sources -- Record the use of research data in the account compilation metadata. Documentation should identify: the research data source (with citation and persistent identifier), the account components that use the research data, the processing steps applied, and the quality assessment results. This documentation supports transparency and reproducibility.
Step 4.4: Establish update procedures -- Where research data will be used on a recurring basis for time series accounts, establish procedures for data updates. Coordinate with research data providers to understand their publication schedule and arrange for regular data transfers. Monitor for methodological changes that may affect comparability across accounting periods.
3.6 Worked Example: Integrating Oceanographic Survey Data into Condition Accounts
This worked example demonstrates the application of the compilation procedure to a realistic scenario: a National Statistical Office seeking to compile ecosystem condition accounts for coastal shelf waters using dissolved oxygen data from a national oceanographic research programme.
Setting: A coastal nation with 150,000 km² of exclusive economic zone (EEZ) shelf waters (depths <200m) seeks to compile annual condition accounts for the Marine Shelf (M1) ecosystem type following the IUCN Global Ecosystem Typology. One of the selected condition variables is dissolved oxygen concentration, which serves as an indicator of ecosystem health and hypoxia risk. The NSO has identified the National Oceanographic Research Institute (NORI) as a potential data provider.
Phase 1: Research data assessment
Step 1.1: Identify data requirements -- The condition account requires dissolved oxygen measurements representative of the shelf ecosystem. Following SEEA EA guidance, the reference condition is defined as the dissolved oxygen level corresponding to a healthy, well-mixed shelf ecosystem (typically 6-8 mg/L). The account structure requires annual average values aggregated to ecosystem asset spatial units.
Step 1.2: Survey available data -- NORI conducts quarterly oceanographic surveys at 45 fixed stations distributed across the shelf. Each station is sampled at 5 depth intervals (surface, 25m, 50m, 75m, 100m). Dissolved oxygen is measured using calibrated Winkler titration (precision ±0.1 mg/L). The programme has operated continuously since 2010 with consistent methodology. Data are archived in NORI's institutional repository.
Step 1.3: Apply integration checklist -- Applying Table 3.5.1:
| Criterion | Assessment Result | Documentation |
|---|---|---|
| Spatial coverage | 45 stations cover 150,000 km² shelf area; spatial interpolation required | Station coordinates in WGS84 |
| Temporal alignment | Quarterly surveys provide seasonal coverage; annual averaging feasible | Survey dates documented per cruise |
| Methodological consistency | Winkler titration is standard oceanographic method; consistent with international best practice | NORI Standard Operating Procedures manual |
| Quality assurance | Inter-laboratory comparison with national metrology institute every 2 years shows bias <0.15 mg/L | QA reports available 2012, 2014, 2016, 2018, 2020 |
| Institutional access | NORI willing to share data; Memorandum of Understanding required | Draft MoU provided by NORI legal office |
| Classification alignment | Dissolved oxygen is a standard SEEA EA condition variable; no classification mapping required | SEEA EA Table 5.3 (abiotic chemical characteristics) |
| Metadata completeness | Station metadata exist; cruise-level metadata incomplete | ISO 19115 records for stations; cruise metadata to be created |
| Reproducibility | Raw titration data archived; processing code not version-controlled | NORI agrees to deposit processing scripts in GitHub repository |
Finding: The data source meets most integration criteria. Key actions required: negotiate MoU, complete cruise metadata, establish code repository for processing scripts, and develop spatial interpolation approach for station-to-asset aggregation.
Phase 2: Metadata alignment
Step 2.1: Extract existing metadata -- NORI provides station-level metadata in CSV format including: station ID, latitude, longitude, depth, seafloor substrate type, and sampling history. Dissolved oxygen data are provided in a separate CSV with fields: station ID, cruise ID, date, depth, dissolved oxygen (mg/L), temperature (°C), salinity (PSU).
Step 2.2: Assess metadata completeness -- Existing metadata lack: temporal coverage (survey start/end dates per cruise), spatial reference system documentation (assumed WGS84 but not stated), quality flags (outlier values not flagged), and provenance (processing steps from raw titration to reported mg/L not documented).
Step 2.3: Fill metadata gaps -- NSO and NORI jointly develop enhanced metadata including:
- Temporal coverage: Date range for each quarterly cruise added to cruise metadata table
- Spatial reference: Coordinate reference system explicitly documented as WGS84 (EPSG:4326)
- Quality flags: NORI applies automated QC checks (range test, climatology test, spike test) following GOOS recommendations and adds QC flags to data file (1=good, 2=probably good, 3=probably bad, 4=bad)
- Provenance: Processing workflow documented: raw titration volume → dissolved oxygen calculation using modified Winkler equation → temperature and salinity correction → final value in mg/L
Step 2.4: Document provenance -- Provenance record created:
Dissolved oxygen values are measured using Winkler titration following the GOOS BioEco Panel recommendations. Seawater samples are collected using Niskin bottles mounted on a CTD rosette. Titration is performed shipboard within 6 hours of collection. Raw titration volumes are converted to dissolved oxygen concentration using the modified Winkler equation with temperature and salinity corrections applied. Processing code (Python) is version-controlled at https://github.com/NORI/oceanography/DO-processing [fictional URL]. Quality control follows GOOS Real-Time Quality Control procedures (GOOS, 2021).
Phase 3: Quality assurance
Step 3.1: Assess relevance -- The dissolved oxygen data directly address the condition account requirement for chemical state characteristics. Quarterly temporal resolution provides adequate seasonal coverage for annual aggregation. Spatial coverage (45 stations across 150,000 km²) is sparser than ideal but sufficient for broad-scale condition assessment given the relatively homogeneous shelf environment.
Step 3.2: Evaluate accuracy -- NORI's QA reports show inter-laboratory comparison results within ±0.15 mg/L of reference standards (typical accuracy requirement is ±0.2 mg/L for oceanographic work). Sampling precision (replicate measurements at same station) averages ±0.08 mg/L. Overall, accuracy is fit for purpose for condition accounting.
Step 3.3: Test coherence -- NSO compares dissolved oxygen data against coastal water quality monitoring data from the environmental protection agency at 12 co-located stations. Average difference is 0.12 mg/L (within measurement uncertainty), confirming coherence.
Step 3.4: Check comparability -- NORI methodology has remained unchanged since programme inception (2010). All data are directly comparable over time. Spatial comparability verified by consistent station locations and sampling protocols.
Step 3.5: Quantify uncertainty -- Based on QA assessment, dissolved oxygen values carry an overall uncertainty of ±0.15 mg/L (combining measurement precision and inter-laboratory comparison). This uncertainty is propagated through spatial interpolation, yielding spatially varying uncertainty estimates for condition index values.
Phase 4: Account integration
Step 4.1: Apply classification concordances -- No classification mapping required; dissolved oxygen is used directly.
Step 4.2: Reconcile spatial and temporal boundaries -- The accounting area is divided into 500 ecosystem asset spatial units (300 km² each) based on seabed substrate type. Dissolved oxygen data from 45 stations are spatially interpolated to the centroid of each asset unit using inverse distance weighting with a search radius of 30 km. Quarterly values are averaged to produce annual mean dissolved oxygen per asset unit. Interpolation uncertainty is quantified using cross-validation (leave-one-out approach).
Step 4.3: Document data sources -- Account compilation metadata records:
Dissolved oxygen condition data sourced from National Oceanographic Research Institute Quarterly Shelf Survey (NORI-QSS), 2015-2020. Data access via MoU between NSO and NORI dated 2021-03-15. Data processing conducted by NSO Environmental Accounts Unit using scripts deposited at https://github.com/NSO/ocean-accounts/condition-processing [fictional URL]. Spatial interpolation: inverse distance weighting, 30 km search radius. Temporal aggregation: arithmetic mean of quarterly values. Uncertainty: ±0.15 mg/L measurement uncertainty plus ±0.3 mg/L spatial interpolation uncertainty (varies by distance to nearest station).
Step 4.4: Establish update procedures -- NSO and NORI agree that NORI will provide annual data extracts by 31 March each year (covering the previous calendar year). NSO will re-run spatial interpolation and update condition accounts by 30 June. NORI will notify NSO of any methodological changes at least 6 months prior to implementation.
Resulting condition account entry (example for one asset unit):
| Accounting year | Dissolved oxygen (mg/L) | Indicator value | Uncertainty |
|---|---|---|---|
| 2015 | 6.8 | 0.85 (good condition) | ±0.35 mg/L |
| 2016 | 6.5 | 0.81 (good condition) | ±0.33 mg/L |
| 2017 | 5.9 | 0.74 (moderate condition) | ±0.38 mg/L |
| 2018 | 6.2 | 0.78 (good condition) | ±0.36 mg/L |
| 2019 | 5.7 | 0.71 (moderate condition) | ±0.40 mg/L |
| 2020 | 5.4 | 0.68 (moderate condition) | ±0.42 mg/L |
Note: Indicator value calculated as (observed DO - minimum threshold) / (reference level - minimum threshold), where minimum threshold = 2.0 mg/L (hypoxia threshold) and reference level = 8.0 mg/L (fully oxygenated shelf water).
Outcome: The NSO has successfully integrated research data from NORI into the ecosystem condition account. The dissolved oxygen time series reveals a declining trend from good to moderate condition over 2015-2020, prompting policy attention to potential eutrophication drivers. The documented uncertainty estimates and provenance information support transparent communication of account results and reproducible updates in future accounting periods.
3.7 Institutional Arrangements
Effective integration of research data into official statistics requires institutional arrangements that bridge the different cultures, incentives, and practices of statistical offices and research organisations.
3.7.1 Roles of National Statistical Offices
The SEEA Technical Recommendations identify several roles that NSOs can play in ecosystem accounting that are relevant to research data integration[55]:
- Data organisation: NSOs have expertise in collecting and organising data from diverse sources, building coherent pictures from varied inputs.
- Standards stewardship: NSOs establish and maintain definitions, concepts, and classifications, addressing the multiple definitions common in research contexts.
- Data integration: NSOs integrate data from various sources within national and international statistical frameworks.
- Quality frameworks: NSOs apply data quality frameworks enabling consistent assessment and accreditation of information sources.
- National coverage: NSOs create national pictures, applying techniques for scaling information to national level.
- Authority: NSOs present an authoritative voice through application of standard measurement approaches and quality frameworks.
While NSOs may not have deep expertise in marine science, they bring essential capabilities for transforming research data into official statistics[56].
3.7.2 Roles of research institutions
Research institutions contribute domain expertise, data collection infrastructure, and methodological innovation. The SEEA Technical Recommendations note that "agencies that lead work on geographic and spatial data -- particularly the mapping of environmental data and the use of remote sensing information -- including for spatial and temporal modelling of ecosystem services" play important roles[57]. For ocean accounting, relevant research institutions include:
- Universities with marine science programmes
- National oceanographic and hydrographic agencies
- Fisheries research institutes
- Environmental monitoring agencies
- International research programmes (e.g., IOC, ICES)
These institutions are often the primary custodians of ocean observation data, biodiversity records, and ecosystem assessments needed for ocean accounts.
3.7.3 Establishing partnerships
The Global Statistical Geospatial Framework (GSGF) provides guidance on establishing collaboration between statistical offices and geospatial agencies that is applicable to research partnerships more broadly[58]. Key elements include:
Memoranda of Understanding (MoUs): Formal agreements define respective responsibilities, data sharing arrangements, and collaboration mechanisms. The UN-GGIM provides a template for MoUs between NSOs and National Geospatial Information Agencies[59].
Data sharing agreements: These should specify data to be shared, formats and standards to be used, update frequency, access restrictions, citation requirements, and responsibilities for quality assurance.
Communities of practice: Regular engagement through working groups or committees maintains relationships and addresses emerging issues. The SEEA Technical Recommendations emphasise that "appropriate institutional arrangements and resourcing to support ongoing engagement and communication are also required"[60].
Capacity building: Joint training and skill-sharing activities build mutual understanding between statistical and research communities. Research personnel may need orientation on statistical concepts and quality frameworks; statistical personnel may need training on oceanographic data and methods.
Several countries provide practical models for NSO-research institution partnerships in ocean accounting. In Australia, the Australian Bureau of Statistics (ABS) has collaborated with CSIRO (the Commonwealth Scientific and Industrial Research Organisation) and the Bureau of Meteorology to integrate ocean observation and ecosystem monitoring data into environmental-economic accounts, drawing on CSIRO's marine research infrastructure and the Integrated Marine Observing System (IMOS) for oceanographic data that ABS would not independently collect[61]. In the Netherlands, Statistics Netherlands (CBS) has worked with research institutes including Wageningen University and NIOZ (Royal Netherlands Institute for Sea Research) to compile experimental natural capital accounts for the North Sea, combining statistical data with research-derived biophysical models and monitoring data[61:1]. These examples demonstrate that successful partnerships typically require sustained engagement over multiple accounting cycles, clear allocation of responsibilities for data processing and quality assurance, and mutual recognition that research and statistical institutions bring complementary capabilities.
3.7.4 Data transfer protocols
The SEEA Technical Note on Air Emission Accounts recommends establishing "data transfer protocols" given that "data may be acquired from a number of institutions or agencies"[62]. Such protocols should address:
- Data formats and transmission methods
- Timing and frequency of data provision
- Procedures for handling system changes and upgrades
- Metadata to be provided with each data transfer
- Feedback mechanisms for data quality issues
Establishing robust protocols prevents disruption to statistical production when research systems are upgraded or personnel change.
4. Acknowledgements
This Circular has been approved for public circulation and comment by the GOAP Technical Experts Group in accordance with the Circular Publication Procedure.
Authors: GOAP Secretariat
Reviewers: To be confirmed
5. References
SEEA EA, para. 5.14-5.30. Ecosystem condition accounts record "the quality of an ecosystem" through biophysical and chemical characteristics. ↩︎
Intergovernmental Oceanographic Commission. (2019). The Global Ocean Observing System 2030 Strategy. Paris: UNESCO-IOC. Available from: https://www.goosocean.org/ ↩︎
Simmonds, J., & MacLennan, D.N. (2005). Fisheries Acoustics: Theory and Practice, 2nd ed. Oxford: Blackwell Science. Acoustic surveys provide fishery-independent biomass estimates used in stock assessment. ↩︎
SEEA EA, Table 5.3. The ecosystem condition typology identifies abiotic (physical state, chemical state), biotic (compositional, structural, functional), and landscape characteristics. ↩︎
Hilborn, R., & Walters, C.J. (1992). Quantitative Fisheries Stock Assessment: Choice, Dynamics and Uncertainty. New York: Chapman and Hall. Stock assessment integrates fishery-dependent catch data with fishery-independent survey data. ↩︎
Davies, C.R., et al. (2015). "Close-kin mark-recapture: a new method for estimating population abundance from genetic data." Molecular Ecology, 24(2), 289-300. https://doi.org/10.1111/mec.13011 ↩︎
OBIS. (2025). Ocean Biodiversity Information System. Available from: https://obis.org/ -- GBIF. (2025). Global Biodiversity Information Facility. Available from: https://www.gbif.org/ ↩︎
FDES 2013, para. 1.32-1.33. Scientific research and special projects "can be used to address data gaps" but "often use terms and definitions that differ from those used in statistics." ↩︎
Intergovernmental Oceanographic Commission. (2019). The Global Ocean Observing System 2030 Strategy. Paris: UNESCO-IOC. ↩︎
Roemmich, D., et al. (2019). "On the future of Argo: A global, full-depth, multi-disciplinary array." Frontiers in Marine Science, 6, 439. https://doi.org/10.3389/fmars.2019.00439 ↩︎
SDG Framework. SDG Target 14.a: "Increase scientific knowledge, develop research capacity and transfer marine technology, taking into account the Intergovernmental Oceanographic Commission Criteria and Guidelines on the Transfer of Marine Technology." ↩︎
GOOS. (2023). GOOS Essential Ocean Variables. Available from: https://www.goosocean.org/eov -- Observation coverage is denser in developed country waters and coastal zones. ↩︎
GOOS. (2023). GOOS Essential Ocean Variables. Available from: https://www.goosocean.org/eov ↩︎
IUCN. (2020). IUCN Global Ecosystem Typology 2.0: Descriptive profiles for biomes and ecosystem functional groups. Gland: IUCN. https://doi.org/10.2305/IUCN.CH.2020.13.en ↩︎
Thomsen, P.F., & Willerslev, E. (2015). "Environmental DNA - An emerging tool in conservation for monitoring past and present biodiversity." Biological Conservation, 183, 4-18. ↩︎
GBIF. (2025). Global Biodiversity Information Facility. Available from: https://www.gbif.org/ ↩︎
OBIS. (2025). Ocean Biodiversity Information System. Available from: https://obis.org/ ↩︎
SEEA Technical Recommendations, para. 1.34. Fully spatial approaches "will generally be more resource intensive and implementation will require more ecological and geo-spatial expertise." ↩︎
GCRMN. (2020). Status of Coral Reefs of the World: 2020. Available from: https://gcrmn.net/ ↩︎
SEEA EA, para. 12.15 on tiered approaches to measurement. ↩︎
SEEA Biophysical Modelling, para. 370. "Remote sensing data and modelling approaches provides enormous opportunities to disseminate data with very short time-lags and high-frequency." ↩︎
Copernicus Marine Service. (2025). Available from: https://marine.copernicus.eu/ ↩︎
SEEA Biophysical Modelling, para. 370. ↩︎
SEEA Biophysical Modelling, para. 370. Remote sensing provides "very short time-lags and high-frequency" data but requires validation against in situ measurements. ↩︎
SEEA Biophysical Modelling, para. 363. "The accuracy of modelled data can be assessed, although different approaches may be needed depending on the type of model used." ↩︎
FDES 2013, para. 1.32. Scientific research data "are usually available at no or low cost." ↩︎
FDES 2013, para. 1.33. Research data "often use terms and definitions that differ from those used in statistics", have "limited scope", and are "often available on a one-time basis only." ↩︎
United Nations. (2019). United Nations National Quality Assurance Frameworks Manual for Official Statistics. New York: United Nations Statistics Division. Available from: https://unstats.un.org/unsd/methodology/dataquality/unnqaf-manual/ ↩︎
UN NQAF Manual, para. on relevance (Level D quality dimensions). Relevance is context-dependent and requires user consultation. ↩︎
SEEA Biophysical Modelling, paras. 361-362 on accuracy of input data. ↩︎
SEEA Biophysical Modelling, paras. 363-369 on model validation approaches. ↩︎
GSGF v2, para. on Principle 4. "Interoperability between statistical and geospatial data and metadata standards is needed to overcome structural, semantic, and syntactic barriers." ↩︎
SEEA Biophysical Modelling, para. 373. "Modelling approaches have been rapidly improving...This creates challenges in including data produced from biophysical models into accounts." ↩︎
UN NQAF Manual, para. on accessibility (Level D quality dimensions). Accessibility covers both discoverability and ease of access. ↩︎
Wilkinson, M.D., et al. (2016). "The FAIR Guiding Principles for scientific data management and stewardship." Scientific Data, 3, 160018. https://doi.org/10.1038/sdata.2016.18 ↩︎
SEEA Biophysical Modelling, para. 378. Reproducibility and replicability are distinct but related concepts for assessing scientific rigour. ↩︎
SEEA Biophysical Modelling, para. 378. Transparency of workflow and uncertainty quantification are essential for reproducibility. ↩︎
SEEA Biophysical Modelling, para. 360. Uncertainty matrices outline possible sources of uncertainty for each model. ↩︎
SEEA Biophysical Modelling, para. 369. Model outputs should be seen as best estimates unless detailed validation has been conducted. ↩︎
SEEA Biophysical Modelling, para. 379. "Tier 1 and Tier 2 approaches may be best for awareness raising or analysis of broad spatiotemporal trends." ↩︎
UN NQAF Manual, recommendation on quality reporting. Statistical agencies should publish quality information including accuracy measures. ↩︎
Wilkinson et al. (2016). "The FAIR Guiding Principles." ↩︎
SEEA Biophysical Modelling, para. 372. FAIR principles are "especially important for primary data sources such as statistical offices." ↩︎
SEEA Biophysical Modelling, Table 27 (after para. 372). Definitions of FAIR guiding principles from Wilkinson et al. (2016). ↩︎ ↩︎ ↩︎ ↩︎
ISO 19115-1:2014. Geographic information -- Metadata -- Part 1: Fundamentals. International Organization for Standardization. ↩︎
GSGF v2 and ISO 19115-1:2014. "The ISO 19115 metadata standard broadly provides a conceptual schema on metadata presented as UML diagrams. It provides information about the identification, the extent, the quality, the spatial and temporal aspects, the content, the spatial reference, the portrayal, distribution, and other properties of digital geographic data and services." ↩︎
Darwin Core Maintenance Group. (2021). Darwin Core Quick Reference Guide. Available from: https://dwc.tdwg.org/terms/ ↩︎
CF Conventions Committee. (2023). CF Conventions and Metadata. Available from: https://cfconventions.org/ ↩︎
SDMX. (2021). SDMX 3.0 Technical Standards. Available from: https://sdmx.org/ ↩︎
SDMX. (2021). SDMX 3.0 Technical Standards. "The Statistical Data and Metadata Exchange (SDMX) initiative sets standards that can facilitate the exchange of statistical data and metadata using modern information technology." ↩︎
International Hydrographic Organization. (2022). S-100 Universal Hydrographic Data Model. Edition 5.1.0. Monaco: IHO. Available from: https://iho.int/en/s-100-universal-hydrographic-data-model ↩︎
SEEA Biophysical Modelling, para. 374 on data provenance systems. ↩︎
SEEA Biophysical Modelling, para. 374. Data provenance systems "improve users' ability to understand the fitness for purpose of data sets." ↩︎
UN NQAF Manual, Level D (Managing statistical outputs). Essential metadata elements for statistical dissemination. ↩︎
SEEA Technical Recommendations, Box 1.2. Potential roles of National Statistical Offices in ecosystem accounting. ↩︎
SEEA Technical Recommendations, paras. 1.60-1.61 on roles of NSOs and non-NSO agencies. ↩︎
SEEA Technical Recommendations, para. 1.61. Agencies with geospatial and remote sensing expertise play important roles. ↩︎
GSGF v2, Section on Principle 1. "Establishing strong communication and institutional collaboration mechanisms between NSOs and NGIAs is essential. This can be facilitated by, for example, country-level laws and policies, Memorandum of Understandings (MoUs), data sharing agreements, and other communities of practice." ↩︎
UN-GGIM. MoU template for joint work between the NSO and NGIAs. Available from: https://ggim.un.org/ ↩︎
SEEA Technical Recommendations, para. 1.57. "Given the need for involving many areas of expertise, an important aspect of implementation is the allocation of resources to co-ordination, data sharing and communication." ↩︎
SEEA Technical Recommendations, paras. 1.55-1.62 on institutional arrangements for ecosystem accounting, including examples of multi-agency collaboration models. ↩︎ ↩︎
SEEA Technical Note: Air Emission Accounts, para. 81. "Given that data may be acquired from a number of institutions or agencies, it is important to establish data transfer protocols." ↩︎