National Data Coordination Architectures
1. Outcome
After completing this Circular, practitioners will be able to design and establish national data coordination architectures for ocean accounting, including the data-sharing arrangements, technical working groups, and coordination mechanisms required to bring together diverse ocean data streams into coherent accounting frameworks. It addresses how national statistical offices (NSOs) and ocean-related agencies can collaborate effectively through formal data sharing agreements, joint technical committees, and shared data platforms to produce and maintain ocean accounts.
The scope of this Circular is deliberately focused on data-specific coordination: the memoranda of understanding (MOUs) that govern data exchange between agencies, the technical working groups that resolve methodological questions about data integration, and the platform architectures that enable shared access to accounting data. Broader questions of governance design -- including how ocean accounts are embedded within national planning frameworks, inter-ministerial coordination models, and strategies for building domestic champions -- are addressed in TG-1.10 OA and National Planning Integration.
Ocean accounting is inherently multi-institutional. As the SEEA Ecosystem Accounting framework recognises, "given the high degree to which ecosystem accounting is cross-cutting and spatial in nature, implementation necessitates a highly collaborative approach and the active participation of representatives of many different agencies and disciplines"[1]. The data coordination architectures described in this Circular provide the institutional infrastructure through which this collaboration is realised at the operational level.
This Circular complements TG-0.8 Implementation Readiness Assessment, which provides the broader context for evaluating a country's institutional, data, and human capacity prerequisites for ocean accounting. Where TG-0.8 helps countries assess their readiness across all dimensions of implementation, the present Circular provides detailed operational guidance on establishing and managing the data coordination mechanisms that readiness assessment identifies as necessary.
The data harmonisation standards and interoperability protocols described in TG-4.6 Data Harmonisation and Interoperability provide the technical foundation upon which the coordination architectures in this Circular are built. While TG-4.6 addresses what standards and formats to use, TG-4.7 addresses how agencies organise themselves to apply those standards collaboratively.
2. Requirements
Essential prerequisites:
- TG-0.1 General Introduction to Ocean Accounts -- for the conceptual framework and key components of Ocean Accounts, including the relationship between environmental and economic accounting frameworks that data coordination must bridge.
- TG-4.6 Data Harmonisation and Interoperability -- for the data standards, exchange formats, and interoperability protocols that coordination architectures operationalise.
Helpful background:
- TG-0.7 Quality Assurance Principles -- for the quality dimensions and assessment frameworks that data coordination arrangements must uphold, particularly the institutional environment dimension.
This Circular addresses the institutional infrastructure for data coordination across all ocean accounting domains. Data coordination arrangements support every edge in the Ocean Accounts Framework (TG-0.1 Figure 0.1.2), as each connection between accounting components requires data flows from multiple custodian agencies. The coordination mechanisms described here are prerequisites for the data harmonisation workflows described in TG-4.6.
3. Guidance Material
Ocean accounts draw upon data from national statistical offices (economic statistics, national accounts), environmental agencies (ecosystem monitoring, biodiversity surveys), hydrographic organisations (bathymetry, tides, currents), fisheries management authorities (catch data, stock assessments), maritime administrations (vessel tracking, port statistics), and research institutions (earth observation, oceanographic surveys). Each of these agencies operates within its own legislative framework, with distinct mandates, data collection cycles, confidentiality requirements, and institutional cultures. The fundamental challenge for data coordination is to establish durable institutional mechanisms that enable these agencies to share data systematically while respecting their individual mandates and constraints.
The GSGF identifies five key elements that support implementation of its principles: governance and institutional capacity, policy and legal, human resources and capability, data and interoperability, and technology and infrastructure[2]. These elements provide a useful analytical lens for ocean accounting data coordination, recognising that effective data sharing requires not only technical standards but also institutional arrangements, legal frameworks, and skilled personnel.
This section examines six dimensions of national data coordination architecture: committee models and mandates; data sharing agreements; trust frameworks; integrated platform design; modular implementation strategies; and time-series data investment planning. Together, these elements form the institutional infrastructure for sustained ocean accounting data production.
Figure 4.7.1: National data coordination architecture for ocean accounts[3]
3.1 National Data Coordination Committee Models and Mandates
The establishment of a dedicated data coordination committee is the foundational institutional step for ocean accounting. This committee provides the governance structure through which agencies negotiate data sharing arrangements, resolve methodological disagreements, and coordinate production schedules. The committee's design should reflect the country's existing institutional landscape and statistical system architecture, building upon rather than duplicating existing coordination mechanisms.
The SEEA EA emphasises that "work is needed to fulfil a key objective, namely, the appropriate institutionalization of the processes (including data sharing), roles and responsibilities underpinning the compilation of ecosystem accounts"[4]. For ocean accounting specifically, this institutionalisation requires a committee structure that brings together data custodians from across the statistical, environmental, and marine science domains.
Three committee models are commonly observed in environmental-economic accounting programmes, each with distinct strengths and limitations:
Model A: NSO-led steering committee. The national statistical office establishes and chairs the committee, drawing authority from its statistical legislation mandate. This model benefits from the NSO's institutional independence and established relationships with government agencies. The NSO's role as data steward -- as described in SEEA EA, where NSOs have shifted "from functioning solely as statistics producers to acting also as service providers, which entails both facilitating a collaborative approach to data and statistics across different data and statistics communities and providing oversight and governance"[5] -- provides a natural basis for convening data-sharing discussions. However, this model may underweight the perspectives of environmental and marine science agencies whose data are essential inputs. It works best where the NSO has an existing environmental accounting programme and established relationships with marine data providers.
Model B: Multi-agency technical committee. A joint committee is established with rotating or co-chair arrangements between the NSO and a lead ocean/environmental agency. This model distributes authority more evenly and may generate stronger buy-in from agencies that see themselves as equal partners rather than data suppliers to the statistical office. The model requires clear terms of reference to avoid ambiguity about decision-making authority, particularly when methodological disagreements arise. It works best in countries with strong inter-agency collaboration traditions and where no single agency has dominant authority over ocean data.
Model C: Nested committee structure. An overarching ocean data coordination committee establishes domain-specific sub-committees for distinct data streams (e.g., fisheries data, ecosystem condition data, marine economic data). Each sub-committee handles the technical details of data sharing within its domain, while the overarching committee ensures cross-domain consistency and resolves inter-domain conflicts. This model scales well to large national statistical systems with many data custodians but introduces coordination overhead. It works best in countries with complex institutional landscapes where ocean data are dispersed across many agencies.
Table 3.1.1: Coordination committee model selection matrix
| Factor | Model A (NSO-led) | Model B (Multi-agency) | Model C (Nested) |
|---|---|---|---|
| Number of data custodian agencies | Few (3--5) | Moderate (4--8) | Many (8+) |
| NSO capacity for environmental data | Strong | Moderate | Variable |
| Existing inter-agency mechanisms | Weak or none | Moderate | Strong |
| Legislative backing | Statistical Act | MOU or executive order | Framework legislation |
| Decision-making speed | Faster | Moderate | Slower |
| Agency buy-in | Lower for non-NSO agencies | Higher | Highest |
| Coordination overhead | Low | Moderate | High |
Regardless of model, the committee mandate should specify at minimum: the scope of data coordination (which data streams and account types are covered); the authority to request data from member agencies; the decision-making process for resolving methodological disputes; the production schedule and reporting obligations; and the review cycle for data sharing arrangements. The mandate should also clarify the committee's relationship to any existing national statistics coordination bodies to avoid duplication.
Committee membership should include representatives with both subject-matter expertise and institutional authority to commit their agencies to data sharing arrangements. A common failure mode is to populate committees with technical staff who lack authority to agree to data release, or conversely with senior officials who lack the technical knowledge to resolve data integration questions. Effective committees typically include both levels, with a senior sponsoring official from each agency and a designated technical focal point.
The GSGF recommends that countries "scope the data sharing landscape in place at country level to support discussions and establish the needs for new arrangements"[6]. Before establishing new committee structures, countries should inventory existing data coordination mechanisms -- such as inter-agency statistics committees, environmental data networks, or marine science consortia -- and assess whether ocean accounting coordination can be accommodated within or alongside these existing structures. Creating entirely new coordination bodies when existing mechanisms can be adapted risks institutional fatigue and competition for senior attention.
3.2 NSO-Ocean Agency Data Sharing Agreements
Formal data sharing agreements provide the legal and operational foundation for regular, systematic exchange of data between agencies. Unlike ad hoc data requests -- which depend on personal relationships, are vulnerable to staff turnover, and typically deliver data in inconsistent formats -- formal agreements establish ongoing obligations, standard formats, and quality expectations that support sustained accounting production.
The GSGF notes that "establishing strong communication and institutional collaboration mechanisms between NSOs and NGIAs is essential" and that "this can be facilitated by, for example, country-level laws and policies, Memorandum of Understandings (MoUs), data sharing agreements, and other communities of practice"[7]. The UN Expert Group on the Integration of Statistical and Geospatial Information has developed a template MOU for joint work between NSOs and national geospatial information authorities that provides a useful starting point for ocean accounting data sharing agreements[8].
Data sharing agreements for ocean accounting should address the following elements:
Scope and content. The agreement should specify precisely which data variables are to be shared, at what level of disaggregation, and covering which geographic areas and time periods. For ocean accounting, this typically involves specifying the statistical unit (e.g., individual vessel, fishing zone, port, ecosystem asset), the variables (e.g., catch weight, landing value, ecosystem condition indicator), and the classification system to be applied (e.g., ISIC Rev.5 for economic activities, FAO ASFIS codes for species, IUCN GET for ecosystem types). Reference to the data harmonisation standards in TG-4.6 Data Harmonisation and Interoperability should be included to establish shared expectations about exchange formats.
Frequency and timeliness. The agreement should specify the data delivery schedule aligned with the ocean accounting production calendar. This includes the reference period (e.g., calendar year), the expected delivery date (e.g., T+6 months for preliminary data, T+18 months for final data), and any interim or provisional data releases. Aligning delivery schedules across multiple agencies is one of the most persistent operational challenges in environmental-economic accounting, as different agencies operate on different production cycles.
Quality standards. The agreement should reference the quality assurance framework in TG-0.7 Quality Assurance Principles and specify the quality documentation expected to accompany each data delivery. At minimum, this should include metadata describing data sources, coverage, estimation methods, and known limitations. Where data are subject to revision, the agreement should specify revision policies and notification procedures.
Confidentiality and access controls. Many ocean data sources contain commercially sensitive information (vessel-level catch data, company-level economic data) or are subject to statistical confidentiality legislation. The agreement must specify how confidentiality is maintained when data are shared between agencies, including access controls, aggregation rules, and permitted uses. Statistical confidentiality requirements may necessitate the provision of aggregated rather than unit-record data, with implications for the level of detail achievable in ocean accounts.
Format and transmission. The agreement should specify the exchange format (referencing SDMX, GeoJSON, or other standards as appropriate from TG-4.6), the transmission method (secure file transfer, API access, shared platform), and the metadata standards to be applied (ISO 19115 for geospatial metadata, SDMX structural metadata for statistical data).
Dispute resolution and review. The agreement should include procedures for resolving disagreements about data quality, timeliness, or interpretation, and should specify a review cycle (typically every two to three years) to ensure the agreement remains fit for purpose as accounting requirements evolve.
Table 3.2.1: Data sharing agreement checklist
| Element | Key Questions | Reference |
|---|---|---|
| Scope | What variables, units, areas, periods? | Account-specific requirements |
| Classifications | Which classification versions and concordances? | TG-4.6 Section 3.4 |
| Frequency | Annual, quarterly, ad hoc? Delivery timeline? | Production calendar |
| Quality | What metadata and documentation accompany data? | TG-0.7 |
| Confidentiality | What access controls, aggregation rules? | Statistical legislation |
| Format | SDMX-CSV, GeoJSON, custom? | TG-4.6 Section 3.2 |
| Transmission | Secure file transfer, API, shared platform? | IT security requirements |
| Review | How often is the agreement updated? | Typically 2--3 year cycle |
Countries at early stages of ocean accounting may find it pragmatic to begin with lightweight data sharing arrangements -- such as a letter of intent or an annex to an existing inter-agency agreement -- rather than negotiating comprehensive MOUs from the outset. Initial arrangements can be formalised as the accounting programme matures and data requirements become better defined. The key requirement is that arrangements be documented and institutional rather than dependent on individual relationships.
3.3 Trust Frameworks for Data Sharing
Trust is the essential precondition for sustained data sharing between agencies. Even where formal agreements exist, effective data sharing requires that data custodians trust that their data will be used appropriately, that their contributions will be acknowledged, that quality concerns will be addressed constructively, and that shared data will not be used in ways that undermine their agency's mandate or reputation. Trust frameworks formalise these expectations and provide mechanisms for building and maintaining inter-agency confidence over time.
The GSGF identifies four dimensions of interoperability derived from the European Interoperability Framework, of which organisational and legal interoperability are directly relevant to trust[9]. Organisational interoperability concerns the alignment of business processes and responsibilities between agencies, while legal interoperability ensures that agencies operating under different legislative frameworks can share data lawfully.
Trust frameworks for ocean accounting data sharing should address four dimensions:
Data provenance and attribution. Agencies are more willing to share data when they are confident that their role as data custodians will be acknowledged in published accounts. Trust frameworks should specify how source agencies are credited in published ocean accounts, including acknowledgement in methodology documentation, inclusion in data provenance metadata, and recognition in public communications. The SEEA EA notes the importance of NSOs providing "oversight and governance through provision of an independent and expert opinion on data"[10], and trust frameworks should clarify how this oversight role interacts with the custodian role of contributing agencies.
Quality feedback loops. Data sharing generates opportunities for quality improvement when receiving agencies identify anomalies, inconsistencies, or gaps in source data. Trust frameworks should establish constructive feedback channels through which the compiling agency can communicate quality concerns to source agencies without undermining confidence or creating adversarial dynamics. Framing quality feedback as a mutual benefit -- where integration reveals patterns invisible to individual agencies -- rather than as criticism encourages continued participation. Technical working groups (see Section 3.1) provide a natural forum for these discussions.
Graduated access models. Not all data need to be shared at the same level of detail or with the same access conditions. Trust frameworks can establish graduated access tiers that allow agencies to share aggregated data broadly while restricting access to unit-record data to authorised compilers under specific conditions. This approach allows data sharing to proceed where full disclosure of microdata is not feasible, while preserving the option for deeper integration as trust develops over time. Graduated access may be structured as follows:
Table 3.3.1: Graduated data access tiers
| Tier | Access Level | Typical Content | Governance |
|---|---|---|---|
| Public | Open access | Published aggregates, indicators | Standard open data licence |
| Statistical | Authorised researchers | Disaggregated tables, subnational data | Data sharing agreement |
| Restricted | Designated compilers only | Unit-record data, confidential microdata | MOU with security protocols |
| Embargoed | Compiling agency only | Pre-release data, provisional estimates | Time-limited embargo clause |
Mutual benefit demonstration. Trust is sustained when all participating agencies derive benefit from the data coordination arrangement. Trust frameworks should identify and communicate the specific benefits that each agency receives from participation -- whether improved data quality through cross-source validation, access to integrated datasets that support their own analytical needs, or enhanced visibility through association with a high-profile national accounting programme. Countries that have successfully sustained inter-agency data sharing for environmental-economic accounting often point to reciprocal data flows as a critical success factor: the NSO provides economic context data to environmental agencies, while environmental agencies provide ecosystem monitoring data to the NSO.
Building trust takes time and requires sustained investment in relationship management. The SEEA EA acknowledges that "traditionally, NSOs have less experience working with those types of environmental data" and that "collaboration with environmental policy and associated technical research agencies in the development of ecosystem accounts should be expected"[11]. Countries should anticipate a trust-building period of one to two years during which data sharing arrangements may operate informally or at limited scope before more comprehensive formal agreements are feasible.
3.4 Integrated Platform Architectures
Integrated data platforms provide the technical infrastructure through which agencies share, access, and manage ocean accounting data. Platform design choices have significant implications for the sustainability, scalability, and accessibility of ocean accounting data coordination. The platform architecture should align with the interoperability standards described in TG-4.6 Data Harmonisation and Interoperability and the quality assurance requirements in TG-0.7 Quality Assurance Principles.
The GSGF recommends that organisations "host appropriate technical infrastructures which support the use of the relevant standards where systems and services are linked through standard interfaces, services, and data formats"[12]. For ocean accounting, this translates to three broad platform architecture options:
Centralised repository. A single platform, typically hosted by the NSO or a designated data management agency, receives, stores, and manages all ocean accounting data. Contributing agencies upload data to the repository according to agreed schedules and formats. The compiling team accesses all data through the central platform. This architecture simplifies data management, ensures version control, and provides a single point of access for compilation. However, it requires significant hosting infrastructure, places the maintenance burden on a single agency, and may raise concerns about data ownership among contributing agencies. Centralised repositories work well for small-to-medium scale ocean accounting programmes where the NSO has strong IT capacity.
Federated data network. Each agency maintains its own data systems and exposes relevant datasets through standardised APIs (such as SDMX REST services or OGC web services). The ocean accounting compilation system queries these distributed data sources and integrates responses into accounting tables. This architecture respects agency autonomy over their data systems, distributes the infrastructure burden, and ensures that agencies always provide the most current version of their data. However, it requires that all participating agencies maintain API endpoints conforming to common standards, introduces dependency on multiple agencies' system availability, and may complicate version management. Federated networks work well where agencies have mature data management systems and strong IT capacity. The SDMX registry architecture, with its support for "automated processing" through subscription and notification services[13], provides a model for federated data access in the statistical domain.
Hybrid approach. The most common architecture in practice combines elements of both models. Core accounting data are maintained in a central repository, while large or frequently updated datasets (e.g., satellite imagery, real-time environmental monitoring) are accessed through federated services. This approach balances the governance benefits of centralisation with the flexibility and scalability of federation. The hybrid model allows agencies to contribute data through whichever channel best suits their capacity: agencies with mature API infrastructure expose data through web services, while agencies with limited IT capacity upload data files to the central repository.
Table 3.4.1: Platform architecture comparison
| Criterion | Centralised | Federated | Hybrid |
|---|---|---|---|
| Data governance | Single point of control | Distributed governance | Mixed |
| Infrastructure cost | Concentrated in one agency | Distributed across agencies | Distributed with central core |
| Technical prerequisite | Central repository + upload mechanisms | API endpoints at each agency | Central repository + selective APIs |
| Version control | Straightforward | Requires coordination | Moderate complexity |
| Data currency | Depends on upload frequency | Always current at source | Mixed |
| Scalability | Limited by central capacity | Scales with network | Scales selectively |
| Best suited to | Small--medium programmes | Mature IT environments | Most national contexts |
Regardless of architecture, the platform should support the following functional requirements:
- Metadata management: Storage and retrieval of data provenance, quality documentation, and structural metadata conforming to SDMX and ISO 19115 standards
- Version control: Tracking of data revisions with timestamped snapshots enabling reproduction of previously published accounts
- Access control: Role-based access consistent with the graduated access tiers described in Section 3.3
- Data validation: Automated quality checks at data ingestion, implementing the validation procedures described in TG-0.7 Quality Assurance Principles
- Audit trail: Logging of data access, modifications, and exports for accountability
Platform investment should be proportionate to the scale of the ocean accounting programme. Countries at early stages may begin with a shared file system or spreadsheet-based workflow, progressing to database-backed platforms and API integration as the programme matures and data volumes grow. The critical requirement is not technological sophistication but rather that data management procedures are documented, reproducible, and not dependent on a single individual's knowledge.
3.5 Modular Implementation Strategies
Given the institutional complexity of multi-agency data coordination, a modular implementation strategy allows countries to establish coordination mechanisms incrementally, building upon initial successes rather than attempting comprehensive coordination from the outset. Modular implementation recognises that different data streams present different coordination challenges and that agencies vary in their readiness to participate in formal data sharing arrangements.
The SEEA EA describes a spectrum of compilation approaches, noting that "in practice, the approach to compilation of ecosystem accounts lies between these two ends of the spectrum, with implementation being dependent on (a) policy focus; (b) availability of source data; and (c) resources available for compilation"[14]. This same pragmatism should guide data coordination: begin with the data streams that are most readily available and where coordination relationships are strongest, then extend to more challenging domains as institutional capacity develops.
A modular strategy typically proceeds through three phases:
Phase 1: Anchor partnership. Establish a data sharing arrangement between two agencies for a single, well-defined data stream. The most productive anchor partnerships typically involve the NSO and a fisheries management authority (where catch and economic data are relatively well-structured) or the NSO and an environmental monitoring agency (where ecosystem extent data from remote sensing are readily available). The anchor partnership serves as a proof of concept, demonstrating the feasibility and value of formal data coordination and providing a template for subsequent agreements. During this phase, the coordination mechanism may be informal -- a working-level agreement between technical staff -- with formal MOU development deferred until the collaboration has demonstrated its value.
Phase 2: Network expansion. Building on the anchor partnership, extend data coordination to additional agencies and data streams. Each new partnership can draw upon the templates, workflows, and lessons learned from Phase 1. Priority for expansion should be guided by the accounting priorities identified through the implementation readiness assessment (see TG-0.8 Implementation Readiness Assessment). During this phase, the coordination committee (Section 3.1) is formalised, data sharing agreements (Section 3.2) are documented, and platform requirements (Section 3.4) are assessed. The network should aim to establish stable data flows for the minimum set of account types identified as priorities.
Phase 3: System consolidation. With multiple bilateral data sharing arrangements in place, the coordination architecture can be consolidated into a coherent system. This involves standardising data sharing agreements across agencies, establishing a shared platform architecture, formalising quality assurance procedures, and embedding the coordination arrangements in institutional mandates and budgets. During this phase, the focus shifts from establishing new partnerships to optimising existing ones: improving data timeliness, increasing spatial or temporal resolution, and extending time series.
Table 3.5.1: Modular implementation pathway
| Phase | Duration | Activities | Outputs |
|---|---|---|---|
| 1. Anchor partnership | 6--12 months | Identify lead agencies; negotiate initial data sharing; compile pilot account | Working data exchange; pilot account; lessons learned |
| 2. Network expansion | 12--24 months | Extend to additional agencies; formalise committee; document agreements | Multiple data sharing MOUs; formal committee terms of reference; expanded account coverage |
| 3. System consolidation | 24--36 months | Standardise agreements; deploy shared platform; embed in budgets | Operational coordination system; regular production cycle; sustained institutional commitment |
Countries should resist the temptation to proceed directly to Phase 3 without the experiential learning provided by Phases 1 and 2. Data coordination arrangements that are designed comprehensively on paper but lack the foundation of practical inter-agency collaboration often fail to achieve sustained data flows. The modular approach ensures that institutional relationships, technical workflows, and governance mechanisms are tested and refined incrementally.
Risk management is important throughout all phases. Common risks include loss of key personnel (mitigated by documenting procedures and distributing knowledge across multiple staff), changes in agency leadership or priorities (mitigated by embedding coordination in formal mandates rather than personal commitments), and budget constraints (mitigated by demonstrating the value of coordination to decision-makers through early publication of useful accounting outputs). Countries should identify these risks explicitly in their implementation plans and develop mitigation strategies as part of the coordination committee's work programme.
3.6 Time-Series Data Investment Planning
Ocean accounts are most valuable when they provide consistent time-series data that reveal trends and support analysis of policy effectiveness. The SEEA EA emphasises that "ecosystem accounts are most informative when they are not compiled as one-off, irregular or short-term studies" and that "progressively, long time series of ecosystem accounting data can be established"[15]. Achieving this requires deliberate investment in data coordination arrangements that sustain data flows over time, rather than project-based arrangements that lapse when initial funding expires.
Time-series data investment planning addresses three challenges: ensuring continuity of data supply from contributing agencies; managing methodological changes that affect comparability over time; and building the institutional memory needed to maintain consistent production practices.
Continuity of data supply. Data coordination arrangements should be designed for permanence rather than project duration. This means embedding data sharing obligations in agencies' recurrent work programmes and budgets rather than relying on project funding. The coordination committee should maintain a data supply register that documents, for each data stream, the custodian agency, the legal basis for data collection, the funding source, and any known risks to continuity. Where data streams are at risk -- for example, because a monitoring programme faces budget cuts -- the committee can advocate for continued investment by demonstrating how the data contribute to multiple accounting and policy uses.
Countries should also identify opportunities to align ocean accounting data requirements with existing data collection mandates. Fisheries agencies, for instance, are typically required by law to collect catch data for stock management purposes; ocean accounting coordination leverages this existing obligation rather than creating new data collection burdens. Similarly, environmental agencies may be required to report on ecosystem condition for international conventions (such as the Convention on Biological Diversity or RAMSAR Convention on Wetlands), and ocean accounting can draw upon these reporting data streams. Where alignment exists, data sharing agreements should reference the existing legal mandate to strengthen the case for continued data supply.
Managing methodological change. Over the lifetime of an ocean accounting programme, contributing agencies will revise their data collection methods, classification systems, and estimation procedures. These changes can disrupt time-series comparability if not managed carefully. The coordination committee should establish a change notification protocol through which agencies inform the compiling team of planned methodological changes before they take effect. This advance notice allows the compiling team to assess the impact on accounting time series and, where necessary, develop bridging estimates or concordance procedures.
The classification concordance guidance in TG-4.6 Data Harmonisation and Interoperability Section 3.4 provides the technical framework for managing classification transitions. At the institutional level, the coordination committee should maintain documentation of all methodological changes that affect time-series comparability, including the date of change, the nature of the change, the concordance or bridging procedure applied, and any residual comparability limitations. This documentation should be published alongside the accounting data as a "changes and revisions" annex.
Institutional memory. Sustained time-series production depends on institutional memory -- the accumulated knowledge about data sources, compilation methods, quality issues, and inter-agency relationships that resides within the compiling team. Staff turnover poses a significant risk to institutional memory, particularly in small statistical offices where ocean accounting may be the responsibility of one or two individuals. Mitigation strategies include comprehensive documentation of compilation procedures (including data source contacts, file locations, processing scripts, and known data quality issues); cross-training of staff so that multiple individuals can perform each compilation step; and periodic compilation reviews where the full production process is documented and assessed.
The coordination committee should also maintain a shared knowledge base that records decisions made about data integration issues, methodological choices, and their rationale. This knowledge base serves both as institutional memory and as a transparency tool, enabling external reviewers to understand how compilation decisions were made. Over time, this knowledge base becomes a valuable resource for countries that are establishing similar coordination arrangements, supporting the knowledge exchange objectives of the broader ocean accounting community.
Investment sequencing. Countries should plan their data investment strategically, recognising that different data streams require different levels of investment and deliver value at different timescales. A practical approach sequences investment as follows:
Table 3.6.1: Data investment sequencing framework
| Priority | Data Domain | Typical Sources | Investment Type | Value Horizon |
|---|---|---|---|---|
| Immediate | Economic activity (ocean economy) | National accounts, business surveys | Coordination with NSO economic statistics division | Short-term (1--2 years) |
| Near-term | Ecosystem extent | Remote sensing, land/sea cover maps | Coordination with geospatial/environmental agency | Medium-term (2--3 years) |
| Medium-term | Ecosystem condition | Environmental monitoring networks | Coordination with multiple environmental agencies | Medium-term (3--5 years) |
| Longer-term | Ecosystem services (physical) | Biophysical modelling, survey data | Coordination with research institutions | Longer-term (3--5+ years) |
| Sustained | Monetary valuation | Economic modelling, market data | Integration across all coordination partners | Ongoing |
This sequencing reflects both the relative availability of data and the dependency structure of ocean accounts, where ecosystem extent accounts provide the spatial framework upon which condition and service accounts are built (see TG-3.3 Ecosystem Accounts and TG-3.1 Asset Accounts).
4. Summary
National data coordination architectures provide the institutional foundation for sustained ocean accounting. The key elements are:
- Coordination committee models should be selected based on the number of data custodian agencies, existing inter-agency mechanisms, and NSO capacity, with a clear mandate specifying scope, authority, and decision-making processes
- Data sharing agreements should formalise the scope, frequency, quality standards, confidentiality provisions, and exchange formats for each bilateral data flow, referencing the harmonisation standards in TG-4.6
- Trust frameworks should address data provenance and attribution, quality feedback loops, graduated access models, and mutual benefit demonstration to sustain agency participation over time
- Platform architectures should be selected to match national IT capacity and programme scale, with centralised, federated, or hybrid options available, all supporting metadata management, version control, and access control
- Modular implementation should proceed through anchor partnership, network expansion, and system consolidation phases, building institutional capacity incrementally rather than attempting comprehensive coordination from the outset
- Time-series investment planning should address continuity of data supply, methodological change management, and institutional memory to ensure that ocean accounting data coordination is sustained beyond initial project cycles
The broader implementation readiness context that complements this data-specific coordination focus is provided in TG-0.8 Implementation Readiness Assessment, which guides countries through evaluating institutional, data, and human capacity prerequisites across all dimensions of ocean accounting implementation.
Data coordination for ocean accounting should build upon existing national statistical system coordination mechanisms wherever possible, adapting proven models from economic statistics, agricultural statistics, and environmental-economic accounting. The GSGF's key elements -- governance and institutional capacity, policy and legal, human resources and capability, data and interoperability, and technology and infrastructure -- provide a useful checklist for assessing the completeness of coordination arrangements[16].
Guidance on the broader governance context for embedding ocean accounts in national planning is provided in TG-1.10 OA and National Planning Integration. Data source-specific integration guidance is provided in the companion Circulars: TG-4.1 Remote Sensing and Geospatial Data, TG-4.2 Survey Methods for Ocean Economic Activity, TG-4.3 Administrative Data Sources, TG-4.4 Citizen Science, and TG-4.5 Research Data.
Implementation Considerations
For minimum institutional capacity, data infrastructure, and human skills requirements for implementing these data methods, see TG-0.8 Implementation Readiness Assessment.
5. Acknowledgements
This Circular has been approved for public circulation and comment by the GOAP Technical Experts Group in accordance with the Circular Publication Procedure.
Authors: [To be confirmed]
Reviewers: [To be confirmed]
6. References
United Nations, "System of Environmental-Economic Accounting -- Ecosystem Accounting" (2021), para 1.52. ↩︎
United Nations Expert Group on the Integration of Statistical and Geospatial Information, "Global Statistical Geospatial Framework (GSGF), Version 2.0" (2025), Stage 2: The Key Elements. The five key elements are: Governance and Institutional Capacity; Policy and Legal; Human Resources and Capacity; Data and Interoperability; and Technology and Infrastructure. ↩︎
Figure 4.7.1 illustrates the architecture by which multiple data custodian agencies contribute to ocean accounts through a technical working group that governs a shared data platform. ↩︎
SEEA EA (2021), para 1.52. ↩︎
SEEA EA (2021), para 1.51. ↩︎
GSGF v2 (2025), Principle 1, Policy and Legal section. ↩︎
GSGF v2 (2025), Principle 1, Policy and Legal section. ↩︎
United Nations Expert Group on the Integration of Statistical and Geospatial Information, "MoU template for joint work between the NSO and the NGIAs for the integration of statistical and geospatial information." Available at https://ggim.un.org/meetings/GGIM-committee/14th-Session/documents/Background_document_Institutional_agreement_for_GSGF.pdf ↩︎
GSGF v2 (2025), Principle 4: Statistical and geospatial interoperability. The four interoperability dimensions (legal, organisational, semantic, technical) are derived from the European Interoperability Framework (EIF). ↩︎
SEEA EA (2021), para 1.54. ↩︎
SEEA EA (2021), para 1.53. ↩︎
GSGF v2 (2025), Principle 4, Technology and Infrastructure section. ↩︎
Statistical Data and Metadata Exchange (SDMX), "SDMX Standards: Section 1 - Framework for SDMX Technical Standards, Version 3.1" (May 2025), Section 3.5. ↩︎
SEEA EA (2021), para 1.58. ↩︎
SEEA EA (2021), para 1.56. ↩︎
GSGF v2 (2025), Stage 2: The Key Elements. The GSGF maturity self-assessment tool, adapted from the UN-IGIF baseline assessment methodology, provides a structured approach to evaluating institutional readiness across these dimensions. ↩︎