Implementation Readiness Assessment

Field Value
Circular ID TG-0.8
Version 1.0
Badge Applied
Status Draft
Last Updated April 2026

1. Outcome

This Circular provides a country readiness self-assessment framework for launching or scaling ocean accounts. It guides countries through evaluating their institutional, data, and human capacity prerequisites, and outlines phased implementation pathways appropriate to different starting conditions. After reading this Circular, national statistical offices, ocean agencies, and inter-ministerial coordination bodies will be able to diagnose their current readiness level, identify priority gaps, design a sequenced implementation plan, and mobilise the institutional arrangements necessary to sustain ocean accounting as a regular statistical programme.

2. Requirements

Essential prerequisites:

Helpful background:

This Circular sits at the start of any country's implementation journey. It evaluates readiness across institutional capacity, data infrastructure, human skills, and policy demand -- the four pillars that determine whether a country can sustain ocean accounting as a regular statistical programme. The assessment informs the choice of implementation pathway and connects to TG-4.7 Data Coordination for institutional architecture and TG-3.11 Sub-National Accounts for multi-level implementation.

3. Guidance Material

3.1 Country Readiness Self-Assessment Framework

3.1.1 Purpose and scope

Implementation readiness assessment is a structured process through which a country evaluates its capacity to initiate, compile, and sustain ocean accounts. The concept draws on established practice in the international statistical community, including the UN Statistical Commission's endorsed strategy for implementing the 2025 SNA, which calls on countries to "self-assess their preparedness" using standardised tools that inform planning, priority-setting, and resource mobilisation[1]. Similar assessment approaches are embedded in National Strategies for the Development of Statistics (NSDS) guidelines and the UN National Quality Assurance Frameworks Manual (UN-NQAF)[2][3].

The readiness assessment framework presented here adapts these principles to the specific requirements of ocean accounting. Ocean accounts draw on data and expertise from environmental, economic, and social domains and therefore demand multi-agency coordination that goes beyond the typical scope of national accounts compilation. The assessment must therefore evaluate capacity across all three domains of the Ocean Accounts Framework described in TG-0.1, as well as the institutional arrangements that connect them.

The framework is designed to be applied iteratively. An initial assessment establishes a baseline and informs the choice of implementation pathway (Section 3.2). Subsequent assessments, conducted at regular intervals (typically annually), track progress, identify emerging gaps, and support adaptive management of the implementation programme. Countries at any stage of ocean accounting development -- from initial scoping to regular production -- can use this framework to evaluate their position and plan next steps.

3.1.2 Assessment dimensions

The readiness assessment covers four interconnected dimensions, each of which is essential for sustainable ocean accounting:

Dimension Description Key questions
Institutional capacity Legal mandates, inter-agency coordination mechanisms, political commitment, and resource allocation Is there a lead agency with a clear mandate? Are data-sharing agreements in place? Is ocean accounting embedded in national statistical plans?
Data infrastructure Availability, accessibility, and quality of environmental, economic, and social data relevant to ocean accounting What ocean-related data exist? Are they accessible in machine-readable formats? Do spatial and temporal coverages meet minimum requirements?
Human skills and knowledge Technical expertise in statistical compilation, environmental science, economic accounting, and geospatial analysis Are there staff with SEEA compilation experience? Is there familiarity with ecosystem classification and condition assessment? Are GIS capabilities available?
Policy demand and use Existence of policy frameworks, planning processes, and reporting commitments that create demand for ocean account outputs Are there national ocean or blue economy strategies? Does the country report on SDG 14? Are there active marine spatial planning processes?

These dimensions correspond broadly to the capacity areas identified in Chapter 14 of the SEEA EA, which emphasises that the derivation of coherent indicators from accounts requires not only technical measurement capacity but also institutional arrangements that support regular data integration across environmental and economic domains[4]. They also align with the five principles of the Global Statistical Geospatial Framework (GSGF), which stresses the importance of fundamental geospatial infrastructure, interoperability standards, and institutional cooperation for producing geospatially enabled statistics[5].

3.1.3 Readiness scoring rubric

Each dimension is scored on a four-level maturity scale. The levels are cumulative -- achieving a higher level implies that the requirements of lower levels have also been met:

Level Label General description
1 Initial Ad hoc or absent. No systematic arrangements. Activities depend on individual initiative or project-based funding.
2 Developing Some foundational elements in place. Pilot activities may have been conducted. Formal arrangements are emerging but not yet institutionalised.
3 Established Systematic arrangements are in place and operational. Regular production occurs, though coverage or quality may be incomplete.
4 Optimising Mature and regularly reviewed. Continuous improvement processes are embedded. Outputs are integrated into decision-making.

The detailed scoring criteria for each dimension are as follows:

Institutional capacity scoring:

Level Criteria
1 -- Initial No designated lead agency. Ocean accounting is not referenced in national statistical plans. Inter-agency coordination for ocean data is informal or absent.
2 -- Developing A lead agency has been identified (NSO, ocean ministry, or environment agency). Initial discussions on inter-agency coordination have occurred. Ocean accounting is referenced in at least one national strategy or plan.
3 -- Established A formal inter-agency coordination mechanism (committee, working group, or MoU network) is operational. The lead agency has allocated recurrent budget for ocean accounting. Data-sharing agreements cover at least the core data providers.
4 -- Optimising Ocean accounting is embedded in the National Strategy for the Development of Statistics (NSDS) or equivalent. The coordination mechanism has a formal mandate and meets regularly. Production is funded through recurrent budget lines. Quality assurance procedures are documented and applied.

Data infrastructure scoring:

Level Criteria
1 -- Initial Key datasets (marine area delineation, fisheries production, coastal land use) are unavailable or inaccessible. No data inventory for ocean-relevant holdings has been conducted.
2 -- Developing A data inventory has been completed identifying available ocean-relevant datasets. Some key datasets are accessible but may lack spatial referencing, consistent time series, or documentation. Global or regional datasets are used to fill gaps.
3 -- Established Core datasets for at least one account type (e.g., extent, economic flows) are available with adequate spatial and temporal coverage. A common spatial framework (e.g., ecosystem accounting areas, statistical geography) is defined. Metadata standards are applied.
4 -- Optimising Comprehensive data holdings support multiple account types. Data pipelines are automated or semi-automated. Data quality is regularly assessed against documented standards. New data sources (remote sensing, citizen science, administrative records) are actively integrated.

Human skills scoring:

Level Criteria
1 -- Initial No staff have received training in environmental-economic accounting or SEEA methods. Relevant expertise (ecology, marine science, economic statistics, GIS) exists in separate agencies but has not been mobilised for ocean accounting.
2 -- Developing At least one staff member has received SEEA or ocean accounting training. A skills gap analysis has been conducted. Expertise in relevant disciplines has been identified across agencies.
3 -- Established A core team (3--5 staff) has practical experience in compiling at least one type of ocean account. The team includes or has access to expertise in ecological classification, economic accounting, and geospatial analysis. Training plans are in place.
4 -- Optimising A multi-disciplinary team is in place with documented skills across all required domains. Staff participate in international peer networks. Knowledge management systems (documentation, handover procedures, training curricula) ensure institutional memory.

Policy demand scoring:

Level Criteria
1 -- Initial No explicit policy demand for ocean accounts. Ocean governance relies on sector-specific data without integration across domains.
2 -- Developing At least one policy process (national ocean strategy, marine spatial plan, SDG reporting) has identified ocean accounts as a potential input. Initial engagement with policy users has occurred.
3 -- Established Ocean account outputs are referenced in at least one active policy process. Regular dialogue between account compilers and policy users is established. Indicator needs have been specified.
4 -- Optimising Ocean accounts are routinely used in multiple policy processes (budgeting, spatial planning, international reporting). Feedback loops between users and producers are formalised. Account outputs are cited in policy documents.

3.1.4 Conducting the assessment

The self-assessment should be conducted by a small team that includes representatives from the national statistical office, the lead ocean or environment agency, and at least one major data provider. Where possible, a policy user (from a planning or finance ministry) should also participate to ensure the policy demand dimension is accurately scored. The assessment can be completed in a structured workshop of one to two days, following these steps:

  1. Preparation: Assemble background documentation including the national statistical plan, any existing data inventories, relevant policy documents, and organisational charts showing institutional responsibilities for ocean-related data.
  2. Scoring: Work through each dimension and score against the rubric. Document the evidence supporting each score and note any uncertainties.
  3. Gap identification: For each dimension where the score falls below the target level for the intended implementation pathway (Section 3.2), identify specific gaps and their causes.
  4. Priority setting: Rank gaps by their impact on the chosen pathway and the feasibility of addressing them. Use the requirements matrix in Section 3.3 to map gaps to specific account types.
  5. Action planning: Develop a time-bound action plan to address priority gaps, assigning responsibilities and identifying resource requirements.

The completed assessment should be documented in a standardised format and shared with relevant stakeholders, including potential international partners and technical assistance providers. Countries are encouraged to share their assessments through the GOAP network to support peer learning and to help the partnership target support effectively.

3.2 Phased Implementation Pathway

3.2.1 Rationale for phased implementation

Ocean accounting is a multi-year undertaking. Experience with SEEA implementation globally demonstrates that countries achieve the most sustainable results when they adopt a phased approach that builds incrementally from simpler to more complex accounts, expanding coverage and sophistication over time[6]. The 2025 SNA implementation strategy similarly recommends that countries plan implementation in stages -- scoping, adaptation, estimation, and follow-up -- with explicit milestones and performance metrics[1:1].

A phased approach offers several practical advantages. It allows countries to demonstrate value to policy users early, building the political support and institutional commitment needed to sustain longer-term investment. It enables learning-by-doing, so that technical challenges encountered in compiling initial accounts inform the design of more complex ones. And it aligns resource requirements with available capacity, avoiding the risk of overcommitment that leads to incomplete or abandoned programmes.

3.2.2 Three-phase pathway

The following three-phase pathway is recommended as a general model. Countries should adapt it to their specific circumstances, taking account of their readiness assessment results, policy priorities, and available resources. The phases are not rigidly sequential -- countries may pursue elements of later phases in parallel with earlier ones where capacity permits.

Phase 1: Foundation (typically 12--18 months)

The foundation phase establishes the institutional and technical prerequisites for ocean accounting and delivers initial account products. Key activities include:

Phase 2: Expansion (typically 18--36 months)

The expansion phase broadens account coverage, deepens analytical capability, and strengthens institutional arrangements. Key activities include:

Phase 3: Integration and sustainability (ongoing)

The integration phase embeds ocean accounts into regular statistical production and active policy use. Key activities include:

3.2.3 Aligning the pathway with readiness levels

The readiness assessment results inform where a country enters the pathway and how rapidly it can progress:

Overall readiness profile Recommended entry point Expected timeframe to Phase 3
Predominantly Level 1 across dimensions Begin with Phase 1 institutional setup and data inventory. Focus on a single pilot account. 4--6 years
Mixed Levels 1--2 with at least one dimension at Level 2 Enter Phase 1 with accelerated timeline, leveraging existing strengths. 3--5 years
Predominantly Level 2 with some Level 3 Begin Phase 2 activities in areas of strength while completing Phase 1 in others. 2--4 years
Predominantly Level 3 or higher Focus on Phase 3 integration and sustainability. Expand account coverage and deepen policy use. 1--2 years

Countries with existing SEEA implementation experience (whether for land, water, energy, or forest accounts) will often find that their readiness levels are higher across multiple dimensions, because institutional arrangements, data infrastructure, and human skills developed for other environmental-economic accounts are substantially transferable to ocean accounting.

3.3 Implementation Requirements Matrix

3.3.1 Purpose

This section provides a comprehensive requirements matrix that maps the institutional capacity, data infrastructure, and human skills needed across major ocean account types. The matrix serves as a diagnostic tool: by reading down a column for a particular account type, compilers can identify the full set of prerequisites; by reading across a row for a particular requirement category, they can see how requirements scale across account types and plan resources accordingly.

The matrix extends the "minimum data requirements" concept to encompass all three capacity dimensions, reflecting the principle that successful account compilation depends not only on data availability but equally on institutional arrangements and human expertise. This integrated view draws on the SEEA EA emphasis on the importance of reconciling and harmonising data from disparate sources (SEEA EA para. 14.13), which in practice requires institutional cooperation and multi-disciplinary skills[4:2].

3.3.2 Institutional capacity requirements

Requirement Ocean economic accounts Ecosystem extent accounts Ecosystem condition accounts Ecosystem service flow accounts Ecosystem asset accounts Social and governance accounts
Lead agency identified NSO (national accounts division) NSO or environment agency Environment agency with NSO support NSO and environment agency jointly NSO and environment agency jointly NSO (social statistics division)
Inter-agency coordination Basic: NSO + fisheries/maritime agencies Moderate: environment agency + geospatial agency + marine research High: environment agency + multiple research institutions + monitoring agencies High: NSO + environment + sectoral agencies (fisheries, tourism, water) High: all of the above High: NSO + governance + community development agencies
Data-sharing agreements Standard statistical data sharing Geospatial data licensing + research data access Research data sharing + monitoring network agreements Cross-domain agreements spanning environment and economic agencies Comprehensive agreements covering all data sources Agreements with governance and community organisations
Budget commitment Marginal additional cost if SNA exists Moderate: geospatial data acquisition, mapping Moderate to high: field monitoring, laboratory analysis Moderate: modelling, valuation expertise Moderate: builds on service flow and condition accounts Moderate: survey instruments, community engagement
Legal/mandate basis Statistics Act (typically sufficient) Environmental legislation + statistics mandate Environmental monitoring legislation Cross-cutting mandate desirable Cross-cutting mandate desirable Social statistics mandate + governance assessment framework

3.3.3 Data infrastructure requirements

Requirement Ocean economic accounts Ecosystem extent accounts Ecosystem condition accounts Ecosystem service flow accounts Ecosystem asset accounts Social and governance accounts
Spatial framework Administrative boundaries; port/coastal zone delineation Ecosystem accounting areas; ecosystem type map at IUCN GET EFG level Same as extent + monitoring site locations Same as extent + user locations (households, industries) Same as extent, condition, and services Administrative + community boundaries
Core datasets National accounts (SUT, GDP by industry); business register; trade statistics; fisheries production Satellite imagery (Landsat, Sentinel); habitat maps; bathymetry; land cover Water quality; biodiversity surveys; physical oceanography; benthic monitoring Fisheries catch; coastal protection models; recreation surveys; carbon flux data NPV calculations require discount rates, ecosystem service projections Governance indicators; community surveys; regulatory records
Temporal requirements Annual (aligned with national accounts cycle) Multi-year baseline + periodic update (3--5 years) Annual or sub-annual for key indicators Annual (aligned with national accounts) Annual or aligned with condition assessment cycle Periodic (2--5 year cycle typical)
Quality standards SNA compilation standards; DQAF Accuracy assessment of classification; positional accuracy of maps Measurement uncertainty quantification; inter-calibration of monitoring networks Model validation; sensitivity analysis Propagation of uncertainty from underlying accounts Survey methodology standards; response rate thresholds
Metadata requirements SDMX-compliant metadata ISO 19115 geospatial metadata; lineage documentation Measurement protocols; QA/QC records Model documentation; assumption registers Valuation methodology documentation Survey instruments; sampling design documentation

3.3.4 Human skills requirements

Skill domain Ocean economic accounts Ecosystem extent accounts Ecosystem condition accounts Ecosystem service flow accounts Ecosystem asset accounts Social and governance accounts
National accounting Essential: SNA compilation, industry classification (ISIC) Useful for integration with economic data Useful for integration with economic data Essential: supply-use framework, linking ecosystem services to industries Essential: asset valuation, NPV methods, balance sheets Useful for linking social data to SNA framework
Environmental science Useful for defining ocean economy boundaries Essential: ecology, remote sensing, ecosystem classification (IUCN GET) Essential: marine ecology, water chemistry, biodiversity assessment Essential: ecosystem service modelling, biophysical quantification Essential: understanding ecosystem dynamics and sustainability thresholds Useful for understanding environmental context
Geospatial analysis Moderate: spatial delineation of coastal zones Essential: GIS, remote sensing, spatial analysis Essential: spatial interpolation, monitoring network design Essential: spatial modelling of service flows and use Essential: spatially explicit valuation Moderate: mapping governance jurisdictions
Statistical methods Essential: sampling, estimation, seasonal adjustment Moderate: accuracy assessment, area estimation Moderate to high: index construction, composite indicators High: modelling, uncertainty analysis High: discounting, time series, sensitivity analysis Essential: survey design, social statistics
Monetary valuation Not required (market data used directly) Not required Not required Required for monetary accounts: market price methods, replacement cost, avoided damage cost Essential: NPV, discount rate selection, welfare economics Not typically required
Data management Essential: database management, data integration Essential: geospatial databases, raster/vector data management Essential: time series databases, monitoring data management Essential: multi-source data integration Essential: all of the above Essential: survey data management, confidentiality

3.3.5 Using the requirements matrix

The matrix should be used in conjunction with the readiness assessment (Section 3.1) as follows:

  1. Identify target accounts: Based on the implementation pathway selected (Section 3.2) and policy priorities, identify which account types will be compiled in each phase.
  2. Read requirements: For each target account type, read down the relevant column across all three requirement tables (institutional, data, human skills) to identify the full set of prerequisites.
  3. Compare to readiness assessment: Map the identified requirements against the country's current readiness levels. Where requirements exceed current capacity, these represent gaps to be addressed.
  4. Sequence investment: Use the comparison to sequence capacity-building investments. Some requirements are shared across multiple account types (e.g., spatial framework, inter-agency coordination) and should be prioritised as foundational investments that enable multiple downstream products.

Countries will note that requirements become progressively more demanding as they move from left to right across the matrix -- from ocean economic accounts (which build substantially on existing SNA infrastructure) through ecosystem accounts (which require environmental science and geospatial capabilities) to social and governance accounts (which require additional survey and community engagement capabilities). This progression informs the sequencing recommendations in Section 3.2.

3.4 Human Capacity Requirements and Team Composition

3.4.1 Core team structure

Sustained ocean accounting requires a dedicated core team that combines expertise across multiple disciplines. The composition and size of this team will vary with the scope of the implementation programme, but the following structure provides a general model for countries at Phase 2 or later:

Role Key responsibilities Typical background FTE allocation
Programme coordinator Overall management; inter-agency liaison; stakeholder engagement; quality assurance Senior statistician or environmental economist with management experience 0.5--1.0
Economic accountant Ocean economy satellite accounts; SNA-SEEA linkages; monetary ecosystem service accounts National accounts compilation; economic statistics 0.5--1.0
Environmental scientist Ecosystem classification; condition assessment; ecosystem service quantification Marine ecology, environmental science, or conservation biology 0.5--1.0
Geospatial analyst Spatial framework management; remote sensing; map production; spatial data integration GIS, remote sensing, cartography 0.5--1.0
Data manager Data acquisition; quality control; database management; metadata documentation Information technology, data science, statistics 0.5--1.0

During Phase 1 (foundation), a smaller team of two to three staff may be sufficient, provided they have access to specialist advice from other agencies or international partners. The programme coordinator role is critical from the outset, as the primary challenge in Phase 1 is institutional coordination rather than technical compilation.

3.4.2 Extended expertise network

Beyond the core team, ocean accounting requires access to specialised expertise that is typically distributed across multiple agencies and institutions. Effective implementation depends on establishing an extended network that includes:

Building and maintaining this network is a function of the institutional arrangements discussed in Section 3.5. Countries should map existing expertise early in the implementation process and develop formal or informal mechanisms (secondments, advisory panels, memoranda of understanding) to access it on a sustained basis.

3.4.3 Capacity building strategy

A systematic approach to building and sustaining human capacity should address three time horizons:

Short-term (0--12 months): Targeted training for the core team in SEEA concepts, ocean accounting methods, and relevant tools. International training courses offered by UNSD, the GOAP partnership, and regional bodies provide foundational knowledge. Where possible, training should be combined with practical application to real data, through mentored pilot compilation exercises.

Medium-term (1--3 years): Deepening expertise through learning-by-doing, peer exchanges with more experienced countries, and participation in international expert networks. Staff should attend relevant conferences and workshops to stay current with methodological developments. Formal recognition of ocean accounting competencies within institutional human resource frameworks helps retain skilled staff.

Long-term (3+ years): Embedding ocean accounting in university curricula and national training institutions ensures a pipeline of qualified graduates. Partnerships with universities for research projects, graduate internships, and joint publications support both knowledge generation and recruitment. Documentation of institutional knowledge -- compilation manuals, standard operating procedures, and lesson-learned reports -- protects against the loss of capacity through staff turnover.

The UN-NQAF emphasises that quality assurance of statistical outputs depends fundamentally on the competence and motivation of staff[3:1]. Countries should therefore ensure that capacity building is not limited to technical training but also addresses career development, professional recognition, and working conditions that support retention.

3.5 Institutional Arrangements for Implementation

3.5.1 Why institutional arrangements matter

The SEEA EA recognises that compiling ecosystem accounts involves "the collation and integration of a wide variety of types of data, many of which may be unfamiliar to statistical offices" (SEEA EA, Chapter 15 research agenda)[4:3]. This makes institutional arrangements -- the formal and informal structures through which agencies cooperate, share data, and coordinate production -- a critical determinant of implementation success. Without effective institutional arrangements, even countries with strong data and skills will struggle to produce coherent, integrated accounts.

The 2025 SNA implementation strategy similarly identifies governance as a foundational element, recommending that countries "establish appropriate governance mechanisms to provide oversight" and "appropriate mechanisms to ensure coordination" across relevant agencies[1:2]. For ocean accounting, the coordination challenge is amplified by the number and diversity of agencies involved: national statistical offices, environment ministries, fisheries agencies, maritime authorities, navy hydrographic offices, port authorities, research institutions, and geospatial agencies may all hold relevant data and expertise.

3.5.2 Coordination models

Countries have adopted a range of coordination models for environmental-economic accounting, each with distinct advantages and limitations. The choice of model should reflect the country's administrative structure, existing inter-agency mechanisms, and the political economy of ocean governance. TG-1.10 OA and National Planning Integration provides detailed guidance on coordination models and their relationship to national planning processes; this section provides a summary checklist for implementation readiness purposes.

The principal models observed in practice include:

Model Description Advantages Limitations
NSO-led The national statistical office leads coordination and production, with other agencies as data providers. Aligns with statistical mandate; ensures quality standards; leverages existing SNA infrastructure. NSO may lack environmental expertise; ocean may be low priority relative to other statistical programmes.
Environment agency-led The environment ministry or a dedicated ocean agency leads coordination, with NSO providing economic data and statistical guidance. Strong domain knowledge; direct link to environmental policy; access to monitoring data. May lack statistical compilation expertise; risk of perceived bias; may not carry statistical authority.
Joint steering committee A multi-agency committee provides strategic direction, with a technical working group handling production. Shared ownership; diverse expertise; broader political support. Requires strong secretariat; risk of slow decision-making; coordination costs.
Dedicated unit A specialist ocean accounting unit is established, either within an existing agency or as a standalone body. Focused mandate; clear accountability; can recruit specialist staff. Higher setup cost; risk of isolation from broader statistical and environmental programmes.

No single model is universally superior. Many countries use hybrid approaches that combine elements of multiple models. The critical success factors, regardless of model, are: (a) a clear mandate for ocean accounting that is recognised by all participating agencies, (b) designated staff with allocated time, (c) formal data-sharing arrangements, and (d) recurrent funding that does not depend on project-based support alone.

3.5.3 Implementation readiness checklist for institutional arrangements

The following checklist summarises the institutional prerequisites for sustainable ocean accounting. Countries should aim to address all items by the end of Phase 1, though some (particularly items 7 and 8) may extend into Phase 2:

For detailed guidance on coordination models, building domestic champions, and leveraging regional platforms, see TG-1.10 OA and National Planning Integration. For guidance on data coordination architectures specifically, see TG-4.7 National Data Coordination Architectures.

3.6 Country Commitment Framework

3.6.1 From assessment to commitment

The readiness assessment and implementation pathway provide the analytical foundation for ocean accounting, but sustained implementation requires explicit commitment at the institutional and political level. Experience with SEEA implementation demonstrates that countries which formalise their commitment -- through inclusion in national statistical plans, ministerial endorsements, or legislative provisions -- achieve more durable results than those that rely on informal arrangements or project-based support alone[6:1].

The 2025 SNA implementation strategy underscores this point, recommending that "implementation of the 2025 SNA/BPM7 within a country should be built into the strategic plans of the national statistical office" with "strong visibility within the strategic plan, with concrete objectives and timing"[1:3]. The same principle applies to ocean accounting: embedding the commitment in official planning instruments signals institutional seriousness, creates accountability, and facilitates resource allocation.

3.6.2 Elements of a country commitment

A country commitment framework for ocean accounting should address the following elements:

Political endorsement: A statement of commitment from an appropriate authority (minister, chief statistician, or inter-ministerial body) that establishes ocean accounting as a national priority. This endorsement should reference the policy objectives that ocean accounts will support and the institutional arrangements that will be put in place.

Strategic plan integration: Inclusion of ocean accounting in the National Strategy for the Development of Statistics (NSDS) or equivalent national statistical plan. This ensures that ocean accounting is planned, resourced, and monitored alongside other statistical programmes. For countries that do not yet have an NSDS, ocean accounting can be included in the strategic plan of the lead agency.

Resource commitment: An explicit commitment to allocate resources (financial, human, and infrastructure) to ocean accounting. The commitment should distinguish between initial investment (which may draw on international technical assistance) and recurrent production costs (which should transition to domestic funding). A realistic budget estimate, based on the requirements matrix in Section 3.3, should accompany the resource commitment.

Timeline and milestones: A time-bound implementation plan with clear milestones aligned to the phased pathway in Section 3.2. Milestones should be specific, measurable, and linked to deliverables (e.g., "Complete data inventory by Q2 2027," "Publish first ecosystem extent account by Q4 2028").

Accountability mechanism: A designated person or body responsible for monitoring implementation progress and reporting to the coordination mechanism and to senior leadership. Regular progress reporting (at least annually) ensures that implementation stays on track and that emerging challenges are addressed promptly.

3.6.3 Sustaining commitment through demonstrated value

Political and institutional commitment is more easily sustained when ocean accounts demonstrably contribute to policy and decision-making. Countries should therefore plan early wins -- policy-relevant outputs that can be delivered within the first 12--18 months of the implementation programme. Examples of early wins include:

These early products serve dual purposes: they provide immediate value to policy users, and they build the case for continued investment in more comprehensive and analytically powerful account products in later phases.

Countries should also invest in communication and outreach to ensure that account outputs reach their intended audiences. The SEEA EA emphasises that indicators derived from accounts are useful tools for "tracking progress," "mainstreaming" environmental issues into public policy, and "promoting the sustainable use of ecosystems and ecosystem services" -- but these benefits are realised only when outputs are communicated effectively to decision-makers and the public (SEEA EA para. 14.10)[4:4].

4. Limitations and Considerations

4.1 Context sensitivity

The readiness assessment framework and phased pathway presented in this Circular provide a general model that must be adapted to country-specific circumstances. Small island developing states, for example, may face distinctive challenges related to limited institutional capacity and data infrastructure but may also benefit from relatively simpler ocean geographies and strong policy demand. Large maritime nations may have abundant data and institutional resources but face coordination challenges across multiple agencies and levels of government. The framework should be applied with sufficient flexibility to accommodate these differences.

4.2 Assessment subjectivity

Self-assessment inherently involves subjective judgement. Scores may be influenced by the composition of the assessment team, institutional incentives, or incomplete knowledge of data holdings across agencies. Countries can mitigate this risk by including diverse perspectives in the assessment team, requiring evidence to support scores, and inviting external review (e.g., through GOAP peer review or technical assistance missions).

4.3 Dynamic nature of readiness

Readiness is not a fixed state. Changes in government priorities, staffing, funding, or data availability can shift readiness levels in either direction. The assessment should therefore be treated as a living instrument that is updated regularly and used to inform adaptive management of the implementation programme, rather than a one-time diagnostic.

5. Acknowledgements

Authors: [To be confirmed]

Reviewers: [To be confirmed]

This Circular draws on the institutional and methodological frameworks established by the System of Environmental-Economic Accounting -- Ecosystem Accounting (SEEA EA), the Strategy for Implementing 2025 SNA and BPM7, the UN National Quality Assurance Frameworks Manual, the Global Statistical Geospatial Framework (2nd edition), and NSDS guidelines. The guidance also benefits from the practical experience of GOAP partner countries in implementing ocean accounts across diverse institutional contexts.

6. References


  1. Inter-secretariat Working Group on National Accounts. (2025). Strategy for Implementing 2025 SNA and BPM7. Endorsed by the UN Statistical Commission at its 56th Session, March 2025. ↩︎ ↩︎ ↩︎ ↩︎

  2. Partnership in Statistics for Development in the 21st Century (PARIS21). (2018). Guidelines for the preparation of a National Strategy for the Development of Statistics (NSDS). PARIS21 Secretariat. ↩︎

  3. United Nations. (2019). United Nations National Quality Assurance Frameworks Manual for Official Statistics. Studies in Methods, Series M, No. 100. New York: United Nations. ↩︎ ↩︎

  4. United Nations. (2021). System of Environmental-Economic Accounting -- Ecosystem Accounting. New York: United Nations. See especially Chapters 14 (Indicators and combined presentations) and Chapter 15 (Research agenda, topics on methods and implementation). ↩︎ ↩︎ ↩︎ ↩︎ ↩︎

  5. United Nations. (2025). The Global Statistical Geospatial Framework, 2nd edition. UN Committee of Experts on Global Geospatial Information Management (UN-GGIM). ↩︎ ↩︎

  6. United Nations. (2021). System of Environmental-Economic Accounting -- Ecosystem Accounting: Implementation Guide. New York: United Nations. ↩︎ ↩︎