Strategic Whitepaper

CALIFORNIA'S AI OPPORTUNITY

Prepared by SAP for the California Council on Science and Technology
AI Applications and Innovation Showcase • March 4, 2026

Interoperability, Enterprise Planning, and the Logistics Network Government Didn't Know It Had
Jason Shearer Chief Architect & AI Lead, U.S. Public Services
Business Transformation & Architecture, North America
CCST AI Applications and Innovation Showcase
March 4, 2026
Scroll to read

Contents

Executive Summary

California's public agencies face a convergence of mandates. All share a common dependency: interoperability.

GASB 104 takes effect this fiscal year. It requires every state and local government to disclose capital asset categories they never had to break out before. GASB 96 already put every IT subscription on the balance sheet as a right-to-use asset. The federal SAMOSA Act passed the House in December 2025. It mandates software asset inventories across all agencies. NIEM, the National Information Exchange Model, is moving from policy preference toward procurement requirement for federally funded programs in emergency management, homeland security, and defense. California's own Envision 2026 commits the state to interoperability, digital identity, and procurement change.

You cannot meet these mandates in isolation. Asset disclosure requires cost allocation. Cost allocation requires integrated planning. Planning requires data that moves across organizational boundaries. Data exchange requires identity verification. Identity verification requires open standards. Each mandate pulls on the same thread. Systems, organizations, and data must work together across boundaries of scale, maturity, and jurisdiction.

This paper makes a specific claim. The infrastructure to meet these mandates exists. California agencies are already running it. What it has not done is connect.

Most California state and local government (SLED) customers already run an enterprise platform for finance, HR, and procurement. Many already use Ariba. They sit adjacent to a business network. It connects utilities, food distributors, logistics providers, and warehouse operators. Over 15 million trading partners. They can participate in logistics orchestration, emergency supply staging, and cross-agency coordination without buying dedicated systems. The network is there. It has not been turned on.

Three moves unlock it:

1. Connect the network. Activate Business Network logistics for coordination across agencies, utilities, and suppliers.

2. Add planning. Extend financial planning into integrated planning across procurement, logistics, assets, and workforce.

3. Build prototypes. Deploy pilots that show edge AI, cost transparency, and citizen services on infrastructure that already exists.

The Department of Defense's Task Force Lima validated this approach in December 2024. It was the largest military generative AI experiment in history. Their finding: roughly half of viable AI use cases lived in traditional enterprise systems, not frontier model applications. Classification. Routing. Anomaly detection. Planning optimization. The capabilities described in this paper. The aftermath tells the real story. CDAO lost nearly 60% of its staff. The $15B Advana contract was cancelled. 18F dissolved. The lesson: build on open, federated foundations that survive political transitions, not centralized platforms.

Seven pilots are proposed. Each can start within six months. Each addresses a real California problem.

About This Showcase

This paper was prepared for the California Council on Science and Technology's AI Applications and Innovation Showcase on March 4, 2026.

CCST builds AI fluency among California's legislative staff. This paper supports that mission. It presents working systems already deployed in California. It proposes seven pilots that address real compliance mandates. They teach how enterprise AI works in practice.

The Loot Locker pilot shows this approach. It fits in a community center. It teaches edge AI, business networks, digital identity, and circular economy in a single deployment. Legislative staff can scan a donated coat. Watch AI classify it. See routing decisions. Understand what enterprise AI does and does not do. That is fluency through experience.

This is not vendor advocacy. This paper was prepared by SAP. The infrastructure runs on open source and standard protocols. Cloud Foundry. Eclipse Tractus-X. W3C Verifiable Credentials. NIEM. California agencies are already running much of it. This is not a procurement proposal. It is a map of what sits adjacent to what California already owns.

The question is not whether California needs this capability. The question is whether California realizes it already has it.

Chapter 1: The Interoperability Problem

The Coordinator's Problem

Carlos works for a California water utility. At 3 AM a main breaks. He photographs the damaged fitting. An AI model classifies the failure mode. Checks inventory. Generates a work order. The work routes to the nearest crew with the right equipment. Carlos never puts his coffee down.

Fifteen miles away, the county emergency operations center cannot see what Carlos sees. They cannot access the utility's asset data, crew availability, or mutual aid capacity. The systems are incompatible.

During the 2025 Los Angeles fires, emergency coordinators discovered this gap at scale. Critical infrastructure data existed in utility systems. It could not be shared with agencies coordinating response.

This is not a technology problem. Carlos's utility runs strong enterprise systems. The county runs its own platforms. The gap is not capability. It is connection.

The systems were never designed to work across organizational boundaries. Investment at the operational level has not reached the coordination layer.

This pattern repeats across California's infrastructure. Large organizations run advanced digital systems. Smaller organizations serve vulnerable populations but have minimal digital capability. During emergencies, this gap becomes coordination failure.

Not because the technology doesn't exist. Because it has never been activated for cross-boundary coordination.

Across infrastructure, housing, and public services, the same structural problem repeats. Large organizations run advanced digital systems. Smaller organizations, which often serve the most vulnerable populations, have minimal or no digital capability. During emergencies, this gap becomes a coordination failure.

This is not specific to California. It appears wherever infrastructure responsibility is distributed across organizations of different size and digital maturity. But California, with its wildfire exposure, seismic risk, housing pressure, and utility complexity, faces this pattern at a scale few other jurisdictions match.

California's Moment

Three priorities converge now. Wildfire resilience. Housing crisis response. Digital government modernization.

After the 2025 Los Angeles fires, the state deployed its first statewide LiDAR mapping effort. It expanded the world's largest aerial firefighting fleet. It nearly doubled CAL FIRE's protection budget from $2 billion to $3.8 billion. In February 2026, a new CPUC President was named with a clear mandate for wildfire safety spending, grid hardening, and utility accountability.

Beneath this investment lies a structural coordination gap. During the fires, emergency coordination entities discovered that critical spatial data about infrastructure assets could not be shared between small utilities and large providers. The systems were never designed for cross-organizational work. Investment at the top had not reached the coordination layer at the bottom.

The housing challenge follows the same pattern. More than 540 jurisdictions submit Annual Progress Reports through fragmented, incompatible systems. The citizen services challenge follows it too. One in five Californians lack reliable internet access, while digital services assume broadband connectivity.

In each case, the barrier is not a lack of technology at any single organization. It is the absence of an interoperability layer that allows technology to work across organizational boundaries.

California's AI Legislative Leadership

California has established itself as the nation's leading AI regulator. In the first half of the 2025–2026 session, lawmakers introduced more than 33 AI and privacy bills.

Legislation Description Significance
SB 53 Transparency in Frontier AI Act First state law regulating frontier AI model developers
AB 316 AI Civil Liability Bars AI autonomy as a defense in civil actions
AB 853 AI Transparency Act (Amended) Expands content provenance requirements; effective August 2026

The 2026 session is expected to advance bills targeting employment AI systems, biometric data protections, and automated decision-making in high-stakes contexts.

This Has Been Solved Before

California is not the first jurisdiction to face this challenge. Europe has built, tested, and deployed a working model. The pattern matters because the same architecture applies directly to California's problem.

The Catena-X initiative was founded by BMW, Deutsche Telekom, Bosch, SAP, Siemens, and others. It created the first large-scale federated data space for industry. Its open-source implementation, Eclipse Tractus-X, enables secure, sovereign data exchange between organizations that do not share common systems. Three principles matter here:

Data sovereignty. Each participant retains control of their data. Small organizations share only what is necessary for specific coordination scenarios, nothing more. Participants verify each other's identity through self-sovereign credentials rather than a central broker. Chapter 2 describes the technical mechanism.

Federated integration. The Eclipse Dataspace Connector enables exchange regardless of internal platform. An organization on paper can participate alongside one running advanced GIS.

Open standards. Built on the Gaia-X trust framework and International Data Spaces Association principles. No single vendor controls the infrastructure.

In parallel, the European Commission's Data Space for Smart and Sustainable Cities and Communities (DS4SSCC) is deploying federated data spaces for municipal governments. The initiative is led by the Open & Agile Smart Cities network (OASC), Eurocities, and the FIWARE Foundation. FIWARE provides the operational data models and NGSI-LD API schemas that define how mobility, energy, environment, and public safety data is structured and exchanged within these spaces. FIWARE Smart Data Models are to the data space what NIEM vocabularies are to U.S. government exchange: the shared language that makes interoperability work.

DS4SSCC-DEP (the deployment phase) runs through September 2026 with an €18 million budget. Three rounds of open calls have selected pilots across Europe. The sectors map directly to California's challenges. Cross-border mobility and pollution management between Szeged and Timișoara. Positive energy districts in Spain and Bulgaria, where neighborhoods produce more energy than they consume. Participatory urban planning across Eindhoven, Oulu, and Álava. Each pilot validates the same blueprint. Resource-constrained local governments participating in data ecosystems alongside well-resourced organizations, without adopting expensive new platforms.

The U.S. Parallel: NIEM and the Coming Mandate

The United States has its own interoperability standard. Its trajectory toward mandate is accelerating.

The National Information Exchange Model (NIEM) defines a common vocabulary and exchange format for sharing data across jurisdictions and agencies. It originated in 2005 as a partnership between the Departments of Justice and Homeland Security. It now spans DoJ, DHS, DoD, and HHS. In 2013, DoD adopted a "NIEM first" policy requiring NIEM-conformant data exchange for defense information sharing. NIEM is cited as a key enabler in the Joint All-Domain Command and Control (JADC2) Reference Architecture. In 2023, NIEM transitioned to NIEMOpen under OASIS, formalizing it as an international standard eligible for inclusion in procurement requirements worldwide.

NIEM already defines domains for emergency management, infrastructure protection, justice, human services, and military operations. Its Information Exchange Package Documentation (IEPD) format specifies how data elements are structured, validated, and exchanged between systems that have never communicated before. This ships today. Over 7,000 DoD staff use NIEM-based systems through the Warfighter Mission Area Architecture Federation and Integration Portal. Law enforcement uses NIEM through LInX, one of the largest information sharing systems in the country. The Maritime Information Sharing Environment (MISE) uses NIEM across 12 federal and defense partners.

For California, the NIEM trajectory creates both an obligation and an opportunity.

The obligation: as NIEM compliance moves from policy preference to procurement requirement across federal programs, California agencies that receive federal emergency management funding, homeland security grants, or defense-related contracts will need NIEM-conformant data exchange. The state's utility coordination infrastructure, its emergency response data, and its cross-jurisdictional service delivery will all face this requirement.

The opportunity: California's largest utilities are already SAP customers. Southern California Edison, SDG&E, and the Department of Water Resources all run SAP for core operations. The data that needs to become NIEM-conformant (outage reports, asset conditions, crew availability, mutual aid requests) already exists in these systems. A data space architecture built on SAP's Data Space Integration and Decentralized Identity Verification can produce NIEM-conformant exchange packages from enterprise data that is already being captured. The small utilities that cannot afford dedicated systems get NIEM compliance through the shared data layer, the same way they get logistics coordination through the Business Network.

Tractus-X and NIEM solve the same problem from different starting points. Europe started with industry consortia and built toward government. The U.S. started with government and is building toward open standards. The architectural patterns converge: federated data exchange, sovereign identity verification, standard vocabularies, no central broker. SAP's position in both ecosystems is the bridge.

Chapter 2: Connect. The Business Network

When FEMA coordinates disaster response with a California county, the exchange runs on phone calls, email, and PDF attachments. When CalOES stages emergency supplies, visibility into warehouse capacity depends on personal relationships, not systems.

The coordination layer already exists. SAP Business Network connects over 15 million trading partners. When a California city uses Ariba for procurement, it is already connected. They don't know that the same network supports freight collaboration, asset condition sharing, and real-time supply chain coordination.

SLED customers do not need to buy transportation or warehouse management systems to participate in logistics orchestration. They connect through the Business Network to utilities, food distributors, 3PLs, and warehouse operators who are already there.

A county emergency management office coordinates supply staging after a wildfire. Today: phone calls and spreadsheets. Through the Business Network: the office sees available warehouse capacity from regional 3PLs. Routes freight through carriers already on the network. Tracks delivery in real time. No new system required.

The integration layer underneath this is SAP Integration Suite. Gartner named it a Magic Quadrant Leader for iPaaS five consecutive years. It scores highest in Ability to Execute. Event-driven architectures, API management, and B2B integration run on one service. This is the same platform that powers the Tractus-X data spaces running in production across Europe.

The Transformation Toolchain

A legislator asking "what evidence supports this?" deserves a direct answer. Each capability referenced in this paper has been independently evaluated.

Capability Recognition What It Does
Integration Suite Gartner iPaaS Leader, 5 consecutive years (highest in Ability to Execute) Connects systems across organizational boundaries — the plumbing of interoperability
SAP Signavio Gartner Process Mining Leader, 3 consecutive years Maps how processes actually run vs. how they should — exposes bottlenecks and compliance gaps
SAP LeanIX Gartner EA Tools Leader, 5 consecutive years (highest in both Vision and Execution — first SAP solution to achieve this) Maps every application, its cost, its owner, and its retirement date — the x-ray of the IT landscape
WalkMe Gartner Customers' Choice for Digital Adoption Platforms; only vendor meeting all Gartner-evaluated capabilities Guides users through new systems in real time — the difference between software deployed and software used
Analytics Cloud BARC Market Leader for integrated BI and analytics; broadest planning and analytics capability in a single platform Connects financial planning, operational analytics, and predictive models — this is the xP&A engine
LeanIX + Signavio + WalkMe QKS SPARK Matrix Digital Twin of an Organization Leader, 5 consecutive years (highest in Technology Excellence and Customer Impact) A live, queryable model of how the enterprise operates: its processes, applications, people, and data

These capabilities ship today. Analysts have tracked them for years. When this paper describes a digital twin, integrated planning, or process optimization, it describes capabilities that independent analysts have evaluated and ranked against every competitor in the market.

The Open Source Foundation

A Decade of Building

A legislator reviewing this paper will ask a reasonable question: does this lock California into a single vendor? The answer matters. The state's Envision 2026 strategy requires open standards and procurement change. The answer is documented across a decade of open source contributions that form the foundation of every interoperability capability described here.

Year Initiative Significance
2014–2018 Cloud Foundry Foundation Founding platinum member. Open-source cloud application platform now used across government and enterprise.
2018 Kyma Runtime Open-source Kubernetes-based extensibility platform. Donated to community. Now the foundation for SAP BTP extensions.
2019 Eclipse Foundation membership Open governance for enterprise software.
2021 Catena-X / Tractus-X Co-founded the automotive data space. Eclipse Tractus-X released as open-source reference implementation.
2022–2024 ABAP Cloud SDK, CAP Framework Open developer tooling on GitHub for building enterprise applications.
2025 Gaia-X and IPCEI-CIS alignment Participation in European sovereign cloud and digital identity infrastructure.
2023–2026 DS4SSCC / FIWARE Foundation Co-participant in the Data Space for Smart and Sustainable Cities and Communities. FIWARE provides NGSI-LD API schemas and Smart Data Models for cross-sector urban data exchange.
2025 Decentralized Identity Verification (DIV) BTP service for self-sovereign identity: verifiable credentials, DID:Web, Decentralized Claims Protocol. In production at Catena-X.

This company has spent a decade building connective tissue between enterprise systems and open ecosystems. That connective tissue is what California's interoperability challenge requires.

The Task Force Lima Validation

In December 2024, the Department of Defense sunsetted Task Force Lima. It was the largest military generative AI experiment in history. Over twelve months, the task force analyzed hundreds of AI workflows. They categorized them into fifteen areas across two categories: warfighting functions and enterprise management functions.

The finding that matters for this paper: roughly half of those use cases lived in enterprise management. Financial management. Healthcare information management. Logistics. Human resources. Not frontier model problems. Enterprise application problems. Classification. Routing. Anomaly detection. Planning optimization. The capabilities described in Chapters 3 through 5.

Task Force Lima drew a distinction between what it called Mode 1 experimentation and Mode 2. Mode 1: exploring frontier capabilities like large language models. Mode 2: applying AI to existing enterprise workflows where data, processes, and governance already exist. Mode 2 produced faster, more measurable results because it ran on established infrastructure. The insight was not that frontier AI lacks value. Organizations underinvest in the enterprise AI that delivers returns now.

The institutional trajectory that followed validates the architectural choices in this paper more powerfully than any technical argument could.

Task Force Lima's parent organization, the Chief Digital and Artificial Intelligence Office, has since lost nearly 60 percent of its staff. The $15 billion contract to scale its Advana analytics platform was cancelled. The Advana program officer, Alex O'Toole, departed for Databricks. CDAO itself has been reorganized under the Under Secretary of Defense for Research and Engineering, with a 120-day review deadline to determine the future of both Advana and the Maven AI program. Advana is being split into three pieces. Key leaders dispersed. Stuart Wagner became the Navy's Chief Data and AI Officer. Radha Plumb, who led CDAO through Lima's sunset, departed at the start of the new administration.

Meanwhile, 18F has been dissolved. 18F was the federal digital services team that built Cloud.gov, the government's platform-as-a-service. Cloud.gov ran on Cloud Foundry, the same open-source technology SAP has contributed to as a founding platinum member since 2014. The platform persists. The institutional home does not.

The pattern could not be clearer. Centralized platforms built within institutional structures are vulnerable to reorganization, budget cuts, and political transition. Federated, open-source infrastructure survives because no single entity owns it. The Cloud Foundry foundation that SAP helped build continues independent of any agency's budget cycle. The Tractus-X data space architecture has no single node to decommission. The W3C standards underlying Decentralized Identity Verification do not require Senate confirmation to remain operational.

California should learn from this sequence. The federal government built excellent AI and analytics infrastructure. It staffed that infrastructure with talented people. A political transition dispersed both. The lesson is not that the work was wrong. The lesson is that the architecture was fragile. Build on open foundations. Federate the infrastructure. Make it survive the next reorganization. The next reorganization is always coming.

Sovereign Cloud and Data Sovereignty

For government customers, where data lives and who controls it is not negotiable. SAP's sovereign cloud positioning includes dedicated government cloud instances, FedRAMP authorization pathways, and architectural alignment with Gaia-X data sovereignty principles.

The federated data space model described in Chapter 1 does not require centralization. Each participating organization retains sovereignty over its data. They share only what is needed under explicit, auditable agreements. This model supports local autonomy. Jurisdictions retain control of their systems while participating in shared standards.

Why This Matters for SLED

State, local, and education (SLED) customers interact with SAP in the ERP room. Finance. HR. Procurement. They have not been shown the open source integration layer, the data space architecture, or the business network. This paper aims to change that. Not by proposing new procurement. By showing what already sits adjacent to what they own.

The Open Source Way

SAP's approach to open source is documented in the SAP Open Source Way podcast. It explores how enterprise software companies build sustainable open source communities while maintaining commercial viability. The podcast features conversations with maintainers, contributors, and ecosystem partners across SAP's open source portfolio. From Cloud Foundry to Eclipse Tractus-X.

This matters for California. The infrastructure proposed in this paper rests on these foundations. When a legislator asks whether this architecture will survive vendor transitions or budget cycles, the answer is in the governance models these communities have built. The same Cloud Foundry technology powered the federal government's Cloud.gov platform, built by 18F. When 18F was dissolved and Cloud.gov's future became uncertain, the open-source foundation continued. No single institution owned it.

The Logistics Partner Government Didn't Know It Had

When FEMA coordinates disaster response with a California county, the data exchange runs on phone calls, email, and PDF attachments. When CalOES stages emergency supplies, visibility into warehouse capacity and carrier availability depends on personal relationships, not system integration. The interoperability mandates described in Chapter 1 exist because this coordination layer does not.

This chapter describes the coordination layer. It already exists. It has not been turned on for government.

SAP Business Network connects over 15 million trading partners across logistics, procurement, and asset management. When a California city uses Ariba for procurement, it is already connected to this network. They likely do not know that the same network supports:

Freight Collaboration. Visibility into shipments, carrier performance, and transportation capacity across logistics providers. The city does not need to own or operate a transportation management system.

Asset Collaboration. Shared visibility into asset condition, maintenance schedules, and lifecycle data across organizations that co-manage infrastructure. Utilities. Public works. Contractors.

Supply Chain Collaboration. Real-time coordination with suppliers, distributors, and third-party logistics providers for everything from emergency supplies to school food distribution.

SLED customers do not need to buy TM (Transportation Management) or EWM (Extended Warehouse Management) to participate in logistics orchestration. They can connect through the Business Network to utilities, food distributors, 3PLs, and warehouse operators who are already there. The coordination layer exists. It has not been activated for government use cases.

Example: A county emergency management office coordinates supply staging after a wildfire. Today, the process runs on phone calls and spreadsheets. Through the Business Network, the office sees available warehouse capacity from regional 3PLs. Routes freight through carriers already on the network. Tracks delivery in real time. No new logistics system is required. The county connects through its existing Ariba instance.

For organizations that do need physical warehouse capability (a regional distribution center, a school district food hub, a disaster relief staging area) a targeted EWM deployment provides full warehouse management. But many coordination use cases require only the network, not the full stack.

Extended Planning and Analysis (xP&A) for the Public Sector

xP&A replaces siloed planning with integrated planning. Instead of finance planning in one system, supply chain in another, and workforce in a third, organizations plan across all domains in a single environment. Financial forecasts, procurement spend, logistics capacity, asset maintenance, and workforce availability inform each other in real time.

In enterprise, xP&A has produced measurable results. In state and local government, the discipline barely exists. The ERP conversations in government are still about core finance and HCM. No one has walked SLED customers from their financial planning into integrated planning across logistics, assets, and services.

When a city sees freight collaboration data next to its financial plan, next to procurement spend, next to asset maintenance schedules, that is xP&A applied to public infrastructure. Budget cycles compress. Resource allocation improves.

Planning Domain Traditional SLED Approach xP&A Integrated Approach
Financial Planning Annual budget cycle in isolation Continuous planning informed by logistics, procurement, and asset data
Procurement Purchase orders disconnected from supply chain visibility Procurement linked to supplier capacity and delivery forecasts
Logistics Not managed; delegated to individual departments Network-connected freight and supply chain visibility feeding the financial plan
Asset Management Reactive maintenance tracked in spreadsheets Predictive maintenance integrated with capital planning
Workforce Headcount budgeting disconnected from service demand Workforce planning shaped by service volumes, seasonal patterns, and field data

Cost Transparency: The Foundation xP&A Requires

Integrated planning without cost transparency is planning without a foundation. A city can connect its financial plan to procurement and logistics data, but if it cannot show where shared costs land, the planning produces numbers no one trusts.

Government budgets are full of pooled costs. Facility overhead. IT infrastructure. Shared administrative services. Fleet maintenance. These costs are real. They are also, in most agencies, invisible at the program level. A legislator asks what it costs to deliver wildfire coordination, or school food distribution, or homeless outreach, fully burdened, including the infrastructure underneath. The honest answer today, in most jurisdictions: we do not know.

SAP Profitability and Performance Management (PaPM) is an allocation engine built for this problem. It takes pooled costs and distributes them through multi-step, auditable allocation models. Driver-based spreads. Hierarchical waterfalls. Weighted distributions. Iterative cycles. In defense contracting, PaPM produces the indirect rate calculations (fringe, overhead, G&A) that survive a DCAA audit. In state government, the same engine can produce the fully burdened cost of any service, program, or jurisdiction.

The allocation models are not simple division. PaPM handles sequential allocation steps where the output of one pool feeds the input of the next. It runs what-if simulations before posting actuals. It pulls from multiple data sources. S/4HANA for financial actuals. The Business Network for logistics costs. EPPM for project costs. The result is a fully attributed cost model where every dollar of shared infrastructure is assigned to the programs it supports, through logic that can be examined, questioned, and defended.

The interface is a CAP application built on SAP's open-source Cloud Application Programming model. CAP runs on BTP. The application surfaces PaPM's allocation results as a clean, queryable cost transparency dashboard. An agency head drills from a total budget down to allocated cost per service. A legislative budget analyst compares fully burdened delivery costs across programs. A program manager sees the real cost of the infrastructure underneath their operation, not just the direct charges on their ledger.

This connects the xP&A story to accountability. Integrated planning tells you where resources should go. Cost transparency tells you where they went, and what they cost. Together, they give government something it has rarely had: a planning system backed by auditable truth about the cost of delivering services.

For the CCST audience, cost transparency also addresses a political reality. Every dollar of state investment in AI, interoperability, or coordination infrastructure will face scrutiny. PaPM provides the mechanism to show, line by line, what that investment produced and what it cost. That is not a technical capability. That is a governance requirement.

The Compliance Pressure That Makes This Urgent

Two accounting standards converge to make capital asset transparency a mandate, not a preference.

GASB 96 (effective 2022) reclassified subscription-based IT arrangements as right-to-use assets on the balance sheet. Every cloud subscription, every SaaS contract, every managed service agreement a state agency holds is now a capital asset that must be tracked, amortized, and disclosed. GASB 104 (effective for fiscal years beginning after June 15, 2025) goes further. It requires state and local governments to separately disclose these right-to-use assets from owned capital assets in their financial statement notes. Assets held for sale get their own disclosure requirements. The Governmental Accounting Standards Board is also developing new infrastructure asset reporting requirements that would mandate disclosure of assets more than 80% depreciated.

At the federal level, the SAMOSA Act (Strengthening Agency Management and Oversight of Software Assets) passed the House in December 2025. It awaits Senate action. It mandates software asset inventories for all federal agencies within 18 months of enactment, with independent assessments of license management practices. The Federal CIO has already directed agencies to begin inventorying their top software entitlements. A Senate committee estimated annual savings of $750 million from eliminating waste the inventories would reveal.

Federal mandates flow downhill. States that receive federal funding face audit requirements that align with federal standards. California agencies already subject to GASB 96 and 104 will face increasing pressure to show not just that they hold these assets, but what each asset costs to operate, maintain, and retire.

PaPM closes this loop. The asset master data already exists in SAP for agencies running S/4HANA. GASB 104 requires that assets be categorized and disclosed by type. PaPM allocates the shared costs (facility, IT infrastructure, administrative overhead) that make each asset category's total cost of ownership visible. The CAP dashboard surfaces the result in a format that satisfies both the accounting disclosure and the legislative analyst asking the harder question: what did we get for this money?

A state that deploys PaPM-based cost transparency is not just building a planning tool. It is building a compliance infrastructure that meets GASB disclosure requirements, prepares for SAMOSA-aligned state mandates, and produces the auditable cost evidence that legislators and auditors will increasingly demand.

Digital Identity and Citizen Services

The Same Pattern, Applied to People

The interoperability challenge appears again in citizen services. A California resident displaced by wildfires needs DMV replacement, emergency benefits, utility transfers, and housing assistance. Each requires separate identity verification through separate systems. For the one in five Californians who lack reliable internet, these barriers compound.

The California Identity Gateway, part of the Envision 2026 strategy, addresses identity fragmentation. But identity is not just a citizen services problem. It is the missing piece of the data space architecture described in Chapter 1. Before two organizations can share data in a federated model, they must verify each other's identity without relying on a central authority.

How It Works

Europe solved this. Rather than building centralized identity systems, European data space initiatives use self-sovereign identity. Organizations and individuals hold verifiable credentials that can be checked by any party that trusts the issuer, without calling home to a central broker. The W3C standards for Decentralized Identifiers (DIDs) and Verifiable Credentials provide the mechanism. No blockchain required. Standard web protocols.

SAP's Decentralized Identity Verification service implements this model on BTP. It already runs in production at Catena-X. When a data consumer requests data from a provider, the consumer presents credentials. The provider checks them cryptographically. If valid, the exchange proceeds. No central identity broker is involved.

The same architecture applies directly to California. When a small utility shares asset data with a large IOU during an emergency, both parties verify identity through credentials issued by a trusted authority. The CPUC. A coordination entity. The data space operator. The small utility does not hand over control to the large utility. It presents a credential. The credential is verified. The data flows. When the coordination ends, the sharing stops.

For citizen services, the Identity Gateway does not need to be a centralized system. It can be a trust framework. Credential schemas. Trusted issuers. Verification policies that allow agencies to verify residents through portable credentials rather than point-to-point integrations. A resident proves eligibility for emergency benefits with a credential that also works at the DMV, without either agency sharing backend systems. Privacy-preserving selective disclosure (proving a claim without revealing the data behind it) is built into the credential standard.

The technical architecture, including the DARPA concerns that stalled U.S. decentralized identity efforts, zero-knowledge proofs, and the MOSA (Modular Open Systems Approach) parallel, is detailed in Appendix C.

Multi-Channel Service Delivery

Multi-channel service delivery (where a single backend supports web, phone, SMS, kiosks, and offline-capable mobile) is a proven pattern. The design principle: digital inclusion is an architectural requirement, not an add-on. Services must work for a resident on gigabit fiber and for a resident on intermittent cellular in a rural area that has just been evacuated.

The equity dimension connects to the infrastructure story. The same rural communities served by under-resourced utilities are the most digitally excluded and the most vulnerable to wildfire.

Chapter 3: Plan. Cost Transparency and Integrated Planning

Cost Transparency: What Does It Cost?

Government budgets are full of pooled costs. Facility overhead. IT infrastructure. Shared administration. Fleet maintenance. These costs are real but invisible at the program level. A legislator asks what it costs to deliver wildfire coordination, or school food distribution, fully burdened. The honest answer in most jurisdictions: we do not know.

SAP Profitability and Performance Management (PaPM) distributes pooled costs through multi-step, auditable allocation models. In defense contracting, PaPM produces the indirect rate calculations that survive a DCAA audit. In state government, the same engine produces the fully burdened cost of any service. The result surfaces through a CAP application dashboard where an agency head drills from total budget to allocated cost per service. A legislative analyst compares delivery costs across programs.

This matters now because of converging compliance pressure. GASB 96 reclassified IT subscriptions as balance-sheet assets. GASB 104 requires separate disclosure by asset category. The SAMOSA Act mandates software inventories. PaPM closes the loop. Asset data already in SAP gets categorized, allocated, and disclosed in a format that satisfies both accounting requirements and the harder legislative question: what did we get for this money?

Integrated Planning (xP&A)

Cost transparency tells you where resources went. Integrated planning tells you where they should go. Instead of finance planning in one system, supply chain in another, and workforce in a third, xP&A connects them. Financial forecasts, procurement spend, logistics capacity, and workforce availability inform each other in real time.

When a city sees freight collaboration data next to its financial plan, next to procurement spend, next to asset maintenance schedules, that is xP&A applied to public infrastructure. Budget cycles compress. Resource allocation improves.

Planning Domain Traditional SLED Approach xP&A Integrated Approach
Financial Planning Annual budget cycle in isolation Continuous planning informed by logistics, procurement, and asset data
Procurement Purchase orders disconnected from supply chain visibility Procurement linked to supplier capacity and delivery forecasts
Logistics Not managed; delegated to individual departments Network-connected freight and supply chain visibility feeding the financial plan
Asset Management Reactive maintenance tracked in spreadsheets Predictive maintenance integrated with capital planning
Workforce Headcount budgeting disconnected from service demand Workforce planning shaped by service volumes, seasonal patterns, and field data

Cost Transparency: The Foundation xP&A Requires

Integrated planning without cost transparency is planning without a foundation. A city can connect its financial plan to procurement and logistics data, but if it cannot show where shared costs land, the planning produces numbers no one trusts.

Government budgets are full of pooled costs. Facility overhead. IT infrastructure. Shared administrative services. Fleet maintenance. These costs are real. They are also, in most agencies, invisible at the program level. A legislator asks what it costs to deliver wildfire coordination, or school food distribution, or homeless outreach, fully burdened, including the infrastructure underneath. The honest answer today, in most jurisdictions: we do not know.

SAP Profitability and Performance Management (PaPM) is an allocation engine built for this problem. It takes pooled costs and distributes them through multi-step, auditable allocation models. Driver-based spreads. Hierarchical waterfalls. Weighted distributions. Iterative cycles. In defense contracting, PaPM produces the indirect rate calculations (fringe, overhead, G&A) that survive a DCAA audit. In state government, the same engine can produce the fully burdened cost of any service, program, or jurisdiction.

The allocation models are not simple division. PaPM handles sequential allocation steps where the output of one pool feeds the input of the next. It runs what-if simulations before posting actuals. It pulls from multiple data sources. S/4HANA for financial actuals. The Business Network for logistics costs. EPPM for project costs. The result is a fully attributed cost model where every dollar of shared infrastructure is assigned to the programs it supports, through logic that can be examined, questioned, and defended.

The interface is a CAP application built on SAP's open-source Cloud Application Programming model. CAP runs on BTP. The application surfaces PaPM's allocation results as a clean, queryable cost transparency dashboard. An agency head drills from a total budget down to allocated cost per service. A legislative budget analyst compares fully burdened delivery costs across programs. A program manager sees the real cost of the infrastructure underneath their operation, not just the direct charges on their ledger.

This connects the xP&A story to accountability. Integrated planning tells you where resources should go. Cost transparency tells you where they went, and what they cost. Together, they give government something it has rarely had: a planning system backed by auditable truth about the cost of delivering services.

For the CCST audience, cost transparency also addresses a political reality. Every dollar of state investment in AI, interoperability, or coordination infrastructure will face scrutiny. PaPM provides the mechanism to show, line by line, what that investment produced and what it cost. That is not a technical capability. That is a governance requirement.

Chapter 4: Build. Prototypes That Teach

Planning and coordination produce better decisions. But decisions happen in the field. On a utility pole. At a school loading dock. At a shelter intake desk. This chapter describes what happens when the interoperability layer reaches the people who do the work.

SAP Build: The Mobile App with AI That Integrates Everything

The interoperability story becomes tangible when a field worker picks up a phone.

SAP Build is a low-code/no-code platform for building mobile applications that integrate with the full enterprise stack. ERP. Business Network. AI services. Edge devices including cameras. For SLED customers, a single development environment produces apps for different field scenarios. All feeding data back into the same planning and coordination layer.

Utility field service. A crew member photographs a damaged transformer. The app captures the image, geolocation, and condition assessment. The work order routes through Field Service Management. The Business Network notifies the parts supplier. The planning layer adjusts the maintenance forecast.

Public works. A pothole detection app runs on a phone mounted to a dashboard. The camera photographs damage during routine driving. AI classifies severity. A work order is created. The planning layer prioritizes repair based on traffic volume, safety risk, and budget.

Homeless outreach. A case worker checks real-time shelter bed availability. Routes a person to the nearest open bed. Logs the interaction. The data feeds service demand forecasting.

School food logistics. A cafeteria manager scans incoming deliveries. The app verifies quantities against purchase orders. Flags discrepancies. The day's menu plan updates.

In every case, the app is built once in SAP Build. Connects through standard APIs. Generates data that feeds planning. The field worker needs a phone and five minutes of training.

Digital Manufacturing for Same-Day Logistics

SAP Digital Manufacturing was designed for factory floors. Production schedules. Quality checks. Material flows. Equipment management. But a school district central kitchen preparing 50,000 meals a day is a manufacturing operation. A disaster relief staging area sorting and routing supplies is a manufacturing operation. A donation center classifying and distributing contributed goods is a manufacturing operation.

The same platform that manages an automotive assembly line can manage food flow from a central kitchen to 200 school cafeterias. It schedules production runs. Tracks ingredients through preparation. Coordinates delivery routes through the Business Network. Confirms receipt at each school. Camera systems at the loading dock count pallets and verify shipments. Quality checks are logged. Exceptions flag in real time.

On-Device Vision: CliffordNet and California-Specific AI

The camera capabilities described above require vision AI that runs at the edge, on-device, without cloud latency or connectivity dependencies.

CliffordNet is a geometric-algebra-based neural architecture that uses Clifford Algebraic Networks (CANs) to encode spatial relationships through the geometric product rather than standard convolution. Small model variants deliver accuracy per parameter that punches above their weight. The interaction mechanism runs at linear complexity with a sparse rolling attention pattern. On-device deployment is practical even on commodity hardware.

What makes this California-specific is the training data. Every photograph taken at a donation center. Every pallet counted at a school loading dock. Every pothole captured by a public works app. Each generates labeled training data. Over time, a vision system trained on California warehouse images, California road conditions, and California infrastructure assets becomes measurably more accurate for California use cases. The people using it are training it, with real data in real conditions.

This is sovereign AI in the most practical sense. A model trained on local data. Running on local devices. Improving through local use. Owned by the public agencies that generated the training data.

Loot Locker: A Pilot That Teaches Everything

Loot Locker is an AI-powered donation intake system built on SAP Business Technology Platform. It originated at a hackathon. A person walks up to a donation event. Photographs what they are donating. Describes it. The system handles classification, condition scoring, and routing. Resell. Refurbish. Recycle. Dispose.

When the vision model lacks confidence, it asks for help. "What size shoes?" "Remove the clothes from the bag and photograph them separately." "Rescan that QR code." The system knows what it needs.

As a pilot, Loot Locker teaches every concept in this paper at low cost and high visibility.

Concept How Loot Locker Teaches It
Business Network Routes donations to resale partners, recyclers, and distribution points through network collaboration
Edge AI / CliffordNet On-device vision classification in variable lighting and conditions
SAP Build Mobile app built on low-code platform with camera integration
Digital Manufacturing Classification, quality grading, and routing as a production workflow
xP&A Donation volume forecasting, capacity planning, distribution optimization
Training Data Every interaction produces labeled data for model improvement
Workforce Development Participants learn AI-assisted classification, logistics coordination, data quality

The classification and routing logic scales. At a community drive, you sort shoes. At a defense receiving dock, you determine whether an inbound shipment routes to a COMSEC cage, a flight line, or a depot in theater. Same pattern. Different consequences.

Chapter 5: Responsible AI and Governance

California introduced more than 33 AI bills in the first half of the 2025-2026 session. SB 53 regulates frontier model developers. AB 316 bars AI autonomy as a defense in civil actions. AB 853 expands content provenance requirements. Every capability in this paper is designed to operate within this framework and the responsible AI principles established by the Governor's Frontier AI Working Group.

Human Oversight and Accountability

All AI recommendations require human review before action. In utility coordination, AI surfaces priorities. Human operators decide deployment. In logistics, AI optimizes routes. Human dispatchers confirm. In citizen services, AI handles routine inquiries. Complex cases escalate to trained staff. Every AI decision is logged with an auditable trail.

Transparency and Explainability

All proposed systems are designed for transparency. This aligns with SB 53 and AB 853. Model capabilities and limitations are documented. Users can understand why the AI made a recommendation. AI-generated content is labeled. Risk assessments include confidence indicators.

Digital Inclusion as Design Principle

Multi-channel access reaches residents regardless of connectivity. Multi-lingual support reflects California's diversity. Shared service models provide capability to under-resourced organizations. Offline-capable solutions serve communities where broadband has not arrived or has been destroyed.

Data Sovereignty and Privacy

The Tractus-X, DS4SSCC, and FIWARE data space models show that federated data sharing does not require centralized data collection. Each organization retains sovereignty over its data. Jurisdictions retain control of their systems while participating in shared standards. This is a design principle, not a concession.

Chapter 6: Seven Pilots

The mandates described in this paper (GASB 104, NIEM conformance, SAMOSA, Envision 2026) are on calendars, not wish lists. The progression (connect, equip, sense, plan, twin, simulate) is not a monolithic program. It is a series of pilots. Each delivers value independently. Each meets a specific compliance or coordination requirement. Together, they build toward a digital twin of California's public infrastructure.

Pilot What It Proves Entry Point Timeline
Business Network Logistics SLED customers can coordinate freight, supplies, and warehouse capacity through the network without procuring TM/EWM One city or county with Ariba 6 months
Loot Locker AI classification, edge vision, training data generation, and workforce development in a single deployment One community partner or school district 3–6 months
xP&A Planning Integrated planning across finance, procurement, logistics, and assets gives public agencies predictive capability One state agency on S/4HANA 6–12 months
Edge Vision for Field Service On-device CliffordNet models trained on California imagery deliver value for pothole detection, asset inspection, or shelter routing One public works or outreach department 6 months
Utility Data Space Small utilities share critical asset data with large IOUs and emergency management through a NIEM-conformant, federated layer modeled on Tractus-X 3–5 small utilities in fire-prone regions 6–12 months
School Kitchen Digital Manufacturing Same-day food logistics orchestrated through Digital Manufacturing with camera-based quality verification One school district central kitchen 6 months
Cost Transparency (PaPM + CAP) Fully burdened cost-per-service model with auditable allocation logic, meeting GASB 104 capital asset disclosure requirements and surfaced through an open-source CAP application One state agency modeling the true cost of a specific service 6 months

Workforce Development

Every pilot creates a training pathway. Loot Locker trains participants on AI-assisted classification. The warehouse vision system trains staff on edge deployment. The Business Network pilot trains procurement teams on logistics collaboration. The xP&A pilot trains financial analysts on integrated planning. The cost transparency pilot trains budget staff on allocation modeling, auditable cost attribution, and GASB compliance reporting. For state and local government, workforce development matters as much as operational outcomes. It should be measured alongside them.

Proposed Immediate Actions

1. Convene a working session with utility coordination entities, CPUC, and emergency management stakeholders to scope the utility data space pilot. Use the Tractus-X governance model and NIEM exchange standards as the foundation.

2. Identify one city or county on Ariba to pilot Business Network logistics coordination for emergency supply staging or routine distribution.

3. Deploy Loot Locker as a pilot that teaches edge AI, training data generation, and workforce development at the same time.

4. Engage a state agency on S/4HANA for an xP&A proof of concept across finance, procurement, and assets.

5. Model the true cost of one service using PaPM allocation logic and a CAP application dashboard. Produce GASB 104-compliant asset disclosures alongside fully burdened cost transparency from shared infrastructure through program delivery.

"Enterprise Architects have a unique opportunity to interact with customers not transactionally, but transformationally."

Jason Shearer

The infrastructure is closer than anyone in the room thinks. California has the opportunity to lead.

Appendix A: Global Evidence Base

The following deployments provide evidence that the capabilities described in this paper are proven in production. Each entry links to its published Innovation Award submission.

California and U.S. Government

Customer Key Outcome Case Study
CA Dept of Water Resources Billing automation: days to hours View →
Southern California Edison 48.6% process efficiency; 19.2% IT cost reduction View →
SDG&E 2.7M users; 5-star mobile platform View →
DTE Energy (Michigan) 397,000+ shutoffs prevented; $88M+ in assistance View →
National Grid US 30% training engagement increase via GenAI View →
Defense Logistics Agency 25% productivity increase; $46B supply chain View →
U.S. Dept of the Interior 87 legacy systems decommissioned; 26M transactions View →
U.S. Dept of Justice 30% IT cost reduction; 9.5/10 user satisfaction View →
Lockheed Martin (1LMX) Unified AI across 50+ apps; 120K employees View →
Commonwealth of PA 10x reporting speed improvement View →

Innovative City and State Deployments (Global)

Customer Key Outcome Case Study
City of Hagen (Germany) 90% budget evaluation time reduction; €1B budget View →
City of Vancouver Budget cycle: 7 weeks to 3 weeks View →
City of Antibes (France) 30% workload reduction; AI-powered procurement View →
German Federal Foreign Office 50% manual reduction; 83.2% same-day resolution View →
Spain Public Employment Service 125,000 monthly benefits; 20% productivity gain View →
Canton Zurich Budget sources: 120 to 1; full compliance View →
Transport for NSW A$211B asset portfolio on unified platform View →

Utility and Energy

Customer Key Outcome Case Study
Thames Water (UK) 85% efficiency gain; £2.2M combined savings View →
E.ON (Germany) GenAI-powered finance; 40% efficiency increase View →
Essent (Netherlands) Monthly processing: 36+ hours to <10 hours View →
GS Inima (Spain) Digital twin: 20% cost reduction; 30% resource gain View →
Tata Power (India) 50% financial closure improvement; 30% IT cost cut View →

Enterprise AI and Supply Chain

Customer Key Outcome Case Study
Bosch Power Tools 1.5M tickets/year; Joule AI: 35% accuracy gain View →
AutoScheduler.AI Warehouse planning: 8 hours to 20 minutes View →
SBB AG (Swiss Rail) AI predictive maintenance; lifecycle extension View →
Embraer 41,000 man-hours saved; 5% FTE productivity gain View →
Honeywell 94% on-time payment (from 74%); 96% first-pass yield View →

Appendix B: Enterprise AI Capabilities Reference

AI capabilities available today in enterprise platforms, mapped to roadmap items and Discovery Center pages. For existing California government and utility customers, many may be included in current license entitlements.

Legend: GA = Generally Available; Planned = on roadmap, not yet released.

Utility Operations & Field Service

CapabilityWhat It DoesReferenceStatus
Intelligent Scheduling AI optimizes technician routes and job assignments by skills, location, priority Roadmap GA
Predictive Maintenance Predicts equipment failures using sensor and historical data Roadmap GA
Customer Self-Service Agent Natural language service requests routed to resolution; ~45% FTE cost reduction, ~90% handling cost reduction, 10x ROI Discovery Center GA
Smart Meter Integration High-volume meter data processing for billing and consumption analytics Roadmap GA
Consumption Forecasting Predicts utility demand based on weather, history, and events Roadmap GA

Emergency Response & Logistics

CapabilityWhat It DoesReferenceStatus
Demand Sensing Predicts short-term demand changes from external signals Roadmap GA
Route Optimization Calculates optimal delivery routes with real-time constraints Roadmap GA
Intelligent Warehouse Slotting Optimizes storage locations based on movement patterns Roadmap GA
Supplier Risk Monitoring Flags supply chain disruptions before impact Roadmap GA
Supply Chain Control Tower Single real-time view across suppliers, logistics, inventory Roadmap GA

Document Intelligence & Compliance

CapabilityWhat It DoesReferenceStatus
Document Information Extraction Extracts structured data from invoices and forms Discovery Center GA
Document Summarization Summarizes documents into key points and action items Discovery Center GA
Compliance Monitoring Monitors transactions for regulatory compliance flags Roadmap GA
Custom Document Processing Build extraction pipelines for specific document formats Roadmap GA

Citizen Services & AI Assistants

CapabilityWhat It DoesReferenceStatus
Joule AI Copilot Natural language assistant embedded across enterprise applications Discovery Center GA
AI Case Classification Categorizes and routes inquiries to correct resolution Discovery Center Planned
Sentiment Analysis Detects citizen/customer sentiment in communications Discovery Center GA
Personalization Engine Tailors digital experiences based on user context and history Roadmap GA
Decentralized Identity Verification (DIV) Self-sovereign identity for inter-organizational verification; verifiable credentials, DID:Web, Decentralized Claims Protocol Product Page GA
Data Space Integration Implements Dataspace Protocol for federated data exchange; NIEM-conformant exchange packages; works with DIV for verified participant identity Product Page GA

Planning & Analytics

CapabilityWhat It DoesReferenceStatus
Smart Insights Surfaces anomalies and trends humans would miss Roadmap GA
Predictive Planning Forecasts based on historical patterns with what-if scenarios Discovery Center GA
Workforce Analytics Predicts attrition risk and identifies skill gaps Roadmap GA
GenAI Hub Access to foundational models for custom AI development Roadmap GA

Responsible AI

CapabilityResponsible AI DimensionReferenceStatus
Audit Trail Logging Accountability: all AI actions recorded and reviewable Platform capability (SAP AI Core / SAP BTP) GA
Confidence Scores Transparency: users see AI certainty level Roadmap GA
Human-in-the-Loop Controls Oversight: critical actions require human approval Roadmap Planned
Data Masking Privacy: sensitive data protected in AI processing Roadmap GA
Decision Explainability Transparency: reasoning behind AI recommendations Discovery Center GA

Platform Scale

MetricValue
AI features on current roadmap2,916
Products with AI capabilities196
Available today (GA)~60% of roadmap
Utility-specific AI features150+
Public sector AI features100+
Logistics/supply chain AI features300+
Published Innovation Award use cases375+
Business Network trading partners15M+

Appendix C: Technical Architecture Detail

This appendix provides engineering depth for readers who want to understand the mechanisms behind the capabilities described in the main text.

Digital Twin Architecture

SAP Business Transformation Management (BTM) provides the framework for a digital twin of the enterprise: a functional model of how the organization operates — its processes, data flows, resource allocations, and performance against plan. This is not a visualization. It is queryable, analyzable, and usable for scenario planning.

Integrated Product Development (IPD) extends the twin into physical asset lifecycle management. Originated in aerospace and manufacturing, IPD manages complex assets through design, production, operation, maintenance, and retirement. A bridge has a design specification, a construction history, an operational profile, a maintenance schedule, and an end-of-life plan. So does a water treatment plant, a school building, and a power substation.

NVIDIA Omniverse Integration. When the digital twin requires spatial visualization, SAP's partnership with NVIDIA Omniverse connects enterprise operational data to physically accurate 3D simulation environments. Operational data (asset conditions, logistics flows, workforce positions) comes from SAP. Spatial rendering comes from Omniverse. The result: a simulation where policy decisions can be tested before implementation, with real data.

Decentralized Identity: Technical Foundation

Why the U.S. Stalled. In 2022, a DARPA-commissioned study by Trail of Bits found that blockchain implementations contained structural weaknesses: unintended centralities, vulnerable node software, traffic concentration through a small number of ISPs. The cryptography was sound. The infrastructure was not. Since blockchain had been the leading U.S. candidate for decentralized identity, the report froze government momentum.

How Europe Responded. European initiatives separated self-sovereign identity from blockchain. SSI requires three things: decentralized identifiers (DIDs), verifiable credentials issued against those identifiers, and a way to verify credentials without calling home to a central authority. The W3C standards provide all three. The DID:Web method uses standard web protocols. No chain. No mining. No unintended centralities. This is the identity model running in production at Catena-X.

SAP Decentralized Identity Verification (DIV). A BTP service implementing SSI for inter-organizational communication. Organizations sign, verify, and manage verifiable credentials through an administration interface and API. In Catena-X, DIV handles the identification step of the Decentralized Claims Protocol. When a data consumer requests data, it presents verifiable credentials issued by a trusted authority. The provider's Data Space Integration service checks credentials against access policies. If valid, the contract negotiation and data exchange proceed. No central identity broker.

Zero-Knowledge Proofs (ZKPs). ZKPs allow one party to prove a statement is true without revealing the data behind it. In the SSI context: selective disclosure. A resident proves they are over 18 without revealing their birthdate. An agency proves it holds a valid operating license without exposing financial statements. A utility proves CPUC safety compliance without sharing proprietary grid data. W3C Verifiable Credentials support ZKP-based selective disclosure through BBS+ signatures. The European Digital Identity Wallet program is building ZKP capabilities to comply with GDPR data minimization. SAP DIV, built on W3C standards, provides the infrastructure.

The MOSA Principle Applied to Identity. For technologists familiar with defense acquisition: the Modular Open Systems Approach (MOSA) mandates composable, interoperable, vendor-independent systems. Traditional IAM is monolithic. Federated identity (SAML, OIDC) distributes authentication but relies on bilateral trust relationships that become brittle at scale. Self-sovereign identity makes identity itself a composable component. An organization's identity is a credential verifiable by any party that trusts the issuer, not locked into any provider's IAM system. In Europe, this is operational. In the U.S., where the DARPA report created justified blockchain skepticism, it represents a path forward: decentralized identity on standard web protocols, not blockchain infrastructure.

Extended: Task Force Lima and the Validation of Enterprise AI

In December 2024, the Department of Defense sunsetted Task Force Lima, the largest military generative AI experimentation effort in history. Over twelve months, the task force analyzed hundreds of AI workflows and categorized them into fifteen areas across two categories: warfighting functions and enterprise management functions.

The finding that matters for this paper: roughly half of those use cases resided in enterprise management — financial management, healthcare information management, logistics, human resources. Not frontier model problems. Enterprise application problems. Classification, routing, anomaly detection, planning optimization.

Task Force Lima drew a distinction between Mode 1 experimentation — exploring frontier capabilities like large language models — and Mode 2 — applying AI to existing enterprise workflows where data, processes, and governance already exist. Mode 2 produced faster, more measurable results precisely because it operated on established infrastructure.

The institutional trajectory that followed validates the architectural choices in this paper. Task Force Lima's parent organization, the Chief Digital and Artificial Intelligence Office (CDAO), has since lost nearly 60 percent of its staff. The $15 billion contract to scale its Advana analytics platform was cancelled. The Advana program officer, Alex O'Toole, departed for Databricks. CDAO itself has been reorganized under the Under Secretary of Defense for Research and Engineering, with a 120-day review deadline to determine the future of both Advana and the Maven AI program. Key leaders dispersed: Stuart Wagner became the Navy's Chief Data and AI Officer. Radha Plumb, who led CDAO through Lima's sunset, departed at the start of the new administration.

Meanwhile, 18F — the federal digital services team that built Cloud.gov, the government's platform-as-a-service — has been dissolved. Cloud.gov ran on Cloud Foundry, the same open-source technology SAP has contributed to as a founding platinum member since 2014. The platform persists. The institutional home does not.

The pattern could not be clearer. Centralized platforms built within institutional structures are vulnerable to reorganization, budget cuts, and political transition. Federated, open-source infrastructure survives because no single entity owns it.

Extended: CliffordNet Architecture

CliffordNet is a geometric-algebra-based neural architecture that uses Clifford Algebraic Networks (CANs) to encode spatial relationships through the geometric product rather than standard convolution. The architecture delivers several advantages for edge deployment:

For California-specific applications, the training strategy matters as much as the architecture. Every photograph taken at a donation center, every pallet counted at a loading dock, every pothole captured generates labeled training data. Over time, a vision system trained on California warehouse images, California road conditions, and California infrastructure assets becomes measurably more accurate for California use cases. The people using it are training it, with real data in real conditions. This is sovereign AI in the most practical sense: a model trained on local data, running on local devices, improving through local use, owned by the public agencies that generated the training data.

Extended: NIEM Technical Detail

The National Information Exchange Model (NIEM) defines a common vocabulary for sharing data across jurisdictions. It originated in 2005 through a partnership between the Department of Justice and Department of Homeland Security, and now spans DoD, HHS, and state/local participants.

NIEM is not a system. It is a standard for building data exchange packages that can be understood by any participant regardless of their internal systems. A NIEM-conformant outage report from a small California utility contains geospatial coordinates, asset identifiers, crew status, and mutual aid capacity in a format that emergency management software can parse without custom integration.

As NIEM compliance moves toward procurement requirement for federally funded programs in emergency management, homeland security, and defense, California agencies receiving federal funding will increasingly need NIEM-conformant data exchange. The data that needs to become NIEM-conformant — outage reports, asset conditions, crew availability, logistics status — already exists in the SAP systems California utilities and agencies run today. The gap is not data availability. It is the mapping layer between internal data structures and NIEM exchange packages.

SAP Integration Suite provides this mapping layer through its B2B integration capabilities. The same service that powers Eclipse Tractus-X data spaces in Europe can produce NIEM-conformant exchange packages for U.S. federal coordination. This is not a new integration project. It is configuration work on infrastructure that already exists.

Extended: European Data Space Architecture

DS4SSCC (Data Space for Smart and Sustainable Cities and Communities) is deploying federated data infrastructure for European municipal governments. The sectors map directly to California priorities: cross-border mobility, energy management, participatory urban planning, environmental monitoring.

The architecture relies on three open standards:

Eclipse Tractus-X provides the reference implementation. Originally built for automotive supply chains (BMW, SAP, Siemens, and others), Tractus-X demonstrates federated data exchange at industrial scale. Three principles: data sovereignty (each participant controls their data), federated integration (any organization can participate regardless of internal platform), and open standards (no single vendor controls the infrastructure).

The governance model matters as much as the technology. Tractus-X operates as an Eclipse Foundation project with open governance, transparent roadmaps, and vendor-neutral leadership. When California considers building a data space for utility coordination or housing data, the governance precedent exists. The technical stack is proven. The political risk — that a single vendor or agency controls the infrastructure — has been designed out through open governance.

The Digital Twin Progression

A legislator asks: if we fund this interoperability work, what do we get?

The answer is a working model of how California's public infrastructure actually operates. Not a dashboard. Not a report. A model you can test decisions against before committing resources.

An organization that has connected to the Business Network, equipped field workers with AI-enabled mobile apps, deployed edge sensors, and integrated all of it into a unified planning environment has built the foundation of a digital twin without necessarily naming it. Each chapter in this paper is a layer of that twin:

LayerWhat It ProvidesChapter
Connect Data flowing across organizational boundaries Ch. 1–3
Equip Field workers generating structured data Ch. 4
Sense Edge AI capturing conditions in real time Ch. 4
Plan Integrated planning across all resource types Ch. 3
Twin Queryable model of operations Ch. 5
Simulate Test decisions before making them Ch. 5

For a city government, a digital twin means answering questions that are currently unanswerable. What happens to emergency logistics capacity if a primary warehouse floods? How does a 15% increase in school enrollment affect the food distribution network? If three maintenance crews are redirected to wildfire recovery, what deferred maintenance risk accumulates on water infrastructure?

Public infrastructure is a complex physical asset with a lifecycle: design, construction, operation, maintenance, retirement. Managing these assets the way aerospace manages aircraft — with full lifecycle digital twins and predictive maintenance — is not aspirational. The technology exists in production. SAP's partnership with NVIDIA Omniverse extends the model into physically accurate 3D simulation where policy decisions can be tested with real operational data.

The technical architecture for digital twins and the Omniverse integration is detailed in Appendix C for readers who want depth on the engineering.

DRAFT | CCST AI Applications and Innovation Showcase, March 4, 2026

This document supports evidence-based policy dialogue and does not constitute a commercial proposal.

Document version: 4.2
Last updated: February 27, 2026