Date:
13 January 2026
 
Phillipa Martin is Salsa's Rules as Code Practice Lead

The AI temptation for automating rules

Governments everywhere are exploring how artificial intelligence (AI) might make public services faster and smarter. AI can process vast amounts of information, find patterns and even generate recommendations — qualities that seem perfect for managing complex rules such as welfare or taxation rules.

But when it comes to the law, AI doesn’t truly understand what a rule means; it simply infers likely outcomes from data. At a Rules as Code (RaC) Guild presentation in June 2025External Link , Rules as Code expert Pia Andrews talked about how RaC can be used as a technical guardrail for AI systems. She believes (and we agree) that AI alone should NOT be used for government decision-making because: “... machine-learning systems do not follow rules.”

To demonstrate this point, Pia talked about an experiment in which ChatGPT played chess against a rules-based chess engine. “ChatGPT, of course, just made stuff up.”

Robodebt: when automation went wrong

Another factor influencing rules automation in Australia is, of course, Robodebt. The Robodebt Scheme was introduced with the promise of efficiency — automating the detection of income discrepancies among welfare recipients. But its underlying logic wasn’t derived from the legislation itself. Instead, it relied on averaging ATO income data over long periods, producing debt calculations that were often legally baseless and inaccurate.

The result: unlawful debts, enormous distress, and a Royal Commission into the Robodebt SchemeExternal Link . You can read the full Robodebt Royal Commission reportExternal Link including the list of 57 recommendations.

The horrific outcomes of robodebt have, understandably, tainted rule automation in this country. This also has implications for AI. As stated in The Mandarin article AI adoption in the shadow of robodebtExternal Link : “...AI adoption is not starting from a clean slate. AI adoption is happening under the long shadow cast by robodebt.” This Mandarin article talks about the effect of robodebt on AI adoption in general, rather than specifically/only for rules, but it’s well worth a read.

The problem with AI for rules automation

At the June 2025 RaC Guild presentationExternal Link , Pia Andrews said:

“Of course, a lot of people are playing around with machine learning for decision-making from a government perspective. I have a very firm position on this. And I'll just say it's not necessarily one that's shared by everyone, but my position is that we should not be using machine learning for any administrative decisions in government.”

The black-box, probabilistic nature of AI could potentially lead to similar issues that are harder to explain. For rules automation to succeed, it’s essential that the logic can be explained, tested and traced back to the law.

There’s a better way: transparent rule automation

Rules automation can still be a force for good — but it must be explicit, explainable and legally grounded. That’s where Rules as Code (RaC) and OpenFiscaExternal Link come in.

OpenFiscaExternal Link is an open-source rules engine for coding legislation. Instead of learning patterns from data, it encodes rules as clear, machine-readable formulas. The outcomes are deterministic and auditable: the same input will always produce the same result and, when implemented correctly, every result can be traced back to its legal source.

What is OpenFisca?

The French Government created OpenFiscaExternal Link in 2011 to make tax and welfare calculations more transparent. It allows policymakers, researchers and developers to represent legislation as structured code. It defines variables, formulas and dependencies in a way that computers can understand.

Today, OpenFisca is used around the world to simulate how policy changes affect people and budgets.

More about OpenFisca

How it works

  • Inputs: Real-world data (like income, household structure or age)
  • Logic: A transparent, rule-based model of the legislation written in Python
  • Outputs: Precise calculations such as how much a person is eligible to receive under current law

Each rule in OpenFisca can be coded with references to the source legislation. Using OpenFiscaExternal Link , you can build a digital twin of the law, designed for clarity rather than prediction.

OpenFisca vs AI: two different approaches to automation

Aspect OpenFisca AI/ machine learning
Foundation Logic-based, rule-driven Data-driven, pattern-based
Transparency Fully auditable, interpretable Often opaque (black box)
Consistency Deterministic results Probabilistic outputs
Legal traceability Direct link to legislation Indirect or absent
Risk profile Low (if rules are coded correctly) High (depends on training data)

AI and machine learning excel at classification, prediction and discovery — but they’re not designed to implement the law. In contrast, OpenFisca is built for logic and repeatability, making it ideal for modelling rights, obligations and entitlements where fairness and accuracy matter most.

OpenFiscaExternal Link also brings other benefits to Rules as Code, such as simulations out-of-the-box and the ability to keep all versions of the rules for point-in-time auditability.

How to automate rules without another robodebt

1. Start with the law, not the data

Coding rules should begin with the legislative text. Translating legal rules into explicit, computable logic ensures every calculation has a clear legal basis. Data informs the model — it doesn’t define it.

2. Test continuously

Every rule model should be tested against real-world cases and edge scenarios. This makes unintended consequences visible long before they reach production. We use test-driven development, which helps ensure constant testing of the code against the rules. In addition, in OpenFisca tests can be automated to verify that changes in one rule don’t break another.

3. Maintain transparency

Publish the logic. Document the assumptions. Allow others to inspect and critique the model so it can be improved. Transparency transforms rule automation from a source of risk into a source of trust.

4. Include human oversight

Automation and Rules as Code should support, not replace, human judgment. Keeping a ‘human in the loop’ ensures compassion and is essential for any areas that aren’t black and white. People must be able to intervene. This is another key learning from robodebt.

5. Design for accountability

Every automated outcome should be explainable: which rule was applied, what data was used, and how the result was calculated. This ensures not only fairness but also legal defensibility.

6. Use AI but with reduced risk

AI does have a place in rules automation. However, at the moment its power lies in how it can help with the Rules as Code process. AI can be used to:

  • Help business analysts and policy experts convert legislation into IF-AND/OR-THEN statements or logic diagrams
  • Help convert IF-AND/OR-THEN statements into code
  • Create test cases
  • Provide a user-friendly chat interface for citizens

However, the rules engine itself should be a RaC implementation that performs the deterministic evaluation steps so you get audibility, transparency and consistency.

Note: Future AI technology may change AI’s role in rule automation. But for the moment, deterministic logic is still needed (and safer) for users.

Rules as Code: the broader vision

Rules as Code is the movement behind the shift to rules automation — the idea that government rules should exist as both readable text and executable code. OpenFiscaExternal Link is one of the most mature implementations of that vision.

When legislation, policy and service delivery are all aligned to the same digital rule set, everyone benefits:

  • Citizens get clearer, faster answers
  • Policy teams can model impacts before laws are passed
  • Developers can build consistent, reliable services

Building trustworthy rule automation

Robodebt showed what happens when automation outruns accountability. But automation itself isn’t the enemy. Done right, it can make government simpler, faster and fairer.

OpenFisca and the broader Rules as Code approach demonstrate that rule automation can be transparent, ethical and precise. It’s about making the law computable without making it invisible.

If robodebt showed what happens when automation goes wrong, Rules as Code and OpenFisca have the power to show how to get it right.

Want to find out more?

At Salsa Digital, we help governments and organisations bring Rules as Code to life — turning complex legislation into simple, testable and transparent digital logic.

Whether you’re exploring OpenFiscaExternal Link for policy modelling, eligibility checking or compliance automation, our team can help you design solutions that are legally sound, technically robust and citizen-focused.

Learn more about our Rules as Code services and how we can support your journey towards trustworthy rule automation.