Regulators still publish laws and regulations as texts. Just as it has been for thousands of years. Regulation as code represents a radical paradigm shift away from this practice.
Is it possible to transform laws into algorithms? Why should anyone want to do this? How should you get ready for this as a regulator, manufacturer, authority, or notified body?
This article provides answers.
A law is clear and unambiguous if the facts and legal consequences described in the law are unambiguously clear to everyone.
Many laws – lawyers usually speak of "legal norms" – comprise two parts: the facts and the legal consequences.
The facts of a case specify the "if", i.e., the criteria and conditions that must be fulfilled for the legal consequences to ensue, i.e., the "then."
Section 1 German Civil Code: "The legal capacity of a human being begins on the completion of birth." Here, the legal consequence is legal capacity; while the fact is that it must be a human being whose birth has been completed.
Legal texts, in particular the facts, are often not phrased using adequately precise, comprehensible, and unambiguous language. Sometimes a misplaced comma, other times an error in translation, and sometimes a lack of definition renders the meaning of a sentence inconsistent with the lawmakers' intent.
Numerous publications bear witness to this lack of clarity, which they attempt to remedy. Some such examples include:
Regulatory requirements published as machine-readable algorithms would eliminate these inconsistencies and ambiguities.
A computer algorithm with a unique ID can be identified and referenced more specifically than is possible with legal texts:
This would make references such as the following obsolete:
"MDR Annex XIV, Part A, first sentence, subparagraph a), eighth indent, last subordinate clause".
Regulatory requirements may overlap not only with each other but also within individual requirements:
Risk management requirements are described in the MDR, ISO 13485, ISO 14971, and hundreds of other standards, all of which require, for example, the establishment of acceptance criteria.
There are also redundancies in many of the individual regulations. For example, the wording of ISO 13485 calling for the validation of computerized systems is almost identical in three places.
Regulation as code would eliminate such redundancies.
Contradictions in legal requirements are not easily recognized as long as they are published in text form. Unfortunately, these contradictions are found not only between requirements stemming from different areas of law, but also within each respective area of law.
Incoherent logic can be found, for example, in IEC 62304, Team-NB documents, and state agency interpretations.
Algorithms suitable for formal testing can be used to detect and eliminate such inconsistencies.
Laws often do not claim to fully reflect the set of facts. At the very least, they often fail to live up to this claim.
Regulation as code would identify such gaps.
The capability to automate the verification of all of the above quality aspects (completeness, consistency, etc.) represents another benefit of laws articulated as algorithms.
Automated checking of facts for conformity with these regulatory requirements is just as pertinent. For example, algorithms can check whether
This capability for automation is a prerequisite for "real-time regulation", i.e., continuous, automated conformity checking taking place whenever inputs (e.g., documents) are changed.
In order to fully test this compliance of medical devices and economic operators, all regulatory requirements must be transformed into algorithms.
These regulations include
The more specific the requirements, the easier they can be mapped into algorithms. In other words, regulation as code works more easily for checklists than for general regulatory frameworks.
In addition to these legal and normative requirements, authorities and notified bodies have the right to judge within a latitude of assessment and discretion, setting forth additional specific requirements. These additional requirements must also be mapped into algorithms.
The latitude of assessment relates to the factual side (the "if"). This means that the authority has leeway in deciding whether a fact has been fulfilled, e.g., whether a process has been defined with adequate precision.
In the same way, authorities have a latitude of discretion. For example, a school may determine a student's grade within given limits or decide whether to expel a student for a given set of facts.
The scope also includes the definition of which processes in which organizations must be checked for conformity.
For example, Johner Institute includes the following as processes for which regulatory requirements should be transformed into algorithms:
In many places, lawmakers have deliberately avoided providing clear guidance. The law employs, for example, so-called indeterminate legal terms.
Legislators are aware that laws with their limited means (language) cannot cover all conceivable circumstances of life. The less determinate the terms, the more leeway there is in applying the law. So this is an advantage.
However, the less determinate the terms are, the less clear is the actual intent in case of doubt. This, on the other hand, is a disadvantage.
This fuzziness hurts medical device manufacturers primarily in terms of latitude of discretion ("if" side): Discussions are held internally and with notified bodies and authorities to determine whether a document or device meets regulatory requirements.
Final standards are one category of indeterminate laws. These requirements only state the objectives, but unlike conditional standards, they do not contain if-then rules.
One such example of the requirement of a final standard would be:
"The EU is committed to ensuring that all patients have access to safe, affordable, and effective medical devices."
Regulation as code allows checking whether
Uncertainties often arise because laws are inadvertently phrased in such a way that it is not exactly clear what the facts and legal consequences imply:
Resolving this fuzziness and transforming it into algorithms is only possible if
In any case, the objective must be to eliminate inadvertent fuzziness.
The gaps and "fuzziness" must be eliminated in order to map the regulations into algorithms.
Legislators and standards bodies are currently unable to close the gaps and clarify the requirements with adequate swiftness. However, if someone else does so and thus assumes interpretive authority, a dispute may arise as to how binding these additions and clarifications are.
This would necessitate bodies to which this authority is delegated.
Writing a sentence containing a requirement is easy. However, it is difficult to transform this requirement, with mathematical precision, into an algorithm. Many cannot or even do not want to commit themselves to such exact details.
Moreover, the authors of such algorithms must be able to think in terms of different hierarchy levels and design consistent and coherent inheritance hierarchies. For example, a requirement of a national authority must comply with national laws, and the latter with European regulations.
This is not only true for regulation as code. But with requirements phrased as text, such inconsistencies and contradictions are not so apparent.
Even if legislators could and would describe their rules as code: The effort required to transform laws and regulations into algorithms is high, and not just initially. Ongoing further refinement is also time-consuming. This is because the algorithms must reflect the constantly evolving state of the art.
However, this does not imply that the effort required to transform regulations into algorithms is greater than the effort required by thousands of medical device manufacturers to verify the compliance of their devices and organizations.
Quite the opposite is true: The algorithms would automate these verifications.
And the effort to first publish future laws as algorithms (regulation as code) should not be greater than regulations as text.
For it should be assumed that the authors have thought and phrased all of their requirements precisely.
All algorithms are worthless when not used. Algorithms can only be used if they are provided with the necessary input. And this input must be available as data in the right formats and data structures.
In other words, manufacturers, authorities, and notified bodies need IT systems that access these algorithms and feed them with the appropriate input.
Formal languages, in particular domain-specific languages (DSL), are suitable for mapping regulatory requirements. One such example is Catala, a language used to describe parts of the U.S. tax code, for instance.
Rule engines allow the definition of rules and control of their dependencies.
A technological framework must not only allow rule hierarchies, but also let users add their own rules and override existing rules. It must also verify internal rule consistency.
Such a framework would also create an important foundation for an entire ecosystem of "algorithm providers."
Lawmakers, along with standards bodies, for example, should begin to express their requirements as algorithms by way of example, in order to get a feel for the feasibility, effort, and design of processes.
It is not enough(!) to break down regulatory requirements to the level of sentences. On the contrary, the contents of these sentences must be expressed in formal languages.
This implementation requires concurrent research in order to
A step-by-step implementation is thus possible. There is no need for a "big bang."
Manufacturers, authorities, and notified bodies do not have to invest as much into research as lawmakers, but can directly initiate the application:
Many companies boast that they are already data-centric and are building an enterprise-wide data model.
As long as this data model is unable to provide the algorithms with the appropriate data, it is only of limited use for automating the abovementioned processes.
Johner Institute supports the digital transformation of manufacturers, authorities, and notified bodies, in particular regarding their relevant regulatory processes:
Regulation as code, or the transformation of regulations into algorithms, may sound like a pipe dream. But implementation has already started. Johner Institute, for example, is already transforming regulations into algorithms.
It is evident that the Pareto principle should also apply here: A large part of manual verification activities can be automated with reasonable effort.
The task now is to support manufacturers in their digital transformation and to encourage lawmakers to follow this path. This will be an iterative process accompanied by research.
The precision of algorithms not only forces precision in the phrasing of regulatory requirements. It also reveals mistakes quickly and mercilessly.
Therein lies the opportunity to correct these errors just as quickly – and to respond to technological trends.
However, this requires the ability to
In turn, this requires a regulatory meta-framework including processes and systems to
Not by coincidence, this resembles best practices from software development.
As a rule, unclear, ambiguous, and difficult-to-understand laws, standards, and guidelines do not reflect the authors' desire to do justice to an incalculable number of situations (facts). Rather, they are testimony to a lack of definitions as well as a lack of precision in thinking and the willingness (or ability) to adequately rack the gray matter of the brain in phrasing these requirements.
The result is evident: A fuzziness of regulation that has become a burden for all those who have to follow and enforce these regulations.
Regulation as code brings an end to this fuzziness.
At the same time, regulation as code is the prerequisite for ensuring that regulation can keep pace with technology in terms of speed and precision and does not become a stumbling block to innovation.
Johner Institute is looking for sponsors for LegalTech research projects.