Adding Consistency
Consistency is the "C" in ACID. It ensures that every transaction successfully moves the data from one valid state to another valid state. This means that all changes made to the data must follow a defined set of validations.
To achieve consistency in Mendix, developers must clearly define these validations and strategically implement checks to validate data entering the system from any entry point.
1. Validation types
Validations can be grouped into two types based on whether they depend on the time or context of the transaction: Invariant Validations and Variant Validations.
1.1. Invariant Validations
Invariant Validations enforce Data Integrity. Their purpose is to guarantee that stored data is never in an invalid state. The outcome of an invariant validation remains consistent, no matter the time or place it is checked.
The Order_Amount must be ≥ €100.
This is always true regardless the moment in time.
Invariant validations are implemented using the VAL microflow typology. VAL microflows must always be executed by the Commit Microflow Typology (CMT) before object are committed. Although invariant validations may be called from touchpoint microflows (ACT, SCE, PUB) to provide early feedback to the user, they must still be included in the CMT microflow execution, because the CMT is ultimately responsible for guaranteeing data integrity.
If an application frequently stores data but has few VAL microflows, it suggests that invariant validations might be incorrectly implemented as variant rules within Orchestration (ORC) microflows.
1.2. Variant Validations
A variant validation enforces Process Validation. Its goal is to confirm that a specific action is valid at the moment of execution.
On order placement, the user must have a valid credit card.
This is only checked when the user clicks Place Order
Variant rules are implemented using decision activities or calling the RULE-microflow types.
Design Principle: Do not rely on Variant Rules to enforce Data Integrity. Spreading context-dependent checks throughout the application logic results in weak enforcement and poor testability. Invariant Rules must be universally applied.
2. Layered Validation Strategy
A layered approach to validation is necessary to ensure the application is robust and provides a good user experience. Three primary layers are used:
- Client-Side Validation (UX Focus)
- Server-Side Validation (Integrity Focus)
- Scheduled validation (Integrity focus)
2.1. Client-Side Validation (UX Focus)
Goal: To give the user immediate feedback about values and basic data formatting.
Mendix Location: Implemented within the Touchpoint microflow as a precondition, as a call to a VAL microflow, or directly using input widget properties.
Crucial Note: Client-side validation is solely for user experience. It can be easily bypassed and should never be trusted for ensuring data security or integrity.
2.2 Server-Side Validation (Integrity Focus)
Goal: To perform the final, definitive check of all validations before any data is committed to the database.
Mendix Location: Performed by dedicated Validation Microflows (VAL), which are executed by the Commit Microflow Typology (CMT
Crucial Notes: The Back-End (Server-Side) layer is the only layer that can truly guarantee consistency. Therefore, all server-side validations must run before the Mendix Commit actions inside the CMT microflow.
2.3. Scheduled validation (Integrity focus)
Goal: To maintain the long-term integrity of data by regularly checking the entire dataset against the system’s Invariant Validations. Because this validation runs outside the immediate user transaction, it is well-suited for large-scale data quality audits.
This validation layer is important for catching data inconsistencies that might be introduced by:
- Validation Changes: Data that is created before a new validation was implemented (e.g., a "Minimum Order Amount" rule increases).
- External Sources: Data imported via APIs or batch jobs that bypassed the usual real-time validation layers.
- System Drift: Unexpected corruption or rare errors that were not detected by real-time checks.
Mendix Location: This is set up is often implemented by a scheduled event touchpoint microflow (SCE) that triggers an orchestration microflow (ORC). The ORC microflow then retrieves the necessary data and executes validations.
Crucial notes: If the dataset contains historical objects that are purposely exempt from new validations, this approach can be combined with Data Offloading & Cleansing to archive the older data.