Structured Data with Custom Validation
Use Case: Weather Data Aggregation for Insurance Payouts
In parametric insurance, payouts are automatically triggered when predefined weather conditions are met, such as rainfall above a certain level or wind speeds exceeding a specific threshold. The data for these events is collected from various weather stations or meteorological sources, which may report slightly different readings due to location or equipment sensitivity. Aggregating this data using averages or medians can be problematic when a specific threshold must be met for an insurance payout to occur.
Why Traditional Oracles Are Not Suitable
Traditional oracles that average or use the median across weather stations can obscure critical events that trigger payouts. For example, if one weather station reports a rainfall level just above the threshold for a payout, but others report lower amounts, averaging the data could bring the overall value below the threshold, preventing the payout. Similarly, the median could ignore the station that reports the crucial value, leading to incorrect results for policyholders.
How IntelliX Solves This
IntelliX enables custom validation of weather data from different sources. Instead of using an average or median, IntelliX can apply rules that ensure the correct threshold is respected, regardless of slight variations across stations. For example, a custom rule could be set to trigger the insurance payout if any one weather station reports a rainfall above the threshold, ensuring that policyholders receive their payout even if other stations report lower values. This makes IntelliX well-suited for scenarios where specific conditions must be met for automated contract execution.
Use Case: Location Data Aggregation for Asset Tracking
In asset tracking systems, such as monitoring the movement of vehicles, shipments, or valuable equipment, location data is constantly collected. However, publishing every small movement on-chain may be unnecessary and could result in an overload of data. Instead, it is often more efficient to filter the data and publish only significant movements, ensuring that on-chain records reflect meaningful changes rather than minor shifts.
Why Traditional Oracles Are Not Suitable
Traditional oracles generally publish all data points or use simple aggregation methods without the ability to filter out small, irrelevant changes in position. For example, a vehicle might report minor movements throughout the day as it shifts slightly while parked. If this location data is aggregated and published continuously, it could clutter the blockchain with unnecessary records and make it difficult to track meaningful changes.
How IntelliX Solves This
IntelliX allows validators to implement custom rules for filtering location data. Instead of publishing every small movement, the system can be set to only publish substantial movements—such as a vehicle moving more than a specific distance or crossing a defined geographical boundary. This filtering ensures that only significant changes in location are recorded on-chain, reducing data noise while still providing a reliable record of important movements.
For example, in a fleet management system, the location of delivery trucks can be tracked continuously, but only substantial movements (e.g., leaving a city or entering a delivery area) are published on-chain. This reduces unnecessary data while ensuring the relevant tracking information is available for auditing or reporting purposes.
Last updated