Data Mappings
Understanding Data Transformation
Mappings are JavaScript or Python scripts with assigned names and descriptions that serve as the bridge between raw data source responses and actionable decision inputs. These scripts process information retrieved from data sources, extract specific values, transform them as needed, and return named values called features that decision flows use during scoring processes.
Core Mapping Concepts
A mapping functions as a data transformation mechanism:
Input: Raw response from a data source
Processing: Extraction and transformation logic
Output: Named feature value for use in workflows
The set of required mappings is determined by users according to the list of data elements that must be analyzed during the scoring process. This user-driven approach ensures that only necessary transformations are implemented and maintained.
Technical Requirements
Mappings must adhere to specific technical constraints:
Return Types Valid return types include:
String values
Null (for missing or invalid data)
Boolean (true/false)
Integer numbers
Decimal numbers
Data Handling Requirements
Must return flat, anonymized data
Avoid personal identifying information
Focus on behavioral indicators and calculated metrics
Compliance Considerations
While technically possible to return personal data, proper scoring methodology requires adherence to privacy principles:
Use anonymized behavioral indicators
Comply with jurisdiction-specific regulations
Maintain fairness principles in decision-making
Avoid storing sensitive personal information
Implementation Examples
The system documentation provides these practical examples:
Example 1: Experian Monthly Obligations
// Monthly Credit Commitments (SP & SPA)
experian_converted.Response.ConsumerSummary.CATO.spEDI08
This simple mapping extracts a specific field from the Experian response structure, returning the monthly credit commitment value directly.
Example 2: Tink Average Expenses Calculation
const RoundingMode = Java.type('java.math.RoundingMode');
const insuranceExpenses = getTotalFor12Months(tink_converted.expenses.insurance);
const transportationExpenses = getTotalFor12Months(tink_converted.expenses.transportation);
const utilitiesExpenses = getTotalFor12Months(tink_converted.expenses.utilities);
const housingExpenses = getTotalFor12Months(tink_converted.expenses.housing);
const childRelatedExpenses = getTotalFor12Months(tink_converted.expenses.childRelated);
const expenses = insuranceExpenses
.add(transportationExpenses)
.add(utilitiesExpenses)
.add(housingExpenses)
.add(childRelatedExpenses)
.abs();
expenses.doubleValue()
function getTotalFor12Months(category) {
return unscale(category.summaries.summariesByMonth.lastTwelveMonths.total);
}
function unscale(exactNumber) {
const scale = exactNumber.scale === 0 ? 0 : -exactNumber.scale;
return exactNumber.unscaledValue
.scaleByPowerOfTen(scale)
.setScale(2, RoundingMode.HALF_UP);
}
This complex mapping demonstrates:
Multiple data point aggregation
Mathematical operations on financial data
Precision handling for monetary values
Reusable function definitions
Mapping Development Process
Step 1: Identify Required Features
Analyze your scoring requirements to determine:
What data elements influence decisions
How raw data must be transformed
What calculations are needed
Expected output formats
Step 2: Understand Data Source Responses
Before creating mappings:
Review data source documentation
Examine response structures
Identify relevant fields
Note data formats and types
Step 3: Design Transformation Logic
Plan your mapping approach:
Simple field extraction for direct values
Complex calculations for derived metrics
Aggregations for summary data
Conditional logic for varied scenarios
Step 4: Implement and Test
Create the mapping script:
Write clear, maintainable code
Handle edge cases and null values
Test with sample data
Validate output accuracy
Step 5: Deploy and Monitor
Put mappings into production:
Associate with data sources
Configure in workflows
Monitor execution success
Track feature quality
Mapping Implementation Principles
Based on the documented examples, mappings demonstrate:
Simple Extraction (Experian Example)
Direct field access from structured responses
Navigation through nested data structures
Return of specific values without transformation
Complex Aggregation (Tink Example)
Multiple category processing
Mathematical operations on financial data
Precision handling for monetary calculations
Helper function usage for code reusability
System Requirements and Constraints
Framework Enforcement
The system requires mappings to:
Return only valid data types (string, null, boolean, integer, decimal)
Process data source responses into named features
Execute within workflow contexts
Maintain flat data structures
Operational Behavior
Each mapping has assigned name and description
Modifications tracked through versioning
Integration with workflows via Expression nodes
Output feeds profile building for offer calculation
Integration with Workflows
Mappings integrate seamlessly with the decision framework:
Data Flow Sequence
Workflow triggers data source call
Data source returns raw response
Mappings process response into features
Features feed into workflow expressions
Decisions based on transformed data
Feature Usage in Workflows
Expression nodes access mapping outputs
Switch nodes evaluate feature values
Save to Profile nodes store results
Decision logic uses normalized data
Configuration Management
Version Control
Each mapping modification should be:
Tracked with version numbers
Documented with change reasons
Tested before deployment
Rolled back if issues arise
Testing Strategy
Comprehensive testing includes:
Unit tests for individual mappings
Integration tests with data sources
End-to-end workflow validation
Performance benchmarking
Documentation Requirements
Maintain clear documentation:
Mapping purpose and logic
Input/output specifications
Business rule explanations
Update history
Implementation Resources
Through the Admin Panel (Step 1)
Access mapping configuration:
Feature Store - Mapping management interface
Through the SDK
(No setting in the SDK)
Framework Integration
Mapping Relationships
The system establishes clear relationships:
Data Sources: Provide raw responses for processing
Mappings: Transform responses into features
Workflows: Consume features for decisions
Profiles: Store feature values for offers
Execution Context
Mappings operate within defined constraints:
JavaScript or Python language support
Access to data source response structures
Return type validation by framework
Integration with workflow Expression nodes
Related Topics
Data Sources - Input providers for mappings
Workflow Tool - Feature consumption in decisions
Manual Review - Handling mapping exceptions
Profiles - Feature storage destination
timveroOS: Precision data transformation for intelligent lending decisions
Last updated
Was this helpful?