Data Sources

Overview

timveroOS provides comprehensive data integration capabilities to support informed lending decisions. The platform enables connections to various external data providers and internal systems through a flexible integration framework.

Integration Architecture

timveroOS uses a Subject-based pattern for data source integration:

How It Works:

  1. Subject Definition: Each data source requires a "subject" that defines what data to fetch (e.g., participant ID, credit application)

  2. Data Retrieval: The system calls the data source with the subject information

  3. Parsing: Retrieved data is automatically parsed into structured objects

  4. Feature Creation: Parsed data becomes available as features for workflows

Custom Integration Pattern: New data sources can be integrated by implementing the MappedDataSource interface, which requires:

  • getData() method for fetching raw data

  • parseRecord() method for converting to structured data

  • Error handling for unavailable data scenarios

Performance Considerations:

  • Data is cached according to configured TTL values

  • Failed requests are logged for monitoring

  • Retry logic is configurable per data source

Data Source Categories

Credit Bureau Integration

The system supports integration with major credit bureaus:

  • Experian: Credit reports, identity verification, business credit

  • Equifax: Consumer credit data, verification services

  • TransUnion: Credit monitoring, fraud detection

  • Specialized Bureaus: Additional data providers as configured

Banking and Financial Data

Connect to banking data providers for:

  • Account verification

  • Transaction analysis

  • Income verification

  • Cash flow assessment

Integrated Providers:

  • Plaid: Bank account verification, transaction data

  • Tink: Open banking connectivity

  • TrueLayer: Payment initiation, account information

Business Data Sources

Access business financial information through:

  • Boss Insights: Business analytics and insights

  • Zoho: Books API, Expense API, Inventory API

  • Bud: Financial data aggregation

Specialized Data Providers

  • MarketCheck: Vehicle valuation for auto lending

  • Plumery: Vendor data services

  • Mambu: Core banking integration

Pre-Integrated Providers

timveroOS includes ready-to-deploy integrations with the following providers:

Data Provider

Available Services

Common Use Cases

Experian

ConsumerView API, Delphi Select V2, Microcells API, WorldView API, PowerCurve, Business Decisioning, Commercial Credit, KYC, Business Ownership, Business Profile

Credit assessment, identity verification, business credit analysis

Boss Insights

Business financial data and analytics

Commercial lending decisions, cash flow analysis

Mambu

Core banking platform integration

Account data, transaction history

Plumery

Vendor data services

Various data verification needs

Plaid

Payments and funding API, Financial insights API, Credit and underwriting

Income verification, account ownership, transaction analysis

Tink

Open banking services

Account aggregation, payment initiation

Zoho

Books API, Expense API, Inventory API, Invoice API, Billing API

Business financial management data

TrueLayer

Open banking and payments

Account information, payment processing

Bud

Financial data intelligence

Transaction categorization, insights

MarketCheck

Vehicle data and valuations

Auto loan underwriting, collateral valuation

Integration Process

Configuration Steps

  1. Enable Provider

    • Select from available integrations

    • Configure authentication credentials

    • Set up data permissions

  2. Map Data Fields

    • Link provider fields to system attributes

    • Configure data transformations

    • Set validation rules

  3. Test Integration

    • Verify connectivity

    • Test data retrieval

    • Validate field mappings

  4. Deploy to Production

    • Enable for live processing

    • Monitor performance

    • Track usage metrics

Data Usage in Workflows

Feature Creation

When integrated data is retrieved, it can be transformed into features through the Feature Store for use in workflow decision logic. See Feature Store for detailed configuration instructions.

Features support:

  • Credit scoring algorithms

  • Risk assessment models

  • Eligibility determinations

  • Workflow decision logic

Workflow Integration

Data sources connect to workflows through:

  • Load Datasource tasks in workflow designer

  • Automatic data retrieval triggers

  • Conditional data fetching based on rules

  • Real-time or batch processing options

Data Transformation

The system supports transforming raw data into usable features:

  • Direct value usage (e.g., credit scores)

  • Calculated metrics (e.g., debt ratios)

  • Aggregated values (e.g., transaction summaries)

  • Derived indicators (e.g., stability scores)

Configuration Management

Authentication

Each data source requires appropriate authentication:

  • API keys

  • OAuth credentials

  • Certificate-based authentication

  • IP whitelisting where required

Data Freshness

Configure how often data should be refreshed:

  • Real-time queries for critical decisions

  • Cached data for frequently accessed information

  • Scheduled updates for batch processing

  • Event-triggered refreshes

Error Handling

The system provides robust error handling:

  • Connection retry mechanisms

  • Fallback data sources

  • Error logging and alerts

  • Manual intervention workflows

Security and Compliance

Data Security

All integrations implement:

  • Encrypted data transmission

  • Secure credential storage

  • Access logging

  • Data retention policies

Compliance Features

  • Consent management

  • Audit trail maintenance

  • Data minimization

  • Right to deletion support

Performance Considerations

Optimization Strategies

  • Efficient data caching

  • Parallel processing where applicable

  • Query optimization

  • Resource management

Monitoring

Track integration performance through:

  • Response time metrics

  • Success rate monitoring

  • Data quality indicators

  • Usage analytics

Technical Implementation

Data sources configured in the admin interface correspond to SDK implementations of the DataSource or MappedDataSource interfaces. Each integration requires:

  • Subject interface defining required input data

  • Target class representing the parsed response

  • Service implementation handling API calls

  • Error handling for data unavailability

For technical details on implementing custom data sources, consult your development team or SDK documentation.

Best Practices

Integration Design

  • Start with essential data sources

  • Test thoroughly before production use

  • Document field mappings

  • Monitor usage patterns

Data Quality

  • Validate incoming data

  • Handle missing values appropriately

  • Track data accuracy

  • Regular quality audits

Operational Management

  • Regular credential updates

  • Performance monitoring

  • Cost optimization

  • Capacity planning

Common Use Cases

Credit Application Processing

  1. Retrieve credit bureau data

  2. Verify bank account ownership

  3. Analyze income patterns

  4. Calculate risk metrics

Business Lending

  1. Access business financial data

  2. Analyze cash flow patterns

  3. Verify business ownership

  4. Assess business creditworthiness

Auto Lending

  1. Verify applicant identity

  2. Check credit history

  3. Value vehicle collateral

  4. Confirm insurance coverage

Adding New Integrations

While timveroOS includes many pre-integrated providers, additional data sources can be connected:

  1. Identify Requirements

    • Data types needed

    • Integration method

    • Security requirements

  2. Development Process

    • API documentation review

    • Integration development

    • Testing and validation

  3. Deployment

    • Configuration setup

    • Production testing

    • Monitoring establishment

Note: New integrations can typically be added within hours through the integration interface.

Troubleshooting

Connection Issues

  • Verify credentials are current

  • Check network connectivity

  • Review API status

  • Examine error logs

Data Quality Issues

  • Validate field mappings

  • Check transformation rules

  • Review source data quality

  • Adjust validation rules

Performance Problems

  • Monitor API response times

  • Check rate limits

  • Review caching settings

  • Optimize query patterns

Next Steps

With data sources configured, explore:


For additional data integration support, consult your implementation team or system documentation.

Last updated

Was this helpful?