This site is in English. Use your browser's built-in translate feature to read it in your language.

Data Mapper Pattern

Overview

The Data Mapper pattern is a fundamental transformation pattern in enterprise integration architectures that provides sophisticated mapping and conversion capabilities between different data formats, structures, and representations. Like a skilled translator who not only converts words but also adapts cultural context and meaning, the Data Mapper pattern transforms data while preserving semantic integrity and handling complex structural differences between source and target systems.

Theoretical Foundation

The Data Mapper pattern is grounded in data transformation theory and semantic mapping principles, specifically addressing the challenge of structural and semantic impedance mismatch between different data representations in distributed systems. The pattern embodies the principle of "lossless transformation" - converting data between formats while preserving all essential information and maintaining data integrity throughout the transformation process.

Core Principles

1. Bidirectional Data Transformation

The Data Mapper provides comprehensive transformation capabilities: - Format conversion - transforming between JSON, XML, CSV, binary formats, and custom protocols - Structural mapping - converting between different object hierarchies and data structures - Schema adaptation - adapting data to match target system schema requirements - Protocol mediation - bridging different communication protocols and data exchange patterns

2. Semantic Preservation

During transformation, the pattern maintains data meaning and integrity: - Business rule enforcement - applying transformation rules that preserve business logic - Data validation - ensuring transformed data meets target system validation requirements - Semantic enrichment - adding contextual information during transformation - Constraint preservation - maintaining data constraints and relationships across transformations

3. Complex Mapping Logic

The pattern supports sophisticated mapping scenarios: - One-to-many mapping - splitting single source fields into multiple target fields - Many-to-one mapping - combining multiple source fields into single target fields - Conditional mapping - applying different transformation logic based on data conditions - Computed fields - generating target fields through calculations or business logic

4. Performance and Scalability

The Data Mapper optimizes transformation performance: - Streaming transformation - processing large datasets without loading entirely into memory - Parallel mapping - concurrent transformation of independent data elements - Caching strategies - caching transformation rules and lookup data for better performance - Lazy evaluation - performing transformations only when data is actually needed

Why Data Mappers are Essential in Integration Architecture

1. System Heterogeneity Management

In diverse technology environments, data mappers address: - Legacy system integration - bridging modern and legacy data formats - Multi-vendor environments - adapting between different vendor data standards - Technology evolution - supporting migration between different technology stacks - API versioning - maintaining compatibility across different API versions

2. Enterprise Application Integration

For connecting enterprise systems: - ERP integration - mapping between different ERP system data models - CRM synchronization - transforming customer data between CRM and other systems - Financial system integration - converting financial data between accounting standards - Supply chain coordination - standardizing product and order data across partners

3. Cloud and Hybrid Architecture

In cloud integration scenarios: - Cloud migration - transforming on-premises data for cloud systems - Multi-cloud integration - standardizing data across different cloud providers - SaaS integration - adapting enterprise data for SaaS application consumption - Hybrid data flows - maintaining data consistency across hybrid environments

4. Data Governance and Compliance

For regulatory and governance requirements: - Data standardization - ensuring consistent data formats across the organization - Privacy protection - transforming data to comply with privacy regulations - Audit trail maintenance - preserving transformation history for compliance - Data quality improvement - cleansing and enriching data during transformation

Benefits in Integration Contexts

1. System Decoupling

2. Data Quality Enhancement

3. Business Agility

4. Operational Efficiency

Integration Architecture Applications

1. Enterprise Service Bus (ESB)

Data Mappers in ESB environments provide: - Message transformation - converting messages between different service formats - Protocol adaptation - transforming data for different communication protocols - Content enrichment - adding metadata and context during transformation - Format standardization - ensuring consistent message formats across the bus

2. API Management and Gateway

In API gateway implementations: - Request transformation - adapting incoming requests to backend service formats - Response transformation - converting backend responses to API specification formats - Version mediation - supporting multiple API versions through transformation - Client adaptation - customizing responses for different client types

3. Data Integration and ETL

For data pipeline implementations: - Source system extraction - transforming data from various source formats - Data warehouse loading - converting operational data to analytical formats - Real-time streaming - transforming streaming data for real-time analytics - Master data management - standardizing data across multiple master data domains

4. Event-Driven Architecture

In event streaming systems: - Event transformation - converting events between different schemas - Event enrichment - adding contextual data to events during transformation - Event filtering - selecting and transforming relevant event data - Stream processing - transforming streaming data for downstream consumers

Implementation Patterns

1. Object-to-Object Mapper

// Direct object mapping with field transformation
@Component
public class CustomerDataMapper {

    public TargetCustomer mapToTarget(SourceCustomer source) {
        return TargetCustomer.builder()
            .customerId(source.getId())
            .fullName(combineNames(source.getFirstName(), source.getLastName()))
            .emailAddress(normalizeEmail(source.getEmail()))
            .phoneNumber(formatPhoneNumber(source.getPhone()))
            .address(mapAddress(source.getAddress()))
            .createdDate(source.getCreationTimestamp().toLocalDate())
            .isActive(source.getStatus() == CustomerStatus.ACTIVE)
            .customerType(mapCustomerType(source))
            .build();
    }

    public SourceCustomer mapFromTarget(TargetCustomer target) {
        String[] nameParts = splitFullName(target.getFullName());

        return SourceCustomer.builder()
            .id(target.getCustomerId())
            .firstName(nameParts[0])
            .lastName(nameParts[1])
            .email(target.getEmailAddress())
            .phone(target.getPhoneNumber())
            .address(mapAddressReverse(target.getAddress()))
            .creationTimestamp(target.getCreatedDate().atStartOfDay())
            .status(target.isActive() ? CustomerStatus.ACTIVE : CustomerStatus.INACTIVE)
            .build();
    }

    private String combineNames(String firstName, String lastName) {
        return firstName + " " + lastName;
    }

    private String normalizeEmail(String email) {
        return email != null ? email.toLowerCase().trim() : null;
    }

    private String formatPhoneNumber(String phone) {
        if (phone == null) return null;
        // Remove all non-digits and format as international number
        String digits = phone.replaceAll("\\D", "");
        if (digits.startsWith("0")) {
            digits = "+358" + digits.substring(1); // Finnish format
        }
        return digits;
    }

    private CustomerType mapCustomerType(SourceCustomer source) {
        if (source.isPremium()) {
            return CustomerType.PREMIUM;
        } else if (source.getOrderCount() > 10) {
            return CustomerType.REGULAR;
        }
        return CustomerType.NEW;
    }
}

2. Configuration-Driven Mapper

// Flexible mapping based on external configuration
@Component
public class ConfigurableDataMapper {

    @Autowired
    private MappingConfiguration mappingConfig;

    @Autowired
    private ExpressionEvaluator expressionEvaluator;

    public Map<String, Object> transform(Map<String, Object> sourceData, String mappingName) {
        Map<String, Object> targetData = new HashMap<>();
        MappingRules rules = mappingConfig.getMappingRules(mappingName);

        for (FieldMapping fieldMapping : rules.getFieldMappings()) {
            Object value = extractValue(sourceData, fieldMapping);
            Object transformedValue = applyTransformations(value, fieldMapping.getTransformations());

            if (transformedValue != null || fieldMapping.isIncludeNulls()) {
                setNestedValue(targetData, fieldMapping.getTargetField(), transformedValue);
            }
        }

        return targetData;
    }

    private Object extractValue(Map<String, Object> sourceData, FieldMapping fieldMapping) {
        if (fieldMapping.isExpression()) {
            return expressionEvaluator.evaluate(fieldMapping.getSourceExpression(), sourceData);
        } else {
            return getNestedValue(sourceData, fieldMapping.getSourceField());
        }
    }

    private Object applyTransformations(Object value, List<Transformation> transformations) {
        Object result = value;

        for (Transformation transformation : transformations) {
            result = transformation.apply(result);
        }

        return result;
    }

    private Object getNestedValue(Map<String, Object> data, String path) {
        String[] pathParts = path.split("\\.");
        Object current = data;

        for (String part : pathParts) {
            if (current instanceof Map) {
                current = ((Map<?, ?>) current).get(part);
            } else {
                return null;
            }
        }

        return current;
    }
}

// Configuration classes
@Configuration
@ConfigurationProperties(prefix = "mapping")
public class MappingConfiguration {
    private Map<String, MappingRules> rules = new HashMap<>();

    public MappingRules getMappingRules(String mappingName) {
        return rules.get(mappingName);
    }

    // getters and setters
}

public class MappingRules {
    private List<FieldMapping> fieldMappings;
    private List<ValidationRule> validationRules;

    // getters and setters
}

public class FieldMapping {
    private String sourceField;
    private String targetField;
    private String sourceExpression;
    private List<Transformation> transformations;
    private boolean isExpression;
    private boolean includeNulls;

    // getters and setters
}

3. Stream Processing Mapper

// High-performance streaming data transformation
@Component
public class StreamingDataMapper {

    @Autowired
    private TransformationEngine transformationEngine;

    public Flux<TransformedRecord> transformStream(Flux<SourceRecord> sourceStream, 
                                                   String transformationSpec) {
        return sourceStream
            .buffer(1000) // Process in batches for efficiency
            .flatMap(batch -> transformBatch(batch, transformationSpec))
            .onErrorContinue((error, record) -> {
                log.error("Transformation error for record: {}", record, error);
                // Send to error queue or dead letter topic
            });
    }

    private Flux<TransformedRecord> transformBatch(List<SourceRecord> batch, 
                                                   String transformationSpec) {
        return Flux.fromIterable(batch)
            .parallel()
            .runOn(Schedulers.parallel())
            .map(record -> transformSingleRecord(record, transformationSpec))
            .filter(Objects::nonNull)
            .sequential();
    }

    private TransformedRecord transformSingleRecord(SourceRecord source, String transformationSpec) {
        try {
            return transformationEngine.transform(source, transformationSpec);
        } catch (Exception e) {
            log.warn("Failed to transform record: {}", source.getId(), e);
            return null; // Will be filtered out
        }
    }
}

@Service
public class TransformationEngine {

    private final Map<String, Transformer> transformers = new HashMap<>();

    @PostConstruct
    public void initializeTransformers() {
        transformers.put("customer", new CustomerTransformer());
        transformers.put("order", new OrderTransformer());
        transformers.put("product", new ProductTransformer());
    }

    public TransformedRecord transform(SourceRecord source, String transformationSpec) {
        Transformer transformer = transformers.get(transformationSpec);
        if (transformer == null) {
            throw new IllegalArgumentException("Unknown transformation spec: " + transformationSpec);
        }

        return transformer.transform(source);
    }
}

4. Multi-Format Converter

// Convert between different data formats (JSON, XML, CSV, etc.)
@Component
public class MultiFormatDataMapper {

    @Autowired
    private ObjectMapper jsonMapper;

    @Autowired
    private XmlMapper xmlMapper;

    public String convertFormat(String inputData, DataFormat sourceFormat, 
                               DataFormat targetFormat, String mappingSchema) throws Exception {

        // Parse input data to intermediate representation
        Object intermediateData = parseInput(inputData, sourceFormat);

        // Apply transformation if mapping schema is provided
        if (mappingSchema != null) {
            intermediateData = applyMapping(intermediateData, mappingSchema);
        }

        // Convert to target format
        return formatOutput(intermediateData, targetFormat);
    }

    private Object parseInput(String inputData, DataFormat sourceFormat) throws Exception {
        return switch (sourceFormat) {
            case JSON -> jsonMapper.readValue(inputData, Object.class);
            case XML -> xmlMapper.readValue(inputData, Object.class);
            case CSV -> parseCSV(inputData);
            case YAML -> parseYAML(inputData);
            default -> throw new UnsupportedOperationException("Unsupported source format: " + sourceFormat);
        };
    }

    private String formatOutput(Object data, DataFormat targetFormat) throws Exception {
        return switch (targetFormat) {
            case JSON -> jsonMapper.writeValueAsString(data);
            case XML -> xmlMapper.writeValueAsString(data);
            case CSV -> formatAsCSV(data);
            case YAML -> formatAsYAML(data);
            default -> throw new UnsupportedOperationException("Unsupported target format: " + targetFormat);
        };
    }

    private Object applyMapping(Object data, String mappingSchema) {
        // Apply JSONPath or XPath transformations
        // This could use a rule engine or transformation library
        return data; // Simplified implementation
    }

    private List<Map<String, Object>> parseCSV(String csvData) {
        List<Map<String, Object>> records = new ArrayList<>();
        String[] lines = csvData.split("\n");

        if (lines.length < 2) return records;

        String[] headers = lines[0].split(",");

        for (int i = 1; i < lines.length; i++) {
            String[] values = lines[i].split(",");
            Map<String, Object> record = new HashMap<>();

            for (int j = 0; j < headers.length && j < values.length; j++) {
                record.put(headers[j].trim(), values[j].trim());
            }

            records.add(record);
        }

        return records;
    }
}

public enum DataFormat {
    JSON, XML, CSV, YAML, BINARY
}

Apache Camel Implementation

1. Simple Data Transformation Route

@Component
public class DataMappingRoute extends RouteBuilder {

    @Override
    public void configure() throws Exception {
        from("direct:transformData")
            .routeId("data-transformation")
            .log("Starting data transformation: ${header.transformationType}")
            .choice()
                .when(header("transformationType").isEqualTo("CUSTOMER"))
                    .to("direct:transformCustomer")
                .when(header("transformationType").isEqualTo("ORDER"))
                    .to("direct:transformOrder")
                .when(header("transformationType").isEqualTo("PRODUCT"))
                    .to("direct:transformProduct")
                .otherwise()
                    .to("direct:genericTransform")
            .end();

        from("direct:transformCustomer")
            .log("Transforming customer data")
            .process(exchange -> {
                SourceCustomer source = exchange.getIn().getBody(SourceCustomer.class);
                TargetCustomer target = mapCustomer(source);
                exchange.getIn().setBody(target);
            })
            .marshal().json()
            .to("direct:sendTransformed");
    }

    private TargetCustomer mapCustomer(SourceCustomer source) {
        return TargetCustomer.builder()
            .id(source.getCustomerId())
            .name(source.getFirstName() + " " + source.getLastName())
            .email(source.getEmailAddress())
            .active(source.isEnabled())
            .build();
    }
}

2. JSON to XML Transformation

@Component
public class JSONToXMLMappingRoute extends RouteBuilder {

    @Override
    public void configure() throws Exception {
        from("direct:jsonToXml")
            .routeId("json-to-xml-mapper")
            .log("Converting JSON to XML")
            .unmarshal().json()
            .process(exchange -> {
                Map<String, Object> jsonData = exchange.getIn().getBody(Map.class);

                // Apply business transformations
                Map<String, Object> transformedData = applyBusinessRules(jsonData);

                // Convert to XML-friendly structure
                XMLData xmlData = createXMLStructure(transformedData);
                exchange.getIn().setBody(xmlData);
            })
            .marshal().jaxb()
            .log("XML transformation completed")
            .to("direct:sendXMLData");
    }

    private Map<String, Object> applyBusinessRules(Map<String, Object> data) {
        Map<String, Object> transformed = new HashMap<>(data);

        // Apply date formatting
        if (data.containsKey("timestamp")) {
            String isoDate = formatDateForXML((String) data.get("timestamp"));
            transformed.put("formattedDate", isoDate);
        }

        // Normalize phone numbers
        if (data.containsKey("phone")) {
            String normalizedPhone = normalizePhoneNumber((String) data.get("phone"));
            transformed.put("phone", normalizedPhone);
        }

        return transformed;
    }
}

3. Database Result Set Mapping

@Component
public class DatabaseMappingRoute extends RouteBuilder {

    @Override
    public void configure() throws Exception {
        from("direct:mapDatabaseResults")
            .routeId("database-result-mapper")
            .log("Mapping database results")
            .process(exchange -> {
                List<Map<String, Object>> dbResults = exchange.getIn().getBody(List.class);
                List<APIResponseDto> apiResponses = new ArrayList<>();

                for (Map<String, Object> row : dbResults) {
                    APIResponseDto response = mapDatabaseRow(row);
                    apiResponses.add(response);
                }

                exchange.getIn().setBody(apiResponses);
            })
            .marshal().json()
            .to("direct:sendAPIResponse");
    }

    private APIResponseDto mapDatabaseRow(Map<String, Object> row) {
        return APIResponseDto.builder()
            .id((String) row.get("ID"))
            .name((String) row.get("FULL_NAME"))
            .email((String) row.get("EMAIL_ADDRESS"))
            .createdAt(formatTimestamp((Timestamp) row.get("CREATED_DATE")))
            .status(mapStatus((String) row.get("STATUS_CODE")))
            .build();
    }

    private String mapStatus(String statusCode) {
        return switch (statusCode) {
            case "A" -> "ACTIVE";
            case "I" -> "INACTIVE";
            case "P" -> "PENDING";
            default -> "UNKNOWN";
        };
    }
}

4. Dynamic Field Mapping Route

@Component
public class DynamicMappingRoute extends RouteBuilder {

    @Autowired
    private FieldMappingService fieldMappingService;

    @Override
    public void configure() throws Exception {
        from("direct:dynamicMapping")
            .routeId("dynamic-field-mapper")
            .log("Applying dynamic field mapping")
            .process(exchange -> {
                Map<String, Object> sourceData = exchange.getIn().getBody(Map.class);
                String mappingProfile = exchange.getIn().getHeader("mappingProfile", String.class);

                Map<String, Object> targetData = applyDynamicMapping(sourceData, mappingProfile);
                exchange.getIn().setBody(targetData);
            })
            .choice()
                .when(header("outputFormat").isEqualTo("JSON"))
                    .marshal().json()
                .when(header("outputFormat").isEqualTo("XML"))
                    .marshal().jaxb()
                .otherwise()
                    .log("Using default output format")
            .end()
            .to("direct:sendMappedData");
    }

    private Map<String, Object> applyDynamicMapping(Map<String, Object> sourceData, 
                                                    String mappingProfile) {

        Map<String, Object> targetData = new HashMap<>();
        MappingConfiguration config = fieldMappingService.getMappingConfiguration(mappingProfile);

        for (FieldMapping mapping : config.getFieldMappings()) {
            Object sourceValue = extractSourceValue(sourceData, mapping.getSourcePath());
            Object transformedValue = applyFieldTransformations(sourceValue, mapping.getTransformations());

            if (transformedValue != null) {
                setTargetValue(targetData, mapping.getTargetPath(), transformedValue);
            }
        }

        return targetData;
    }

    private Object extractSourceValue(Map<String, Object> data, String path) {
        String[] pathSegments = path.split("\\.");
        Object current = data;

        for (String segment : pathSegments) {
            if (current instanceof Map) {
                current = ((Map<?, ?>) current).get(segment);
            } else {
                return null;
            }
        }

        return current;
    }
}

5. Batch File Processing with Mapping

@Component
public class BatchFileMappingRoute extends RouteBuilder {

    @Override
    public void configure() throws Exception {
        from("file:input?include=*.csv")
            .routeId("batch-file-mapper")
            .log("Processing batch file: ${header.CamelFileName}")
            .process(exchange -> {
                String fileName = exchange.getIn().getHeader("CamelFileName", String.class);
                String mappingType = determineMappingType(fileName);
                exchange.getIn().setHeader("mappingType", mappingType);
            })
            .split(body().tokenize("\n"))
                .skipFirst(1) // Skip header
                .streaming()
                .process(exchange -> {
                    String csvLine = exchange.getIn().getBody(String.class);
                    String mappingType = exchange.getIn().getHeader("mappingType", String.class);

                    Map<String, Object> record = parseCSVLine(csvLine);
                    Map<String, Object> mappedRecord = applyRecordMapping(record, mappingType);

                    exchange.getIn().setBody(mappedRecord);
                })
                .marshal().json()
                .to("direct:sendBatchRecord")
            .end()
            .log("Batch file processing completed: ${header.CamelFileName}");
    }

    private String determineMappingType(String fileName) {
        if (fileName.startsWith("customer_")) {
            return "CUSTOMER_IMPORT";
        } else if (fileName.startsWith("order_")) {
            return "ORDER_IMPORT";
        }
        return "GENERIC_IMPORT";
    }

    private Map<String, Object> parseCSVLine(String csvLine) {
        String[] fields = csvLine.split(",");
        Map<String, Object> record = new HashMap<>();

        // Basic CSV parsing - in real implementation, use proper CSV library
        for (int i = 0; i < fields.length; i++) {
            record.put("field_" + i, fields[i].trim());
        }

        return record;
    }

    private Map<String, Object> applyRecordMapping(Map<String, Object> record, String mappingType) {
        // Apply mapping based on type
        return switch (mappingType) {
            case "CUSTOMER_IMPORT" -> mapCustomerRecord(record);
            case "ORDER_IMPORT" -> mapOrderRecord(record);
            default -> record;
        };
    }
}

Best Practices

1. Mapping Design and Maintainability

2. Performance Optimization

3. Error Handling and Data Quality

4. Flexibility and Extensibility

5. Security and Compliance

The Data Mapper pattern is essential for building robust, flexible integration solutions that can handle the complexity of data transformation in heterogeneous enterprise environments while maintaining data integrity and supporting business requirements.

← Back to All Patterns