Extended Configuration
Table of Contents
- Connection Configuration
- Performance Settings
- Debug Options
- Column Handling
- Data Transformation
- Retry Configuration
- Rate Limit Settings
Connection Configuration
Settings that control connection pooling and management.
connection_config = {
"advanced.connection.max_connections_per_sp": 50, # Maximum connections per service principal (1-100)
"advanced.connection.blocking": True, # Whether connection requests block when pool is full
"advanced.connection.use_multi_pool": False, # Use separate pools for each service principal
}
Setting | Type | Default | Description |
---|---|---|---|
max_connections_per_sp | int | 50 | Maximum connections per service principal. Dynamics 365 limit is 52 |
blocking | bool | True | If True, waits for connection when pool is full. If False, raises error |
use_multi_pool | bool | False | When True, creates separate connection pools for each service principal |
Performance Settings
Settings that affect processing performance and resource utilization.
performance_config = {
"advanced.performance.num_workers": 32, # Thread pool size
"advanced.performance.batching_enabled": True, # Enable batch processing
"advanced.performance.batch_size": 10_000, # Records per batch
"advanced.performance.deferred_retry_enabled": False, # Enable deferred retry mechanism
"advanced.performance.deferred_batch_size": 1000, # Records per deferred batch
"advanced.performance.max_deferred_retries": 3, # Maximum retry attempts
"advanced.performance.deferred_retry_delay": 300, # Initial retry delay (seconds)
"advanced.performance.pushdown_filter_enabled": True, # Use key-based filtering
"advanced.performance.max_pushdown_filter": 100_000, # Maximum records for pushdown
"advanced.performance.bulk_insert_with_fallback": False, # Try bulk insert before fallback
"advanced.performance.query_builder_caching_enabled": True, # Cache query builders
"advanced.performance.id_based_strategy_enabled": False # Use ID-based strategy
}
Setting | Type | Default | Description |
---|---|---|---|
num_workers | int | 32 | Number of concurrent worker threads |
batching_enabled | bool | True | Enable processing in batches |
batch_size | int | 10,000 | Number of records per batch |
deferred_retry_enabled | bool | False | Enable retry mechanism for failed operations |
deferred_batch_size | int | 1,000 | Number of records per retry batch |
max_deferred_retries | int | 3 | Maximum number of retry attempts |
deferred_retry_delay | int | 300 | Delay between retry attempts (seconds) |
pushdown_filter_enabled | bool | True | Enable database-side filtering |
max_pushdown_filter | int | 100,000 | Maximum records for pushdown filtering |
bulk_insert_with_fallback | bool | False | Attempt bulk insert before single inserts |
query_builder_caching_enabled | bool | True | Cache query templates |
id_based_strategy_enabled | bool | False | Use ID-based upsert strategy |
Debug Options
Settings for troubleshooting and debugging.
debug_config = {
"advanced.debug.cache_queries": False, # Cache executed queries
"advanced.debug.disable_helper_text": False, # Disable helper text output
"advanced.debug.raise_on_failure": False, # Raise exceptions on failures
"advanced.debug.log_queries": False, # Log executed queries
"advanced.debug.log_batch_mapping": False, # Log batch mapping details
"advanced.debug.log_skips_missed": False, # Log skipped records
"advanced.debug.sampling_enabled": False, # Enable data sampling
"advanced.debug.sample_condition": None, # SQL condition for sampling
"advanced.debug.sample_size": None, # Number of records to sample
"advanced.debug.sample_seed": None, # Random seed for sampling
"advanced.debug.profiling_enabled": False # Enable performance profiling
}
Setting | Type | Default | Description |
---|---|---|---|
cache_queries | bool | False | Store executed queries in memory |
disable_helper_text | bool | False | Disable informational messages |
raise_on_failure | bool | False | Raise exceptions instead of logging |
log_queries | bool | False | Log SQL queries to table |
log_batch_mapping | bool | False | Log batch processing details |
log_skips_missed | bool | False | Log records that should have been skipped |
sampling_enabled | bool | False | Enable data sampling |
sample_condition | str | None | WHERE clause for sampling |
sample_size | int | None | Number of records to sample |
sample_seed | int | None | Seed for reproducible sampling |
profiling_enabled | bool | False | Enable performance profiling |
Column Handling
Settings for column management and processing.
column_config = {
"advanced.columns.target_pk_column": "id", # Primary key column
"advanced.columns.target_prefix": "target__", # Prefix for target columns
"advanced.columns.target_read_columns": ["id"], # Required target columns
"advanced.columns.target_conditional_columns": ["statecode", "statuscode"], # State columns
"advanced.columns.exclude_columns": [], # Columns to exclude
"advanced.columns.include_only_columns": [], # Columns to include
"advanced.columns.legacy_compare_values": False, # Use legacy comparison
"advanced.columns.compare_case_insensitive": False, # Case-insensitive comparison
"advanced.columns.use_extended_state_codes": False # Use extended state codes
}
Setting | Type | Default | Description |
---|---|---|---|
target_pk_column | str | "id" | Primary key column name |
target_prefix | str | "target__" | Prefix for target comparison columns |
target_read_columns | List[str] | ["id"] | Required columns in target queries |
target_conditional_columns | List[str] | ["statecode", "statuscode"] | State management columns |
exclude_columns | List[str] | [] | Columns to exclude from processing |
include_only_columns | List[str] | [] | Only process these columns |
legacy_compare_values | bool | False | Use legacy value comparison |
compare_case_insensitive | bool | False | Ignore case in comparisons |
use_extended_state_codes | bool | False | Use extended state code mapping |
Data Transformation
Settings for data transformation and sanitization.
data_transform_config = {
"advanced.data_transformation.sanitization_rules": [], # Data cleaning rules
"advanced.data_transformation.normalize_owner_columns": True, # Normalize ownership
"advanced.data_transformation.case_insensitive_columns": [] # Case-insensitive cols
}
Setting | Type | Default | Description |
---|---|---|---|
sanitization_rules | List[Rule] | [] | Data cleaning rules |
normalize_owner_columns | bool | True | Normalize ownership columns |
case_insensitive_columns | List[str] | [] | Case-insensitive columns |
Retry Configuration
Settings for operation retry behavior.
retry_config = {
"advanced.retry.enabled": False, # Enable retry mechanism
"advanced.retry.max_attempts": 3, # Maximum retry attempts
"advanced.retry.base_delay": 2, # Base delay seconds
"advanced.retry.max_delay": 60, # Maximum delay seconds
"advanced.retry.jitter": 1.5 # Jitter factor
}
Setting | Type | Default | Description |
---|---|---|---|
enabled | bool | False | Enable automatic retries |
max_attempts | int | 3 | Maximum retry attempts |
base_delay | int | 2 | Base delay between retries |
max_delay | int | 60 | Maximum retry delay |
jitter | float | 1.5 | Random jitter factor |
Rate Limit Settings
Settings for rate limit simulation and handling.
rate_limit_config = {
"advanced.rate_limit_simulation.enabled": False, # Enable simulation
"advanced.rate_limit_simulation.error_frequency": 0.1, # Error frequency
"advanced.rate_limit_simulation.retry_after_range": (3.0, 10.0), # Retry range
"advanced.rate_limit_simulation.sequential_errors": 0, # Sequential errors
"advanced.rate_limit_simulation.error_types_sequence": None # Error sequence
}
Setting | Type | Default | Description |
---|---|---|---|
enabled | bool | False | Enable rate limit simulation |
error_frequency | float | 0.1 | Frequency of simulated errors |
retry_after_range | tuple | (3.0, 10.0) | Range for retry delays |
sequential_errors | int | 0 | Number of sequential errors |
error_types_sequence | List[str] | None | Specific error sequence |
Example Configuration
Here's an example of a comprehensive configuration:
config = {
# Connection settings
"advanced.connection.use_multi_pool": True,
"advanced.connection.max_connections_per_sp": 50,
# Performance settings
"advanced.performance.num_workers": 1600,
"advanced.performance.batch_size": 50_000,
"advanced.performance.id_based_strategy_enabled": True,
# Debug settings
"advanced.debug.log_queries": True,
"advanced.debug.log_batch_mapping": True,
# Column settings
"advanced.columns.exclude_columns": ["owningteam_id", "owningteam_logicalname"],
# Retry settings
"advanced.retry.enabled": True,
"advanced.performance.deferred_retry_enabled": True
}
dynamics_client.update_config(config)
Updated 16 days ago