Conditional Flows Now Available in Public Beta
Conditional Flows extend your data pipelines with intelligent automation such as dynamic branching, smart retries, delays, and runtime logic for workflows that adapt to real-world conditions.
Key Features
- Conditional Branching: Route workflows based on task outcomes, runtime data, or external conditions like time and date. Handle different failure types with appropriate responses instead of stopping pipeline execution.
- Smart Retry Logic: Configure maximum retry attempts and intelligent failure handling. Retry only on specific error codes while defining fallback actions when all retries are exhausted.
- Variables & Runtime Context: Use static variables for configuration and dynamic variables populated from task outputs. Make routing decisions based on execution context like number of output tables or processing duration.
- Visual Flow Builder: Design complex conditional logic through an interactive visual canvas. Build and debug workflows visually without writing code.
Main Use Cases
API Integration with Smart Retries
Third-party API integrations are inherently unreliable, with failure rates up to 20% due to rate limits, temporary outages, or network issues. Standard flows treat these failures as hard stops, requiring manual investigation and retry attempts that can take hours to resolve.
Conditional Flows handle these scenarios automatically through intelligent retry strategies. When an API call fails with specific error codes like 429 (rate limited) or 503 (service unavailable), the flow can wait for a specified duration and retry failed tasks. You can configure maximum retry attempts and define fallback actions - like continuing to next tasks if applicable or alerting the engineering team - if all retries are exhausted.
This approach resolves the vast majority of transient API failures without human intervention, turning what used to be 3 AM wake-up calls into background processes that handle themselves. Your pipelines become resilient to external service instability while maintaining data freshness when services are available.
Time-Based Business Logic
Data validation requirements often vary based on business context. Strict validation rules that make sense during business hours can generate false alarms on weekends when data volumes are naturally lower or when certain upstream systems are offline for maintenance.
With Conditional Flows, you can build time-aware logic directly into your pipelines. The same validation task can apply different thresholds based on whether it's a weekend, holiday, or specific day of the month. During weekdays, strict validation rules ensure data quality standards for business-critical reports. On weekends, relaxed rules prevent unnecessary alerts while still catching genuine data issues.
This contextual approach dramatically reduces alert fatigue for on-call engineers while maintaining appropriate data quality controls. Your validation logic becomes as sophisticated as your business requirements, not limited by the rigid structure of traditional linear workflows.
Data-Driven Pipeline Routing
Data processing requirements aren't one-size-fits-all. Some datasets require distributed processing with extended timeouts, while others can be handled using standard processing paths for faster turnaround. Managing these decisions manually leads to bottlenecks, delays, or inefficient resource usage.
Conditional Flows enable dynamic branching based on runtime metadata - such as the number of tables produced in an output mapping. This allows your pipeline to evaluate specific outcomes of previous steps and decide the next path accordingly. For instance, if no tables are produced, you can skip downstream tasks.
This intelligent routing optimizes both performance and resource utilization without requiring manual intervention or separate pipeline definitions. Your data processing becomes adaptive, scaling processing strategies to match the actual characteristics of each dataset rather than assuming worst-case scenarios for every run.
Notifications with Context – Only When It Matters
Conditional Flows introduce context-aware notifications, allowing you to configure alerts that only fire under specific, meaningful conditions. For example, you can define rules like:
- “Send Slack alert only if a retry fails after 3 attempts”
- “Trigger an email only if the output mapping produced zero tables”
- “Notify the data team only if a task exceeds 10 minutes”
By embedding notification logic into your flow conditions, you reduce noise and ensure that alerts are routed to the right people - only when action is actually needed. This increases focus, accelerates response time for critical issues, and restores trust in your alerting system.
The result: your team isn’t overwhelmed with generic errors - they’re empowered with relevant, real-time signals that drive faster decisions and cleaner operations.
Beta Status & Limitations
Important: Conditional Flows are a distinct feature from standard Flows and are not interchangeable. While both orchestrate task execution, Conditional Flows use a different execution engine optimized for conditional logic and dynamic decision-making. Migration tooling is planned for future release to help transition existing standard Flows to conditional logic where appropriate.
Currently not supported in Conditional Flows beta: Several features available in standard Flows are not yet available in Conditional Flows, including CLI tool integration for programmatic flow management, the flow templates library for quick setup, Trigger Components for event-driven execution, and advanced features like Run Selected Tasks or Re-run Failed Tasks for partial pipeline execution.
What's included in beta: The core conditional logic engine supports all branching, retry, and delay functionality. The visual flow builder provides full design and testing capabilities. Basic notification systems work for email and webhook alerts. Variables and runtime context are fully supported for dynamic decision-making.
Getting Started
Navigate to Flows → Conditional Flows → Create Flow to access the new builder interface. You'll land directly in the visual editor where you can start designing your first conditional workflow. Choose from several common patterns or build your logic from scratch using the drag-and-drop interface.
The builder includes testing capabilities that let you simulate different execution scenarios before deploying your flow. You can validate your conditional logic, test retry strategies, and verify notification settings using sample data and mock failures.
Documentation: Complete guides and API references are available at: help.keboola.com/flows/conditional-flows
Feedback: As we continue developing this feature, your feedback is essential for prioritizing improvements and additional capabilities. Share suggestions or report issues through our standard support channels.