Overview
The Flow Caller node enables you to execute other flows within your workspace while passing data between them. This node acts as a bridge, allowing you to build modular flow architectures where specialized flows can be called and reused across different contexts. The target flow must contain Data Input nodes to receive data and Data Output nodes to return results.
Input Configuration
Select Flow to Run
Choose the target flow to execute:
Purpose: Identify which flow in your workspace to execute
Selection Method: Use the dropdown to browse available flows
Default State: Shows "Select Flow" until a flow is chosen
Flow Requirements: Target flow must contain Data Input and/or Data Output nodes for data exchange
Flow Execution Mode
Control how the flow execution integrates with the current flow:
Return after Flow:
Purpose: Execute the target flow and return its output to continue the current flow
Behavior: Waits for target flow completion and retrieves results
Use Case: When you need the target flow's output for further processing
Trigger Only:
Purpose: Execute the target flow without waiting for output or return data
Behavior: Starts target flow execution and immediately continues current flow
Use Case: When triggering background processes or fire-and-forget operations
Data Exchange Configuration
File Input Section
Pass file data to the target flow:
File Input Field: Upload or connect files to pass to Data Input nodes in the target flow
Connection Point: Can receive file data from other flow nodes
Target Mapping: Files are passed to corresponding Data Input nodes in the called flow
File Types: Supports various file formats depending on target flow requirements
Data Input Section
Pass text or structured data to the target flow:
Data Field: Enter text, JSON, or other structured data to pass to the target flow
Connection Point: Can receive data from other flow nodes
Format Flexibility: Accepts various data formats as required by target flow's Data Input nodes
Multiple Inputs: If target flow has multiple Data Input nodes, data is distributed accordingly
Output Configuration
File Output Section
File Output Display: Shows file outputs returned from Data Output nodes in the executed flow
File Access: Retrieved files can be downloaded or passed to other nodes
Multiple Files: Displays all files returned by the target flow's Data Output nodes
Flow Results
Output Display: Shows text and data results from the target flow's execution
Data Format: Preserves original format and structure of returned data
Multiple Outputs: If target flow has multiple Data Output nodes, all outputs are displayed
Processing Status: Indicates execution progress and completion
Execution Control
Run Flow Button
Location: Top-right corner of the interface
Function: Initiates execution of the selected target flow
Data Passing: Automatically passes configured input data and files to target flow
Output Retrieval: Collects and displays results when execution completes
Flow Architecture Requirements
Target Flow Setup
Data Input Nodes: Target flow must contain Data Input nodes to receive data from Flow Caller
Data Output Nodes: Target flow must contain Data Output nodes to return results to Flow Caller
Node Mapping: Multiple Data Input/Output nodes in target flow correspond to multiple inputs/outputs in Flow Caller
Flow Design: Target flows should be designed with clear input/output interfaces for reusability
Data Flow Process
Input Distribution: Flow Caller distributes input data to all Data Input nodes in target flow
Flow Execution: Target flow processes using provided input data
Output Collection: Flow Caller collects results from all Data Output nodes in target flow
Result Display: Retrieved outputs are displayed in corresponding sections
Best Practices
Flow Design
Modular Architecture: Design target flows as reusable modules with clear input/output interfaces
Documentation: Document expected input formats and output structures for target flows
Error Handling: Implement error handling in target flows to provide meaningful feedback
Testing: Test flow calling relationships thoroughly before production use
Data Management
Input Validation: Ensure input data matches target flow's expected format and structure
Output Processing: Plan for handling various output types and potential error conditions
Data Size: Consider data volume limitations and processing time for large datasets
Format Consistency: Maintain consistent data formats across related flows
Performance Considerations
Execution Mode: Choose appropriate execution mode based on whether you need return data
Flow Complexity: Consider target flow execution time and resource requirements
Parallel Execution: Plan for cases where multiple Flow Caller nodes execute simultaneously
Resource Management: Monitor system resources when calling computationally intensive flows
Integration Considerations
Flow Orchestration
Flow Dependencies: Map dependencies between flows to avoid circular references
Execution Order: Plan execution sequences for complex multi-flow operations
State Management: Consider how data state is maintained across flow boundaries
Error Propagation: Plan how errors in target flows should affect calling flows
Use Cases
Data Processing Pipelines: Break complex processing into specialized, reusable flows
Microservice Architecture: Implement flow-based microservices with clear interfaces
Template Flows: Create reusable template flows for common operations
Conditional Processing: Execute different flows based on input conditions or business logic
Background Tasks: Trigger long-running processes without blocking main flow execution
Security Considerations
Access Control: Ensure appropriate permissions for cross-flow execution
Data Privacy: Consider data sensitivity when passing information between flows
Flow Isolation: Design flows to prevent unintended data leakage between executions
Audit Trail: Track flow execution relationships for compliance and debugging
The Flow Caller enables sophisticated flow orchestration patterns, allowing you to build scalable, modular automation systems with clear separation of concerns and reusable components.