Vertex AI Component
Drag & Drop LLM Component
Overview
A drag-and-drop component for integrating Google Cloud's Vertex AI models into your workflow. Configure model parameters and connect inputs/outputs to other components.
Component Configuration
Basic Parameters
Input
Text input for the modelSystem Message
System prompt to guide model behaviorStream
Toggle for streaming responsesModel Name
e.g., gemini-1.5-pro
Project Settings
Credentials
Google Cloud credentials fileProject
Google Cloud project IDLocation
Region (e.g., us-central1)
Model Parameters
Max Output Tokens
Maximum number of tokens to generateMax Retries
Number of retry attempts (1 default)Temperature
Creativity control (0 default)Top K
Top-k sampling parameterTop P
Nucleus sampling parameter (0.95 default)Verbose
Toggle detailed output logging
Output Connections
Text
Generated text outputLanguage Model
Model information and metadata
Usage Tips
- Use service account credentials
- Set appropriate region for lower latency
- Enable streaming for real-time responses
- Adjust temperature based on task needs
Best Practices
- Secure credentials file handling
- Monitor API quotas and usage
- Implement proper error handling
- Test with small token limits first