Groq Component
Drag & Drop LLM Component
Overview
A drag-and-drop component for integrating Groq's high-performance LLM inference into your workflow. Configure model parameters and connect inputs/outputs to other components.
Component Configuration
Basic Parameters
Input
Text input for the modelSystem Message
System prompt to guide model behaviorStream
Toggle for streaming responsesModel
e.g., llama-2-1-8b-instant
API Configuration
Groq API Key
Your API authentication keyGroq API Base
https://api.groq.com
Model Parameters
Max Output Tokens
Maximum number of tokens to generateTemperature
Creativity control (0.1 default)N
Number of completions to generate
Output Connections
Text
Generated text outputLanguage Model
Model information and metadata
Usage Tips
- Use environment variables for API key
- Enable streaming for faster responses
- Start with default temperature (0.1)
- Test with single completions first
Best Practices
- Secure API key handling
- Monitor token usage
- Implement error handling
- Set appropriate token limits