Hello LLM
Initiation
CLI Command: para init --template hello_llm
Why this project?
This project demonstrates LLM integration with Paranet with sample approaches for building production LLM applications.
What this kit showcases
- Direct LLM integration using Azure OpenAI via Python actors.
- Three interaction patterns:
- Raw conversational interface
- Medical assistant with specialized prompts
- Template-based email classification and routing
- Actor-based LLM orchestration using Paraflow skills.
- Environment-based configuration for secure API management.
How to run it
1 Set up environment variables
Create a .env file with Azure OpenAI credentials:
AZURE_OPENAI_API_KEY=<your key here>
AZURE_OPENAI_ENDPOINT=<your endpoint here>
2 Deploy Paranet services
para devkit login
para docker deploy node
para docker deploy package
3 Test interaction patterns
- Navigate to Paracord UI → Actor Hub.
- Try each pattern:
Direct LLM Conversation (chatgpt/direct_chat):
- Enter any question in the prompt field
- Get raw conversational responses without system context
Medical Assistant (chatgpt/medical_chat):
- Enter healthcare-related questions
- Receive professional medical assistant responses with specialized context
Email Classification (email_classifier/prioritize):
- Fill in email details (subject and body)
- LLM categorizes the email and Paraflow routes based on classification