Workflow 1: Basic Batch Processing
```bash
# 1. Create JSONL file
echo '{"key": "req1", "request": {"contents": [{"parts": [{"text": "Explain photosynthesis"}]}]}}' > requests.jsonl
echo '{"key": "req2", "request": {"contents": [{"parts": [{"text": "What is gravity?"}]}}]}' >> requests.jsonl
# 2. Create batch job
python scripts/create_batch.py requests.jsonl --name "science-questions"
# 3. Check status
python scripts/check_status.py --wait
# 4. Get results
python scripts/get_results.py --output results.jsonl
```
- Best for: Basic bulk processing, cost efficiency
- Typical time: Minutes to hours depending on job size
Workflow 2: Bulk Content Generation
```bash
# 1. Generate JSONL with content requests
python3 << 'EOF'
import json
topics = ["sustainable energy", "AI in healthcare", "space exploration"]
with open("content-requests.jsonl", "w") as f:
for i, topic in enumerate(topics):
req = {
"key": f"blog-{i}",
"request": {
"contents": [{
"parts": [{
"text": f"Write a 500-word blog post about {topic}"
}]
}]
}
}
f.write(json.dumps(req) + "\n")
EOF
# 2. Process batch
python scripts/create_batch.py content-requests.jsonl --name "blog-posts" --model gemini-3-flash-preview
python scripts/check_status.py --wait
python scripts/get_results.py --output blog-posts.jsonl
```
- Best for: Blog generation, article creation, bulk writing
- Combines with: gemini-text for content needs
Workflow 3: Dataset Processing
```bash
# 1. Load dataset and create batch requests
python3 << 'EOF'
import json
# Your dataset
data = [
{"product": "laptop", "features": ["fast", "lightweight"]},
{"product": "headphones", "features": ["wireless", "noise-cancelling"]},
]
with open("product-descriptions.jsonl", "w") as f:
for item in data:
features = ", ".join(item["features"])
prompt = f"Write a product description for {item['product']} with these features: {features}"
req = {
"key": item["product"],
"request": {
"contents": [{"parts": [{"text": prompt}]}]
}
}
f.write(json.dumps(req) + "\n")
EOF
# 2. Process
python scripts/create_batch.py product-descriptions.jsonl
python scripts/check_status.py --wait
python scripts/get_results.py --output results.jsonl
```
- Best for: Product descriptions, dataset enrichment, bulk analysis
Workflow 4: Email Campaign Generation
```bash
# 1. Create personalized email requests
python3 << 'EOF'
import json
customers = [
{"name": "Alice", "product": "premium plan"},
{"name": "Bob", "product": "basic plan"},
]
with open("emails.jsonl", "w") as f:
for cust in customers:
prompt = f"Write a personalized email to {cust['name']} about upgrading to our {cust['product']}"
req = {
"key": f"email-{cust['name'].lower()}",
"request": {
"contents": [{"parts": [{"text": prompt}]}]
}
}
f.write(json.dumps(req) + "\n")
EOF
# 2. Process batch
python scripts/create_batch.py emails.jsonl --name "email-campaign"
python scripts/check_status.py --wait
python scripts/get_results.py --output email-results.jsonl
```
- Best for: Marketing campaigns, personalized outreach
- Combines with: gemini-text for email content
Workflow 5: Async Job Monitoring
```bash
# 1. Create job
python scripts/create_batch.py large-batch.jsonl --name "big-job"
# 2. Check status periodically (non-blocking)
while true; do
python scripts/check_status.py
sleep 60 # Check every minute
done
# 3. Get results when done
python scripts/get_results.py --output final-results.jsonl
```
- Best for: Long-running jobs, background processing
- Use when: You don't need immediate results
Workflow 6: Cost-Optimized Bulk Processing
```bash
# 1. Use flash model for cost efficiency
python scripts/create_batch.py requests.jsonl --model gemini-3-flash-preview --name "cost-optimized"
# 2. Monitor and retrieve
python scripts/check_status.py --wait
python scripts/get_results.py
```
- Best for: High-volume, cost-sensitive applications
- Savings: Batch API typically 50%+ cheaper than real-time
Workflow 7: Multi-Stage Pipeline
```bash
# Stage 1: Generate content
python scripts/create_batch.py content-requests.jsonl --name "stage1-content"
python scripts/check_status.py --wait
# Stage 2: Summarize content
python scripts/create_batch.py summaries.jsonl --name "stage2-summaries"
python scripts/check_status.py --wait
# Stage 3: Convert to audio (gemini-tts)
# Process results from stage 2
```
- Best for: Complex workflows, multi-step processing
- Combines with: Other Gemini skills for complete pipelines