| name | rag-agent |
| description | Retrieval-Augmented Generation for project knowledge management using ChromaDB |
Rag Agent
Purpose
Stores and retrieves project artifacts with semantic search capabilities
When to Use This Skill
- Context Retrieval - Get relevant info for LLMs
- Documentation Search - Find relevant docs
- Prompt Management - Store/retrieve prompts
- Code Examples - Find similar implementations
Responsibilities
- Store artifacts - (prompts, code, docs) with embeddings
- Semantic search - across project knowledge
- Context retrieval - for LLM queries
- Version management - for artifacts
- Integration with - Knowledge Graph
Integration with Pipeline
Communication
Receives:
- Artifacts to store (prompts, docs, code)
- Search queries from other agents
- Context retrieval requests for LLMs
Sends:
- Relevant artifacts based on semantic similarity
- Context for LLM queries
- Search results with relevance scores
Usage Examples
Standalone Usage
python3 rag_agent.py \
--operation store \
--content-file prompt.txt \
--collection prompts \
--metadata '{"type": "developer_prompt", "version": "1.0"}'
Programmatic Usage
from rag_agent import RAGAgent
rag = RAGAgent(persist_directory="./rag_data")
# Store artifact
rag.store_artifact(
content=prompt_text,
collection_name="prompts",
metadata={"type": "developer_prompt"}
)
# Retrieve context
results = rag.query(
query_text="How to implement authentication?",
collection_name="documentation",
top_k=5
)
for doc in results:
print(f"Relevance: {doc['score']:.2f}")
print(f"Content: {doc['content'][:200]}...")
Configuration
Environment Variables
# Agent-specific configuration
ARTEMIS_RAG_AGENT_ENABLED=true
ARTEMIS_LLM_PROVIDER=openai
ARTEMIS_LLM_MODEL=gpt-4o
Hydra Configuration (if applicable)
rag_agent:
enabled: true
llm:
provider: openai
model: gpt-4o
Best Practices
- Organize Collections - Separate prompts, docs, code
- Rich Metadata - Tag artifacts for better filtering
- Regular Cleanup - Archive old/unused artifacts
- Monitor Size - ChromaDB can grow large
- Backup Regularly - Persist directory is critical
Cost Considerations
Typical cost: $0.05-0.20 per operation depending on complexity
Limitations
- Depends on LLM quality
- Context window limits
- May require multiple iterations
References
Version: 1.0.0
Maintained By: Artemis Pipeline Team
Last Updated: October 24, 2025