This handbook provides a comprehensive overview of all tools available to agents within the epic_news project, including native CrewAI tools, custom-built tools, and integrations from the Composio library.
Factory Pattern: Use factory functions to organize and provide tools. This allows for centralized management and easy modification.
Domain Separation: Group tools by functionality into separate files (e.g., web_tools.py, document_tools.py).
Graceful Degradation: If a tool fails to initialize (e.g., missing API key), the factory should handle it gracefully (e.g., return an empty list) so the application can proceed without it.
# Example: src/epic_news/tools/document_tools.pydefget_document_tools():return[FileReadTool(),FileWriteTool(),PDFSearchTool()]# Example: Graceful degradationdefget_github_tools():try:fromcrewai_toolsimportGithubSearchToolreturn[GithubSearchTool(gh_token=os.getenv('GITHUB_TOKEN'))]except(ImportError,Exception):return[]# Return empty list if dependencies or keys are missing
Asynchronous Execution: Enable async for independent, I/O-bound tasks.
Caching: Implement caching for expensive or frequently repeated API calls.
Testing: Write unit tests for tool factories and integration tests for tool combinations. Mock external APIs to avoid costs and dependencies during testing.
What it does: Centralizes selection of the web scraper provider via src/epic_news/tools/scraper_factory.py::get_scraper().
Default: ScrapeNinjaTool.
Switching providers: Set WEB_SCRAPER_PROVIDER in your .env to scrapeninja (default) or firecrawl.
Required keys:
RAPIDAPI_KEY for ScrapeNinja
FIRECRAWL_API_KEY for Firecrawl
Usage example:
fromepic_news.tools.scraper_factoryimportget_scraperscraper=get_scraper()# selected by WEB_SCRAPER_PROVIDERresult_json=scraper.run({"url":"https://example.com"})