Development update - Saturday, April 19th, 2025
LLM Configuration & In-App Documentation
The refactoring mentioned in the demo video is now complete.
Setting up the LLM
The new Settings page allows you to select the preferred LLM model for both:
The manager (orchestrator of the workflow)
The agent store (which contains all agents associated with that workflow)
In this architecture, agents dedicated to a given workflow are grouped within a single file for clarity and reusability.
All LLM-related logic has been centralized into a single file, providing a clean separation of concerns. This structure was designed with serialization in mind — enabling the system to adapt to new tasks by drawing analogies from existing, well-functioning workflows. This kind of adaptation can be efficiently performed via LLM pair-coding.
Here’s a snapshot of the current Settings page:
Documentation Page
The Documentation page now summarizes the technical specification of the entire application. It provides a clear, structured overview of all four workflows, detailing:
Each flow’s purpose
Its operational steps
Expected inputs & outputs
This page acts as both a technical reference and user-facing guide.
Next steps
Notion integration is currently underway — to allow direct publishing of summaries and insights.
QuantConnect LEAN integration will follow for backtesting and deployment.
Authentication and user management will be handled via Firebase for scalable, secure access.
A CI/CD pipeline is on the roadmap to streamline testing, deployment, and versioning.
Known Bugs & Limitations
Logging not visible in the frontend — though fully functional in the backend console.
Basic quote chart rendering — advanced chart libraries introduce unwanted vulnerabilities and have been temporarily excluded.
Minor rendering issues in the Chat with Fundamentals interface.
The research flow breaks on single-word queries (fix planned).
Development is active and ongoing. Open source release expected June 2025.
S.M.L