Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds CLI support for running Caspion's scraper in headless mode using the --scrape flag. This enables automated scraping via cron jobs without requiring the GUI to be open.
Changes:
- Added a new
scripts/scrape.jsfile that checks if source files have changed and rebuilds if necessary before launching the scraper - Modified the main entry point to detect CLI mode and bypass single-instance checks, allowing the scraper to run alongside the GUI
- Changed the default maxConcurrency from 1 to 3 for better performance
- Added comprehensive documentation in README.md with platform-specific examples and cron job configuration
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| scripts/scrape.js | New build-aware launcher script for CLI scraping in development mode |
| packages/main/src/index.ts | Added CLI mode detection and scraping flow, bypasses single-instance lock in CLI mode |
| packages/main/src/backend/import/importTransactions.ts | Updated default maxConcurrency from 1 to 3 |
| package.json | Added scrape script command |
| README.md | Added comprehensive CLI mode documentation with platform-specific examples |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Although I left this requirement behind since I found the daniel-hauser/moneyman project, I will review it. |
| eventPublisher.onAny((eventName, eventData) => { | ||
| console.log(`[${eventName}]`, eventData?.message ?? ''); | ||
| }); | ||
| await scrapeAndUpdateOutputVendors(config, eventPublisher); | ||
| logAppEvent('CLI_SCRAPE_SUCCESS'); | ||
| app.quit(); | ||
| } catch (error) { | ||
| logAppEvent('CLI_SCRAPE_FAILED', { errorMessage: (error as Error).message }); |
There was a problem hiding this comment.
I'm from mobile, but what is the difference between console.log and logAppEvent?
If the logAppEvent is for special lifecycle keys, why are those keys not enum?
There was a problem hiding this comment.
logAppEvent writes structured logs to electron-log (persistent log files), while console.log here outputs scraping progress to stdout for CLI users running from terminal/cron. They serve different purposes — logAppEvent for persistent diagnostics, console for real-time CLI feedback.
Regarding enums — good point, but all existing event keys throughout the codebase (APP_READY, APP_QUIT, UPDATE_CHECK_START, etc.) are plain strings. We can create an enum for all of them in a separate PR to keep this one focused.
There was a problem hiding this comment.
What is this file? Is it for those who want to really cron it from source, and you trying to avoid recompiling every time, but only when needed?
|
@baruchiro I've opened a new pull request, #725, to work on those changes. Once the pull request is ready, I'll request review from you. |
Addresses code review feedback on #724 requesting removal of two inline comments that don't add value beyond what the code clearly expresses. **Changes:** - Removed `// Check for CLI mode` comment before `isCliScrape` variable declaration - Removed `// CLI mode: run scraping and exit` comment at start of CLI execution block Both comments were redundant given the self-documenting variable names and control flow. <!-- START COPILOT CODING AGENT TIPS --> --- 💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more [Copilot coding agent tips](https://gh.io/copilot-coding-agent-tips) in the docs. --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: baruchiro <17686879+baruchiro@users.noreply.github.com>
|
🎉 This PR is included in version 2.16.2 🎉 The release is available on GitHub release Your semantic-release bot 📦🚀 |
This allows running caspion from command line for scraping.
I chose a lean approach not doing a full refactor. This is a very beneficial feature and as you can see, can be supported with a relatively small change.
Fixes #183