38 lines
1.5 KiB
Markdown
38 lines
1.5 KiB
Markdown
```mermaid
|
|
graph TD
|
|
A[Add bid_history table] --> B[Add watch_count + estimates]
|
|
B --> C[Create market_indices]
|
|
C --> D[Add condition + year fields]
|
|
D --> E[Build comparable matching]
|
|
E --> F[Enrich with auction house data]
|
|
F --> G[Add AI image analysis]
|
|
```
|
|
|
|
| Current Practice | New Requirement | Why |
|
|
|-----------------------|---------------------------------|---------------------------|
|
|
| Scrape once per hour | **Scrape every bid update** | Capture velocity & timing |
|
|
| Save only current bid | **Save full bid history** | Detect patterns & sniping |
|
|
| Ignore watchers | **Track watch\_count** | Predict competition |
|
|
| Skip auction metadata | **Capture house estimates** | Anchor valuations |
|
|
| No historical data | **Store sold prices** | Train prediction models |
|
|
| Basic text scraping | **Parse condition/serial/year** | Enable comparables |
|
|
|
|
|
|
```bazaar
|
|
Week 1-2: Foundation
|
|
Implement bid_history scraping (most critical)
|
|
Add watch_count, starting_bid, estimated_min/max fields
|
|
Calculate basic bid_velocity
|
|
Week 3-4: Valuation
|
|
Extract year_manufactured, manufacturer, condition_description
|
|
Create market_indices (manually or via external API)
|
|
Build comparable lot matching logic
|
|
Week 5-6: Intelligence Layer
|
|
Add auction house performance tracking
|
|
Implement undervaluation detection algorithm
|
|
Create price alert system
|
|
Week 7-8: Automation
|
|
Integrate image analysis API
|
|
Add economic indicator tracking
|
|
Refine ML-based price predictions
|
|
``` |