Marketplace Assortment Tracking
Marketplace assortment changes quickly and usually without announcements. New sellers appear, products go out of stock, prices shift multiple times per day, and review counts climb at different rates across competitors. ScrapingLab automates the collection of all these signals so your ecommerce team can act on real-time market data instead of quarterly reports.
Why assortment tracking matters
For brands selling on marketplaces like Amazon, Walmart, or niche vertical platforms, the competitive landscape is defined by three things: which products are available, at what price, and how they are perceived by buyers. Missing any of these signals means reacting to market changes instead of anticipating them.
Stock visibility drives buy box share. On Amazon, the buy box accounts for the vast majority of sales. When a competitor goes out of stock, your product gets more visibility. When you go out of stock, competitors capture your share. Knowing stock status across your category in real time lets you adjust ad spend, pricing, and inventory planning accordingly.
Price volatility affects margins. Marketplace sellers frequently adjust prices using repricing tools. A competitor dropping price by 5% can trigger a race to the bottom if you are not monitoring. Conversely, knowing that a competitor raised prices gives you room to maintain margins without losing share.
Review velocity signals product-market fit. A new competitor product gaining 50 reviews per week is a threat worth watching. A product with declining review velocity may be losing momentum. These signals inform product development, marketing investment, and category strategy decisions.
How ScrapingLab automates assortment tracking
Step 1: Define your category scope
Start by identifying the marketplace categories, search terms, and product pages you want to monitor. Common scoping approaches include:
- Category pages — Monitor the top 50-100 results in your primary category
- Search result pages — Track rankings for your most important keywords
- Competitor brand pages — Watch specific seller storefronts or brand pages
- Your own listings — Ensure your products display correctly and competitively
Step 2: Build extraction workflows
Create ScrapingLab workflows for each monitoring dimension. The visual builder handles JavaScript-heavy marketplace pages with dynamic loading, lazy images, and infinite scroll patterns.
Category and search monitoring workflow:
| Data point | What it captures | Strategic value |
|---|---|---|
| Product title | SKU identification | Assortment mapping |
| Price | Current list price | Pricing intelligence |
| Rating | Star rating (e.g., 4.3/5) | Quality perception |
| Review count | Total reviews | Social proof strength |
| Seller name | Who holds the listing | Competitive landscape |
| Stock status | In Stock / Out of Stock | Availability monitoring |
| Position | Rank in search or category | Visibility tracking |
| Sponsored flag | Whether the listing is an ad | Ad spend intelligence |
Product detail monitoring workflow:
For your top competitors and your own listings, create deeper extraction workflows that capture:
- Full pricing including discounts, coupons, and subscribe-and-save pricing
- Bullet points and description text (changes indicate repositioning)
- Variant availability (sizes, colors, configurations)
- Seller count (number of sellers offering the same product)
- Best Seller Rank in relevant categories
- “Frequently bought together” and “Customers also viewed” relationships
Step 3: Handle marketplace-specific challenges
Marketplaces are among the more technically challenging targets for scraping because they actively defend against automated access. ScrapingLab addresses these challenges with built-in capabilities:
Proxy rotation. ScrapingLab automatically rotates through residential and datacenter proxies to avoid IP-based blocking. For marketplace monitoring, residential proxies provide the most reliable access because they mimic real shopper traffic.
CAPTCHA handling. When a marketplace presents a CAPTCHA challenge, ScrapingLab’s built-in CAPTCHA solving resolves it automatically without interrupting your workflow.
Dynamic content rendering. Marketplace pages load product data dynamically through JavaScript. ScrapingLab renders pages in a real browser, waiting for all content to load before extracting data, so you capture what actual shoppers see.
Rate management. Overly aggressive scraping triggers bot detection. ScrapingLab lets you configure delays between requests and spread runs across time windows to maintain reliable access.
Step 4: Schedule and analyze
Set monitoring frequency based on how dynamic your category is:
- High-velocity categories (electronics, fashion, supplements) — Daily or twice daily
- Moderate categories (home goods, tools, pet supplies) — Every 2-3 days
- Stable categories (industrial supplies, specialty equipment) — Weekly
Export data to your analytics stack via CSV, JSON, or webhook. Over time, this data builds into a competitive intelligence database that reveals seasonal pricing patterns, new entrant traction, assortment gaps, and review trajectory trends.
Real outcomes for ecommerce teams
Smarter repricing decisions. Instead of blindly matching competitor prices, teams use assortment data to identify when competitors are out of stock (no need to lower price) or when a new entrant is underpricing to gain reviews (temporary threat, do not overreact).
Better inventory planning. When data shows competitors frequently going out of stock in a category, that signals supply chain constraints you can exploit by maintaining deeper inventory and capturing their displaced demand.
Informed product launches. Before launching a new product, assortment data reveals the exact competitive landscape: how many sellers, at what price points, with what review counts. This data shapes launch pricing, advertising budget, and positioning decisions.
Category expansion intelligence. When evaluating a new marketplace category, historical assortment data shows category size, price distribution, competitive density, and growth trajectory — all from public data that is available before you invest in inventory.
Getting started
- Pick your primary marketplace category (start with one)
- Create a ScrapingLab workflow targeting category or search result pages
- Extract product title, price, rating, review count, seller, and stock status
- Add a second workflow for 10-20 key competitor product detail pages
- Schedule daily runs and export to a spreadsheet or data warehouse
- Build a simple dashboard that highlights day-over-day changes
Most teams have their first marketplace monitoring workflow producing data within 45 minutes. Start with broad category monitoring and add detail-level tracking for the products that matter most to your strategy.