Skip to content

[Bug]: enable_stealth=True is a silent no-op — StealthAdapter imports symbols that don't exist in playwright-stealth 2.x #1959

@NaabZer

Description

@NaabZer

crawl4ai version

0.8.6 (also reproducible on main @ 1debe5f)

Expected Behavior

BrowserConfig(enable_stealth=True) should apply playwright-stealth evasions to crawled pages — masking navigator.webdriver, populating navigator.plugins, etc. — as documented in docs/md_v2/advanced/advanced-features.md and docs/examples/stealth_mode_example.py.

Current Behavior

No stealth shim is ever applied, even with enable_stealth=True, and no exception is thrown. navigator.webdriver stays true, navigator.plugins.length stays 0, and the resulting fingerprint is indistinguishable from a vanilla headless Chromium.

Root cause: StealthAdapter._check_stealth_availability() in crawl4ai/browser_adapter.py:158-171 imports stealth_async / stealth_sync from playwright_stealth. These symbols existed in tf-playwright-stealth 1.x but not in playwright-stealth 2.x (the current declared dep, set by #1714). Both ImportErrors are swallowed → _stealth_function = Noneapply_stealth(page) short-circuits silently. playwright-stealth 2.0.3 exposes Stealth (with apply_stealth_async(page_or_context)), but the adapter never reaches for it.

Likely regression history

Is this reproducible?

Yes

Inputs Causing the Bug

- URL(s): https://bot.sannysoft.com/ (or any fingerprinting test page)
- Settings: BrowserConfig(enable_stealth=True, headless=True)
- Versions: crawl4ai==0.8.6, playwright-stealth==2.0.3

Steps to Reproduce

1. pip install crawl4ai==0.8.6  # pulls playwright-stealth>=2.0.0
2. Confirm the missing symbols:
   python -c "from playwright_stealth import stealth_async"
   # ImportError: cannot import name 'stealth_async' from 'playwright_stealth'
3. Run the snippet below.
4. Inspect the result — navigator.webdriver is still "true",
   navigator.plugins.length is 0. No stealth was applied.

Code snippets

import asyncio
from crawl4ai import AsyncWebCrawler, BrowserConfig, CrawlerRunConfig

async def main():
    browser_config = BrowserConfig(enable_stealth=True, headless=True)
    async with AsyncWebCrawler(config=browser_config) as crawler:
        result = await crawler.arun(
            url="https://bot.sannysoft.com/",
            config=CrawlerRunConfig(screenshot=True),
        )
        print(result.html[:2000])

asyncio.run(main())

Supporting Information

OS: Linux (Ubuntu 24.04)
Python version: 3.12
Browser: Chromium (bundled by Playwright)
Browser version: N/A

Error logs & Screenshots (if applicable)

No exception is raised — the failure is silent. The only signal is the unmasked navigator.webdriver / plugin-count rows on bot.sannysoft.com.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions