You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: website_and_docs/content/blog/2025/selenium-community-live-episode2.md
+101-6Lines changed: 101 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,20 +10,115 @@ description: >
10
10
Selenium Community Live - Episode 2
11
11
---
12
12
13
-
The second episode of Selenium Community Live happened on Jan 21st, 2025, with speaker **<ahref="https://www.linkedin.com/in/theautomatedtester/"target="_blank">David Burns</a>**, event hosted by **<ahref="https://www.linkedin.com/in/musepallavi/"target="_blank">Pallavi Sharma</a>**
13
+
The second episode of Selenium Community Live happened on Jan 21st, 2025, with David Burns, Selenium Project Leadership Member, Chair W3C Browser Testing and Tools Workgroup, and Head Open source and Developer Advocacy at BrowserStack. The topic of the event was Browsers, Browser Engines and why they are not the same.
14
14
15
15
You can watch the episode here- **<ahref="https://www.youtube.com/watch?v=0W_rYPxVIgA"target="_blank">Selenium Community Live - Episode 2</a>**
16
16
17
-
**Selenium Community Live - Episode 2**
18
-
19
-
David Burns, Selenium Project Leadership Member, Chair W3C Browser Testing and Tools Workgroup, Head Open source and Developer Advocacy at Browser Stack was the speaker for the episode. David spoke about Web Browsers and Browser engines, and how while automating them we should be aware of the underlying software we are automating, even the platform makes a difference!
20
-
Thank you everyone who joined the community event.
# Browser Engines vs Real Browsers: Why They're Not Equal
23
+
24
+
The Selenium community recently hosted an enlightening session with David Burns, who, shared crucial insights about browser testing that every automation engineer should understand.
25
+
26
+
## The Foundation: Web Standards Matter
27
+
28
+
David began by emphasizing the importance of web specifications, particularly the work done by the W3C Browser Testing and Tools Working Group. This group maintains three critical specifications:
29
+
30
+
-**WebDriver Classic/HTTP**: The standard WebDriver protocol we use daily
31
+
-**WebDriver BiDi**: A bidirectional protocol enabling event-driven APIs for network interception and DOM mutations
32
+
-**AT Driver**: Built on WebDriver BiDi for driving accessibility tools like screen readers
33
+
34
+
The key takeaway being, that Standards create a level playing field, but the devil is in the details. The difference between "MUST" and "SHOULD" in specifications can create significant bugs across different browser implementations.
35
+
36
+
## Real User Testing: Beyond Surface-Level Automation
37
+
38
+
One of David's most compelling points centered on the concept of "real user testing." When Selenium executes a click, it goes through the browser's backend, creating trusted events that the browser recognizes as legitimate user interactions. This is crucial for:
39
+
40
+
- Banking iframes
41
+
- Third-party authentication (like Okta)
42
+
- Any security-sensitive operations
43
+
44
+
Tools that execute events through the document (frontend) create synthetic events marked as `isTrusted: false`, which security-conscious applications will reject.
45
+
46
+
## The Headless vs Headful Reality Check
47
+
48
+
David revealed a startling discovery made by Mathias Bynens (Engineering Manager at Google): for years, Chrome's headless mode wasn't actually using the same rendering engine (Blink) as regular Chrome. It was essentially a different browser altogether.
49
+
50
+
This revelation led to the creation of **Chrome for Testing**, providing a stable, consistent testing environment that actually matches what users experience.
51
+
52
+
> "Headless and headful is not necessarily the same... it is literally apples to oranges."
53
+
54
+
## Browser Engines vs Real Browsers: The Critical Difference
55
+
56
+
Using Chromium instead of actual browsers like Chrome, Edge, Brave, or Opera might seem equivalent, but David highlighted crucial differences:
57
+
58
+
### Third-Party Cookie Handling
59
+
Different browsers handle cookies differently. Brave's privacy-focused approach differs significantly from Chrome's implementation, affecting:
60
+
- Session management
61
+
- Login/logout flows
62
+
- Cross-site functionality
63
+
64
+
### Real-World Example: Safari's IndexedDB Bug
65
+
A particularly illustrative case was Safari's IndexedDB bug that affected desktop Safari but not:
66
+
- WebKit (the engine)
67
+
- iOS Safari
68
+
- Safari Tech Preview
69
+
70
+
Testing with WebKit alone would have missed this critical bug that could break login functionality for real users.
71
+
72
+
## Mobile vs Desktop: More Than Just Screen Size
73
+
74
+
Simply resizing a desktop browser to mobile dimensions doesn't replicate mobile browsing:
75
+
76
+
### Operating System Differences
77
+
- Mobile and desktop use different operating systems
78
+
- Display rendering works differently
79
+
- Resource constraints affect performance
80
+
81
+
### Device Pixel Ratio Issues
82
+
Mobile devices have different pixel density requirements that can't be accurately simulated by browser resizing, leading to rendering inconsistencies in graphics-intensive applications.
83
+
84
+
## Risk Management: Making Informed Decisions
85
+
86
+
David's presentation wasn't about mandating specific tools but about understanding trade-offs:
87
+
88
+
### Low Risk Scenarios
89
+
- Simple web forms
90
+
- Basic functionality testing
91
+
- Limited third-party integrations
92
+
93
+
### High Risk Scenarios
94
+
- Canvas/graphics-heavy applications
95
+
- Complex authentication flows
96
+
- Mobile-specific interactions
97
+
- Security-sensitive operations
98
+
99
+
## Practical Recommendations
100
+
101
+
1.**Understand Your User Base**: Test where your users actually are
102
+
2.**Know Your Risk Profile**: Complex applications require more realistic testing environments
103
+
3.**Choose Tools Wisely**: Understand what your testing framework actually provides
David shared insights about Selenium's future direction:
109
+
- Continued focus on WebDriver BiDi implementation
110
+
- More "batteries included" features like Selenium Manager
111
+
- Enhanced APIs for network interception and advanced automation
112
+
113
+
The Selenium team remains committed to conservative changes, prioritizing stability while adding powerful new capabilities.
114
+
115
+
## Conclusion
116
+
117
+
David's presentation reminds us that effective testing requires understanding the nuances of web browsers and making informed decisions about our testing strategies. While convenience tools have their place, knowing when and how to test with real browsers can be the difference between catching critical bugs and shipping broken experiences to users.
118
+
119
+
The key message is clear: there's no one-size-fits-all solution, but with proper knowledge of the risks and differences between testing approaches, teams can make intelligent choices that balance practicality with coverage.
120
+
121
+
---
27
122
## Watch the Recording
28
123
29
124
Couldn’t join us live? Watch the entire episode here -
0 commit comments