エピソード

  • Mastering Automatability for Test Automation
    2025/12/12

    The answers given during a Browserstack Community AMA session held on Discord on the 11th of December 2025, following a live LinkedIn video stream. The session focused on "Mastering Automatability for Test Automation". The main theme is the concept of Automatability, which I view as the ability to automate, this personal skill is more critical than reliance on specific tools. The discussion covers various topics, including how to separate automation problems from application design issues, dealing with slow UIs and non-automation friendly third-party widgets, evaluating automation readiness, and addressing common architectural failings related to large-scale UI automation.


    00:00:00 Introduction

    00:01:27 key early lesson about automatability?

    00:01:56 separating automation issues vs. design issues?

    00:03:49 is slow UI a testability or automatability problem?

    00:06:50 handling non-automatable third-party widgets?

    00:09:20 assessing automation readiness - any framework?

    00:11:23 common architectural patterns that break at scale?

    00:13:37 prioritizing testability vs. automation in sprints?

    00:16:51 do modern tools reduce the need for good design?

    00:19:32 explaining automatability as an investment?

    00:21:44 how do AI agents handle dynamic/third-party elements?

    00:23:17 early signs a feature will be flaky when automated?

    00:26:10 which microservice layers to automate first?

    00:29:16 high-ROI automatability fixes for small budgets?

    00:30:55 early dev–test collaboration to prevent rework?

    00:34:08 thinking about automatability in continuous delivery?

    Join the BrowserStack Discord community and discover more AMA sessions https://www.browserstack.com/community

    続きを読む 一部表示
    42 分
  • Test Code Migration not Test Cases
    2025/10/07

    Should you use AI to help you migrate test automation code? And what should you actually migrate, the tests coverage hasn't changed. In this episode we discus show abstractions and AI can be used to migrate... and discuss when you shouldn't.

    Welcome to The Evil Tester Show! In this episode, host Alan Richardson dives into the complex world of test automation migrations. Have you ever wondered what it really takes to move your automated test execution code from one tool or language to another—like switching from WebDriver to Playwright, or migrating from Java to TypeScript? Alan breaks down the pitfalls, challenges, and best practices you need to consider before taking the leap. He explains why migrating isn’t just about copying test cases, how abstraction layers can save you time and headaches, and why using AI and solid design principles can streamline your transition. Whether you’re facing unsupported tools, evolving frameworks, or strategic changes in your testing approach, this episode offers practical advice to plan and execute a seamless migration—without burying new problems beneath old ones.

    00:00 Migration Challenges

    02:43 Tool Evaluation

    04:05 Migrating to Playwright: Considerations

    06:00 Migration Process

    06:25 Migrate: Easy First, Hardest Next

    09:37 Effective Migration Strategies for Tests

    10:23 Focusing Abstractions

    14:39 Optimize Test Code Migration

    15:44 Focus on Abstraction, Not Auto-Healing

    **1. Why Migrate—And When You Really Shouldn’t** Before any big move, Alan urges teams to get their “why” straight. Is your current tool unsupported? Is your framework truly incompatible, or are you missing some hidden potential? Migrate for the right reasons and make sure your decision isn’t just papering over problems that could follow you to the next tool.


    **2. Don’t Confuse Migration with a Rewrite** Too many teams treat migration like a rewrite—often with disastrous results. Alan emphasizes the importance of planning ahead, solving existing flakiness and coverage issues _before_ you move, and carefully evaluating all options (not just the shiny new tool you think you want).


    **3. The Secret Weapon: Abstraction Layers** The podcast’s biggest takeaway: Don’t migrate “test cases”—migrate _abstractions_. If your tests are full of direct calls like `webdriver.openPage()`, you’ve got work to do. Build out robust abstraction layers (think page objects or logical user flows) and keep your tests clean. When it comes time to migrate, you’ll only need to move those underlying layers, not thousands of individual test case scripts.


    **4. Taming Flakiness and the Risks of Retries** Migration is not the time to rely on self-healing tests or retries. Any test flakiness _must_ be rooted out and fixed before porting code. Bringing instability into a new stack only multiplies headaches later.


    **5. Harnessing AI—But Stay in Control** AI-assisted migration really shines at mapping old code to new, but Alan warns against “agentic” (hands-off) approaches. Use AI as a powerful tool, not as the driver—you need understanding and control to ensure things work reliably in CI/CD pipelines.


    **6. Learn Fast: Tackle the Hardest Stuff Early** Pro tip: Once you’re ready, start your migration with the simplest test, just to get going—then dive into the hardest, flakiest, most complex workflows. You’ll uncover potential blockers early and kick-start team learning.


    “We’re not migrating test cases when we change a tool. We’re migrating the physical interaction layer with our application... ”

    続きを読む 一部表示
    17 分
  • Building a Job-Hunting Portfolio for Software Development and Testing
    2025/09/18

    Should you have an online portfolio showcasing your Software Development and Testing skills to help get a job?

    It really depends on the recruitment process. But... if I'm recruiting, and you have a profile then I will have looked at it. So it better be good.

    Most Software Developers and Testers don't have public portfolios so that means you can really stand out.

    We'll cover the difference between different types of projects: A breakdown of project types: Learning Projects, Personal Projects, Portfolio Projects.

    Lots of tips on how to adjust your Github profile and promote your projects.

    00:00 Value of Portfolio

    02:59 Stand Out Skills

    09:19 Project Types

    12:27 Showcase Projects

    19:39 Promoting Yourself

    21:44 Final Advice

    続きを読む 一部表示
    23 分
  • Respect in Software Testing and Development
    2025/09/06

    Software Testing deserves respect. Doesn't it? But so does every role in Software Development: managers, testers, QA, programmers, Product, Everyone. This is for you.

    Ever feel like you’re not getting the respect that you deserve in your job? This episode dives deep into the topic of Respect in tech, especially focusing on software testing versus programming.

    We look at why some roles seem to earn more respect, what that means for workplace culture, and how you can change things for yourself and your team. Respect isn’t just about manners or titles - it’s about how the system works and how we show up in our roles.

    If you’ve worked in agile projects, you might have heard, "Everyone is a developer." But some roles seem to get more recognition than others. Is this because of how we define our jobs, or is it just baked into the way our workplaces run? This episode is a call to action, urging everyone to look at respect both at a personal, process and craft level.

    We’re breaking down the difference between self-respect, respect for others, and respect built into your team’s process. You'll see why just doing your job isn’t enough. You have to own your craft, communicate what you do, and make your contributions visible to earn genuine respect. By the end of this episode, you'll have practical steps to make respect part of your daily work, whether you’re writing code, testing, building products, or managing.


    00:00 Respect Dilemma

    02:41 Human Level Respect

    06:31 Self-Respect First

    10:17 Respect Cycle

    15:37 Knowledge Sharing

    18:53 Respectful Organizations

    21:26 Final Thoughts

    続きを読む 一部表示
    22 分
  • Software Testing Strategy vs Planning The Strategy Episode
    2025/08/07

    Software Testing typically confuses a Test Strategy Document with the process of strategising. Alan Richardson simplifies the over complicated world of test strategy. Drawing on years of experience creating test strategies and plans, Alan explains the real difference between strategy, approach, and plan. Explaining that what really matters isn’t following templates or writing elaborate documents, but actually thinking through problems, understanding risks, and communicating those ideas clearly.

    続きを読む 一部表示
    30 分
  • Software Testing Job Market with Jack Cole
    2025/06/28

    Are you trying to figure out how to break into the software testing job market or make your next big move? This episode of the Evil Tester Show dives deep into the realities of tech recruitment, job search strategies, and career planning for testers - with recruitment veteran Jack Cole from WEDOTech.uk - Whether you're an experienced Test manager, expert Tester, junior QA or even a programmer, Jack’s decades of Software Testing and Development industry experience will give you strategies and tips about what works in today’s competitive job seeking world.

    In this packed hour-long conversation, we cover everything from market trends, LinkedIn networking, and the recruitment pipeline, to building a career roadmap and even the AI hype machine. Grab your notebook, settle in, and get ready for real insights you can use – plus a few stories from the trenches and actionable tips for every step of your job hunt.

    続きを読む 一部表示
    1 時間
  • Practicing Software Testing - Guest James Lyndsay
    2025/03/18

    Software Testing is a skill and like all skills require practice, that's what makes you a practitioner of Software Testing. In this episode we're diving into the world of practice with the James Lyndsay.

    In this conversation, your host Alan Richardson chats with James about the essence of practice in software testing, exploring how exercises and real-world scenarios can enrich our skills. James shares insights on his weekly online practice sessions and the interactive Test Lab concept, offering a dynamic playground for testers.

    Discover how practice blends with rehearsal and learning, and delve into the intriguing intersection of testing and development. With firsthand experiences in software experiments, fencing, and scientific investigation, James and Alan discuss the art of modeling and exploring software systems. Whether you're refining your testing techniques or embracing new perspectives with AI, this episode offers a wealth of wisdom for testers at all levels.

    Join us as we learn, laugh, and explore the world of testing practice. We hope you find inspiration for your own practice sessions. Don't forget to check out James's resources at https://workroom-productions.com for more testing challenges and exercises.

    続きを読む 一部表示
    52 分
  • Context in Context Driven Software Testing
    2025/01/04

    Effective Software Testing is highly contextual: we adapt what we do to the project and the process.

    In this episode of The Evil Tester Show, host Alan Richardson describes context-driven testing. Is there really such a thing as context-driven testing, or is it just a phrase we use to describe our testing approach? Alan explores the intricacies of context in testing, discussing its evolving nature, the impact of context on testing practices, and the challenges in defining it.

    From the origins of the term by James Bach, Brian Marick, Brett Petichord, and Cem Kaner, to Alan’s personal insights on systems within systems and how context impacts our testing methodologies, this episode provides a comprehensive look at how context affects software testing. Alan also critiques the principles of context-driven testing and emphasizes the importance of adapting to projects without being swayed by ideologies.

    We explore how to navigate context in testing environments, adapt our approaches, and effectively challenge and evolve systems. Discover the importance of context-driven testing in software development, exploring models, adaptability, and useful practices.

    続きを読む 一部表示
    25 分