Excel as a Test Management Tool

Because you're going to use it anyway...

Ruby-Selenium Webdriver

In under 10 Minutes

%w or %W? Secrets revealed!

Delimited Input discussed in depth.

Managing Knowledge Transfer Sessions

Read the article, grab the templates

Ask questions

If you don't ask, you won't learn. You won't teach either.

Tuesday, 31 March 2015

Peak-End or How to deliver in then leave a testing role.


The curious nature of our monkey brains never ceases to amaze me. One model for our thinking that caught my attention a few years ago was an idea called Peak-End. It’s the simple idea that we remember only the peak of any experience and the end. There is a complication here, what we remember is the memory of the event, not the experience itself. There’s been a lot said about the remembering self and the experiencing self, I’ll let you Google it.

The significance here is that your employer is also an owner of a monkey brain which possesses a remembering and experiencing self. You might want to keep that concept to yourself though.

We’d like to think that when we leave a role, the employer or client will remember the experience of us working for them. We hope they’ll be mindful of the many days we delivered consistently, provided all those reports on time, finished the testing on time every time, worked late, etc. I’m here to tell you they won’t.

What they will remember is the big event, the highpoint or possibly the low point. Think back to your prior employments and contract engagements, what do you remember? I remember big launches, major defects, moments when unexpected change happened. We do this because everything else, all those day to day things that were just getting-the-job-done were just like any other day and nothing to really remember. They were not a significant experience to remember.

That will happen, the core to a good delivery is the consistent and steady achievement of the task you’ve been set. If we do nothing else, we need to achieve what we were asked to do.

However, we’re looking to be remembered, right? To do that we need to work the Peak-End rule a little.

Do one thing amazing, then celebrate success
During a project or contract engagement work hard to do at least one thing really, really, really well. Do it so well that it re-shapes your thinking about how that one thing is done. Push yourself, put the hours in, do the research, deliver something no one will forget. Do this and you’ll create a peak in the experience that will form part of the ‘remembering’ later on.
This could be anything, finding an show stopping defect, finishing all the tests when it looked impossible, fixes a deployment issues and safeguarding the release. The best tactic is just to keep doing the best you can, keep pushing for excellence and you will hit on that one amazing thing.

Next, celebrate success. It’s no good doing awesome if no one recognises it for that it is. SPINACH – Self Promotion is Not a Crime Here :) If you’re a manager call out the successes of your team, if you’re a team member make your great work known. Maybe that’s to the team in stand-ups, weekly reports or conversations with your manager.

End on a Crescendo
Eventually, we all leave jobs or engagements. If you’re getting pushed off a role or leaving on your own, treat it the same and push yourself to do good work even more.

As we know about Peak-End, we know it’s critical to end well. You must end on a high, even if that’s just a really solid wrap up and handover. How many people have left a role with a half-arsed handover? What do you think of that looking back? Not positive I guess and likely clouds your opinion of that person.

Now if you can, think of someone that left on a high, doing great work, still putting in (even more) energy. What do you think of them? Someone you’d like to hire again or work with in the future? That’s the value of the End being strong and not just fizzling weakly out.

Hey, all’s well, that ends well, as they say.

Mark.

 ------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------






Image stolen from - http://www.smidswater.nl/blog/ikea-bespeelt-ons-perfect/

Monday, 30 March 2015

Actually, there ARE Best Practices.

Being the contrarian that I am, I of course think there ARE best practices. However, as is often the case, it comes down to a matter of what’s meant by it and the position you take on using the phrase. I’ll admit before the below, I've stated there are no best practices, but I was lying slightly.

As I recall it, about 5 or 6 years ago, when the re-thinking around Best Practice kicked-off in the test community, it was timely. We were definitely in an era when the Big Consultancies had to be given a firm ‘No’ to the restrictive tools and process models that were becoming the norm. I don’t think it’s an overstatement to say if the push-back hadn't happened, we’d be under the heel of consultancies driving the definition of Best Practice and rolling out the tools to carry it out.

It was already happening, both HP and Microsoft had their respective tools and companies like PWC were adding in processes they would impose on engagements. Best Practice had become defined by whatever those with the strongest grip on senior management said it was. What they said it was; was a way to ensure success. What it actually was; was whatever increased their footprint and made money of course. It wont take you long to recall from memory glorious failures of the big consultancies. A quick Google search will turn up more.

Sadly even within the test community itself it looked like that was happening too. ISTQB splitting out into a set of exams was starting to position itself as the de facto exam set that defined the testing profession’s education. Some luminaries of the testing profession helped put the original Foundation exam together, then stepped back and started to attack it when they saw what the real agenda might be.

The backlash and war cries were understandable. Every revolution starts with trashing the old ways and thinking and attempts to reshape or reclaim certain ideas and words. 

However, often times the thing being fought against are still there. You can shout “There’s no such thing as best practice!” all you like, but there still will be and the world will still roll on stating it’s so. Even though I recall blogging the opposite about 9 months ago. There’s a caveat to this argument as always.

Intellectual Vs Pragmatic
You have a choice as a consultant and a testing professional. You can fight and resist or you can go with the flow and subvert the thinking to achieve your aims. I think that’s in the Art of War or something. Feel free to hit the forums and blogs and engage in the robust intellectual arguments that go on about best practice, testing v checking, waterfall over scrum over agile over whatever this week. In fact, please do. Don’t be intellectually lazy. 

You HAVE TO understand the various viewpoints to get good at what we do.

As a test professional, you have an obligation to yourself and the community to understand as thoroughly as possible, the arguments and thinking, so you can apply them in practice. You have to gain as much understanding as possible and then subvert it for your own aims.

You also have an obligation to be pragmatic.

Taking a rigid, positional stance, that there is no best practice is where this line of argument hits the rails for me. When I go and meet a potential client for the first time, (assuming I’m not hearing otherwise from them) you better believe I talk about best practice, I say testing when I mean checking, heavyweight agile and more. If I didn't, I’d be out of the room before the conversation got started, never mind getting engaged to deliver. You want to open client discussions or berate an existing client for using terms we know are inaccurate? Bye bye job prospects.

You better get good at hearing what people think they mean when they say best practice, testing, performance, security, etc. Find that out by asking probing questions. You might have to talk in terms of best practice, perhaps you can avoid doing so. Later on, when you have their confidence, then is the time to bust out the real truth of matters and reveal the rich depth of nuance. Be pragmatic, you can win the argument and change people’s minds only when you've secured a relationship with them first.

So, what’s Best Practice?
Best Practice is anything that’s been tried and proven to work. That’s it. You know about writing a test plan, test schedule, test cases, checklists, raising defects, tracking progress, regression testing, issuing a test completion report, managing a backlog, creating a burn-down, etc? You know how these things are pretty much approached in the same way, give or take a few details? Best Practice.

Best Practice is a model, an approach to a problem that is known to make it more likely you’ll resolve the problem or complete the task.

Did you notice what I omitted there? What you shouldn't be looking for is prescriptive detail on top of this. The exact content of status report, the precise steps to perform analysis, the specific choice of colours for reporting. It’s nonsense to think it could be any other way, not even IEEE 829 or the new IEEE 29119 go that far, so stop looking.

Best Practice is what is in your box of tricks, that you've learned to apply over the years and that you know how to modify to fit the client’s needs.

You are building that personal tool kit right? It’s best practice don’t you know.


Mark.

------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------


Sunday, 29 March 2015

Understand the context of the Testing Problem

In various other posts and papers, I've said that testing is testing. That if we’re suitably skilled and experienced in our profession, then we can test in most any environment and industry. I believe this remains as true as when I first said it.

To quote directly from the FX Kickstart for Testers paper;

As a competent test professional you’ll know that fundamentally you can test in any domain. That domain could be travel, healthcare, engineering, retail, etc. and of course finance. Ultimately we can say with sincerity, it’s all software testing and the skills and knowledge of our profession that we possess, apply no matter what the domain.

However, there are two very real caveats with this perspective that will limit our effectiveness in a given domain until overcome:

  • We will initially lack understanding of the testing challenges in the domain, thereby limiting our application of contextually relevant testing techniques and approaches
  • The domain will have unique vocabulary and concepts that must be understood if we are to translate its meaning and communicate effectively
Once we address these caveats we will understand the unique testing needs of the domain in a contextually relevant way and be able to communicate with our colleagues effectively.

The point is, to deliver effective and pragmatic testing we need to understand the client’s business. Let’s look more closely at the caveats above.

A lack understanding of the testing challenges in the domain
A mistake inexperienced consultants make, is to roll into a client’s site with a pre-concieved idea of what a testing solution will look like, yet they have scant knowledge of the problems the client are facing.

We see this with many of the larger consultancies. Their team arrive with all the answers already cooked up. The direction a solution will move in, the shape it will take, tools to be used, etc. Are all mostly known in advance. That’s not always a bad thing. If it’s known the testing problem is  the integration of new development with existing systems, then certain approaches for this will be known. Maybe the problem is specific, a lack of security testing for example. In which case there are some industry standard, yes even best practice, approaches to take.

On a new engagement it’s important to take the time to put off solutionising and ensure we’re clear on such things as:
  • The nature of the problem is known
  • What problems are being caused to current business operations, what limitations it is causing
  • What is limiting the client from solving the problem themselves
  • What attempts have been made to solve the problem and what the outcome was
  • Timescales for the problem needing solving and why, is anything dependent on the test work
Unique Vocabulary and Concepts
I stated above this applied to our ability to understand the language of the business to ensure clear communication, in a dialect the business would understand. It’s more than that however. Just saying an environment is System Integration instead of Development Integration or the other way around is not the big issue. Though it certainly helps to get that right too.

The big issue is with the concepts and I would expand that to be quite inclusive; the ideas, ways of thinking, ingrained culture, internal politics, external pressures, etc. Understanding these are just as important to the success of your consulting engagement as knowing what the testing problem actually is and how they talk-testing in the business.

We should check if we’re in need to understand other aspects such as:
  • What kind of environment the client has in place culturally; open, communicative, blaming
  • How they work technically; adaptive / descriptive, predictive / prescriptive, hybrid
  • If they are subject to professional or legal compliance and auditing
  • Any issues with public perception or on a branch / regional basis
Have a look at the free preview pages for Flawless Consulting, scroll down to where it says 'Consulting Skills Preview'. This is valuable additional reading.

Testing is just like – software testing. We can be confident in that.

What changes is the setting in which the testing problem we need to solve exists. To ensure our testing is relevant and effective, we serve the client best when we understand the context of their testing problem.

Mark.
------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------



Replacing QA with 100% automation

Replacing QA with 100% automation

A question came up on one of the forums recently, that echoes a question I and I’m sure you’ve been asked at some point. That question being along the lines of reaching 100% automation. Groan… yes, that old chestnut.

As experienced testers we already know the answer, for the avoidance of doubt the answer is: No.

Thanks for reading, now on with my day!

Mark.


OK, OK… we wish the ill-informed question could be dismissed so easily.

However, though we know its asked by the ill-informed, we can’t blame people for asking. To senior management, non-technical, non-testing staff it’s a reasonable question. It’s our job to respond in a way that ensures we leave them informed.

To whit… if asked, here’s a few thoughts on why we can’t achieve 100% coverage via automated testing. (We’ll skip the QA Vs Testing perspective for now, hey they’re ill-informed, just roll with it).

You could also send them this link of Case Studies on Automation, by Dorothy Graham (RTFM…? :)

Coverage
It isn’t possible to achieve 100% test coverage of the application/system with manual testing, so strictly it can’t be possible to do the same with automation. Though interestingly we might expand coverage…

Technical Limitations
There are aspects of the system you can check and test manually, that can’t be automated ‘technically’. Examples include Accessibility and Usabilty, others include complex end-to-end scenarios that might require exploratory or dynamic testing dependent on variable results.

Automation of what?
The different levels of testing (unit, integration, system, etc.) mean we’d need to have suits of different automation tests, focusing on different aspects of the system / application.

Tools and Team Skills
If we wanted to ‘automate everything’ we’d need a host of tools; web front end, application interface, API testing, webserver/DB/CDN testing. Maybe 100% automation is just one of these?

Assuming we define what 100% means in this context, do we even have the tools in place or the staff who know how to use them? If not, there will be some manual testing going on.

Don’t have time / money
The Return on Investment from automating everything just won’t be there. So, even if we had the tools, time, skills and a limited automation remit – it’s highly unlikely the time and money needed would be worth it. Automating all of a payment system that’ll be there for 10 years is more likely than automating a website that’ll be trashed in 6 months.

What other reasons can you think of?


Mark.

------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------


Site Update - Testing and Technology Library

Just a quick update, as it would be easy to miss...

The eagle eyed among you might have noticed I've redesigned the Books page of the main website, with a similar widget appearing on the right of the Blog.

Instead of a few books I've reviewed and making it not worth visiting, the Books section is now a Library of Testing books with a smattering of Technology that’ll help with those testing jobs you get landed with.

What does it contain?
The Library is a collection of books that either a) I’ve read over the years or b) come with strong recommendations. As such, it’s not just a bucket full of books, you should recognize them. For example, check out the book by Lisa on Agile testing or the one by William Perry – recognize those?

Good, all items in the Library should be ones that have proven themselves in our profession. That applies to both the books and the hardware, have you got one of these or one of these on your desk? I rest my case then ;)
  • Do you have a favourite book you want included here?
  • Do you have anything good or bad to say about the books in my Library?

Leave a comment and let me know!


Mark,

P.S. Challenge - Join Goodreads and set yourself a 2015 Reading Challenge

------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------





Saturday, 28 March 2015

Testing Books - 4 of the classics



There are a few 'classic' books about software testing that should be on every serious tester's shelf. Though written a few years ago, they still contain timely advice that's as applicable now as it was when first published.

Agile testing
I would suggest that if you read nothing else, you should be reading through Agile Testing: A Practical Guide for Testers and Agile Teams, by Lisa Crispin and Janet Gregory.

Since its first publication the book has had rave reviews and Janet and Lisa remain today as leading lights in defining what agile testing is and how it can be applied to projects.

As the blurb for the book states, readers will come away from this book understanding
  • How to get testers engaged in agile development
  • Where testers and QA managers fit on an agile team
  • What to look for when hiring an agile tester
  • How to transition from a traditional cycle to agile development
  • How to complete testing activities in short iterations
  • How to use tests to successfully guide development
  • How to overcome barriers to test automation
The book is not just a wall of text, but provides a number of diagrams and mind-maps you’ll keep referring to time and again. The authors draw on their experience greatly, sharing examples of situations they’ve encountered and what they did to overcome them.

Grab a copy of Agile Testing: A Practical Guide for Testers and Agile Teams

Lessons Learned
If you're aware of the work of Michael Bolton, James Bach and Cem Kaner around contextual and rapid software testing, then you should already have read the book Lessons Learned in Software Testing. Written by Cem Kaner, James Bach and Bret Pettichord you're guaranteed it's packed full of useful information and ideas.

Broken down into 293 lessons, the book allows you to dip in and read just what's of interest at that time. Think of this as a collection of related essays, where you can read around one topic or read across topics to get a fuller sense of a particular area.

Whether you're a veteran tester or just starting out, this is a great reference book to have on your shelf.
Get your copy of Lessons Learned in Software Testing

Test Design
It feels like this book by Lee Copeland has been around forever. When it was published, it pretty much became the oracle source on test design. A Practitioner's Guide to Software Test Design remains one of the most comprehensive books about test design, a critical skill that we need to apply almost every day.

While the book does focus on the IEEE 829 view of the world, this is no bad thing. Copeland uses that as a framework around which to provide a background to the testing context and then move onto design techniques and their effects in a way that is extremely applicable.

If you ever wanted to really apply Equivalence Class Partitioning, Boundary Value Analysis, Domain Analysis, Use Case Testing and other powerful techniques, this is the book for you.

Upgrade your test design skills by getting a copy of  A Practitioner's Guide to Software Test Design today.


Systematic Software Testing
I don't believe this book is as well known as it should be, like Copeland's book, this is a a heavyweight gem to be sat on all tester's shelves. Unlike Copeland's book that focuses on test design for execution, Systematic Software Testing focuses on the process and practices fundamentals of software testing.

The book focuses on process and practive, makes reference to the IEEE standards but then applies years of experience and know how to subvert that process and show you how to really build a testing practice that'll work.

There's information on the information you'll need, people and documents. The author relates the testing process and practice to the wider development context and ensures the reader is set to deliver well technically as well as a member of the business.

Another must read, so get Systematic Software Testing and round off your understanding.

Mark.


------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------



Excel - Best. Test. Management. Tool. Ever.

OK, I may be talking that one up, Excel is not the “best test management tool ever”, but I bet you either of the following ring true:
  •          It’s as used a most of the big name test management tools
  •          It’s what gets rolled out when there is no tool in place
  •          You just did some test/testing management stuff with it recently
There are a number of good quality, free and low cost test management tools out there, but even with that we’ll end up using whatever;
  •          The customer already has in place, paid/free/other
  •          Is easiest for the test team and stakeholder community to work with
I’ll bet you again that where a tool isn’t in place that, whatever-test-management-tool-is-easiest-to-work-with, is Excel.

Given this, we need to make sure we’re up to speed on how to make useful Excel documents that can provide, capture and report testing information and status.


Let me emphasise this again, you ARE going to use Excel extensively in your testing career – so invest some time learning it! Now, this may use may not be so prevalent in a more agilistic environment, but don’t bank it. Just because “we’re agile here” doesn’t mean the definition of what tests will be run and what the outcome was suddenly goes away.

Use it for what?
What might we used Excel for?

  • Test Cases – defining them and capturing test run information
  •  Coverage Matrix – defining system coverage by test cases and tracking status of coverage
  • Test Status Tracker – a dashboard of status for test preparation and execution
  • Status Dashboard – analyse data, conditions, status, etc. and present a view to the world
  • Burn Down Dashboard – Number of test cases over time against actual? Excel will map this too
The above might be separate Excel workbooks or combined, they might be cross linked or standalone. You decide what’s best.

As always, there’s no hard and fast rule of what you’ll need to use or how you’ll use it. Take the templates and ideas as just that, then build them out to meet the specific needs of your software testing project.

But why EXCEL?
Most of what we deal with is the presentation, capturing and representation of data with some guidance narrative text. That its best in Excel. Sure, you could have Test Cases in Word for example. But how are you going to analyse the numbers and conditions of the data sets? How will you create pretty charts and dashboards?

Just stick with Excel and use other tools for the management extracts you might be called on to provide for the senior levels in the business.

Problems…
Using Excel is not without issues. The biggest problem is you’ll most likely en up emailing the document around. Uh oh… filling everyone’s email box up and worst of all - version control nightmare. My advice is don’t do it. Stick it on a shared drive or better still use some extra tool like SharePoint, assuming the client has this and they’re a Microsoft house. You can at least check-out the document then so people know you’re editing it.

Middle ground is have the document on a shared network drive and use the ‘Share Workbook’ feature. You might what to practice using this before doing so in anger.

Be VERY careful clicking that orange highlighted check box...

At worse, stick with an agreed document versioning method and sort out How You Do Documents™. Have a look at my paper on that very topic: A Documentation Model

Where to Get Excel
So, you’re sat here starry eyed at the thought of all the amazing Excel in your future, but what’s that, you don’t have access to a copy?! I would be stunned if this was the case these days. Apple computers haven’t taken over like Windows and office is usually there somewhere on a Windows machine.

If not… then hit www.live.com or possibly www.outlook.com and sign up for an account. With that you’ll have online access to the office suite and email.

If you need a personal install, go grab a copy from Amazon, using my affiliate link of course (you get extra karma points for that).



Templates
Be sure to have a look over the site and review the various Excel templates available there:


Have fun!

Mark

------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------



Friday, 27 March 2015

Keep what you’ve got, by giving it all away


I remember working at an electronics manufacturer many years ago. The managing director there was an interesting guy, interesting in that his attitude was a little bit back in the 1970s.

One conversation I had with him was about training. My argument was that we should implement a simple training course, more a skills check, for the staff on the line. The idea being that we could ensure a consistency of approach and spot any training needs. After all, getting staff into the business, a new start up at that time, then trained on the products, was a big investment. It was surely wise to ensure everyone was around the same skill level for the core skills.

His response was straight out of the How Not to Manage handbook. Yep, “If we train them, they could leave.” As the retort goes, “… and if we don’t train them and they stay?”
This attitude has no place in the modern professional environment. It’s a small minded and fearful attitude. One that reeks of ‘lack’ instead of ‘abundance’. It’s an attitude that if applied to other matters, will help kill off any team, business and even personal life. When consulting or working for an employer in the software testing field, this attitude will be your death knell.

Small Minded Consultants
I recently encountered a similar attitude to this again, in my current testing consultancy role.

The scenario here was a conversation with another consultant. I proposed that we should collaborate on some White Papers, proposing service offerings our client could deliver. Showing them how to structure the services, tools needed, likely costs, blockers and enablers. All you’d expect in something that was essentially a business proposal.

The response from the consultant, 15+ years on since the encounter at the electronics company, was like hearing an echo. This time the complaint was that if we show them what they could do in such detail, they wouldn’t need us, they would go off and do it themselves.

Why this is wrong, wrong
This is an attitude of fear, of small mindedness and a reflection that this person’s consultative mindset is way off.

First off, I absolutely believe you cannot keep secret everything you know, in an attempt to keep it rarified, believing it will keep you employable. In doing so you, if it’s something new for your client, you will fail to ignite interest in what you can do, fail to place yourself amongst those that are known to know about whatever it is you’re hiding. That means you can’t engage them and have them pay you for the work.

If you have a business, would you hide your products and services or describe them as fully as possible to engage potential clients? Why is it any different when you’re an IT contractor, especially if you’re working for a testing consultancy? It’s not. You need to help your client understand their options and what you can deliver. Then engage them in that and bill them.
By being open and giving away what you know, you have the potential to get more back.

What, not How.
There is a caveat here though, a little nuance in the approach that any wise consultant will understand.

When schooling your client in whatever it is you believe they may be interested in and that you could deliver – you need to tell them WHAT it is, WHAT the benefits are, etc. You don’t want to tell them HOW. That Know How combined with prior experience is why you’re needed. This is the part of the game that’s understood. Every company shares WHAT it has on offer, the exact How To is why we need them afterwards.

Don’t worry about this approach. When you present something your client is interested in, explaining your prior successes and how you can help guarantee their success – of course the first person they will look to hire to do this thing, is you. Any smart business owner will at any rate. If they’re not smart, then you’re on a route to nothing presenting ideas and opportunities anyway!


Keep an attitude of abundance, not scarcity. And share what you know, tell of the benefits and your experience, give your client options to explore; then wait for it to come back around to you.

Mark

------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------



Image src cleverchameleonconcepts.com
Via a Free to Share and Use image search filter on bing.com




Monday, 2 March 2015

Next Gen DevOps and Software Testing

In his book, Next Gen DevOps, Grant outlines the historic path along which DevOps emerged and then describes how the way it is currently performed is fundamentally flawed. He describes a number of commonly experienced frustrations and inhibitors, both internal and external to the DevOps team, which people in other IT areas will unfortunately recognise. He shows how these impact the ability of the DevOps movement and its practitioners to drive forward their vision of what DevOps would ideally evolve into. Common issues other practice areas will recognise include; a lack of understanding of DevOps resourcing profiles by HR, a verbal agreement to the concepts of DevOps by senior management, but a fear to commit to the corporate and operational change needed to realise the vision, a continuing siloed approach that prevents the establishment of cross-functional, integrative product-based teams that are central to Grant’s view of modern DevOps practice.

Much of what Grant outlines in his book will ring familiar to software test practitioners. I and others have long espoused the value and indeed the criticality, of positioning software test practitioners as an embedded part of the ‘application development team’, ensuring cross-team process synergy1. A term I use in preference to names such as Dev team, the Test team, Ops team, Support team, etc. which serve only to reinforce the ‘silo’ us-and-them perspective. The concept of having these as teams who operate in a non-integrated way is less and less meaningful in context of today’s perspectives on efficient development approaches. Clearly defined practice domains remain important, the sheer scale of today’s IT profession requires a level of specialisation, but this is not the same as being siloed.

Taking this further, as we mature the adaptive, pragmatic, delivery focused approaches that are justifiably popular at this time, and possibly emerge into a post-Agile paradigm, approaches that were established in an era where predictive development models were overlaid onto functionaly siloed teams are, I would suggest, as good as irrelevant.

Services or Products?
Except for the most trivial of application development related work, it simply isn’t possible to deliver anything meaningful, from a technology, business or market perspective, without cross-domain collaboration. This is true because of two key factors; a) large scale application complexity is now so high, that no single person or team can perform all practices effectively and, b) the integrated nature of technology and cross-over of practice areas means that practitioners in one field will already be working in a cross-domain manner.

However, we also need to shift perspectives and come back to another key point in Grant’s Next Gen DevOps book, that rings true for us as software test practitioners and ask; are we delivering a discrete set of software testing services in support of some application development work or are we providing a suite of testing practices, alongside other practice areas, in broader support for the delivery of a software product requested by the business?

If we think more broadly than our technology domains, considering also other domains across the wider business and reflecting on why we’re employed by the business, it will be evident that we’re not really engaged in testing some development output, but instead we’re testing an aspect of a product the business has requested. With even a trivial reflection, it’s apparent that all of these practice areas for given domains need to be drawn together to support not just application development, but to support product development from conception to retirement, in context not of the Software Development Life Cycle, but instead of the Product Development Life Cycle2.

Conclusion
In conclusion on these limited points; while Next Gen DevOps is proposed as a model for DevOps, it discusses many concepts that run parallel to our area of concern, that of the role of software testing practice in the broader business context when delivering software products requested by the business.

Mark.

--------------------------
Learn More...

You can get a copy of Grants book on Amazon Store
If you're on twitter, follow Grant at: https://twitter.com/grjsmith with hashtag #DevOps
While you're about it, pay a visit to his website over at: http://nextgendevops.com/ 

--------------------------



------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------

References

[1] Crowther, Mark, (2005) “Cross Team Process Synergy” [online] Available at: http://cyreath.co.uk/papers/Cyreath_Cross_Team_Process_Synergy.pdf [Accessed 02-mar-15]

[2] Crowther, Mark, (2009) “Life Cycles – Course 1, Session 1”, pp. 3-4