Excel as a Test Management Tool

Because you're going to use it anyway...

Ruby-Selenium Webdriver

In under 10 Minutes

%w or %W? Secrets revealed!

Delimited Input discussed in depth.

Managing Knowledge Transfer Sessions

Read the article, grab the templates

Ask questions

If you don't ask, you won't learn. You won't teach either.

Tuesday, 12 April 2016

Code for the Tool Free VBScript Automation Framework

Following on from the last post, where we looked at the design of a simple helper automation framework, now we look at the files and code.

Grab the files here: https://github.com/MarkCTest/script-bucket > git-test-001.zip and click on ‘View Raw’ to download the .zip file.

In this working example, we navigate and log-in to GitHub then take and save screen shot. In the background a test log is being written out into a text file for test evidence, along with the screen shot.

There are four files in this example:

1) git-test-controller.vbs
This file is used to control the flow of execution, provide the cored methods/subs and call in other needed files. This contains the following Sub (methods):

# Sub WaitForLoad
This Sub checks if IE is busy loading a webpage and if so, delays further execution for 500 milliseconds (0.5 seconds). This is far better than placing hard coded waits of explicit lengths in the hope they are sufficient for a process to finish, don’t do this!

# Sub Find
Used to read through the loaded webpage and read all elements of the page so we can use them later in our test cases.

# Sub Print
This writes out a line of text to the text-log.txt file object opened at the start of the script. Keeping a text log is a good way to trap errors and have test evidence that shows what steps were taken and how long they took.

# Sub Include
This is our magic method used to include all other files and tests needed for the framework. It is a technique I use often to build VBScript frameworks and mimics the Ruby Include method. In using this we can keep the length of the controller file much shorter and it is the key to having a modular framework.

You can make an argument that these subs should be in a separate file such as subs.vbs file, called in by the git-test-controller.vbs file. Feel free to refactor the framework with  that edit.

2) git-test-001.vbs
First we print to our test log file that the test has started and then print out that each step has started. We navigate to GitHub, hit the log-in button and log-in. Log-in credentials are pulled in from a file with would be under the ‘data’ element of our framework design, namely the creds.vbs that has the username and password. See below.

Once logged in we open Paint, bring the browser (with GitHub) to the fore so it’s in focus then take a screen shot, saving that down for test evidence. That’s achieved by calling screen-shot.vbs described below.

3) cred.vbs
A simple file that contains the credentials used to log-in to the system under test. You’ll see these are assigned to variables that are used in the git-test-001.vbs test script file. Replace the placeholder text with your own username and password.

4) screen-shot.vbs
Called in by LAF-Test-Controller.vbs, this takes a screen shot and saves it to the last save location Paint used. There is a big caveat with this file, as I mentioned in a YouTube video. Sometimes this doesn’t work. Have a look at the video here:

The single biggest thing to get right here is the name of the browser window. If it’s not correct it won’t be brought to the fore and your screen shot will be of whatever is the front/active window. Or the script will fail. Here’s the line and the title of the browser window.

WShShell.AppActivate "GitHub - Windows Internet Explorer"

You may find the title is different on your machine. For example "GitHub - Windows Internet Explorer provided by your IT team or company" is a common thing to see. Check it carefully.


Test Log
The test log is written out as a .txt file in the same location as your script, you can change the location and file name in the git-test-controller.vbs file. All being well, it should look something like this:



Screen Shot
The screen shot is saved in the last location that you saved something using Paint and is saved as testing.jpg, this is set in screen-shot.vbs and can be changed as needed.

That’s it, grab the code off GitHub, add your credentials to the creds.vbs and see how it works for you.

Mark.

Check out the first post for the design of this framework
http://cyreath.blogspot.co.uk/2016/04/tool-free-vbscript-automation-framework.html


------------------------------------------------------------------
Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.
------------------------------------------------------------------

Sunday, 10 April 2016

Tool Free VBScript Automation Framework Design

In a surprise turn I’ve been back to working with VBScript as my automation technology of choice on a recent engagement. Despite trying to move away from it, VBScript proves consistently easy to get going with and perfectly sufficient to achieve the automation or at least manumation activities that I need to carry out.

Not surprising, very often what’s needed is a little help to drive tests to a certain point, usually the same point via the same steps, so a lightweight VBScript automation framework with a few test cases is just the answer. Just the answer to addressing repetition, to addressing repetition, a lack of in-house tools, environments, budgets and so on. If you have a Windows system you have all the tools on board that you need, no messy installations and confusing set-up. Did I mention a great thing about VBScript is how accessible it is?

So what could a VBScript base automation framework look like? In this post we’ll look at the design, in the next we’ll look at the code.

Windows OS
As we’re running VBScript we’ll be running on a Windows box. I haven’t encountered any windows system that won’t run VBScript. If you know differently then be sure to let me know. For the diligent who’d like to test, add the following to a text file and save it as test.vbs, double-click and run it.

MsgBox "Well Well…" & VbCrLf & "I see an exciting future for the two of us.", vbOKOnly, "Result"

All being well you got a message box pop up. If so you’re good to go. If not, see above caveat.

Windows Script Host (WSH)
This is a host for scripts on the windows system, bet you didn’t guess that? It provides the ability to run batch like scripts on a Windows system, but with the capability to do much more. With WSH and VBScript there’s little on the system you can’t do something with. These in combination are a the key way to get automation done on a Windows system.

Core Elements
Naturally you can set-up your automation framework files in whatever way you prefer, but I suggest there are some basic coding / design rules to follow that just make good sense.
·         DRY – Don’t Repeat Yourself. Wherever it makes sense, anything that could be reused should be split out and put in a file or sub that can be called, not copy/pasted many times
·         Separation of Concerns – Split your framework files out into sections (files) that address separate areas of concern, we’ll see examples shortly

So what are the suggested core elements?

1) Controller File
We can think of a VBScript as a glorified batch script that gets executed all at once, let’s have a file that controls the flow of execution. This file set’s up some subroutines we’ll want to re-use (though maybe they should be split out actually…), it calls tests when needed and stops and starts testing.

2) Test Scripts
Each script we run should have its own file. These will be called by the Controller File. Following our two rules above, each script should do something unique. If there’s anything un-DRY then consider splitting this out into a utility script (see below) and calling it in your test script.

3) Data
Always get into the habit of splitting data out into separate files. Polluting your test script with reams of data just confuses things. Keep is simple smarty. The format of the data in our example here would be a simple .txt or .csv file for ease of use. However, using Excel is very common and we’ll look at that in a later post.

Utility Scripts
I also call these helper scripts. Essentially whenever you want some task carrying out that isn’t the test script proper, throw it in a utility script and call it from the test script. A good example is taking and saving a screen shot or writing out a text file.

Outputs
Not part of the set-up but part of the end result. These might include test logs and screen shots.

Here’s what the above would look like drawn up a diagram:



That’s the framework example, in the next post we’ll look at the code that foes behind this and in future posts we’ll look at a more complex version.


Mark.

Thursday, 7 April 2016

How Every Tester can do Performance Testing

Performance testing is often passed onto a 3rd party provider of testing services in its entirety. That is usually because the test team don’t feel they have the ability, experience or perhaps the tools to carry out the testing.

Yet, just like Security testing we can break Performance testing down into a set of discrete test types under the overall label of Performance. In doing this we give more opportunity for the test team to a level of Performance testing that draw on their understanding of the system or application under test.
 
Let’s take the example of Performance testing a website, as it’s easy to get access to those and practice the techniques described.
 
Most Performance testing is either benchmark because the site is new or comparative, because some changes have been made and we want to ensure the site is as performant as before. However, that covers performance from the user facing perspective. To get a complete picture we need to do Performance testing of the infrastructure too. This testing would include both the underlying infrastructure and connected network devices, plus the site exposed to users and the actions they perform.
 
In summary then we could break-down Performance testing to the following types:
 
Comparative Performance
• Response Time
• Throughput
• Resource Utilisation

Full System Performance
• Load
• Stress
• Soak
 
For the purposes of this post, I’m going to ignore the Full System Performance and suggest in this scenario we need to get a 3rd party in to help us out. The comparative Performance testing of the website however is perfectly doable by the test team. Let’s see what and how.

---

Response Time Comparison
The user’s perception of the time it takes the service to respond to a request they make, such as loading a web page or responding to a search, is the basis for Response Time comparison testing.
Measuring Response Time

Response time should be measured from the start of an action a user performs to when the results of that action are perceived to have completed, for some singular task. The measurement must be taken from when a state change is triggered by the start of an action such as clicking a link to navigate from one page to another, submitting a search string or confirming a filter they have just configured on data already returned.
 
For Services with a web front end, use the F12 developer tools in IE (for example) to monitor timings from request to completion.
 
1. Open IE, hit F12 and select ‘Network’, then click on the green > to record
 
 
2. Enter the target URL and capture the network information
3. Click on ‘Details’ and record the total time taken
 
 
 
Test Evidence
A timing in seconds should be taken and recorded as the result in the test case. Multiple time recordings are advisable to ensure there were not lulls or spikes in performance that skew the average result.
 
---

Throughput Comparison
This measure is the time it takes to perform a number of concurrent transactions. This could be performing a database search across multiple tables or generating a series of reports.
Measuring Throughput

Measuring Throughput from the user’s perspective is very similar to measuring response time, but in this case Throughput is concerned with measuring the time taken to perform several tasks at once. As with Response time, the measurement should be taken from the start of an action to its perceived end. A suitable action for Throughput might include the generation of weekly/monthly/yearly reports where data is drawn from multiple tables or calculations are performed on the data before a set of reports are produced.
 
Monitor system responses in the same way for Response Comparison above, but also include checks of dates and timings on artefacts or data produced as part of the test. In this way the user facing timings plus the system level timings can be analysed and a full end to end timing derived.

Test Evidence
Careful recording of the time taken to complete the task is needed, as with throughput tests it may not always be obvious when a task has completed. For example, if outputting a series of files, check the created date and time for the first and last files to ensure the total duration is known. Record the results in the relevant test cases, ideally of several runs as suggested for Response time.
 
---

Resource Utilisation
When the service is under a certain workload system resources will be used, e.g. processor, memory, disk and network I/O. It’s essential to assess what the expected level of usage is to ensure no unacceptable degradation in performance.

Measuring Resource Utilisation
Unlike Response and Throughput comparisons, Resource Utilisation measurement can only be done with tools on the test system that can capture the usage of system resources as tests take place. As testing will not generally need to prove the ability of the service to use resources directly, it’s expected this testing will be combined with the execution of other test types, such as Response and Throughput, to assess the use of resources when running agreed tests. Given this, the testing would ideally be done at the same time as Response and Throughput.
 
One example way to monitor resource usage is by using the Performance Monitoring tools in the Windows OS. To allow us to go back to the configuration of Monitors we set up it’s actually best to use Microsoft Management Console. Here’s how:

1. Open the Start/Windows search field and enter MMC to open the Microsoft Management Console
2. In MMC add Performance Monitor snap-in via File > Add/Remove Snap-in...
 


3. Load up the template .msc file that includes the suggested monitors by going to File > Open and adding the .msc file

To do this, save ta copy of the file on GitHub

4. The monitoring of system resources will start straight away.
5. To change the time scale that's being recorded; Right click on 'Performance Monitor', select 'Properties' and change the duration to slightly beyond the length of the test you're running.
 

Test Evidence
Where possible extracted logs and screen shots should be kept and added to the test case as test evidence. Some analysis will of the results will need to be done and as with other comparative test types several runs are suggested.

----

So there we go, it’s easy to do simple performance checks that can then inform the full system performance testing or stand on their own if that’s all you need.
 
Mark.

Wednesday, 6 April 2016

Selenium Webdriver with C# - Cheat Sheet

Hey All,

I've been on a client site where we're using Visual Studio, Selenium WebDriver and C# for not only web front end but more system level automation testing.

As part of getting tooled up and informed as to how our favourite tools work with C# the team and I put together a Cheat Sheet to get us all started quickly. I thought I'd share that with you in a brief post.

Be warned, completing the below takes maybe an hour to get set up and then about 2 to 3 days full on to go through the material. If you’re working, with family, etc. expect a week with great focus.

One of the biggest challenges with adopting a new technology set is simply getting started. How often do we wish for a guiding hand to get us through the first baby steps and off building tests? Well, if you're a Test Architect like me, pretty much all the time! I hope the below helps.

1. Install Selenium IDE on Firefox
No really. As I've said before the IDE is great for doing web page node discovery and grabbing those names, IDs, CSS classes, etc. quickly and easily. This allows you to do a rough Proof of Concept script to prove the automation flow and then export the Selenease commands as C# in this case.

You'll then strip out of the code the elements you want and discard the rest. The alternative is you can right-click, Inspect element and read the code. Just use the IDE.

Get it from the Chrome store or here:


2. Get Visual Studio
In order to structure and build out your C# code you'll want to grab a copy of Visual Studio. There are many flavours and if your company is a Microsoft house go get IT or whomever to provide you a copy. Failing that or if you're suffering budget restrictions you can grab a free version.

The best I've found is Visual Studio Community Edition. Once installed you'll need to sign-in with a Microsoft email, part of the universal account / ID approach they now use.

Get Community Edition from here:


3. Learn C# Basics
If you're new to C# then you'll need to learn a little. There's a great resource over on the Microsoft Virtual Academy which you can take for free:

I've been told the link can sometimes say the course has expired. If you see that, just hit YouTube: https://www.youtube.com/watch?v=bFdP3_TF7Ks 


4. Practice Selenium C#
If you want to jump straight in and not start mastering C# to get building out a framework at this point, then the site you want is this one: http://toolsqa.com/selenium-c-sharp/

Or possibly better still watch Learning Selenium Testing channel on YouTube:


5. Practice, Practice, Practice
Once you’re set-up and running with your first basic tests, be sure to practice practice and practice some more. Here’s some great sites to practice against:


If you need a book then get the only book out there that has pretty much all the answers you need in one place: Selenium Recipes by Zhimin Zhan

 Get the book



Good luck!

Mark.



Tuesday, 5 April 2016

My earliest computers

Gerald Weinberg recently posted about his earliest computers and some of the early influences that got him into computing. Check his post out here:

http://secretsofconsulting.blogspot.co.uk/2016/04/my-earliest-computers.html

That got me thinking about how I arrived here, at a 16+ year long career in software testing. Now clearly I arrived a bit later than Gerald so I can tell you I have never and no doubt will never use a slide rule. In truth I doubt I even know what one is really.

Being amazed by a calculator aside and the amazing things you could do with that (2318008) the earliest computing thing I remember was getting an Oric Atmos. I can't even recall how it was programmed. I do remember plugging it in and nothing appearing on screen. Then discovering we had to tune in the portable TV my Mum had bought me to see the stunning output this thing could generate.
Oric Atmos


The next marvel I encountered in junior school, the world changing ZX81. How many of you remember those things? My two friend Chris Duignan and Shweb Ali formed the CAD computer club and blasted our way through many lunch times typing in the printed programmes we got from computer magazines. The problem was they were copies of printouts done on thermal paper. Consequently, they never worked first time. A ; or : is very hard to see on copied thermal printouts! Larger programmes went onto the 48K RAM pack, so long as it didn't move accidently and lose all your work. 

Sinclair ZX81


Now at this point the home computing market started to introduce serious competition. Vying for attention at the same time where the Amstrad Commodore CPC464, Sinclair ZX Spectrum and the Commodore 64 if I remember correctly. My friends and I switched to the Spectrum camp pretty solidly.

Mine was a Spectrum 64 with those dandy rubber keys, so special.
Sinclair ZX Spectrum and the Commodore 64 



I also recall at some point getting my hands on a VIC 20. Can't remember what I ever did with this one though!

Vic 20
(http://www.old-computers.com/museum/photos/commodore_vic20_1.jpg)

From here on it was all "IBM" PC's as they used to get called. That was the way forward. Many a DOS disc load after and productivity was sky high. The time of course being spent on my first video game: Alone in the Dark

Ah, those were the days. I'm just glad they're over and I can hardly remember what I ever did with these things or used them for. Give me my Win 7 and 10 boxes, MS tech and Office with ethernet connection any day!

Mark