Wednesday, 23 July 2014

02:10 | by Dragan Mestrovik | Categories: | No comments
Technical recruiters are always looking for automation engineers or testers that can deliver valuable contribution to the company saving lot of time and resources. Although different company uses different approach to recruit automation engineers, nowadays most of the companies are following the model that Google has implemented successfully to hire the engineers for their need.
As explained in “How Google Tests Software”,  James Whittaker, Jason Arbon and Jeff Carollo has described the three roles that Google have created making the engineers responsible while making productive and more quality-minded which are described below.
  • Software Engineer (SWE) : SWE’s play the role of tradition developer  writing functional code that is dispatched to the user. They also create design documentation; choose data structures and overall architecture of the automation framework. They are involved in writing test code, test-driven design, unit tests and they own quality for everything they touch whether they wrote it, fixed it, or modified it.
  • Software Engineer in Test (SET): SET also wears the hat of developer role but focuses on testability and general test infrastructure. SET’s review designs and looks closely at code quality and risk. They are more involved in refactoring code to make it more testable and write unit testing frameworks and automation.
  • Test Engineers (TE): TE is similar to SET role but focuses on testing on behalf of user first and developers the second. They are involved in writing the code that drives usage scenarios mimicking the user.  In short, TE’s are product experts, quality advisers and analyzers of risk.
Although the definition of three roles seems self-explanatory and we are more interested the later two roles, lot of people including myself are always wondering what engineering hat do we wear every day.
To answer the question,  James Whittaker, Jason Arbon and Jeff Carollo has summarized the list of questions that could help you decide whether you belong to the SET or TE.
You might be a SET If
  • You can take a specification, a clean whiteboard, and code up a solid and efficient solution.
  • When you code, you guiltily think of all the unit tests you should be writing. Then, you end up thinking of all the ways to generate the test code and validation instead of hand crafting each unit test.
  • You think an end user is someone making an API call.
  •  You get cranky if you look at a poorly written API documentation, but sometimes forget why the API is interesting in the first place.
  • You find yourself geeking out with people about optimizations in code or about looking for race conditions.
  • You prefer to communicate with other human beings via IRC or comments in check-ins.
  • You prefer a command line to a GUI and rarely touch the mouse.
  • You dream of machines executing your code across thousands of machines, beating up algorithms, and testing algorithms–showing their correctness through sheer numbers of CPU cycles and net- work packets.
  • You have never noticed or customized your desktop background. Seeing compiler warnings makes you anxious.
  • When asked to test a product, you open up the source code and start thinking about what needs to be mocked out.
  •  Your idea of leadership is to build out a great low-level unit test framework that everyone leverages or is highly exercised millions of times a day by a test server.
  • When asked if the product is ready to ship, you might just say, “All tests are passing.”
You might be a TE if
  • You can take existing code, look for errors, and immediately under- stand the likely failure modes of that software, but don’t much care about coding it from scratch or making the change. You prefer reading Slashdot or to reading other people’s code all day.
  • You read a spec for a product that is half-baked, you take it upon yourself to fill in all the gaps, and just merge this into the docu- ment.
  •  You dream of working on a product that makes a huge impact on people’s lives, and people recognize the product you work on.
  • You find yourself appalled by some websites’ UI and wonder how they could ever have users.
  • You get excited about visualizing data.
  • You find yourself wanting to talk to humans in meat space. You don’t understand why you have to type “i” to start typing in a certain text editor.
  • Your idea of leadership is nurturing other engineers’ ideas and challenging their ideas with an order of magnitude more scale.
  • When asked if the product is ready to ship, you might say, “I think it’s ready.”
02:09 | by Dragan Mestrovik | Categories: | No comments
As technology has changed drastically, the importance of Automation has also propagated exponentially. As per, about 1 million browser tests run every day in 50,000 VM’s. If you just think about it, that’s a lot of automated tests in one day but with the cloud computing business scale that sales force provides, it can definitely be a necessity for the company. During the Selenium meet-up at Sanjose, Greg  Wester from stated that the quality assurance at sales force is entirely dependent on automation that means no manual testing is done at all except few exceptions.
Now and then, I have been asked from my fellow colleagues about the best programming language to learn automation or to apply automation at work. My immediate answer always tends to be: “It depends”. It’s a complicated question and depends on lot of factors which I have attempted to describe below.
  1. Tool:  All automation tools are developed by certain programming language and therefore there is need to learn the programming language that the tools support. For instance, if you are using Quick Test Pro (QTP), the primary language that you need to learn is VB Script. Similarly, if you are using Test Studio, then you will need to learn C# or VB.NET. However if you use Selenium then you have variety of options to choose from which are C#, Java, PHP, Python, Ruby and JavaScript. Each language has its own advantages and disadvantages so it’s very hard to decide which one is better than another.
  2. Project Framework:  Although this factor can be very debatable but still many companies tend to use the same language that the developers use to develop the application. In most companies, Java is the commonly used language and therefore the QA Managers tend to use the same programming language with the concept that there is help when needed from the developers. However some QA teams tend to use different language that they think suits the most disregarding the language that the company uses to design the application.
  3. Team Knowledge: In my opinion, this factor plays a great role on determining the tool and the language for automation. If all or many QA Automation Engineers feels comfortable with any particular language, then that particular language should be opted for automation. Any new Automation engineers can be recruited with the language that was selected by the team. Personally, I have worked with many different teams which used variety of languages such as C#, Java and Ruby as the team opted for the language that they feel the whole team was comfortable.
  4. Automation Framework Support: Another factor to consider a programming language for automation is the support availability. For instance, since Java is the most commonly used language for automation, it has a significant amount of Selenium Java users and hence more automation support in Selenium. On the other hand, Python along with C# probably has the least automation support for Selenium so it will be very hard to find help if needed. Unless the automation engineers are expert in certain programming language, it is very important to consider the automation tool support especially for Open source tools such as selenium while deciding the programming language for automation.
If you consider all the factors highlighted above, it can be very useful for deciding the favorable programming language for the team. Since people favor different programming language, there is not a single jackpot winner programming language that satisfies all your needs.

Tuesday, 22 July 2014

23:28 | by Dragan Mestrovik | Categories: , | No comments
I've built up quite a comprehensive list of things to think about when test planning. I usually spend an hour or so going through these with the project manager at the start of the project to make sure we have a shared understanding (in my experience the PMs tend to find this really useful).
NB I work for a fairly small organisation who take on a wide variety of development projects, so the testing needs are often different for each project - hence we need to ask quite a few questions each time.

  • What's the project?
  • Why is this project important to the customer - what are their goals and priorities?
  • What's not in scope? (Anything that we are not planning to test)

  • Do we have wireframes, acceptance criteria broken down by story etc?
  • Can we use static analysis tools?
  • What's the code coverage target for unit testing?
  • What are the main integration points with internal or external systems which might need particular integration testing? e.g. emails, payment providers, data migration etc
  • What are the most important high-level functional and non-functional requirements for system testing? (e.g. performance and reliability might be particularly critical for a certain system, or the user must be able to make a booking, etc)
  • Non-functional requirements checklist: do we need to specifically test any of the following? (This is probably the most important and useful aspect of our test planning, as we often uncover unclear or implicit requirements!): security, accessibility, usability, performance, reliability, software/hardware compatibility (e.g. browsers, OS, mobile devices), resource usage (memory/CPU/battery), installation, backup & restore, maintainability (logging etc)
  • What's the plan for UAT?

Test Environment
  • How are we going to get realistic test data? (This is probably the second most useful aspect of our test planning as it can be a challenge, but is also really important, so needs early planning)
  • What CI/test/staging environments are we going to use?

Schedule, Budget & Reporting
  • What test deliverables and reports do we need to give to the customer?
  • What's the budget, how will progress be tracked?
  • Has the tester been invited to all the relevant team meetings?
  • What are the key test/release milestones?

Risks & dependencies
  • What are the project risks we're aware of? e.g. unrealistic timelines at the end of the project, missing 3rd party dependencies blocking integration testing
  • What are the product risks we're aware of? e.g. areas where spec is unclear, anything particularly hard to cover with automated tests
  • What can we do to mitigate these?

Friday, 4 July 2014

22:37 | by Dragan Mestrovik | Categories: | No comments
Well, here are some tips to create a good database test plan:
1. Database testing can get complex. It may be worth your while if you create a separate test plan specifically for database testing.
2. Look for database related requirements in your requirements documentation. You should specifically look for requirements related to data migration or database performance. A good source for eliciting database requirements is the database design documents.
3. You should plan for testing both the schema and the data.
4. Limit the scope of your database test. Your obvious focus should be on the important test items from a business point of view. For example, if your application is of a financial nature, data accuracy may be critical. If you application is a heavily used web application, the speed and concurrency of database transactions may be very important.
5. Your test environment should include a copy of the database. You may want to design your tests with a test database of small size. However, you should execute your tests on a test database of realistic size and complexity. Further, changes to the test database should be controlled.
6. The team members designing the database tests should be familiar with SQL and database tools specific to your database technology.
7. I find it productive to jot down the main points to cover in the test plan first. Then, I write the test plan. While writing it, if I remember any point that I would like to cover in the test plan, I just add it to my list. Once I cover all the points in the list, I review the test plan section by section. Then, I review the test plan as a whole and submit it for review to others. Others may come back with comments that I then address in the test plan.
8. It is useful to begin with the common sections of the test plan. However, the test plan should be totally customized for its readers and users. Include and exclude information as appropriate. For example, if your defect management process never changes from project to project, you may want to leave it out of the test plan. If you think that query coding standards are applicable to your project, you may want to include it in the test plan (either in the main plan or as an annexure).

Now, let us create a sample database test plan. Realize that it is only a sample. Do not use it as it is. Add or remove sections as appropriate to your project, company or client. Enter as much detail as you think valuable but no more.

For the purpose of our sample, we will choose a database supporting a POS (point of sale) application. We will call our database MyItemsPriceDatabase.


This is the test plan for testing the MyItemsPriceDatabase. MyItemsPriceDatabase is used in our POS application to provide the current prices of the items. There are other databases used by our application e.g. inventory database but these other databases are out of scope of this test.

The purpose of this test plan is to:
1. Outline the overall test approach
2. Identify the activities required in our database test
3. Define deliverables


We have identified that the following items are critical to the success of the MyItemsPriceDatabase:
1. The accuracy of uploaded price information (for accuracy of financial calculations)
2. Its speed (in order to provide quick checkouts)
3. Small size (given the restricted local hard disk space on the POS workstation)

Due to limitation of time, we will not test the pricing reports run on the database. Further, since it is a single-user database, we will not test database security.

Test Approach

1. Price upload test
Price upload tests will focus on the accuracy with which the new prices are updated in the database. Tests will be designed to compare all prices in the incoming XML with the final prices stored in the database. Only the new prices should change in the database after the upload process. The tests will also measure the time per single price update and compare it with the last benchmark.

2. Speed test
After analyzing the data provided to us from the field, we have identified the following n queries that are used most of the time. We will run the queries individually (10 times each) and compare their mean execution times with the last benchmark. Further, we will also run all the queries concurrently (in sets of 2 and 3 (based on the maximum number of concurrent checkouts)) to find out any locking issues.

3. Size test
Using SQL queries, we will review the application queries and find out the following:
a. Items which are never used (e.g. tables, views, queries (stored procedures, in-line queries and dynamic queries))
b. Duplicate data in any table
c. Excessive field width in any table

Test Environment

The xyz tool will be used to design and execute all database tests. The tests will be executed on the local tester workstations (p no.s in all).

Test Activities and Schedule
1. Review requirements xx/xx/xxxx (start) and xx/xx/xxxx (end)
2. Develop test queries
3. Review test queries
4. Execute size test
5. Execute price upload test
6. Execute speed test
7. Report test results (daily)
8. Submit bug reports and re-test (as required)
9. Submit final test report

1. Test lead: Responsible for creating this test plan, work assignment and review, review of test queries, review and compile test results and review bug reports
2. Tester: Responsible for reviewing requirements, developing and testing test queries, execute tests, prepare individual test results, submit bug reports and re-test


The testers will produce the following deliverables:
1. Test queries
2. Test results (describing the tests run, run time and pass/ fail for each test)
3. Bug reports


The risks to the successful implementation to this test plan and their mitigation is as under:

       Name        Role        Signature        Date
1. ____________________________________________________________
2. ____________________________________________________________
3. ____________________________________________________________
22:36 | by Dragan Mestrovik | Categories: | No comments
Database migration testing is needed when you move data from the old database(s) to a new database. The old database is called the legacy database or the source database and the new database is called the target database or the destination database. Database migration may be done manually but it is more common to use an automated ETL (Extract-Transform-Load) process to move the data. In addition to mapping the old data structure to the new one, the ETL tool may incorporate certain business-rules to increase the quality of data moved to the target database.

Now, the question arises regarding the scope of your database migration testing. Here are the things that you may want to test.
1. All the live (not expired) entities e.g. customer records, order records are loaded into the target database. Each entity should be loaded just once i.e. there should not be a duplication of entities.
2. Every attribute (present in the source database) of every entity (present in the source database) is loaded into the target database.
3. All data related to a particular entity is loaded in each relevant table in the target database.
4. Each required business rule is implemented correctly in the ETL tool.
5. The data migration process performs reasonably fast and without any major bottleneck.

Next, let us see the challenges that you may face in database migration testing.
1. The data in the source database(s) changes during the test.
2. Some source data is corrupt.
3. The mappings between the tables/ fields of the source databases(s) and target database are changed by the database development/ migration team.
4. A part of the data is rejected by the target database.
5. Due to the slow database migration process or the large size of the source data, it takes a long time for the data to be migrated.

The test approach for database migration testing consists of the following activities:

I. Design the validation tests
In order to test database migration, you need to use SQL queries (created either by hand or using a tool e.g. a query creator). You need to create the validation queries to run against both the source as well as the target databases. Your validation queries should cover the scope defined by you. It is common to arrange the validation queries in a hierarchy e.g. you want to test if all the Orders records have migrated before you test for all OrderDetails records. Put logging statements within your queries for the purpose of effective analysis and bug reporting later.

II. Set up the test environment
The test environment should contain a copy of the source database, the ETL tool (if applicable) and a clean copy of the target database. You should isolate the test environment so that it does not change externally.

III. Run your validation tests
Depending on your test design, you need not wait for the database migration process to finish before you start your tests.

IV. Report the bugs
You should report the following data for each failed test:
    a. Name of the entity that failed the test
    b. Number of rows or columns that failed the test
    c. If applicable, the database error details (error number and error description)
    d. Validation query
    d. User account under which you run your validation test
    e. Date and time the test was run

Keep the tips below in mind to refine your test approach:

1. You should take a backup of the current copies of the source and target databases. This would help you in case you need to re-start your test. This would also help you in reproducing any bugs.
2. If some source data is corrupt (e.g. unreadable or incomplete), you should find out if the ETL tool takes any action on such data. If so, your validation tests should confirm these actions. The ETL tool should not simply accept the corrupt data as such.
3. If the mappings between the tables/ fields of the source and target databases are changed frequently, you should first test the stable mappings.
4. In order to find out the point of failure quickly, you should create modular validation tests. If your tests are modular, it may be possible for you to execute some of your tests before the data migration process finishes. Running some tests while the data migration process is still running would save you time.
5. If the database migration process is manual, you have to run your validation queries externally. However, if the process uses an ETL tool, you have the choice to integrate your validation queries within the ETL tool.

I hope that you are comfortable with the concept of database migration testing. (whether  data is migrated between binary files and an RDBMS or between RDBMSs (Oracle, SQL Server, Informix or Sybase)). According to you, what is the main problem faced while testing database migration? What is a good way to handle this problem?
22:14 | by Dragan Mestrovik | Categories: | No comments
Many (but not all) applications under test use one or more databases. The purposes of using a database include long-term storage of data in an accessible and organizedform. Many people have only a vague idea about database testing.

Firstly, we need to understand what is database testing? As you would know, a database has two main parts - the data structures (the schema) that store the data AND the data itself. Let us discuss them one by one. 

The data is stored in the database in tables. However, tables may not be the only objects in the database. A database may have other objects like views, stored procedures and functions. These other objects help the users access the data in required forms. The data itself is stored in the tables. Database testing involves finding out the answers to the following questions:

Questions related to database structure
1. Is the data organized well logically?
2. Does the database perform well?
3. Do the database objects like views, triggers, stored procedures, functions and jobs work correctly?
4. Does the database implement constraints to allow only correct data to be stored in it?
5. Is the data secure from unauthorized access?

Questions related to data
1. Is the data complete?
2. Is all data factually correct i.e. in sync with its source, for example the data entered by a user via the application UI?
3. Is there any unnecessary data present?

Now that we understand database testing, it is important to know about the 5 common challenges seen before or during database testing:

1. Large scope of testing
It is important to identify the test items in database testing. Otherwise, you may not have a clear understanding of what you would test and what you would not test. You could run out of time much before finishing the database test.
Once you have the list of test items, you should estimate the effort required to design the tests and execute the tests for each test item. Depending on their design and data size, some database tests may take a long time to execute. Look at the test estimates in light of the available time. If you do not have enough time, you should select only the important test items for your database test.

2. Incorrect/ scaled-down test databases
You may be given a copy of the development database to test. This database may only have little data (the data required to run the application and some sample data to show in the application UI). Testing the development or test or staging databases may not be sufficient. You should also be testing a copy of the production database.

3. Changes in database schema and data
This is a particularly nasty challenge. You may find that after you design a test (or even after you execute a test), the database structure (the schema) has been changed. This means that you should be aware of the changes made to the database during testing. Once the database structure changes, you should analyze the impact of the changes and modify any impacted tests.
Further, if your test database is being used by other users, you would not be sure about your test results. Therefore, you should ensure that the test database is used for testing purpose only.
You may also see this problem if you run multiple tests at the same time. You should run one test at a time at least for the performance tests. You do not want your database performing multiple tasks and under-reporting performance.

4. Messy testing
Database testing may get complex. You do not want to be executing tests partially or repeating tests unnecessarily. You should create a test plan and proceed accordingly while carefully noting your progress.

5. Lack of skills
The lack of the required skills may really slow things down. In order to perform database testing effectively, you should be comfortable with SQL queries and the required database management tools.

Next, let us discuss the approach for database testing. You should keep the scope of your test as well as the challenges in mind while designing your particular test design and test execution approach. Note the following 10 tips:

1. List all database-specific requirements. You should gather the requirements from all sources, particularly technical requirements. It is quite possible that some requirements are at a high level. Break-down those requirements into the small testable requirements.

2. Create test scenarios for each requirement as suggested below.

3. In order to check the logical database design, ensure that each entity in the application e.g. actors, system configuration are represented in the database. An application entity may be represented in one or tables in the database. The database should contain only those tables that are required to represent the application entities and no more.

4. In order to check the database performance, you may focus on its throughput and response times. For example, if the database is supposed to insert 1000 customer records per minute, you may design a query that inserts 1000 customer records and print/ store the time taken to do so. If the database is supposed to execute a stored procedure in under 5 seconds, you may design a query to execute the stored procedure with sample test data multiple times and note each time.

5. If you wish to test the database objects e.g. stored procedures, you should remember that a stored procedure may be thought of as a simple program that (optionally) accepts certain input(s) and produces some output. You should design test data to exercise the stored procedure in interesting ways and predict the output of the stored procedure for every test data set.

6. In order to check database constraints, you should design invalid test data sets and then try to insert/ update them in the database. An example of an invalid data set is an order for a customer that does not exist. Another example is a customer test data set with an invalid ZIP code.

7. In order to check the database security, you should design tests that mimic unauthorized access. For example, log in to the database as a user with restricted access and check if you can view/ modify/ delete restricted database objects or view or view and update restricted data. It is important to backup your database before executing any database security tests. Otherwise, you may render your database unusable.
You should also check to see that any confidential data in the database e.g. credit card numbers is either encrypted or obfuscated (masked).

8. In order to test data integrity, you should design valid test data sets for each application entity. Insert/ update a valid test data set (for example, a customer) and check that the data has been stored in the correct table(s) in correct columns. Each data in the test data set should have been inserted/ updated in the database. Further, the test data set should be inserted only once and there should not be any other change in the other data.

9. Since your test design would require creating SQL queries, try to keep your queries as simple as possible to prevent defects in them. It is a good idea for someone other than the author to review the queries. You should also dynamically test each query. One way to test your query is to modify it so that it just shows the resultset and does not perform the actual operation e.g. insert, delete. Another way to test your query is to run it for a couple of iteration s and verify the results.

10. If you are going to have a large number of tests, you should pay special attention to organizing them. You should also consider at least partial automation of frequently run tests.

Now you should know what database testing is all about, the problems that you are likely to face while doing database testing and how to design a good database test approach for the scope decided by you.
22:11 | by Dragan Mestrovik | Categories: , | No comments
TestComplete is one of the best automation tool available in the market. With the increase in demand and changes in technology, this tools has made its significant place. And so we see there are lots of requirements coming up for automators with TestComplete skill set.  This post contains some commonly asked question on TestComplete at different multi-national organizations. We hope this will definitely help you prepare for the same.
Please note, some of the questions here are using VBScript as scripting language in TestComplete.
1. A general question could be like – What is TestComplete, how it works? or What do you know about TestComplete.
2. Explain name mapping concept in TestComplete.
3. Which version of TestComplete you have used ?
4. Have you worked on QTP also?If yes,what are the advantage of Testcomplete over QTP.
5. Use of USEUNIT method in TestComplete.
6. How descriptive programming can be done in TestComplete?
7. We have an object on a webpage say button,its hierarchy is getting change every-time you open that webpage.Apart from descriptive programming is there any way that we can identify that object on webpage or not?If yes then how.
8. Elaborate Object Identification Mechanism in TestComplete.
9. Difference between Find and FindAll method.
10. If any unexpected windows pop up during your script in TestComplete,how can you handle that?
11. What is Distributed testing and how it can be achieve using TestComplete.
12. Regular expression TestComplete.
13. Scripting languages that can be used in testComplete.
14. What is the purpose of TestedApps in TestComplete.
15. How can we call any application that has been added to TestedApps  in your scripts.
16. Different ways of capturing objects in TestComplete.
17. Is it possible to perform record and play  mechanism in TestComplete or not.
18. Browsers supported by TestComplete till date.
19. Regular expression in TestComplete.
20. Syntax for highlighting objects on web page.
21. Steps for calling functions located in some other file to any other file.
Hope, these questions are helpful to you. 
Happy Learning!!!