How to Setup an Automated Reminder in Tick Spot

Misplaced my timesheet

Timesheet is a mandatory implementation in every organization and implementation of timesheets definitely helps in reducing the frequency of disputes between employees and supervisors/employers. It is also a very effective way to track cost, making business accounting more accurate and error free.

Incomplete timesheets slow down our business. Employees spend longer trying to recall their jobs and hours, managers spend time chasing down incomplete timesheets, and accounting gets delayed. I understand your problems, and that is why I came up with this technique for setting up the automatic reminder feature in Tick Spot to complete timesheets on time. Follow some simple steps to set it up on your PC/Laptops.In one of my projects at Zen Test Labs, I implemented the automated timesheet reminder in Tick Spot. This helped not only the individuals working on the project, but also helped business in making correct effort calculations.

Setting up an Automated Reminder in Tick Spot:  

Every evening (you can specify your EOD), your browser opens up with the Tick Spot login screen.

Step 1

  • Create “timesheet.bat”
  • Open Notepad and type the following:
    • echo off
    • start /max iexplore.exe
  • Save the file as “timesheet.bat”

Step 2

Copy the file “timesheet.bat” to your preferred location

Step 3

Open the Run window and type “Task Scheduler”

Task Scheduler

Step 4: Add Batch Action

Select the Action tab, Click on “New”

Add Batch Action

Step 5: Add Batch File

To add the batch file, click “Browse” and select “timesheet.bat” from the stored location

Add Batch File

Step 6: Setup Trigger

To add Trigger tab and follow 5, 6, 7 & 8 actions

Click “New” –> Add the “Daily” radio button –> Add time –>Check “Enable” Check Box –> Click “OK”

Setup Trigger

Your automated reminder has been setup! The great thing about this feature is that it does not require any manual intervention and causes no overhead on system resources. You can extend this technique to other reminders too.

Mukund Wangikar | Zen Test Labs

Advertisements

Communication- A Key Skill to Excel at Testing

Enabling better communication is not a onetime activity. It requires continuous effort across the company. Good internal and external communication is extremely important to a business’s success. In order to work together effectively, there must be clear and coherent communication among all the departments.

Here are a few scenarios wherein communication gaps may arise and lead to poor quality:

1. Continuously Changing Requirements:
At times, requirement changes are implemented directly without updating the specification document. In such cases, there is a chance that the changed requirements remain untested or are tested incorrectly.
Any change in the requirements should be communicated correctly to all stake holders and it is necessary to update the specification document on a timely basis.

2. Configurations:
Lack of clarity from the stakeholders on the configurations to be tested can lead to wasted effort and extra work. Configuration testing can be expensive and time consuming. Investment in hardware and software is required to test the different permutations and combinations. There is also the cost of test execution, test reporting and managing the infrastructure.
Increasing communication between development, QA and stakeholders can help deal with these challenges.

3. Team Size:
When team sizes are large, some members of the team may miss changes in requirements or may not be communicated updates on activities in the project. This could lead to severe problems in the project or project failure.  Each team member should be abreast of the activities in the project through a log or through other means.

4. Changes in  Application Behavior not Communicated:
Continuous changes in the application behavior may lead to requirements being tested incorrectly.  All the functionality implemented in the application should be frozen while testing. If any changes are made to the functionality, they should be communicated to the testing team on a timely basis.

5. Unclear Requirements:
Complex requirements that contain insufficient data may be difficult to understand and therefore, may lead to improper testing. The functional/technical specification documents should be clear and easy to understand; they should contain a glossary, screenshots and examples wherever necessary.

The path to project success is through ensuring that small communication problems are eliminated completely before they build up, so that the message is delivered correctly and completely. Instead of discovering problems, we should figure out how to stop them from appearing in the first place.

Poonam Rathi | Test Consultant | Zen Test Labs

Reducing dependence on automation engineers to manage test automation!

I have always wondered what would it be to separate test automation and automation engineers. Considering that Test Automation has always been treated as the holy grail of testing! Enterprises that have managed to achieve high levels of automation in the testing process have enhanced productivity exponentially while improving coverage and thus reducing risk. This has translated into automation engineers holding  design approaches close to their heart and controlling scripting tightly. Given this dynamic, the adoptions of test automation have remained low over the years.

Test Automation today has transitioned from a “Record and Playback” mode to a virtually “Scriptless” mode thus enabling rapid on the go Test Automation

It has resulted in enterprises automating testing to be oblivious to tool specific coding thus making automation suites maintainable and resource independent. However, scriptless automation frameworks still have many missing links. For example, most scriptless automation frameworks  demand extensive Business User involvement particularly to test the technology enablement. There is a possibility it takes longer than acceptable time to market. Among many causes for greater time to market, one cause is extensive manual testing of the solution. It hamstrings the time taken to market since there is heavy dependence on business analysts (from business or IT) in QA (test design and execution). There is a strong dependence on skilled & expensive technical resources for automation. There is a need to manage spikes in demand for QA resources which results in increase of QA costs.

Considering these dynamics, the next stage in the evolution of test automation is driving in the direction of Business Process Model based test automation that aims at synchronizing Operations, Product Management and Quality functions.

At Zen Test Labs we are innovating with multiple products in this space. Our flagship test automation framework, ZenFRAME is one such example. ZenFRAME improves BA and business testers productivity while reducing dependence on technology teams by up to 40%. The GUI enables most non-technical users to create automated test cases faster  thus resulting in close to 33% lesser creation time, read our whitepaper to know how you can implement a business Process model for you QA environment. Would love to hear thoughts from everyone…

Ravikiran Indore |Sr Consultant |Zen Test Labs

 

 

 

Top 6 solutions for software testing failures

The cost of software testing is still not valued by its worth. Although it is a critical investment companies avoid spending on testing because they don’t realize the ROI on testing and/or a quantifiable cost of quality. The most common complaints against testing that we repeatedly hear are:

  • It is a necessary evil that stalls a project the closer it gets to a release
  • It is too costly, time consuming without any guaranteed outcome
  • Many a times regression testing is not effective to identify new defects

Having worked on a number of testing projects over the past 12 years, I realize why there is a high tendency to look at testing with such a skeptical eye. I would like to share what we have learnt over time.The top six points in our view to improve the effectiveness of manual testing are:

6. Reducing effort and time in Test Documentation

A lot of teams spend unnecessary time detailing test scenarios during the planning phase which are rarely referred to after 2-3 rounds of testing. This increases maintenance overheads and reduces flexibility and coverage in the long run thus resulting in inefficient testing. Post the initial 6-8 months a large % of test scenarios are outdated and require the same effort in updating. Instead of detailing each and every step for every test scenario, one can cover it with test conditions and the expected results.

5. Focusing on breadth and depth of testing

Many a times when execution is not prioritized the depth of testing takes lead over breadth. By aiming at covering more breadth, we align testing with the business objectives. By doing this the teams aim at being effective first and then efficient. Breadth referring to covering positive  critical cases (across the application) that are frequently used by end user.Depth referring to covering all the test cases for a module.

4. Testing, a continuous activity

Many companies look at testing as a one-time investment. They outsource/ execute in-house once during the start of the development of the product and then rarely test it during the maintenance phases. The primary reason is invariably budget driven and goes onto harm the quality of the product when not tested after newer versions. For every minor release one should ensure all the regression test cases are executed and for every major release all the high and medium priority test cases are executed at least once.

3Remembering the objective of testing

The key objective of testing is to break the system and not to prove that the system works as per the requirements.This has a direct impact and can improve testing effectiveness and the number of defects one will find. It is often observed that many senior testers habitually start proving that system is working as per the requirements which is against the primary objective of testing.

2Strategize Test optimization

Coverage is important but not at cost of redundant test cases. Test optimization is an intelligent way to ensure test coverage in less time. That’s why testing teams need to collaborate more with the development teams. Understanding the high level design and structure of the application makes testing more effective. In development, one of the main principles followed is reuse. So, we can use the same principle while testing the same code which is reused. Why not optimize and test the class/object once and just test the implementation of the class/object on other screens/modules. If the test cases are reusable maintainable and scalable it is an additional advantage to roll out in time and under budget.

1. Focusing on the Business for which you are testing

Testing cannot be done in isolation. Business priorities and challenges are equally and in most of the cases more important than testing needs. One thing I have learnt is that testing cannot drive business decisions, business drives testing most of the times. Aligning testing to the business requirements results in a disciplined and ready to market high quality product.

These are some of the solutions with which I could overcome testing failures. Do share yours if you have new solutions or methods

Mukesh Mulchandani | CTO | ZenTest Labs

Verifying 800 Million data sets in record time!

I recently was fortunate to be a part of a unique project at Zen Test Labs. This was a post-merger scenario wherein the acquirer (bank) had to consolidate the customer information systems of the two banks into a single system. This meant mapping the acquired bank’s product, service and customer portfolio, to a new and modified version of the acquirer’s products and services.

Among many other factors, ensuring seamless service to existing customers of the acquired bank implied that such customers should not expect undue increase in service charges. Processing customer data using enhanced systems required that the service fees were within the threshold that the customer would expect in normal course of business. Testing for “Go Live” was tricky since it required that for each acquired customer, the bank had to compare the results from the “Go Live” with historical data for the customer. With hundreds of thousands of customers and millions of transactions in a month, manual verification was a gigantic task, potentially impossible to accomplish.

Zen Test Labs creatively addressed this situation by leveraging its Data Migration Testing framework and extending it to include customer specific scenario. For example, each data component of the source and target data files were mapped, rules applied and integrated into the testing framework. A utility was then designed to pick each record from the source, apply the logic of migration then check if the corresponding value of the record in the target file is within the tolerance level as per the logic. During execution the selected components from the imported source and target data were compared and flagged if not meeting the tolerance levels. Once all the records were compared the utility reported:

  1. All transactions migrated as per the logic
  2. All transactions which did not meet the tolerance criteria
  3. Transactions in the target database which did not have any relation with the migration process

The framework and utility testing itself adopted an approach with three layers of testing:

  1. Utility testing using dummy data for source, target and the mapping
  2. Sampling of output files and manual verification with real data
  3. Verify against “Thumb Rules”. One of the examples of this was; the total number of Pass records and Fail records should total the count of primary key of source data.

Overall I found this project very challenging and interesting. Leveraging the data migration testing framework we created a comprehensive utility in approximately three weeks. The quality and performance of the utility was so sharp that it compared one data component with 600,000 to 700,000 records in 10 to 12 minutes. The total number of data values verified in this project was over 800 Million in a span of 30 days which is as good as verifying at least one data for the entire population of European Union! With our output files we provided great deal of ‘Data Profiled’ information of migrated customers to the bank which was used to understand behavioral patterns of the migrated customers and the performance of the products after migration.

Ravikiran Indore |Sr Consultant |Zen Test Labs

Control charts for beginners

I am going to attempt in this blog to keep the topic of control charts very simple.

In my view the very definitions of control charts can go a long way in determining ones approach to them. In my view

A complicated control chart is “A run chart with confidence intervals calculated and drawn in.  These Statistical control limits form the trip wires which enable us to determine when a process characteristic is operating under the influence of a special cause”.

When I mean by a simple definition is that, a control chart is a set of data points plotted chronologically in time sequence with five horizontal lines drawn on the chart; i.e.,

  • A centre line, drawn at the process mean
  • An upper control-limit (also called an upper natural process-limit drawn three standard deviations above the centre line;
  • A lower control-limit (also called a lower natural process-limit drawn three standard deviations below the centre line.

The above definition if for a X bar control chart or also called as a Shewart control chart (named after its inventor). The whole idea of such a chart using 3 sigma control limits is that data points will fall inside 3-sigma limits 99.7% of the time when a process is in control. Thus one could say that anything beyond the control limits requires investigation. So how do you draw one?

Let us assume you test weekly builds. And you have to finish by Friday EOD what you get on Monday. But you notice that sometimes you are late by 2 days. Sometimes early by a day. Let us say you have 30 weeks of data. How can you know how was your process variation. Every process varies. Thus you may say it is alright to be one day late or one day early. But mathematically, how can you say should it be one day or two days? And more importantly was there a week where there was something abnormal in your process which requires more investigation. This is where a control chart can help.

Click on the link above to see an example and know how to use excel without any special add on to draw a control chart.

sample_control_chart

From the chart you shall notice that there are two data points (two weeks) which are beyond the control limits and that is what you shall investigate further:

As you plot the chart for further weeks, your process is stable if all data points are within the control limits. Your process is capable if you are able to reduce the control limits (variation). As per the first 30 weeks data, you have such a poor variation that you may be 15 days late or early. Thus seems ridiculous. So you can draw the control chart again, removing the special causes (the two weeks) and find out how capable is your process and whether with time the variation is reducing.

This was just a precursor. Read more about types of control charts, common and special causes of variation. The internet is full of examples on the same. Apply to other metrics than schedule slippage to bring your testing process under control.

Krishna Iyer | CEO | Zen Test Labs