Wednesday, December 29, 2010

Sayonara 2010 - Welcome 2011

2010-It’s been quite an eventful year, both personally and professionally. Year started with something quite probing i.e. testing a middleware application in a span of 5 days. I was asked to test the Service Layer part of the application and I am glad that I did it as I got to learn few more tools like SOAP UI for WSDL and RESTful web services testing. If 2009 was about learning RapidSQL then 2010 was about getting comfortable with DBArtisan. There has been some addition on how to use BPT or Business Process Testing using Quality Center and Quick Test Professional both from HP.
In last 4-5 months; I got to learn about iTKO LISA Test Automation tool which I found good for middleware and back-end test automation (or Automated Checks). It‘s not easy to use for sure, at first it looks like a configuration management tool and to make learning little difficult, a newer version of this tool does not provide updated version of help documents. I hope going into New Year; the tool owners will improve on help documentation because if they are looking to have a large user base they will have to help users and make them feel at ease with LISA.
Then there came Agile and for first 2 sprints I was kind of confused so I picked a book titled “Agile Testing: A Practical Guide for Testers and Agile Teams” by Lisa Crispin and Janet Gregory. It proved to be of immense value and helped me BIG time. So, now I have the experience in Agile environment (/methodology) as well. Few things I really liked about Agile are quick feedback and less documentation. In few weeks, I’ll put a separate post on my experience with Agile as a tester.
These are the few important things that I learned in 2010, more importantly I networked heavily and read a number of recommended books from James Marcus Bach. I hope to finish reading (& learning from) all of them by end of 2011.
There had been some disturbances as well on personal front but all is well that ends well. I am sure to start 2011 with a bang; a lot of things will change with the start of New Year on both of he ‘P’ fronts. I hope to network with lot more people in testing arena and share & learn with them. Ohh yes, I have few New Year resolutions also, I will blog more frequently then I have been doing in last 1.5 years. There are few more that I’ll not talk about.
Wishing you all, a very happy & prosperous 2011. This is Rahul Gupta signing off for 2010.
C Ya!!!

Sunday, December 26, 2010

Will Moolya redefine meaning of "Value"?


A few days back, Pradeep Soundararajan officially launched Moolya Brainual Software Testing Services Private Limited on his famous blog. Moolya means Value. I was one of the few people who were aware of this development for at least 2 months. I got an opportunity to meet Pradeep at this year's GTAC in Hyderbad. So is it another software testing company that promises big....NO at least not to me and many others who have been following Pradeep's blog for over 2 years now. 
I am not an expert to talk much about business opportunities, market conditions and services offered etc. but I certainly know that there are so many money spinning machines in the world of software testing. You think of something related to software testing and few names will popup in your head, here is a simple exercise; Certifications in Software Testing, Easy to use Automation Tools, Quality / Standard Certifying boards (think of 2 names for each). See, you would rightly guess few names there. 
I have visited Moolya's site many a time since it's public appearance. Every time I visit it, something or other innovative (read different) approach becomes apparent, whether it be it's hiring process or the header of About Us page which boldly says "Curious to know about us?". 
I spent considerable time in thinking about this venture of Pradeep and Santhosh. It looks promising on many fronts because I feel a fresher will not spend too much time in mugging up some definitions, it will be like learning on the job and practicing that learning on Weekends; because I believe that people will not have managers who will ask them to go for some sort of certifications to prove their worth; because people will not spend hours in filling up excel sheets for some irrelevant metrics to prove test coverage ( or branch coverage and line coverage) rather they will win Clients' confidence by making them aware of current health of the project. I can go on and on for hours....but it's all not gonna be a bed of roses because Moolya promises to be a change agent and it will face some challenges from different quarters, I am expecting few anonymous bullies to comment on this post but that may not happen as BugMagnate is not a famous blog.
Pradeep has put a lot of emphasis on culture of the company and he said “They are the Microsoft & Apple of 1970's. They are the Google of the 1990's.” I don’t think any parallels should be drawn in terms of where it is starting because when any of the aforementioned companies started they were unrivalled in their market segment, the same cannot be said about Moolya. Moolya will need to collaborate, share, make it’s presence felt and above all It has to have a solid client list.
So what's gonna work in Moolya's favor at this point? Word of Mouth...yes and by means of this blog post I am starting in that direction. Let’s spread the word.
All the Best, Pradeep and Santhosh…I wish one day I’ll take pride in saying, “I know one of the founders of Moolya”. And I am sure; Moolya will prove Moolyavaan (of great value) to entire Software Testing industry and will redefine the meaning of "Value".    

Friday, September 10, 2010

What to Blog about?

I have been little confused for last few weeks. I seriously wanted to write something but was too much occupied with work and at the same time, I could not think as what I should write about? I am not sure if any of the bloggers whom I follow, face the same problem.

I wanted to participate in BlogSTAR competition of EuroSTAR conference this year but I could not make genuine progress on that front. And when the list of finalists was disclosed, my name did not figure out there. I understood that I didn't have enough credits, reputation or number of blog posts to prove myself a suitable contender for the coveted competition. No regrets as I got to know few more people in the software testing field. I do make a point to read all the posts on a daily basis and comment as and when I feel the need or something that intrigues me.   

I am running short of topics. So I asked my friends and colleagues who are somehow related to Software Industry as "What they would like to read about testing in next few weeks?". The answers I received were really encouraging. People don't want to know much about definitions, they want to know as how testing is adding value to the organization? (seems like a typical management question.). They want me to share, day to day challenges that I or my team faces in testing, products of different nature. So after getting some inputs, I prepared a list of things that I would like to talk about in coming weeks.

There was one interesting input that I received from a guy who just started his career in Software Testing field.  
He said, "Rahul, you asked me to read blog posts from Michael Bolton, Dr. Cem Kaner, James Bach etc, but I find it very difficult to understand as what they exactly meant." I was kind of confused as that guy has very good communication skills both written and verbal. I asked myself, "Did I ask too much from that guy to do that too when he is just 3 months into Software Testing?" I didn't know that answer. I went back to that little chap and asked, "What's there that you find difficult to understand?". He replied, "I found few terminologies and ideas difficult to understand, like Heuristics". I understood where he was coming from. I faced almost similar difficulties when I started reading these blogs 2 years back. He asked for my help on understanding the things  that are not directly about testing but the knowledge gained can be applied to testing "the passion we share".  

Lets see what I can do to help him. 

C Ya !!!
Rahul

Monday, July 5, 2010

Test Environment to Production Environment

Many a time, I faced a situation when application under test; was working fine in Test Environment but when it is released to Prod Environment (to real users), it goes down.  There have been many reasons as why it happened; functional as well as non-functional.  Following some frequent prod incidents in one of the areas of work that I am aware of, I posted a comment on my LinkedIn Profile. The responses to this comment lead me to write this post.

My comment was “To Save Production Environment......We need a better Test Environment, ideally a mirror image of Production Environment.” I think one of the ideas of having a Testing team; is to Save the Production Environment from any failure whether it is functional or non-functional. There may be many points to counter the above statement but let’s agree that organizations spend a huge sum of money to be rest assured that the application will not cause any trouble once it is tested and deployed to end users.

What are the probable causes of Prod incidents?
No application works on its own. There are always some downstream and upstream apps as well. The incidents that I had experienced were a result of one or more points as mentioned below.
1)      Two interdependent applications are being tested separately in different environments (for e.g. QA and UAT), both works fine but when deployed to prod, all hell break loose.
2)      QA environment may not have the similar data size as in prod environment leading to performance issues specially when there are jobs running & supplying data to another job.
3)      An Urgent requirement change is deployed to prod assuming it would not affect anything else and there is no need to test. The change was urgent because business was getting impacted. Now, after deployment the business is severely getting impacted because of an untested minor code change.
There can be many more reasons for failure of application in the prod environment. The last one is the most frequent one for me.

What I meant by asking for a better test environment?
Environment where all interdependent applications work the same way (in terms of data flow, data size) as they supposed to work in prod.
I have faced situations where an application was tested in test environment, no issues found in UAT environment but when the application was released in Prod, there were some serious issues. When the Root Cause Analysis was done; it was found that when application was tested in QA or UAT; it wasn't receiving data from one of the applications, so integration was never tested. But in prod; the application started receiving data and BOOM. The problem as I see it was the communication gap between teams involved

An environment where the dataset and user base are similar to that is supposed be there in Prod.
An application broke in Prod because the data files it used to receive in Test environment were of size few KBs but once in Prod it had to receive files of size of MBs and it couldn’t handle the load. Kedar Kulkarni, a friend whose expertise is in Performance Testing, would agree to this.

In general, a test environment is shared by many applications so timely availability to different teams becomes an issue. In such cases, environment management becomes an issue. And it’s an irony that people, who want their prod environment healthy, don’t pay enough attention to their test environment which can be disastrous.  

Can a better Test Environment Save the Prod Environment?
My friend Parthiban disagreed with my comment as mentioned in the beginning of the post.  His idea is to have a stable test environment and then mirror the same to prod. Initially I didn’t like or probably understood this thought. Let’s evaluate;
If we can have all the interdependent applications stabilized in the test environment in terms of data dependency, data flow and data size than yes we can mirror it to Prod. If we can remove the problem of communication which is generally a result of ego clashes, yes we can.

I would request readers to share their views on the same and let's learn from each other’s experiences.

Saturday, June 12, 2010

Job Satisfaction

I read a book Outliers from Malcolm Gladwell. It talks about why some people are so accomplished and so extraordinary and so outside of ordinary experience that they are as puzzling to the rest of us as a cold day in August. Malcolm mentioned 3 factors that define Job Satisfaction for an Individual irrespective of the field s/he is working.
Autonomy: You get a role in deciding what you do every day. Even if you might not always get decide exactly what you do, you can choose how to get it done.

Complexity: It must be an intellectually stimulating challenge. As the book states, it should “engage both your mind and imagination.”
Connection between Effort and Reward: The harder you work, the greater your income or recognition (at least eventually).

I found the explanation quite good. So if I were to map my understandings of aforementioned terms in my area of work i.e. Software Testing; what I should consider?
I started my career in testing in the year 2006 and now it is 2010 so thought of base lining this write up on the same.

Year 2006

Number of defects found: In the beginning of my career, the more defects I raised, the more satisfied I was but it’s no longer the case. The Quantity has been replaced by Quality.
Automation: I thought that I should know automation and should be aware of various tools available for the same in the market. But I realized that if you don’t keep a regular touch with these ever changing tools and methods, you tend to forget it and you fall behind.
Domain Knowledge: Initially I spent hours/weeks/months to understand the business function of the applications under test, not all applications require domain knowledge but it is always good to know the basics.
Certifications: Ohh that’s a pain point for many. I did few certifications and believe that it helped me immensely in knowing terminologies and communicate freely with colleagues / clients.
The above mentioned points now have broader meanings.

Year 2010

Where the gaps are: I realized that issues in software aren’t intentional. No one actually wants to make a faulty system. But we still find issues. For example; issues may be a result of lack of communication or information flow. Now, I try to understand the bigger picture, I like to question why something is being done in that way and why not this way if I know what is this.
Learning: It’s a continuous process. Initially I used to think, “If I clear this Certification, I’ll become that etc.” But it’s not the case, with time passing by one need to continuously upgrade his/her skills. This is the only way to stay in demand is to improve on your skills. This is a hard learnt lesson and till date, it’s the most important one.
A new tool, a new scripting language, it can be anything. But the issue that I face is, if I don’t use technical learning I’ll forget the simple menu options of the tool. But if I find a relation of learning with my current project that learning become long lasting.
How can I contribute? I am not a Know It All guy but there is nothing wrong if I try my hands on an issue (may be silently) so that I may learn something. My limited technical skills do not allow me to contribute directly in fixing all the issue but I get some idea for example, how to install something on my machine? What can I do to reduce the on boarding time for a new joiner, how to make them productive ASAP without overburdening them. Is the QA team beside me planning to automate a set of their tests for an application if yes than is there a way to learn something if time permits? Taking this approach help me in revisiting things that I haven't used in long time, like setting something in Quality Center or QTP. 



Now, let’s back to three points above…..is there a relation?

Autonomy
I started believing that no one gives you autonomy, it has to be taken. One can get it by building trusts with stake holders, developers and fellow testers. Trust coupled with hard/smart work with motive to timely deliver a quality product; will definitely give the freedom to do something the way you know, is right.
Complexity
We may not start with complex tasks in a new field but if we can prove our mettle in handling relatively simple tasks, we can do wonders. I hope to shine, if not tomorrow than probably day after.
Connection between Effort and Reward
Rewards can come in many forms. Rewards are a result of continuous improvement in the assigned task or it can come because of excellent work done over a period of time. I am not worried about rewards as of now; my biggest concern is not to lose my passion for testing, learning, writing and reading. 

Sunday, April 4, 2010

Test Managers

A Test Manager is a person who monitors deliverables, who bridges the gap of client and test engineers, who works as a motivator and mentor and one who decides your ratings at the end of year.  

This is a very generic definition I could think of which may be incomplete and readers may have different versions. In last 5 years, my friends and I worked with many people at different levels and here for simplicity, I would (on behalf of my friends as well) call, all of them “Managers” who have more experience than us. Here, experience means, they started their software testing career before we started. We find this definition of work experience incomplete (or wrong) though it is widely accepted across the industry.
They all were different in their approaches, few were so much involved so that we felt like a bird caged in a small compartment, and they were all followers of micro-level management style while few others were completely aloof. Then there were some, who allowed their test engineers a complete freedom with periodic review of deliverables, say once in a week / fortnight as the case may be. I really liked to work with the managers of this last category.

Why I liked third type of Managers?
These managers were the ones who discussed the problems without imposing so many solutions on their resources. They allowed their resources to explore different approaches, allowed them to disagree with anything and listen to their inputs whenever they had one. These Managers were not so obsessed with test case preparation because they understood one fact very clearly that “Its not the test cases that will unearth a number of issues, it’s the tester who will do the job.” They were not too much worried about metrics in the beginning for a new AUT but they recognized importance of data that will be available after one round of testing. These were the managers who allowed their resources to learn new technologies and also emphasized on implementing these learning in day to day work wherever applicable. These managers were open enough to share the client expectations clearly and communicate them freely without wasting any time.

It’s not just positives with third type of managers, its negative as well. I personally experienced that when one is really good with his/her work, they tend to become perfectionist and that leads to aggression. Few of them were bit aggressive at times but that was ok because their aggression was just about deliverables.  

What was the problem with other type of Managers in our perception?
The Aloof ones were not involved with anything throughout the years but when it came to appraisals, they had all the time in this world to point 1000 things in their subordinate’s work. Fortunately, I never had to work with such managers but my friends were not so lucky.
The micro-managers were always too much involved. So involved that they never realized the problems their resources may have. This may sound a bit exaggeration but read on the following thing they never understood.
1)     Test case preparation is not as important as testing, during execution. We are not against documentation but stake holders should realize that not everything can be documented.
2)      Test engineers are the ones working on projects, trust their abilities to perform well, monitoring should be there but it should not turn into smothering at any point of time.
3)      Automation is good to have, but it does not make sense to completely rely on it.
4)      There are no Best Practices; one should understand a context rather than blindly following something for marketing at higher level.