Saturday, November 7, 2009

Most Common Web Server Error Messages.

Here is the list of most common web server error messages a web tester must know about:
1. 400 Bad File Request
Syntax used in the URL is incorrect e.g. uppercase letter should be lowercase letter, wrong punctuation marks.

2. 401 Unauthorized
Server encryption key from the client is missing or wrong password may have been entered.

3. 403 Forbidden/Access Denied
Special permission needed to access the site -- a password and/or username.

4. 404 File Not Found
Server cannot find the file you requested. File has either been moved or deleted, or you entered the wrong URL or document name.

5. 408 Request Timeout
Client stopped the request before the server finished retrieving it. A user will either hit the stop button, close the browser, or click on a link before the page loads. Usually occurs when servers are slow or file sizes are large.

6. 500 Internal Error
Couldn't retrieve the HTML document because of server-configuration problems. Contact site administrator.

7. 501 Not Implemented
Web server doesn't support a requested feature.

8. 502 Service Temporarily Overloaded
Server congestion; too many connections; high traffic. Keep trying until the page loads.

9. 503 Service Unavailable
Server busy, site may have moved ,or you lost your dial-up Internet connection.

10. Connection Refused by Host
Either you do not have permission to access the site or your password is incorrect.

11. File Contains No Data
Page is there but is not showing anything. Error occurs in the document. Attributed to bad table formatting, or stripped header information.

12. Bad File Request
Browser may not support the form or other coding you're trying to access.

13.Failed DNS Lookup
The Domain Name Server can't translate your domain request into a valid Internet address. Server may be busy or down, or incorrect URL was entered.
14. Host Unavailable
Host server down. Hit reload or go to the site later.
15. Unable to Locate Host
Host server is down, Internet connection is lost, or URL typed incorrectly.
Network Connection
16. Refused by the Server
The Web server is busy.

Must have Application XENU Link Checker


Xenu`s Link Sleuth is a multi-threaded spidering software that checks Web sites for broken links. Link verification is done on "normal" links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. The program displays a continuously updated list of URLs which you can sort by different criteria. A full report can be produced at any time. Xenu also allows you to include/exclude certain links from verification.
(http://home.snafu.de/tilman/xenulink.html)

Sunday, August 30, 2009

F-Shaped Pattern Reading....................

A very interesting article on how a user read the web contents of website.

F-Shaped Pattern For Reading Web Content:-
  • Users won't read your text thoroughly in a word-by-word manner. Exhaustive reading is rare, especially when prospective customers are conducting their initial research to compile a shortlist of vendors. Yes, some people will read more, but most won't.
  • The first two paragraphs must state the most important information. There's some hope that users will actually read this material, though they'll probably read more of the first paragraph than the second.
  • Start subheads, paragraphs, and bullet points with information-carrying words that users will notice when scanning down the left side of your content in the final stem of their F-behavior. They'll read the third word on a line much less often than the first two words.
http://www.useit.com/alertbox/reading_pattern.html

Monday, August 24, 2009

Quality matrices.
Person Month (PM):
If a person works for one monh = 1PM
so if 5 person workfor 12 months total PM will be
5 X 12 = 60 PM
So if we know the complexity of a project in PM we can easly calculate the total effort required for the execution of project.

Product Metrics:
Used to estimate the size of the project. There are two different types of methods available
(I) KDSI - Kilo 0r Thousand delivered source instructions.
(II) KLOC - Kilo Lines of Code.
On the basis of this we categorize a project in small, intermediate, medium and large.
Small <= 2KDSI
Intermediate > 2 & <= 8 KDSI
Medium > 8 & <= 32 KDSI
Large > 32 & <= 128 KDSI
Very Large > 128 KDSI

Productivity Metrics:
DSI = delivered source instructions
Calculated as no. of lines written by the programmer per hour.
We can calculate the size of project using KDSI ad DSI.
Time required for the project hours = Total KDSI of the project / Avg. DSI
Q. When did the term "Software Engg" first introduced?
Ans. In 1968, a conference was held which was sponsored by the NATO Science Committee. "Software Engg." was first coined in this meeting.

Wednesday, August 19, 2009

Most Common Hr Questions

Here I have compiled some of the most common Hr interview question.

Q1. Tell me something about yourself:
The most common interviews question. You need to have a short statement prepared in your mind. It should not sound rehearsed.

Q2. Why did you leave your previous job?
Stay positive regardless of the circumstances. Never refer to a major problem with
management and never speak ill of supervisors, co-workers or the organization. If you do, you will be the one looking bad. Keep smiling and talk about leaving for a positive reason such as an opportunity, a chance to do something special or other forward-looking reasons.

Q3. What experience you have in this field?
Be specific try to focus on current work profile you applying for.


Q4. Do you consider yourself successful?

Always answer yes and explain in detail why. Try to explain your goals and how did you achieve them.


Q5. What do your colleagues think about you?

Prepared with a quote or two from colleagues. Either a specific statement or a
paraphrase will work.

Q6. What do you know about this organization/Company?

Always do some research work before the interview. Find out where they have been and where they are going. What are the current issues and who are the major players?

Q7. What have you done to improve your knowledge?
Try to include improvement activities that relate to the job. A wide variety of activities can be mentioned as positive self-improvement. Have some good ones handy to mention.

Q8. Are you looking for other jobs?
Be honest but do not spend a lot of time in this area. Keep the focus on this job and what you can do for this organization. Anything else is a distraction.

Q9. Why do you want to work with us?
This may take some thought and certainly, should be based on the research you have done on the organization. Relate it to your long-term career goals.

Q10. Would you like to question anything from my side?
If you have any doubt in mind definitely ask that. Like if you doesn't know the answer of some question ask that, it will show your attitude to the interviewer that you want to learn and don't wanna repeat that mistake again.

Q11. Do you have any salary expectation?
Never answer it. Instead, you can say something like, That's a tough question. Can you tell me the range for this position? In most cases, the interviewer, taken off guard, will tell you. If not, say that it can depend on the details of the job. Then give a wide range.


Q12. Are you a good team player?
Yes of course, give some real time examples. like, i individually handled XYZ project.

Q13. How long you expect to work with us if hired?
Specifics here are not good. Something like this should work: I'd like it to be a long time. Or As long as we both feel I'm doing a good job.

Q14. Tell me about the most fun you have had on the job.
Talk about having fun by accomplishing something for the organization.

Q15. What is your philosophy towards work?

The interviewer is not looking for a long or flowery dissertation here. Do you have strong feelings that the job gets done? Yes. That's the type of answer that works best here. Short and positive, showing a benefit to the organization.

Q16. If you had enough money to retire right now, would you?
Answer yes if you would. But since you need to work, this is the type of work you prefer. Do not say yes if you do not mean it.

Q17. Have you ever been asked to leave a position?
If you have not, say no. If you have, be honest, brief and avoid saying negative things about the people or organization involved.

Q18. Explain how you would be an asset to this organization?
You should be anxious for this question. It gives you a chance to highlight your best points as they relate to the position being discussed. Give a little advance thought to this relationship.

Q19. Why should we hire you?
Point out how your assets meet what the organization needs. Do not mention any other
candidates to make a comparison.

Q20. Tell me about a suggestion you have made
Have a good one ready. Be sure and use a suggestion that was accepted and was then
considered successful. One related to the type of work applied for is a real plus.

Q21. What irritates you about co-workers?
This is a trap question. Think real hard but fail to come up with anything that irritates you. A short statement that you seem to get along with folks is great.

Q22. What is your greatest strength?
Numerous answers are good, just stay positive. A few good examples:
Your ability to prioritize, Your problem-solving skills, Your ability to work under pressure, Your ability to focus on projects, Your professional expertise, Your leadership skills, Your positive attitude .

Q23. Tell me about your dream job.
Stay away from a specific job. You cannot win. If you say the job you are contending for is it, you strain credibility. If you say another job is it, you plant the suspicion that you will be dissatisfied with this position if hired. The best is to stay genetic and say something like: A job where I love the work, like the people, can contribute and can't wait to get to work.

Q24. Why do you think you would do well at this job?
Give several reasons and include skills, experience and interest.

Q25. Why are you looking for in a job?
See answer # 23

Q26. What kind of person would you refuse to work with?
Do not be trivial. It would take disloyalty to the organization, violence or lawbreaking to get you to object. Minor objections will label you as a whiner.

Q27. What is more important to you: the money or the work?
Money is always important, but the work is the most important. There is no better answer.

Q28. What would your previous supervisor say your strongest point is?
There are numerous good possibilities:
Loyalty, Energy, Positive attitude, Leadership, Team player, Expertise, Initiative, Patience, Hard work, Creativity, Problem solver

Q29. Tell me about a problem you had with a supervisor
Biggest trap of all. This is a test to see if you will speak ill of your boss. If you fall for it and tell about a problem with a former boss, you may well below the interview right there. Stay positive and develop a poor memory about any trouble with a supervisor.

Q30. What has disappointed you about a job?
Don't get trivial or negative. Safe areas are few but can include:
Not enough of a challenge. You were laid off in a reduction Company did not win a contract, which would have given you more responsibility.

Q31. Tell me about your ability to work under pressure.
You may say that you thrive under certain types of pressure. Give an example that relates to the type of position applied for.

Q32. Do your skills match this job or another job more closely?
Probably this one. Do not give fuel to the suspicion that you may want another job more than this one.

Q33. What motivates you to do your best on the job?
This is a personal trait that only you can say, but good examples are: Challenge, Achievement, Recognition

Q34. Are you willing to work overtime? Nights? Weekends?
This is up to you. Be totally honest.

Q35. How would you know you were successful on this job?
Several ways are good measures:
You set high standards for yourself and meet them. Your outcomes are a success.Your boss tell you that you are successful

Q36. Would you be willing to relocate if required?
You should be clear on this with your family prior to the interview if you think there is a chance it may come up. Do not say yes just to get the job if the real answer is no. This can create a lot of problems later on in your career. Be honest at this point and save yourself future grief.

Q37. Are you willing to put the interests of the organization ahead of your own?
This is a straight loyalty and dedication question. Do not worry about the deep ethical and philosophical implications. Just say yes.

Q38. Describe your management style.
Try to avoid labels. Some of the more common labels, like progressive, salesman or
consensus, can have several meanings or descriptions depending on which management
expert you listen to. The situational style is safe, because it says you will manage according to the situation, instead of one size fits all.

Q39. What have you learned from mistakes on the job?
Here you have to come up with something or you strain credibility. Make it small, well intentioned mistake with a positive lesson learned. An example would be working too far ahead of colleagues on a project and thus throwing coordination off.

Q40. Do you have any blind spots?
Trick question. If you know about blind spots, they are no longer blind spots. Do not reveal any personal areas of concern here. Let them do their own discovery on your bad points. Do not hand it to them.

Q41. If you were hiring a person for this job, what would you look for?
Be careful to mention traits that are needed and that you have.

Q42. Do you think you are overqualified for this position?
Regardless of your qualifications, state that you are very well qualified for the position.

Q43. How do you propose to compensate for your lack of experience?
First, if you have experience that the interviewer does not know about, bring that up: Then, point out (if true) that you are a hard working quick learner.

Q44. What qualities do you look for in a boss?
Be generic and positive. Safe qualities are knowledgeable, a sense of humor, fair, loyal to subordinates and holder of high standards. All bosses think they have these traits.

Q45. Tell me about a time when you helped resolve a dispute between others.
Pick a specific incident. Concentrate on your problem solving technique and not the dispute you settled.

Q46. What position do you prefer on a team working on a project?
Be honest. If you are comfortable in different roles, point that out.

Q47. Describe your work ethic.
Emphasize benefits to the organization. Things like, determination to get the job done and work hard but enjoy your work are good.

Q48. What has been your biggest professional disappointment?
Be sure that you refer to something that was beyond your control. Show acceptance and no negative feelings.

Tuesday, August 18, 2009

Software Testing Dictionary

Software Testing Dictionary
A
Acceptance Test
Formal tests (often performed by a customer) to determine whether or not a system has satisfied predetermined acceptance criteria. These tests are often used to enable the customer (either internal or external) to determine whether or not to accept a system.

Ad Hoc Testing
Testing carried out using no recognised test case design technique.

Alpha Testing
Testing of a software product or system conducted at the developer's site by the customer.

Assertion Testing
A dynamic analysis technique which inserts assertions about the relationship between program variables into the program code. The truth of the assertions is determined as the program executes.

Automated Testing
Software testing which is assisted with software technology that does not require operator (tester) input, analysis, or evaluation.

B
Beta Testing

Testing conducted at one or more customer sites by the end-user of a delivered software
product or system.

Big-Bang Testing
Integration testing where no incremental testing takes place prior to all the system's components being combined to form the system.

Black Box Testing
A testing method where the application under test is viewed as a black box and the internal behavior of the program is completely ignored. Testing occurs based upon the external specifications. Also known as behavioral testing, since only the external behaviors of the program are evaluated and analyzed.

Boundary Value Analysis (BVA)
BVA is different from equivalence partitioning in that it focuses on "corner cases" or values that are usually out of range as defined by the specification. This means that if function expects all values in range of negative 100 to positive 1000, test inputs would include negative 101 and positive 1001. BVA attempts to derive the value often used as a technique for stress, load or volume testing. This type of validation is usually performed after positive functional validation has completed (successfully) using requirements specifications and user documentation.
Breadth test. - A test suite that exercises the full scope of a system from a top-down perspective, but does not test any aspect in detail.

C

Cause Effect Graphing
Test data selection technique. The input and output domains are partitioned into classes and analysis is performed to determine which input classes cause which effect. A minimal set of inputs is chosen which will cover the entire effect set. A systematic method of generating test cases representing combinations of conditions. See: testing, functional

Clean Test
A test whose primary purpose is validation; that is, tests designed to demonstrate the software`s correct working.syn. positive test.

Code Walkthrough
A manual testing [error detection] technique where program [source code] logic [structure] is traced manually [mentally] by a group with a small set of test cases, while the state of program variables is manually monitored, to analyze the programmer's logic and assumptions.Contrast with code audit, code inspection, code review.

Compatibility Testing
The process of determining the ability of two or more systems to exchange information. In a situation where the developed software replaces an already working program, an investigation should be conducted to assess possible comparability problems between the new software and other programs or systems.

Condition Coverage
A test coverage criteria requiring enough test cases such that each condition in a decision takes on all possible outcomes at least once, and each point of entry to a program or subroutine is invoked at least once. Contrast with branch coverage, decision coverage, multiple condition coverage, path coverage, statement coverage.

Conformance Directed Testing
Testing that seeks to establish conformance to requirements or specification. CRUD Testing. Build CRUD matrix and test all object creation, reads, updates, and deletion.

D
Data Flow Testing

Testing in which test cases are designed based on variable usage within the code.

Database Testing

Check the integrity of database field values.

Depth Testing
A test case, that exercises some part of a system to a significant level of detail.

Decision Coverage
A test coverage criteria requiring enough test cases such that each decision has a true and false result at least once, and that each statement is executed at least once. Syn: branch coverage. Contrast with condition coverage, multiple condition coverage, path coverage, statement coverage.

Dirty Testing
Same as negative testing.

Dynamic Testing
Testing, based on specific test cases, by execution of the test object or running programs.

E
End-to-End testing

Similar to system testing; the 'macro' end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate.

Error Guessing
A test case design technique where the experience of the tester is used to postulate what faults exist, and to design tests specially to expose them.

Error Seeding
The purposeful introduction of faults into a program to test effectiveness of a test suite or other quality assurance program.

Exception Testing
Identify error messages and exception handling processes an conditions that trigger them.

Exhaustive Testing
Executing the program with all possible combinations of values for program variables. Feasible only for small, simple programs.

Exploratory Testing
An interactive process of concurrent product exploration, test design, and test execution. The heart of exploratory testing can be stated simply: The outcome of this test influences the design of the next test.

F
Formal Testing
Testing conducted in accordance with test plans and procedures that have been reviewed and approved by a customer, user, or designated level of management.

Free Form Testing
Ad hoc or brainstorming using intuition to define test cases.

Functional testing
Application of test data derived from the specified functional requirements without regard to the final program structure. Also known as black-box testing.

G
Gray Box Testing

It is a tezting technique in which we consider both black and white box testing.

H

High-Level TestING
These tests involve testing whole, complete .

I

Integration Testing
Testing conducted after unit and feature testing. The intent is to expose faults in the interactions between software modules and functions. Either top-down or bottom-up approaches can be used. A bottom-up method is preferred, since it leads to earlier unit testing (step-level integration) This method is contrary to the big-band approach where all source modules are combined and tested in one step. The big-band approach to integration should be discouraged.

L
Lateral Testing
A test design technique based on lateral thinking principals, to identify faults.


Load Testing
Testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails.

Load-Stability Testing
Test design to determine whether a Web application will remain serviceable over extended time span.

Load Isolation Testing
The workload for this type of test is designed to contain only the subset of test cases that caused the problem in previous testing.

M
Monkey Testing

Input are generated from probability distributions that reflect actual expected usage statistics -- e.g., from user profiles. There are different levels of IQ in smart monkey testing. In the simplest, each input is considered independent of the other inputs. That is, a given test requires an input vector with five components. In low IQ testing, these would be generated independently. In high IQ monkey testing, the correlation (e.g., the covariance) between these input distribution is taken into account. In all branches of smart monkey testing, the input is considered as a single event.

Maximum Simultaneous Connection testing
This is a test performed to determine the number of connections which the firewall or Web server is capable of handling.

Mutation testing
Mutation testing is reuired to ensure that the software doesn't fail. It is also good debugging mechanism. After the software works correctly. mutation testing can be done to simulate wrong inputs.
In mutation testing, program is modified or logic is changed. Different mutants are tested with same test cases. If the mutants fails, and the actual program works correctly test cases are considered as pass.

Multiple Condition Coverage
A test coverage criteria which requires enough test cases such that all possible combinations of condition outcomes in each decision, and all points of entry, are invoked at least once. Contrast with branch coverage, condition coverage, decision coverage, path coverage, statement coverage.

N
Negative test

A test whose primary purpose is falsification; that is tests designed to brake the software.

O
Orthogonal Array Testing
Mathematical technique to determine which variations of parameters need to be tested.

P

Parallel Testing
Testing a new or an alternate data processing system with the same source data that is used in another system. The other system is considered as the standard of comparison. Syn: parallel run.

Performance Testing

Testing conducted to evaluate the compliance of a system or component with specific performance requirements.

Prior Defect History Testing
Test cases are created or rerun for every defect found in prior tests of the system.

R
Recovery testing
Testing how well a system recovers from crashes, hardware failures, or other catastrophic problems.

Regression Testing
Testing that is performed after making a functional improvement or repair to the program. Its purpose is to determine if the change has regressed other aspects of the program.

Reference testing
A way of deriving expected outcomes by manually validating a set of actual outcomes. A less rigorous alternative to predicting expected outcomes in advance of test execution.

Reliability testing
Verify the probability of failure free operation of a computer program in a specified environment for a specified time.

Range Testing
For each input identifies the range over which the system behavior should be the same.

Robust test
A test, that compares a small amount of information, so that unexpected side effects are less likely to affect whether the test passed or fails.

S
Sanity Testing
Typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is often crashing systems, bogging down systems to a crawl, or destroying databases, the software may not be in a 'sane' enough condition to warrant further testing in its current state.

Sensitive Testing
A test, that compares a large amount of information, so that it is more likely to defect unexpected differences between the actual and expected outcomes of the test.

Specification Based Testing
A test, whose inputs are derived from a specification. State-based testing Testing with test cases developed by modeling the system under test as a state machine.

State Transition Testing
Technique in which the states of a system are fist identified and then test cases are written to test the triggers to cause a transition from one condition to another state

Static Testing
Source code analysis. Analysis of source code to expose potential defects.

Statistical Testing
A test case design technique in which a model is used of the statistical distribution of the input to construct representative test cases.

Storage Testing
Study how memory and space is used by the program, either in resident memory or on disk. If there are limits of these amounts, storage tests attempt to prove that the program will exceed them

Stress / Load / Volume Testing
Tests that provide a high degree of activity, either using boundary conditions as inputs or multiple copies of a program executing in parallel as examples.

Structural Testing
Testing that takes into account the internal mechanism [structure] of a system or component. Types include branch testing, path testing, statement testing. Testing to insure each program statement is made to execute during testing and that each program statement performs its intended function. Contrast with functional testing. Syn: white-box testing, glass-box testing, logic driven testing.

System Testing
Black-box type testing that is based on overall requirements specifications; covers all combined parts of a system.

T
Table Testing

Test access, security, and data integrity of table entries.

U
Unit Testing

Testing performed to isolate and expose faults and failures as soon as the source code is available, regardless of the external interfaces that may be required. Oftentimes, the detailed design and requirements documents are used as a basis to compare how and what the unit is able to perform. White and black-box testing methods are combined during unit testing.

Usability Testing
Testing for 'user-friendliness'. Clearly this is subjective, and will depend on the targeted end-user or customer.

V
Volume Testing
Testing where the system is subjected to large volumes of data.[BS7925-1]

W
White Box Testing
Testing is done under a structural testing strategy and require complete access to the object's structure that is, the source. Also knows as Glass Box Testing.