Notes:
• This is a continuation from the blog about Development Test http://soa-java.blogspot.nl/2012/09/development-test.html
• For these checklist items sometime I use questions instead of mandatory compliance checks (e.g. "how to setup test data" instead of "checklist: test data should always via database"). The goal of the checklist is to instigate our mind to be aware to certain issues, not to force to a specific/narrow solution. The "best" choice is depend on the project context (e.g. test goal, security environment, etc.).
• The symbol "»" in the begining of the line means that the item is relatively important.
Test Plan template
• Datum, version, test objectives & scopes (e.g. functional requirements acceptance test, security penetration system test, performance unit test)• Process definitions (can be defined in the dev-team level, so don't have to be written in each test plans): metrics (e.g. #bugs & severity), defect classification, exit criteria, defect management, defect reporting (e.g. Trac), deliverables (e.g. test case library), if review or approval is needed for this test plan (e.g. test manager, clients).
• Assumptions (e.g. firewall and server configurations mimic the production environment)
• Pre-condition for the whole test cases e.g. licenses for software, database production is cloned in the LabManager/virtual machine test environment
• » For each test cases:
o Test case name and short description
o Traceability with requirement/usecase docs (i.e. the requirement ID)
o Preconditions for this test case (e.g. certain data states in the database, certain inputs from mock web services)
o Test steps and inter-dependencies with other test cases: e.g. fill-in employees' salaries steps: ....., dependency: add new employees (test case#1)
o Input data e.g. birth date 31-2-1980 (which is an incorrect datum)
o Expected results
o Part of system (e.g. GUI/presentation tier)
o Area (e.g. security, functional, performance)
o Test method (e.g. manual, unit test)
o Priority / risk / test effort / test coverage (e.g. high, low)
• » Resources:
o roles, who will build/execute the tests and how many man-hours needed (including external resources & trainings needed due to skills-gap)
o server/database/software/tools/hardware needed
• Schedule/plan
Test Report template
• » Test date, version, tester name, artifact (which jar, svn revision), test environment (which server/LabManager), test code version (svn rev)• » Test objectives & scopes (e.g. functional requirements acceptance test, security penetration system test, performance unit test
• » For each test result:
• Test result ID number
• Traceability (test case ID number in the test plan, requirement ID number in the requirement docs)
• Expected result e.g. web service respond time below 2 seconds (average) and 5 seconds (max).
• Actual result and impact, e.g. result: the web service respond time is 90 seconds, impact: the user waiting time with GUI is 2 minutes (unacceptable according to the SLA)
• Status:
• Ok/green: tested ok
• Bug/red(high priority)/yellow(low priority): defects, a ticket has to be made in bugzilla/trac (with priority level & targeted version/milestone)
• No-bug/gray: won't fix, false-positive
• Hasn't been tested/white
• Follow-up actions (e.g. reworks by developers)
• Part of system (e.g. GUI/presentation tier)
• Area (e.g. security, functional, performance)
• Priority / risk (e.g. high, low)
• Root causes analysis and recommendations e.g. excessive bugs in authentication classes, root causes: inadequate knowledge, recommendation: training, code review.
• Resources (roles, planned & actual man-hours)
• List non-testable requirements e.g. the GUI should be beautiful.
Weekly Status report
Please see http://soa-java.blogspot.nl/2012/09/weekly-status-report-template.htmlTest data
• » How to setup test input data (e.g. via database copy or DDL-DML database scripts) each time we setup a new Labmanager/test environment.• » Make test cases for: too little data (e.g. empty input, null), too much data, invalid data (wrong format, out of range), boundary cases
• » Make sure the positive cases have correct data (e.g. validated according to xml schema, LDAP attributes & tree structures are correct)
• » How to mask sensitive test data (e.g. password, bank account)
• » How realistic the data are?
• How to collect / create test input data (e.g. sampling the actual traffic from jms topic or populate fake customers data using pl/sql).
• How to recover/reinitialized data after test (to fullfill the precondition for the next test)
• How to maintain / versioned test data (i.e. test data for current version and for the next software version)
• How to collect and save the test result data if needed (for further test or analysis)
Functional & Design
• » Test that the product correctly implements (every) requirements and use-cases (including alternative use-cases)• » The product works according to the design and its assumptions (e.g. deployment environment, security environment, performance loads)
• » Test the conformance to relevant standards: the company standard/guideline as well as common standard such as Sarbanes-Oxley (US) / WBP (Netherlands)
• Test that (every) functions give correct result ( including rounding-error for numerical functions)
• Test that (every) application logics (e.g. flow control, business rules)
Performance test
• » Find out the typical data traffic (size, frequency, format) & number of users/connections in the production• » Response time (UI, webservice) / throughput (web service, database) meet the requirements/SLA.
• » Load test: at what load the performance degrades or fails
• » Stress test: running the system for long time under realistic high loads while monitoring resource utilization( CPU/memory/storage/network) e.g. to check memory leak, unclosed connections,tune timeout, tune thread pools.
• In case of unacceptable performance: profiling the system parts that affect the performance (e.g. database, queue/messaging, file storage, networks).
• Scale out (capacity planning for future) e.g. 3x today peak usage
• Test the time to complete of offline operation (e.g. OLAP/ETL bulk scheduled every night). Is the processing time is scallable? What to do if the bulk operation doesn't finish yet at 8.00/working hours?
• Rerun the performance test periodically in case of changes in usage patterns (e.g. growing number of users), change configurations, addition of new modules /services. So we can plan the capacity ahead and prevent the problems before it happens.
Realibility test
• Test (every) fault possibilities, test behaviour & error messages when an exception/failure occurs (e.g. simulate network failure or url-endpoint connection error in the configuration plan)• Test that faults don't compromise the data integrity (e.g. compensation, rollback the transaction) and security. Data loss should be prevented whenever possible.
• Test failover mechanism, check the data integrity after failover.
Environment/compatibility test:
• » Tests for different browser (for UI projects), application servers (e.g. vendor, version), database (e.g. vendor, version), hardware (memory, cpu, networks), OS (& version)• » Tests for different encoding (e.g. UTF-8 中文), different time-zone, different locales (currencies, language, format) e.g. 2,30 euro vs $ 2.30, test conversion between different components (e.g. database and LDAP servers can have different date format).
• » Test file system permissions using different process owner (e.g. generate files with oracle-user & consume the files with weblogic-user during applications integration)
• » Test if the configuration files (e.g. deployment plan, web.xml, log4j-config.xml) work
• Integration test: the connections between components (e.g. the endpoints in the configuration plan)
• Install & uninstall, deployment documentation
GUI
• » All GUI messages (including error messages) are clear/understandable by end users and match with user terminologies• » How frequent are the errors? how the system reacts to user error (e.g. invalid input, invalid workflow)? how the users recover from errors?
• All navigations/menu/links are correct
• Check whether all GUI components (menu/commands/buttons) described in the user instructions are exists
• The fonts are readble
• The GUI consistent is with user environment (e.g. web style in your organization)
• The software state is visible to the users (e.g. waiting for the backend response, error state, waiting user input/action)
• Validate de (X)HTML, CSS: Doctype, syntax/structuur valid
• Another GUI testing checklists: http://www.sitepoint.com/ultimate-testing-checklist/
Tips for organizing usability test
• Identified the test subjects• Provide a simple test guideline & result questionnaire, beware that your test subjects may be not so technical
• Is the software intuitive, easy to use, how much training is needed when you roll out this product in the production?
• Is online help or reference to user documentation available? User documentations should be complete enough and easy to understand for the intended audience
• Attend at least one test as test participant
Coding
• Test that variables are correctly initialized• Test multi-threading scenarios (race condition, deadlock)
Tools selection
• do any team member already have experiences with this tool• how easy to use
• customer review, popularity, how active the discussion groups/blogs to learn
• maturity
• support
• how active the development
• memory, processor requirement
• price/open-source
• easy to install/configure
• functionality, does this tool meet the requirement of the company tests
• demo/try before buy
Security
• Authentication: login, logout, guest, password strength• Authorisation: permissions, admin functions,
• Data overflow, huge input attack/DOS
For more complete security checklists see http://soa-java.blogspot.nl/2012/09/security-checklists.html
Source: Steve's blogs http://soa-java.blogspot.com/
Any comments are welcome :)
References:
• Software Testing and Continuous Quality Improvement by Lewis• Code complete by McConnell
• Department Of Health And Human Services, Enterprise Performance Life Cycle Framework, Checklist.
No comments:
Post a Comment