Database System Testing Process:
When carrying out a testing process for a newer version of an existing application, it’s a little easier to compare the results of the test with the performance data in your current production environment. However, this flexibility may be none existent when introducing or testing a totally new application that does not have an older version already in production. Therefore, you need to establish clear baseline performance metrics, collect instance data, analyze execution plans and data access patterns, collect operating system and hardware data, and begin the tuning process.
  • The test system should be in non-production environment and tuning must be performed in accordance to the performance tuning goals set forth, which outlines how to evaluate and implement changes.
  • Establish clear benchmarks to understand the load and throughput the database system can handle.
  • Perform extensive Unit testing or module by module testing to identify and eliminate execution errors that could cause the application programs to terminate abnormally, and logic errors that could have been missed during desk checking or structured walk-through process in the development phase.
  • Perform a final and overall test of the new database system for functionality, following the syntax error checking, structured walk-through code review, design walk-through and unit/module testing.
  • Perform application testing (OLTP, Wed-application, etc) using many simulated online users, multiple application programs performing different functions, random access patterns, continuous activity, and monitoring peak and low utilization periods.
  • Simulate the entire application environment (including servers, application tier, database layer, etc) with as many users as possible.
  • Run load tests in the database using production data to measure the performance of each unit or overall system using “third-party I/O stress tool,” such as Loadrunner, Postmark, IOZone or ORION. This will validate several aspects of the application that a functional testing cannot validate.
  • Monitor and compare the results with prior old fashion desk checking or structured walk-through techniques or quality assurance (QA) test results.
  • Document real-time and historical (collected metrics) analysis that will provide the basis for determining and developing proper action plans and troubleshooting task list.




References:

Afyouni, A. H. (2004). Oracle 9i Performance Tuning: Optimizing Database
Productivity. Thompson Course Tech.

Alapati, S. R. (2008). Expert Oracle Database 11g Administration. Apress.

Burleson, D. K. & Danchenkov, A. B. (2005). Oracle Tuning: The Definitive Reference.
Rampant Techpress

Connolly, M. T, & Begg, E. C, (2002). Database Systems: A Practical Approach to
Design, Implementation, and Management. Addison-Wesley.

Johnson, J. C. (2002). OCP: Oracle9i performance tuning study guide. Illustrated Edition.
John Wiley and Sons.

Mittra, S. S. (2002). Database Performance Tuning and optimization: Using Oracle.
Illustrated Edition: Springer.

Page, W. G. (1999). Using Oracle8/8i, Special Edition. Chapter 20: Que Books.
Retrieved on September 28, 2009 from http://docs.rinet.ru/O8/index.htm