Design of a Performance Evaluation Tool for Multimedia ...

icon

163

pages

icon

English

icon

Documents

Écrit par

Publié par

Le téléchargement nécessite un accès à la bibliothèque YouScribe Tout savoir sur nos offres

icon

163

pages

icon

English

icon

Ebook

Le téléchargement nécessite un accès à la bibliothèque YouScribe Tout savoir sur nos offres




Design of a Performance Evaluation Tool for
Multimedia Databases with Special Reference
to Oracle




Submitted in fulfilment of the requirements for the degree of


MASTER OF SCIENCE


By


Tonia Stakemire




Computer Science Department
Rhodes University
Grahamstown
South Africa


April 2003 ABSTRACT

Increased production and use of multimedia data has led to the development of a more
advanced Database Management System (DBMS), like an Object Relational Database
Management System (ORDBMS). These advanced databases are necessitated by the
complexity in structure and the functionality required by multimedia data. Unfortunately, no
suitable benchmarks exist with which to test the performance of databases when handling
multimedia data. This thesis describes the design of a benchmark to measure the performance
of basic functionality found in multimedia databases.

The benchmark, called MORD (Multimedia Object Relational Databases), targets Oracle, a
well known commercial Object Relational Database Management System (ORDBMS) that
can handle multimedia data. Although MORD targets Oracle, it can easily be applied to other
Multimedia Database Management System (MMDBMS) as a result of a design that stressed
its portability, and simplicity. MORD consists of a database schema, test data, and code to
simulate representative queries on multimedia databases.

A number of experiments are described that validate MORD and ensure its ...
Voir Alternate Text

Publié par

Nombre de lectures

121

Langue

English

Poids de l'ouvrage

1 Mo

Design of a Performance Evaluation Tool for Multimedia Databases with Special Reference to Oracle Submitted in fulfilment of the requirements for the degree of MASTER OF SCIENCE By Tonia Stakemire Computer Science Department Rhodes University Grahamstown South Africa April 2003 ABSTRACT Increased production and use of multimedia data has led to the development of a more advanced Database Management System (DBMS), like an Object Relational Database Management System (ORDBMS). These advanced databases are necessitated by the complexity in structure and the functionality required by multimedia data. Unfortunately, no suitable benchmarks exist with which to test the performance of databases when handling multimedia data. This thesis describes the design of a benchmark to measure the performance of basic functionality found in multimedia databases. The benchmark, called MORD (Multimedia Object Relational Databases), targets Oracle, a well known commercial Object Relational Database Management System (ORDBMS) that can handle multimedia data. Although MORD targets Oracle, it can easily be applied to other Multimedia Database Management System (MMDBMS) as a result of a design that stressed its portability, and simplicity. MORD consists of a database schema, test data, and code to simulate representative queries on multimedia databases. A number of experiments are described that validate MORD and ensure its correct design and that its objectives are met. A by-product of these experiments is an initial understanding of the performance of multimedia databases. The experiments show that with multimedia data the buffer cache should be at least large enough to hold the largest dataset, a bigger block size improves the performance, and turning off logging and caching for bulk loading improves the performance. MORD can be used to compare different ORDBMS or to assist in the configuration of a specific database. ACKNOWLEDGEMENTS I would like to thank my supervisors, Professor David Sewry and Alfredo Terzoli, for their continuous effort and support throughout the duration of my degree. I would also like to thank Stefan Krueger for his encouragement and for proof reading various drafts of the thesis. Finally, I would like to thank the members of the Rhodes University Computer Science Department and my friends who have helped in many different ways and have made my time here enjoyable. CONTENTS CHAPTER 1 : INTRODUCTION ....................................................................................................................... 1 1.1 MOTIVATION ................................................................................................................................................. 1 1.2 MULTIMEDIA DATABASES AND THEIR PERFORMANCE.................................................................................. 2 1.3 PROBLEM STATEMENT... 3 1.4 AIM................................ 4 1.5 FOCUS AND SCOPE.......... 4 1.6 ORGANISATION OF THESIS............................................................................................................................. 6 CHAPTER 2 : TUNING AND PERFORMANCE EVALUATION OF MULTIMEDIA DATABASES: A SURVEY................................. 8 2.1 MULTIMEDIA DATABASES 8 2.1.1 Multimedia Data ................................................................................................................................... 8 2.1.2 Multimedia Databases........................................................................................................................... 9 2.1.3 Multimedia Applications..................................................................................................................... 12 2.2 DATABASE TUNING AND CONFIGURATION .................................................................................................. 12 2.2.1 Database Tuning.. 12 2.2.2 Operating System Tuning ................................................................................................................... 14 2.2.3 Core Component Tuning Areas .......................................................................................................... 15 2.2.4 Additional Database Tuning Areas ..................................................................................................... 15 2.3 PERFORMANCE EVALUATION ...................................................................................................................... 15 2.3.1 Value of Performance Evaluations...................................................................................................... 15 2.3.2 Measuring Performance 16 2.3.3 Calculating Performance..................................................................................................................... 16 2.4 SUMMARY ................................................................................................................................................... 18 CHAPTER 3 : BENCHMARKING DATABASE: A SURVEY...................................................................... 19 3.1 OVERVIEW OF BENCHMARK ........................................................................................................................ 19 3.2 EXISTING BENCHMARKS.............................................................................................................................. 19 3.2.1 Generic Benchmarks........................................................................................................................... 20 3.2.1.1 RDBMS - Wisconsin.............................................................................................................................20 3.2.1.2 ORDBMS - BUCKY20 3.2.1.3 OODBMS – 007 and predecessors.......................................................................................................21 3.2.2 Architecture Benchmarks.................................................................................................................... 21 33.2.2.1 AS AP...................................................................................................................................................21 3.2.2.2 SPEC Benchmark..21 3.2.2.3 MediaBench..........22 3.2.2.4 Other Architecture Benchmarks...........................................................................................................22 3.2.3 Business Benchmarks ......................................................................................................................... 22 3.2.3.1 TPC Suites.............22 3.2.4 Engineering Benchmarks 23 3.2.4.1 Sun Benchmark, 007 and Derivatives ..................................................................................................23 3.2.4.2 EDB......................................................................................................................................................24 i 3.2.5 Advanced Database Benchmarks........................................................................................................ 24 3.2.5.1 Semantic Benchmark............................................................................................................................24 3.2.5.2 Michigan ..............................................................................................................................................25 3.2.5.3 SEQUOIA 2000.....25 3.2.5.4 FTR Benchmark....25 3.3 CRITERIA FOR BENCHMARK ........................................................................................................................ 25 3.4 DESIGN OF BENCHMARK ............................................................................................................................. 27 3.4.1 Schema Design..... 27 3.4.2 Test Data.............. 29 3.4.3 Workload............. 30 3.4.4 Performance Measurement ................................................................................................................. 32 3.4.5 Programming Language...................................................................................................................... 33 3.5 PROBLEMS WITH EXISTING BENCHMARKS .................................................................................................. 34 3.5.1 Schema Problems................................................................................................................................34 3.5.2 Test Data Problems............................................................................................................................. 35 3.5.3 Workload Problems ............................................................................................................................ 35 3.5.4 Programming Language...................................................................................................................... 35 3.6 REQUIRED BENCHMARK.............................................................................................................................. 36 3.7 APPLICATION OF BENCHMARK .................................................................................................................... 37 3.8 SUMMARY.................... 38 CHAPTER 4 : ORACLE DBMS........................................................................................................................ 39 4.1 INTRODUCTION TO ORACLE......................................................................................................................... 39 4.2 ARCHITECTURE OF ORACLE 40 4.3 ORACLE COMPONENTS................................................................................................................................41 4.3.1 Overall Structure.. 41 4.3.2 Oracle Processes.. 42 4.3.3 Server Processes... 43 4.3.4 Physical Database Files.................................................................
Voir Alternate Text
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents
Alternate Text