Webcast : Geek Sync | Tips for Data Warehouses and Other Very Large Databases
Whether on-premises or in the cloud, DBAs are often asked to create and manage optimal database designs for data warehouses, data lakes, and many other very large databases (VLDBs) using relational database management systems. These databases will be used for business intelligence, data mining, and data analytics. They are radically different than traditional online transaction processing (OLTP) systems. So what special design concerns will be faced? What database editions and features to rely upon? What kind of query execution plans should be sought?
Join IDERA and Bert Scalzo as he covers all pertinent issues, which some may even consider best practices, for such highly specialized database requirements. While the basic concepts will be universally applicable, examples will be primarily in Oracle, with some also in MySQL, SQL Server, and PostgreSQL.
A data warehouse is a large, centralized repository of data that is used to store, analyze, and manage vast amounts of structured and semi-structured data from various sources within an organization. It supports efficient querying and reporting, enabling businesses to make data-driven decisions. Key features of a data warehouse include:
* Integration: Data from different sources is integrated into a single, unified format, making it easier to analyze and compare data across systems.
* Scalability: Data warehouses are built to handle large volumes of data, often in the terabytes or petabytes range, and can scale up as the organization’s data needs grow.
* Historical data storage: Data warehouses store historical data, allowing organizations to analyze trends and changes.
* Data quality and consistency: Data is cleaned, transformed, and standardized before being stored in the data warehouse, ensuring high quality and consistency across the organization.
* Support for complex queries: Data warehouses are optimized for running complex analytical queries, allowing users to extract valuable insights from the data.
About the Presenter
Bert Scalzo is an Oracle ACE, author, speaker, consultant, and a major contributor for many popular database tools used by millions of people worldwide. He has 30+ years of database experience and has worked for several major database vendors. He has BS, MS and Ph.D. in computer science plus an MBA. He has presented at numerous events and webcasts. His areas of key interest include data modeling, database benchmarking, database tuning, SQL optimization, “star schema” data warehousing, running databases on Linux or VMware, and using NVMe flash based technology to speed up database performance.
Please register to view the webcast replay.