Database and systems performances rely heavily on traditional Queue Theory Analysis to identify performance barriers. While this type of analysis quickly targets short-term obstructions, its methodology focuses only on a single metric during each evaluation cycle and fails to look at the comprehensive environment.

A successful alternative to this approach is the scorecard analysis.

Scorecards evaluate multiple metrics simultaneously during any analysis cycle, which provides a more holistic view of the environment. After all, performance problems are rarely the result of a single metric. Identifying multiple corrective solutions for each analysis cycle improves your analytical resources and operational efficiencies while delivering value added benefit to the organization faster.

Multi-metric scorecards can identify high-valued problematic areas faster and easier than traditional methods, whether you’re running a performance analysis for an application, business, database or system. A key feature of this methodology is its flexibility to adapt to environmental changes quickly and easily. Key Performance Indicators (KPIs) can be added, removed or altered based upon the KPI’s value to the analysis process.

Scorecard results can be easily tracked and reported over time, offering an historical perspective. If past performance data have been retained, it may be possible for historical cycles to be replayed, utilizing metric changes, to shed new insights upon performance.

The scorecard methodology is an iterative process that can be used in both proactive and reactive environments. A proactive implementation requires extra planning, increased automation and is suited for a dynamic environment. In contrast, highly structured, static environments can benefit from a reactive approach.

Here are four steps you should follow when using the Scorecard approach:

1. Define/Refine KPIs

The initial process requires the identification of relevant and meaningful KPIs that directly relate to measuring performance of the targeted subject. A KPI can be directly measurable or a calculated value based upon other measured metrics. In addition to defining which KPIs to utilize, how often this data will be evaluated must be defined at the beginning.

Because of the iterative process of Scorecard analysis, a periodic review of each metric must be performed to determine its value add to the overall process. Obsolete or irrelevant metrics can be dynamically removed, and newly identified metrics can be added.

Retention of historical data allows for changed metrics to be reanalyzed quickly and easily.

2. Collect, Rank, Analyze Performance Data

Automation tools should be deployed to ensure consistent and repeatable data collection and retention.

There are numerous ranking methodologies available such as Dense, Ordinal, Fractional, Standard & Modified competition. Select a ranking methodology that best represents the relationship between metrics. Be sure each metric is ranked independently of the other metrics.

The analysis phase should result in a list of ranked metrics where the most offensive attributes are easily identifiable.

3. Identify and Implement Corrective Action(s)

Once the poorly performing attributes have been identified, corrective action(s) must be identified. Because this is a multi-metric analysis approach, it is likely there will be multiple corrective actions.

4. Repeat Analysis Process

This process is repeated based upon your predetermined analysis cycle; however, it is imperative that corrective actions are implemented in order to find the next set of poorly performing attributes.

How Odyssey can help

Your business’ data is its most valuable commodity. Our team of experts can help you keep it safe, keep it accurate, and keep it performing. From daily operations to performance design to backup and recovery, we will incorporate 20 years of experience keeping our clients’ databases running smoothly and always available. Click here to learn more.

About the Author

Greg Hunt
Greg HuntSolutions Architect
Greg Hunt is a technology professional with more than 30 years of industry experience in programming, systems management, data analytics, and query/database design and performance across a variety of platforms. After working for nearly a decade at HPE, Greg brought his extensive knowledge of architectural practices, data management and database platform experiences to Odyssey’s team in 2016 where he aides our clients with improving overall data utilization and database performance.