The Feature Matrix and Other Tall Tales
Once upon a time a marketing manager was hired to make his/her company's product look a lot better than anything else on the market. This marketing person said to themselves, "I know what to do; let's become creative." Let's create a list of features, or groups of features of our choosing, using specially crafted words to show how awesome we are. Henceforth the feature matrix has become a staple of marketing literature.
This may seem a little over the top, childish perhaps, but if you look closely at examples on the internet using your favourite search engine, you will soon be able to find your own examples. Below is a cropped image to show the kind of thing I mean. The companies involved are not relevant, it's the message and tactics used that are important.
A prime example
In the image below, we can see some rather worrying, and outright confusing "statistics." At first glance, you may take this as legitimate. Vendor 4 clearly has a superior product. Then you may wonder what each of the features actually means. There were no further descriptions on the page that I could see to explain what each of these headings meant. This is confusing!
Let's take a tongue-in-cheek look at what the opposition currently offer (or don't offer). In each category, vendor 4 is the only vendor to have a complete offering.
- Personalization - In my view, this is a binary attribute; you can either personalize or you can't. The level of personalization is a configuration option and may or may not be appropriate for your needs.
- Policy - Again, only 1 vendor meets the grade, while 2 vendors have half "a policy."
- Security & Compliance - As a former DBA, this scares me; there is no security or compliance at all for 3 of these vendors. Installing via CD-ROM may lead to radioactive burns for all we know, as they are not compliant with any law in any land.
- Performance - 3 of the vendors have no levels of performance at all! This might, however, mean that the "independent" company performing this test couldn't actually start it. Vendor 4, of course, has full performance, whatever that means.
- File Sync & Share - Only vendor 4 can do both of these things. Strangely, as there are 2 features listed here, the other 3 vendors are rated at either 1/4 or 3/4. One would expect half or nothing.
- Analytics - Reporting on your operational data. Thankfully vendor 4 offers what we can assume is at least one report, while vendor 1 can only manage to publish a 1/4 of a report. Vendors 2 & 3 failed miserably; shame on them. Perhaps they offered a "Statistics" feature rather than an "Analytics" feature. It's hard to say.
Once you take a closer look at these, you realize that a feature matrix is not all that it could be. Maybe vendor 1 actually had all these features, but they were added in 1 day after this matrix went live. Perhaps the matrix is several years old. It's possible that things have been worded in such a way to discount the exact same feature available in another product. I specifically chose an example from a field I knew nothing of to make sure there was no bias for or against any vendor on the matrix.
SQL Server Professionals aren't so gullible
This scenario came to mind because I have seen similar claims made in feature matrices for products relating to SQL Server and the rest of the Microsoft data platform. Without naming names, in one specific example they call out another SQL Server vendor for omissions in their feature matrix. I can confirm that there were not omissions in all those features for products produced at former and current employers, and that the matrix was grossly out of date the moment it was published.
Did they hire somebody independent to do this? I honestly don't know, so it would be unfair to speculate or imply anything misleading was done on their part. And this is merely one example among many.
The exception to the rule
The one time you should look at a feature matrix, and know that it can be trusted, is when a company is comparing its own products and nobody else's. You will undoubtedly see this on numerous sites where a company offers a free and premium service. Hosting companies are a good example of this.
OK, there may actually be two exceptions, the other being where the matrix is dated, version numbers for each product have been used, and where the comparison has been performed by a completely independent body. A good example of this might be the Gartner Magic Quadrant reports. The caveat here is that, although it is independent, each vendor listed has paid to be there. And you can't always use those to help evaluate specific types of software. At the time of writing, there is no Magic Quadrant for SQL Server monitoring tools, for example.
Creating your own feature matrix
Because of how fallible the feature matrix can be, it is important that you do not pin a large part of your evaluation process on such a construct. You can read more about what you should be concentrating on in this post: How to Successfully Evaluate a Monitoring Tool.
What I will say is that you should come up with your own list that you need to monitor, what technologies do you use and need to monitor? A subset of the following would be a good bet for most SQL Server professionals:
- SQL Server
- Analysis Services (SSAS)
- Reporting Services (SSRS)
- Integration Services (SSIS)
- Hyper-V / VMware
- Azure SQL Database
- Analytics Platform System (APS)
- Azure SQL Data Warehouse (SQL DW)
Which features or events do you need to monitor?
These should include, but not be limited to:
- Long running queries
- Waits and Queues information
- High Availability technologies
- Disk contention
- Baseline information
- Query plan analysis
- Other product integration
- Different SLAs for different servers
- Different SLAs for different databases
- Custom alerts
A tick box approach is one way of looking at things; however, you should probably look at a scoring system in order to find the most comprehensive product. For example, all systems are likely to have an alerting engine, but which is the most flexible, the most comprehensive?
I'll discuss more regarding scoring systems in an upcoming post about RFPs.
Why it's unfair to ask for a comparison
If I had a dollar for each time somebody asked me how feature X from product Y compares to feature A from product B, then
- I would have a lot of dollars.
- It would be annoying because I wouldn't be able to spend them in my country.
Anybody who works for a company making products will intentionally or unintentionally put a positive spin on their own product. At the end of the day, it is that person's job to promote their solutions over ours. Thankfully I am blessed to work in a company whose tools I believe in; solutions that I myself have used in the field in a previous role. The most you will ever hear from myself on this matter is "to my knowledge, this is where we differ" and I'll only say that if I know the other tool well. If I do not, then I will simply say that. It would be unethical of me to do otherwise.
In this post in the series on choosing the correct monitoring tool, we have been talking about the dangers of accepting information in a feature matrix at face value. A feature matrix may provide a good starting point to kick off your own investigations, but the data in it should be verified against the current versions of the solutions from the other vendors.
Ideally, you should create your own matrix which consists of things that are important to your business needs. Some of which have been listed above as a starting point for you.
Don't forget that this is not a tick box exercise; refer back to my post on How to Successfully Evaluate a Monitoring Tool for further information.
In a future post, we will be expanding upon the concept of the feature matrix to create a Request for Proposal (RFP).
Richard (@SQLRich) is a Principal Solutions Engineer at SentryOne, specializing in our SQL Server portfolio offering in EMEA. He has worked with SQL Server since version 7.0 in various developer and DBA roles and holds a number of Microsoft certifications. Richard is a keen member of the SQL Server community; previously he ran a PASS Chapter in the UK and served on the organizing committee for SQLRelay.