How to Successfully Evaluate a Monitoring Tool
In a previous post, I covered how to make a decision on whether to "Build or Buy" a monitoring solution. The ability to evaluate which offerings are on the market is a key part of that process. In order to make things more readable, I decided to split the whole process into multiple parts. In another post I also wrote about comparing feature matrices as well as a few other subjects that you might find useful when deciding whether to purchase any kind of new monitoring software. In this post, we will talk about the best ways to evaluate your choices.
Back to the subject of this post - Evaluating a Monitoring Tool. I chose this title very specifically; I could have just gone with, "How to Evaluate Monitoring Tools," but this to me suggests that there is no clear goal. You as the evaluator need to know what the outcome should be. This outcome should be based on a number of factors that you want to achieve through such a solution.
A question that often comes up in meetings is, "What would success look like?" To me, it's my favourite football team Spurs winning the English Premier League! This is never a popular answer to the person asking the question in the meeting, but generally raises a few smiles and lightens the mood. However, you're more likely interested in monitoring software and what success means in that scenario. As I see it, success means finding an outcome that is beyond doubt. Now success could mean that the software you are evaluating is not as good as the current incumbent. That is a successful outcome. You have decided that you already own the best solution for you. Congratulations! It can also mean that a particular solution meets all of the criteria needed by your business in order for it to solve technical issues and to grow.
Deciding how and when to evaluate
Deciding how and when to evaluate really does depend on what you are trying to achieve and the resources you have to evaluate with. More on the required resources in the next main section.
The best way to evaluate in my opinion is to be hands on. This does involve some investment in time on your part. Time can be expensive, not only from an operational cost perspective, but also because SQL Server DBAs tend to be busy people. There have been studies that suggest that a typical SQL Server DBA look after more instances than his or her Oracle counterpart. This makes time a scarce commodity.
If you are short on time, then ask a vendor for a demo. Asking for a demo before installing the software can have pros and cons. If you ask for a demo before you download and install the software, you could save yourself a lot of time. In the demo, you could decide that the tool is not for you, dramatically reducing the amount of time needed to (not) evaluate. Another good reason for asking for a demo first is one of configuration. The vendor on the call will be able to walk you through the architecture and make installation easier for you.
The benefit of having a demo after installing is that you can ask specific questions around things that you have noticed through using the product. Some vendors will be happy for you to have more than one demo if you feel you would benefit from that, or not all your team members were available, perhaps due to time zone differences.
If you would like a demo of any of our solutions, then you can book an appointment with an Engineer through our online booking system. https://sentryone.com/BookADemo.
If there are no times that suit your availability, check to see if there is a live webinar happening soon. There are also recorded webinars at https://www.sentryone.com/resources.
What happens in a demo?
The answer to this is going to vary from vendor to vendor, and possibly on whoever is conducting the demo and their skill level. Obviously, I can only speak from my own experiences, and so will provide the kind of format from my own proof of concept (P.O.C.) discussions.
- Introductions - This is the time for you to say what your expectations are from the session.
- Talk about the overall product offerings - there is no point talking about solutions that do not fit your business needs. You need to feel free to stop the demo if you feel the tool will not help you.
- Talk about the architecture - This is especially important if you have not installed the tool yet.
- Discuss key features
A good sales engineer will take their lead from you and try to make sure you have everything that you need to perform a successful evaluation. As monitoring products can be quite feature intensive, do not expect to be able to master everything in one session. You may want to have another session later on to make sure that you are on the right track.
At SQL Sentry, the first technical demo is just that. We are not the sales team; we wish to make sure that you have the knowledge to make an informed decision. If you want to know about pricing, we are pretty up front about the basic pricing structures; if your situation is more complex, or you are up for a competitive upgrade, then we will introduce you to a member of the sales team. There is no hard sell.
Proactive vs Reactive, vs Feel the Pain
I've lost count of the number of times that I have heard phrases like;
- "We want solutions not problems"
- "We need to be proactive rather than reactive"
- "We need to do more with less"
If you hear any of these things, these are clear signs that your manager and your business are crying out for efficiencies. Efficiencies that monitoring tools can provide in spades. One of the core benefits of using monitoring tools is offloading the manual effort and producing repeatable benefits across every target you wish to monitor.
If being proactive is such a good thing, why did I mention "Feel the Pain" in the subtitle? Well, unfortunately, businesses are keen to say that we should be proactive. However, when money is tight there is a tendency to only spend on those things that are deemed necessary. Unless there is a significant pain that you can show that you can reduce with whichever solution you are evaluating, your chances of a successful purchase diminish. We are in danger of rushing into my post on building a business case here, suffice to say as part of your evaluation you need to quantify in fiscal terms how feature x of product y would help your business.
Using an RFP approach
RFP stands for Request For Proposal. In some circumstances, companies will invite others to tender for business. They will then pick those they wish to evaluate based on the answers given to them in the proposal. This kind of approach can reduce the amount of time to find suitable applicants, as the company will not need to install and evaluate every piece of software on the market. They will just cherry pick the top 2 or 3 for a head to head based on the scores from the proposal. RFPs tend to be more prevalent when working with Government organizations, though you may find that your company is bound to this approach by law or internal policy. I'll cover more about RFPs in another post.
What resources are required for a successful evaluation?
In my time as a sales engineer for both SQL Sentry and Quest/Dell Software, I have seen evaluations that I know will fail from the very beginning. This was nothing to do with the competency of myself, the software, or indeed the person performing the trial. It is usually one of not having the correct political support to start with.
In order to successfully evaluate any kind of software, you need to be able to simulate what you will be doing with it in a live environment. I realize that there is a risk to just letting things loose on production servers, as you can not quantify what kind of workload could be put on an instance of SQL Server. Here at SQL Sentry we can quantify this; we're the only company to my knowledge that has created a white paper to outline the typical overhead. You can read more about this in our overhead analysis paper.
A lack of visibility results in people installing a copy of a vendor solution on their desktop or a server somewhere and pointing it at something unimportant, perhaps a server with little activity. All that could possibly occur in this scenario is familiarity with the interface, as there is little that one could gain about the capabilities of the offering.
To gauge the usefulness of a monitoring tool, you need to be able to monitor something. You need to know that there will be errors and that those errors can be caught and reported on. You need to know that, if there is an error, an action could be performed; you need to test how those actions are performed and if those actions could be integrated into other systems that you may already possess or plan to implement.
I've had some fun with unusual integrations. Have a look at the following posts for some interesting integration options:
Make a list of some of the core problems you see on a regular basis. Things like deadlocks, blocking, high CPU, long running code, and so on. Then check to see what offerings the vendor has to monitor and diagnose those areas of issue. Whilst doing that, the solutions might even highlight other areas that you had no idea were a problem!
From an implementation perspective, you need to make sure that you have the right hardware available. In this day and age of virtualization, this is not as much of an issue as it used to be. The main problem tends to be of understanding the architecture and the permissions that you will require to successfully connect to a target. Again, to make this easier on yourself, think about booking a demo.
The evaluation process
As mentioned before, booking a demo is a great start; it's entirely up to you if you want to download a vendor solution prior to a demo or after. Some people will be tempted not to go down the route of requesting a demo, as they do not want to give their real e-mail address. We're not spammers; we want to make sure that you have the right solution for your needs. If that's us, great. If not, we'll strive to be further down the line. We won't aggressively contact you about it though.
Vendors will generally have an evaluation that you can download for free. Most of these will have full functionality, although may have limits on certain parameters, such as the number of targets that you can monitor. If you wish to monitor a larger amount, ask the vendor. Likewise, if the trial is not long enough, ask the vendor for an extension. Most will be happy to oblige.
I'll be talking about comparing features in another post; there are some feature matrices available online. The recommendation I would make is that you create your own matrix. Start with some of the top issues that you face now. Which vendors tools can see them, which vendors help you to diagnose why they happened? After that, you can start to look at the added value that they can provide. Is this a tool that you see yourself outgrowing? Create yourself a spreadsheet or database to keep all these results in. If you create a database, don't forget to back it up :-)
You are not just evaluating the software
There are a myriad of other aspects that you need to consider.
- What version number is the software? Is this a high number? The higher the number, typically, the more established the software is. This likely means fewer bugs.
- When was the last release made? How frequently are new updates released? Does this company seem innovative, or just playing catch-up?
- What is the technical documentation like? What is support like?
- Are there tutorials available online? Do they offer courses?
- How do you feel about the quality of communication between yourselves and this company?
- Are these people you can work with?
- What do your peers think?
If you are looking to check out a holiday destination, you would look at TripAdvisor, for general merchandise you would probably go to Amazon to read reviews. For software, you can go to Trustradius.com and read what your peers think about the SQL Server monitoring tools on the market.
Hopefully, this has provided you with some food for thought. The key takeaways are:
- Identify your needs
- Identify vendor solutions in that space
- Compare the features against your needs through an RFP, demo or POC
- Evaluate the company as well as the software
- Make your recommendations
Look out for more posts in the series, where I'll be offering advice on creating your own feature matrices, RFP's, and more.
Richard (@SQLRich) is a Principal Solutions Engineer at SentryOne, specializing in our SQL Server portfolio offering in EMEA. He has worked with SQL Server since version 7.0 in various developer and DBA roles and holds a number of Microsoft certifications. Richard is a keen member of the SQL Server community; previously he ran a PASS Chapter in the UK and served on the organizing committee for SQLRelay.