White Paper

3 Hacks to Improve SSAS Memory Efficiency

 

Memory is everything—especially when it comes to SQL Server Analysis Services (SSAS), where memory constraints are a common issue. Because the SSAS Tabular database should reside entirely in-memory in Tabular mode, performance bottlenecks can be particularly challenging in that mode.

Although every environment is different, in SSAS Tabular mode, the database in-memory is typically compressed up to 10x versus on disk. But if the system isn’t configured properly, various data-processing actions can gum up the works (so to speak). A memory-efficient data model relieves that memory pressure when processing data in SSAS.

Three specific configuration hacks can turn your data model into the picture of efficiency. These hacks help you optimize SSAS performance, reduce bottlenecks, and boost speed.

 

1. Adjust Memory Limits

 

SSAS uses configurable memory limits to instruct its cleaner on the handling of both low and high total system memory. If you run SSAS on the same machine as a relational SQL Server instance, you can encounter conflicts when these two programs compete for resources. The cleaning process can also affect performance in and of itself, so keeping track of these settings is important.

The three most important memory limits in Tabular mode are low, high, and VertiPaq.

  • Low memory defaults to 65% of the physical memory on the server. When the low memory limit is exceeded, the SSAS cleaner begins to remove things from cache and looks for opportunities to free up memory.
  • High (or total) memory defaults to 80% of the physical memory. When the total memory limit is exceeded, the cleaner more aggressively removes items from cache and does whatever it can to reduce memory use to below the configured limit.
  • VertiPaq (technically the xVelocity in-memory storage engine) memory defaults to 60% of the physical memory and is further defined by SSAS paging policies. In Tabular mode only, this setting defines the memory limits specific to the in-memory storage engine.


You can use these settings to allocate more space to memory (if free space is available on the server). Be aware that at allocations above 100, the setting is in bytes. Don’t let this change trip you up.

Two strategies can lead to SSAS performance benefits:

  • Monitor for paging. By default, Tabular mode allows for paging, so although the entire database is supposed to be compressed in-memory, the system can still page to disk—slowing performance. (It can be difficult to determine within Perfmon whether paging is occurring. In the SentryOne BI Sentry dashboard, you can correlate various metrics to make this determination.) Don’t assume that no paging is occurring and negatively impacting performance, simply because there is no obvious messaging from the server to that effect.

 

Memory Usage Shown on BI Sentry Dashboard

Memory Usage Shown on BI Sentry Dashboard

 

  • Ensure you can set memory limits to allow a minimum of 2.5x the compressed data size. With BI Sentry, you can quickly determine the use and allocation of memory and whether chronic memory issues are occurring. To prevent such issues, allow for memory limits of at least 2.5x the compressed data size for a full copy of the database. The Process Full method creates a shadow copy of your data each time it’s run, so this setting allows for that plus additional overhead from requests on the system while data is being processed. For example, for a 10GB dataset on disk, which might typically compress to 1GB in memory, set the VertiPaq memory limit to at least 2.5GB to avoid errors or paging. (Also consider using an alternative processing method, when applicable, to further relieve memory pressure.)

 

2. Include Only Necessary Columns

 

An easy way to decrease the data model size—thus making it more memory efficient—is to limit the columns that are included in the data model. By including only the necessary columns from the relational data source, you can save memory during columnar compression. BI Sentry automatically collects object memory usage data that you can use to identify memory intensive columns that you might not need.

 

Object Memory Usage

Object Memory Usage

 

3. Limit Cardinality

 

The third hack is also a simple one: When possible, limit cardinality. Columns with unique data may be compressed minimally (or not at all), reducing memory efficiency. You can improve compression by modifying or splitting columns that have high cardinality. DateTime columns are a great example. Consider removing the Time element if it’s not needed. Or, you can split them into separate columns for date and time, decreasing the amount of cardinality and thus increasing compression. You can also use BI Sentry to identify high-cardinality columns.

 

Get Better Insight for Memory Efficiency

 

These three little hacks can make a big difference in the efficiency of your data model—and SSAS performance. The changes discussed here are even easier to implement with the insight that BI Sentry provides.