Encryption performance depends on many parameters like I/O throughput, network latency, caching, processor performance, application programming, anti-virus configuration, etc. That makes it hard to measure. Pure I/O benchmarks usually don’t reflect the real-life situation, and they are hard to read even for engineers. Depending on their parameterization and caching constellation, they may show between 80% performance degradation and up to 10% acceleration – something which is impossible.
To understand file and folder encryption performance, you need to know a few principles about file systems and filter drivers. Two things are relevant: performance and reliability.
Reliability is very important, but not in the focus of this KBA. However, filter drivers should not worsen a file system’s stability or scrap files. Therefore, programmers must not use every possible tweak, which perhaps optimize performance but as a drawback account for blue screens or data loss.
High performance file access is achieved using a few measures. They include:
In the following paragraphs we will discuss multiple aspects of performance.
Adding a file header to each encrypted file is of course relevant. An additional 4kB of data will be read for each file; usually only once because of the caching. These 4kB don’t matter much for large files, but they can become an extremely influencing overhead for a huge number of small files. Therefore, you may notice a performance reduction if you work with applications that excessively encrypt small files (e.g. temporary files).
Some applications are poorly implemented and cause unusual performance issues. What should applications do to avoid performance degradation? Here are a few tips (from real examples):
Poor application design often stays unnoticed, because it does not greatly harm a plain file system. But if a filter driver is involved, a normally unnoticed delay of e.g. half a second, may become an annoying 10 seconds. When this happens it is time to use performance measurement tools (e.g. Process Monitor of Microsoft’s SysInternals suite) and reveal the main brake.
The filter driver is involved every time when a new file handle is generated. It compares a file’s name and location with the encryption rules, which is kind of a pattern matching process. Also, if a file already exists, the driver checks whether the file is encrypted by reading the first 4kB (i.e. the file header; if the file is not encrypted, the header reading attempt so far is overhead with almost no impact, because it is cached and not read twice when the application requests it).
On network file systems, since multiple users potentially may access a file at the same time, caching for write operations is turned off (only in exclusive mode it is not, which would be sufficient for most use cases, but applications tend to operate in shared mode which the filter driver must not override). The reason for this is reliability – our top objective is to avoid corrupted files, especially in shared use scenarios. Therefore we decided to bite the bullet and switched the network cache off.
To be more specific, this means when a file is opened in the corresponding mode:
Besides that, users often do not perceive network data performance to be slow. It would be an application launch or the Windows booting process that makes users feel a system to be dull. However, system files and applications reside on the local disk, they can be cached, and SafeGuard uses performance optimization tricks. If you want to, you can create additional “encryption off” policies which tell the filter driver not to inspect certain applications, drives or specified folders. Of course, you can’t then work with encrypted files there, but they shouldn’t be there anyway. And the speed remains high.
There are worst case scenarios like copying thousands of files with only a few bytes, which lead to huge performance degradation. Such scenarios are artificial, but still seen in some pure disk access benchmarks.
In situations with heavy file I/O (e.g. copying a folder to another location) you can expect between 30 and 50%. Influencing parameters are file and block sizes, CPU speed, the availability of the AES-NI instruction set for hardware encryption acceleration, network latency, etc. Potential side effects are cache sizes, bandwidth for network operations, disk I/O rates and so on.
This goes down to between 15 and 30% for typical office use cases. Of course, loading or saving a document becomes slower than without encryption, but normally it stays within acceptable boundaries – and most of the time there is not much data transfer at all for daily work processes.
If you've spotted an error or would like to provide feedback on this article, please use the section below to rate and comment on the article. This is invaluable to us to ensure that we continually strive to give our customers the best information possible.
Every comment submitted here is read (by a human) but we do not reply to specific technical questions. For technical support post a question to the community. Or click here for new feature/product improvements. Alternatively for paid/licensed products open a support ticket.