Industry Voice: 7 criteria for data encryption

Encrypting data provides a critical, last stand for protecting information that becomes compromised. However, improperly implemented strategies can actually create additional vulnerabilities. To ensure adequate protection, we recommend following these seven criteria for encrypting data:

1. Know your encryption options

The basic options for encrypting data are FTPS, SFTP and HTTPS. FTPS is the fastest, however, it is more complex with both implicit and explicit modes and high requirements for data port availability. On the other hand, SFTP only requires one port for encryption. HTTPS is often used to secure interactive, human transfers from web interfaces. While all three methods are routinely deployed to encrypt data and protect it from being snipped as it traverses the Internet, it is important to choose the method that best suits your specific needs.

2. Always encrypt data at rest

Most people focus on securing data during a transfer, however, it is critical that data at rest also be encrypted. Data exchange files are especially vulnerable as they are stored in an easily parsable, consumable format. And, web-based file transfer servers are attacked more than their securer, on premise counterparts.

3. …especially with data that may be accessed by or shared with third parties

When a company shares a file with another company, they are typically channeling a storage vendor that automatically encrypts it and authenticates the receiver prior to granting access. However, there will be times when a non-authenticated party needs a file. Companies need a strategy for managing these “exceptions” while data is in motion and at rest.

4. Pretty Good Privacy (PGP) alone is not good enough to manage file security

Most organizations have a PGP policy in place to ensure that uploaded files are encrypted in such a way that the receiver does not need an advanced degree to open it. At the first sign of trouble, these people tend to share their login information in order to get help from a more tech-savvy “friend.” There is also the possibility that the system will break and leave files unencrypted and exposed. PGP policies are a start, but not an all-encompassing solution.

5. It’s less about the type of encryption and more about how it’s executed

Regardless of encryption methodology, companies need to ensure that encryption and security protocols are seamlessly implemented across the board. If they are too difficult and leave too many exceptions, there is a greater chance that an unencrypted file will somehow become available in a public or less secure domain. Clearly defined workflows and tight key management– along with tools for simplifying the process – will go a long way to ensuring that all employees, customers, partners and vendors comply on a daily basis.

6. Establish and protect data integrity

Validating an unbroken chain of command for any and all transfers will further protect important data. There are a variety of methods – manual checksums, PGP signature review, SHA-1 hash functions – and tools for determining if the data has be accessed or corrupted in the process. Maintaining comprehensive user activity logs will help administrators accurately audit systems if there is any uncertainty.

7. Fortify access control

In most FTP implementations, once someone gets past the first layer of security they then have access to all the files on that server. Therefore, administrators must go beyond rudimentary access control and authentication to regulate who can access what. Validating that the authentication process itself is robust is the first step. Implementing a strong password management and lock-out protocols is critical as well.

Adhering to these best-practice recommendations will help ensure that confidential data stands a chance of remaining confidential even if it ends up in the wrong hands.