Is it safe? Trust me

A law firm is an attractive target for computer-based crime. It has lots of confidential information, lots of people with access to the core systems and lots to lose if the information is disclosed. Information needs to be protected, but it also needs to be accessible over networks. So how can you make sure you trust the user or the computer that is accessing it?

Article published in Legal Week in May 2008.

In the past, computer security was mainly about using good password practices, stopping viruses and blocking access at the firewall. But now computer crime is more purposeful, and the protection needs to be more sophisticated. If an Eastern European gang can fit a card reader into a cash machine without anyone noticing, just think what they might be able to do inside a law firm.

So the questions are: when a user account authenticates to a system, how can I know it really is the person it is supposed to be? And when I trust a computer, by storing information on it or by allowing it to connect to the network, how can I know it is the computer I think it is?

Typically, you start by having good password policies and tightly restricting administrator-level access. Obviously, no-one works as an administrator of their own computer. This is a good deterrent for opportunistic attack, but not for purposeful crime. There are a number of ways in which good policies can be subverted. Bear in mind that a password only has to leak once, and if that is not detected it can be used maliciously for a long time.

A keylogger records keystrokes on the computer and so can be used to capture logon details. A hardware keylogger requires no rights to install. It can be hidden inside the keyboard. If you think how easy it is for equipment to be stolen from offices, you can see how easy it is for equipment to be subverted instead.

Normally you will have different levels of administrative access, but a junior IT support person can escalate their rights in several ways. The simplest would be to install software on a workstation and to wait for a more senior administrator to log on to it. With the captured logon details he can log on to a server and repeat the escalation up to enterprise administrator.

Passwords can also just ‘leak’. The domain administrator password might be fiendishly difficult, but a password for another administrator account might be commonly known. The end result is that when a password is supplied for access to a service, you cannot assume the user is who they claim to be.

It has come to the point when two-factor authentication should be required for any administrator account and for any account with access to highly confidential information. It is routinely used for banking transactions and for remote access, so it is not a strange idea. Two-factor authentication inside the organisation is now much easier to do than previously and is almost as easy as password authentication.

The most familiar form of two-factor authentication is the one-time password (OTP), used for example by RSA SecurID. A hardware device generates a unique number that, when combined with a password, authenticates the user. Many services such as SAP and Citrix support it. However, to use it to protect the logon on computers within the office requires changing the authentication mechanism in Windows, which is a big step. It can also be quite expensive for widespread use.

The other main form of two-factor authentication is the ‘smart card’. This is the same as the bank card chip and PIN. The chip is a secure microprocessor which holds a digital certificate. Entering the PIN number releases the public key of the certificate, and so authenticates the user.

Unlike OTP, it is now comparatively easy and inexpensive to implement smart card authentication for Windows. Windows Vista and Server 2008 support smart card authentication built-in. In Active Directory Group Policy you can specify that a user account requires a smart card for interactive logon, or you can specify that a server requires smart card authentication. The user certificate is stored in Active Directory as part of the account properties. You can use the built-in Windows Certificate Authority to generate certificates, or you can use a third-party certificate authority.

It used to be that you needed to have a smart card reader with accompanying driver, but smart card devices from companies such as Gemalto now come in a USB format and the driver is built in to Vista. So you simply plug in the USB stick, enter a PIN and you are authenticated. Many non-IT professionals will find this easier than remembering complex and expiring passwords. I would certainly offer it as an alternative to those who dislike passwords.

Once you adopt smart card authentication, there are some interesting avenues for it. You can use the smart card to store certificates from a public certificate authority, for example VeriSign, and this enables you to authenticate uniquely outside the organisation as well as inside. You can combine the smart card with storage on the same USB drive and use it to encrypt the data on the drive. You can give a smart card to your clients so they can authenticate securely to your extranet and access client confidential information.

You can even combine the smart card with building access systems. Gemalto smart cards, for example, can be integrated with the Mifare chip, which is commonly used for swipe cards and is used in the London Oyster card.

But what about trusting the computer? Why do you need to do that?

The information on a computer is protected by a logon but it can be accessed directly from the disk simply by booting with a different operating system. There is no need to steal the computer. Just boot a computer from a USB drive and read or write what you want. You could drop a batch file into the startup folder of an administrator account so it runs silently when they log on. This applies to a laptop left in a meeting room, a desktop at night, or a server in a remote office.

To protect the whole disk, you need something outside the disk. The trusted platform module (TPM) is the answer. A TPM chip is effectively a smart card on the motherboard of the computer. TPM began life as a means primarily to protect digital assets against piracy, but has a number of other useful security functions.

You can use it to: prevent tampering because the state of the computer can be stored in the TPM chip and checked when the computer starts; secure encryption keys, so that information is only unlocked if the TPM allows it; and allow or disallow a computer to connect to the network (or connect remotely over VPN, for example).

Windows Vista and Server 2008 use the TPM chip to provide the security for BitLocker whole disk encryption. You do not even need to supply a password. As long as the boot sequence of the computer is unchanged, the disk can be unlocked. Then, once the system has started, you can authenticate the user with a smart card. Active Directory provides the infrastructure for key recovery and both TPM and BitLocker can be managed through Group Policy.

There are a few general points about these technologies. They require an infrastructure to manage them, so you need to plan carefully how you are going to use them and for the impact they will have. They are not expensive to implement. In my view they are not inconvenient and may even be more convenient than they are currently. But, if there is a culture in the firm of being not entirely welcoming to new IT security measures, IT people may be reluctant to recommend them. The question the managing partner or IT director needs to ask is: ‘Is it safe?’.

From The Times July 2, 2008

Body Shop ‘snoop’ John Shevlin fined for insider dealing

A former IT technician at Body Shop, the ethical retailer, has been fined for market abuse in a rare victory for the Financial Services Authority in its battle against insider dealing.

The City regulator said yesterday that it had fined John Shevlin £85,000 after he was found to have gained inside knowlege by snooping on confidential e-mails between executives.

Mr Shevlin, who worked at the beauty company’s head office in London, borrowed more than his annual salary to bet that Body Shop’s share price would fall, having obtained a sneak preview of an unexpectedly bleak Christmas trading update.

As an IT technician, it is likely that Mr Shevlin had privileged access to executives’ passwords, enabling him to access their computers without their knowledge, the FSA said.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.