Note: Several of the highest voted answers here advise setting ServicePointManager.SecurityProtocol, but Microsoft explicitly advises against doing that. Below, I go into the typical cause of this issue and the best practices for resolving it.
One of the biggest causes of this issue is the active .NET Framework version. The .NET framework runtime version affects which security protocols are enabled by default.
- In ASP.NET sites, the framework runtime version is often specified in web.config. (see below)
- In other apps, the runtime version is usually the version for which the project was built, regardless of whether it is running on a machine with a newer .NET version.
There doesn't seem to be any authoritative documentation on how it specifically works in different versions, but it seems the defaults are determined more or less as follows:
| Framework Version | Default Protocols |
| 4.5 and earlier | SSL 3.0, TLS 1.0 |
| 4.6.x | TLS 1.0, 1.1, 1.2, 1.3 |
| 4.7+ | System (OS) Defaults |
For the older versions, your mileage may vary somewhat based on which .NET runtimes are installed on the system. For example, there could be a situation where you are using a very old framework and TLS 1.0 is not supported, or using 4.6.x and TLS 1.3 is not supported.
Microsoft's documentation strongly advises using 4.7+ and the system defaults:
We recommend that you:
- Target .NET Framework 4.7 or later versions on your apps. Target .NET Framework 4.7.1 or later versions on your WCF apps.
- Do not specify the TLS version. Configure your code to let the OS decide on the TLS version.
- Perform a thorough code audit to verify you're not specifying a TLS or SSL version.
For ASP.NET sites: check the targetFramework version in your <httpRuntime> element, as this (when present) determines which runtime is actually used by your site:
<httpRuntime targetFramework="4.5" />
Better:
<httpRuntime targetFramework="4.7" />