I'm fairly experienced in .NET development but today I was forced to wrap my head around something I'd never thought about before:
How do the installed .NET Framework, the .NET Framework target in Visual Studio and the C# compiler work together?
Concrete example: The
System.dll contains the enum
System.Net.SecurityProtocolType. On .NET 4.5 this enum contains the members
Tls12. With .NET 4.7, the member
SystemDefault was added.
So, targeting .NET 4.7.x, this code compiles fine:
var p = SecurityProtocolType.SystemDefault;
However, when I target .NET 4.5.x, this code does not compile (as one would expect).
What puzzles me here is why this works considering that .NET 4.7 is an in-place update to .NET 4.5 (i.e. when installing .NET 4.7, the
System.dll of .NET 4.5 is replaced with the one of .NET 4.7).
How does the compiler know that I can't use
SystemDefault on .NET 4.5 but can use it on 4.7? Is this done via some kind of API file known to the compiler?
Side fact: When I target .NET 4.5 and have .NET 4.7 installed, a call to
Enum.GetValues(typeof(SecurityProtocolType)will give me
SecurityProtocolType.SystemDefault. So I'm fairly certain that my .NET 4.5 application uses the .NET 4.7
How does the compiler know that I can't use SystemDefault on .NET 4.5 but can use it on 4.7? Is this done via some kind of API file known to the compiler?
Yes, I'd expect it to be done via the reference assemblies. A reference assembly is an assembly which includes just the accessible API definitions, effectively.
On my Windows machine, these are in
C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework
See more on this question at Stackoverflow