The concept of nullnull can easily be traced back to C but that's not where the problem lies.
My everyday language of choice is C#C# and I would keep null with one difference. C# has two kinds of types, valuesvalues and referencesreferences. Values can never be null, but there are times when I'd like to be able to express that no value is perfectly fine. To do this C# uses Nullable types so int would be the value and int? would be the nullable value. This is how I think reference types should work as well.
Also see: Null reference may not be a mistake:
Null references are helpful and sometimes indispensable (consider how much trouble if you may or may not return a string in C++). The mistake really is not in the existence of the null pointers, but in how the type system handles them. Unfortunately most languages (C++, Java, C#) don’t handle them correctly.
Null references are helpful and sometimes indispensable (consider how much trouble if you may or may not return a string in C++). The mistake really is not in the existence of the null pointers, but in how the type system handles them. Unfortunately most languages (C++, Java, C#) don’t handle them correctly.