Skip to main content
added 10 characters in body
Source Link
user53019
user53019

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to / can vary based upon programming language and platform.

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

ModernI believe that some compilers or their additional tooling can provide somea measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the So others have noted this potential issue downand taken measures to protect against it.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to / can vary based upon programming language and platform.

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to / can vary based upon programming language and platform.

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

I believe that some compilers or their additional tooling can provide a measure of checking against invalid pointer access. So others have noted this potential issue and taken measures to protect against it.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

removed ISO reference - it was incorrect.
Source Link
user53019
user53019

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to / can vary based upon programming language and platform.2

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

2 See this C99 draft specification at Section 7.17 last paragraph of point 2.

the null character shall have the code value zero

Which is an addition to the C99 spec. The addition implying it was not defined behavior before and various compilers had differing definitions.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to vary based upon programming language and platform.2

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

2 See this C99 draft specification at Section 7.17 last paragraph of point 2.

the null character shall have the code value zero

Which is an addition to the C99 spec. The addition implying it was not defined behavior before and various compilers had differing definitions.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to / can vary based upon programming language and platform.

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

added 2200 characters in body
Source Link
user53019
user53019

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null canused to be able to vary based upon programming language and platform.2

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may hindfind it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

2 See this C99 draft specification at Section 7.17 last paragraph of point 2.

the null character shall have the code value zero

Which is an addition to the C99 spec. The addition implying it was not defined behavior before and various compilers had differing definitions.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly. Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null can vary based upon programming language and platform.

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may hind it has a different semantic for your language.

My understanding is that null was a necessary construct in order to abstract programming languages out of assembly.1 Programmers needed the ability to indicate that a pointer or register value was not a valid value and null became the common term for that meaning.

Reinforcing the point that null is just a convention to represent a concept, the actual value for null used to be able to vary based upon programming language and platform.2

If you were designing a new language and wanted to avoid null but use maybe instead, then I would encourage a more descriptive term such as not a valid value or navv to indicate the intent. But the name of that non-value is a separate concept from whether you should allow non-values to even exist in your language.

Before you can decide on either of those two points, you need to define what the meaning of maybe would mean for your system. You may find it's just a rename of null's meaning of not a valid value or you may find it has a different semantic for your language.

Likewise, the decision on whether to check access against null access or reference is another design decision of your language.

To provide a bit of history, C had an implicit assumption that programmers understood what they were attempting to do when manipulating memory. As it was a superior abstraction to assembly and the imperative languages that preceded it, I would venture that the thought of safe-guarding the programmer from an incorrect reference hadn't come across their mind.

Modern compilers provide some measure of checking against invalid pointer access. Null pointer references can show up as errors during the compile, which tell the programmer to hunt the issue down.

Whether or not you should allow it depends upon what you want your language to accomplish and what degree of responsibility you want to push to users of your language. It also depends upon your ability to craft a compiler to restrict that type of behavior.

So to answer your questions:

  1. "What kind of arguments…" - Well, it depends upon what you want the language to do. If you want to simulate bare-metal access then you may want to allow it.

  2. "is it just historical baggage?" Perhaps, perhaps not. null certainly had / has meaning for a number of languages and helps drive the expression of those languages. Historical precedent may have affected more recent languages and their allowing of null but it's a bit much to wave your hands and declare the concept useless historical baggage.


1 See this Wikipedia article although credit is given for to Hoare for null values and object-oriented languages. I believe the imperative languages progressed along a different family tree than Algol did.

2 See this C99 draft specification at Section 7.17 last paragraph of point 2.

the null character shall have the code value zero

Which is an addition to the C99 spec. The addition implying it was not defined behavior before and various compilers had differing definitions.

Source Link
user53019
user53019
Loading