Skip to main content
answered exactly to the asked question
Source Link

Let's consider an analogy to grasp the essence of the question. In the context ofHuman counting floorsintuitively revolves around two different cases:

 a) In some countries, like the US, floors are labeled as 'first floor,' 'second floor,' and so on. b) In other countries, such as the UK, counting begins from the 'Ground floor,' followed by 'first floor,' and so forth. 
  1. The total element count in a list or a collection:

    We do so by assigning '1' to the first element and counting upwards (Rarely, we may choose to assign '0' to the first element, as in the floor counting system in the UK where the first floor is called 'Ground floor'). Example: Counting the total apples in a bag by starting the count from 1.

  2. The total distance traveled from the first position:

    We do so by assigning a certain value to the starting position, and then adding the offset on it as we traverse a certain distance. Now, what value to assign to the starting position to simplify the calculation? As you might have probably guessed, assigning 0 to the starting position makes calculation much easier. Example: Measuring the distance means keeping the measuring tape such that the "zero" reading is at the starting position. This is what others have pointed out as well.

These methods parallel the debate over whether computers should count from 1 or 0. So, which counting method is better? InLet's get to the caseworld of computers, one method may prove slightly more advantageous than the other.

Human counting often revolves around the length of a list, assigning '1' to When we talk about the first element and counting upwards (Occasionallyof an array being called '0', we may choose to assign '0' toare not talking about the first element count, as seen inbut rather the floor counting systemposition in the UK). By doing soarray, it implies that '0' represents something beforejust like in the first element—an absence or nothingnesssecond case above. This observation aligns with the idea that, in a sense, humans also start counting from zeroIt's essential to recognize this distinction, as others have pointed out.

However, Computers are concernedour intuition might lead us to compare computer numbering with the positionsfirst case in the list, rather than the lengthhuman counting. So assigning '0' to the first value of the listarray simplifies indexing. It's so becauseThis means in a computer's memory, the position/addressaddress of the next value in the listelement can be determined withusing the help of an offset rule. In the following 2 cases, it's much easier

While compiler optimizations might have likely been implemented to calculateaccommodate this convention, the option 'a' (where counting beginsinfluence of languages like C, which starts indexing from 0) than the option 'b''0' (where counting beginsunlike FORTRAN that starts from 1'1'):

 a) element(n) = address + n * size_of_the_element b) element(n) = address + (n-1) * size_of_the_element 

The design philosophy of computers predates the development of programming languages, has solidified this approach. Now that we know the philosophy behind why 0Just like there is the first numberno need to alter a computer, lets look intomeasuring tape where the casereading starts with 1 instead of programming languages0, even though human brain is already "optimized" (smart) to seecalculate the convenience in use cases:

 a) Languages like FORTRAN and COBOL start counting from 1 b) Languages like C and Python start the count from 0. 

You are free to build your own language that startsdistance even if the indexingreading started from -1 all the way up, but it will probably add more to the confusion, and it probably won't be efficient1.

Let's consider an analogy to grasp the essence of the question. In the context of counting floors:

 a) In some countries, like the US, floors are labeled as 'first floor,' 'second floor,' and so on. b) In other countries, such as the UK, counting begins from the 'Ground floor,' followed by 'first floor,' and so forth. 

These methods parallel the debate over whether computers should count from 1 or 0. So, which counting method is better? In the case of computers, one method may prove slightly more advantageous than the other.

Human counting often revolves around the length of a list, assigning '1' to the first element and counting upwards (Occasionally, we may choose to assign '0' to the first element, as seen in the floor counting system in the UK). By doing so, it implies that '0' represents something before the first element—an absence or nothingness. This observation aligns with the idea that, in a sense, humans also start counting from zero, as others have pointed out.

However, Computers are concerned with the positions in the list, rather than the length. So assigning '0' to the first value of the list simplifies indexing. It's so because the position/address of the next value in the list can be determined with the help of an offset. In the following 2 cases, it's much easier to calculate the option 'a' (where counting begins from 0) than the option 'b' (where counting begins from 1):

 a) element(n) = address + n * size_of_the_element b) element(n) = address + (n-1) * size_of_the_element 

The design philosophy of computers predates the development of programming languages. Now that we know the philosophy behind why 0 is the first number to a computer, lets look into the case of programming languages to see the convenience in use cases:

 a) Languages like FORTRAN and COBOL start counting from 1 b) Languages like C and Python start the count from 0. 

You are free to build your own language that starts the indexing from -1 all the way up, but it will probably add more to the confusion, and it probably won't be efficient.

Human counting intuitively revolves around two different cases:

  1. The total element count in a list or a collection:

    We do so by assigning '1' to the first element and counting upwards (Rarely, we may choose to assign '0' to the first element, as in the floor counting system in the UK where the first floor is called 'Ground floor'). Example: Counting the total apples in a bag by starting the count from 1.

  2. The total distance traveled from the first position:

    We do so by assigning a certain value to the starting position, and then adding the offset on it as we traverse a certain distance. Now, what value to assign to the starting position to simplify the calculation? As you might have probably guessed, assigning 0 to the starting position makes calculation much easier. Example: Measuring the distance means keeping the measuring tape such that the "zero" reading is at the starting position. This is what others have pointed out as well.

Let's get to the world of computers. When we talk about the first element of an array being called '0', we are not talking about the element count, but rather the position in the array, just like in the second case above. It's essential to recognize this distinction, as our intuition might lead us to compare computer numbering with the first case in human counting. So assigning '0' to the first value of the array simplifies indexing. This means in a computer's memory, the address of the next element can be determined using the offset rule.

While compiler optimizations might have likely been implemented to accommodate this convention, the influence of languages like C, which starts indexing from '0' (unlike FORTRAN that starts from '1'), has solidified this approach. Just like there is no need to alter a measuring tape where the reading starts with 1 instead of 0, even though human brain is already "optimized" (smart) to calculate the distance even if the reading started from 1.

Source Link

Let's consider an analogy to grasp the essence of the question. In the context of counting floors:

 a) In some countries, like the US, floors are labeled as 'first floor,' 'second floor,' and so on. b) In other countries, such as the UK, counting begins from the 'Ground floor,' followed by 'first floor,' and so forth. 

These methods parallel the debate over whether computers should count from 1 or 0. So, which counting method is better? In the case of computers, one method may prove slightly more advantageous than the other.

Human counting often revolves around the length of a list, assigning '1' to the first element and counting upwards (Occasionally, we may choose to assign '0' to the first element, as seen in the floor counting system in the UK). By doing so, it implies that '0' represents something before the first element—an absence or nothingness. This observation aligns with the idea that, in a sense, humans also start counting from zero, as others have pointed out.

However, Computers are concerned with the positions in the list, rather than the length. So assigning '0' to the first value of the list simplifies indexing. It's so because the position/address of the next value in the list can be determined with the help of an offset. In the following 2 cases, it's much easier to calculate the option 'a' (where counting begins from 0) than the option 'b' (where counting begins from 1):

 a) element(n) = address + n * size_of_the_element b) element(n) = address + (n-1) * size_of_the_element 

The design philosophy of computers predates the development of programming languages. Now that we know the philosophy behind why 0 is the first number to a computer, lets look into the case of programming languages to see the convenience in use cases:

 a) Languages like FORTRAN and COBOL start counting from 1 b) Languages like C and Python start the count from 0. 

You are free to build your own language that starts the indexing from -1 all the way up, but it will probably add more to the confusion, and it probably won't be efficient.