Questions tagged [history]
For questions about the history of programming and computing.
356 questions
6 votes
1 answer
475 views
TCL/Tk: Why is it an error for a window/widget's name to start with an uppercase letter?
I was going to ask this on Stack Overflow, but after doing some reading, I guess history questions are considered off-topic there and should be asked here instead? Anyway, as to the question: Perhaps ...
2 votes
2 answers
380 views
Why is the 8-bit exponent of the IEEE 754 32-bit float not byte-aligned?
There is already a prior question dealing with why certain bit-widths were chosen (although I do find it somewhat insufficient, but that's another topic), but what strikes me as unusual is how the ...
8 votes
1 answer
2k views
What was COBOL's syntax first described in?
Nowadays, it's very common to use BNF (or extensions thereof) to describe the syntaxes of various programming languages or their constructs. What was the situation like 60+ years ago? COBOL and BNF ...
3 votes
1 answer
440 views
Who first defined a "library" as software you call and a "framework" as software that calls you?
The distinction between "library" and "framework" is said to be that you call a library but a framework calls you. "Hollywood principle" and "inversion of control&...
1 vote
1 answer
279 views
Does SQL Server Agent predate Windows Task Scheduler?
An old guru once told me that SQL Server Agent was created because Windows Task Scheduler did not exist at the time. However, all of my research shows that they both released in 1995. For Task ...
0 votes
2 answers
444 views
Origins of Unit Testing in hardware?
According to the Wikipedia entry for Unit Testing, it is defined as a technique for testing components of a system in strict isolation from each other, and it is described as having been expressly ...
0 votes
2 answers
528 views
Where does the practice of naming variables with the prefix "my" come from?
I recognize that there are situations in which "my" is semantically useful, but I have met multiple professional programmers that have a habit of using this everywhere that it's not - "...
-3 votes
1 answer
300 views
What does the filename "gmon.out" stand for?
The GNU compiler toolset has a profiler "gprof" for performance analysis of C++ programs. With the -pg switch, the gcc compiler will make executables that write a data file, "gmon.out&...
5 votes
3 answers
791 views
What was the first company to make a drag-and-drop GUI designer like Visual Basic?
When Visual Basic came out, it was revolutionary for its drag-and-drop GUI designer, allowing users to quickly create GUI programs. This video shows Bill Gates introducing it in 1991. Did drag-and-...
10 votes
5 answers
13k views
Why is int in C in practice at least a 32 bit type today, despite it being developed on/for the PDP-11, a 16 bit machine?
For background, the question is to prepare some training material, which should also should explain a bit why the things are the way they are. I tried to get some idea of how C began based on this ...
2 votes
1 answer
333 views
Did the term "decorator" originate with OOP design patterns?
The Decorator pattern allows behaviour to be dynamically added to an existing object, effectively "decorating" it with new behaviour. While the pattern as formalised and named seems to have ...
0 votes
2 answers
374 views
Why does jl test for the second operand of cmp to be less than the first, instead of the other way around?
Something like this cmp $0, %eax jl exit jumps to the exit: label if the content of register eax is less than 0. So it's kind of jl applies the < operator to the operands of cmp, but in reverse ...
1 vote
2 answers
366 views
Why did TC39 name JavaScript's array predicate functions `some` and `every` instead of `any` and `all`?
Python, Ruby, Rust, Haskell, Kotlin, C#, C++, Perl, MATLAB, SQL, and R all call their respective array predicate checking functions any and all. Is there any record of why JavaScript's designers ...
1 vote
1 answer
393 views
What is the first language to implement asterisk * as wild card character?
I have been researching and trying to find the oldest possible implementation of the wildcards and how the asterisk * became the (almost) global standard when it come to represent a wildcard character....
7 votes
1 answer
337 views
Why is the highest NUMERIC precision in most RDBMS 38?
SQL-92 says: 16)For the <exact numeric type>s DECIMAL and NUMERIC: a) The maximum value of <precision> is implementation-defined. <precision> ...