Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

3
  • 2
    Here's a reference for the above: historyofinformation.com/expanded.php?id=546 Commented Jun 3, 2016 at 13:34
  • 1
    There are ongoing discussions about whether Ada Lovelace was the first programmer, she definitely understood that computers could solve whole classes of algorithmic problems, not just specific instances that Babbage was interested in. Her translation of the Menabrea paper made it clear that she grasped what could be done. Commented Jun 3, 2016 at 15:04
  • It's also worth noting (adding to the comment about decimal vs. binary) that binary arithmetics were being used long before digital computers were even thought of. For fun and profit, simply because a lot of problems naturally lend themselves to a power-of-two notation, but also because the idea of being able to represent any number with only two symbols was seen as pretty awesome. Commented Jun 4, 2016 at 23:43