Skip to main content

New answers tagged

1 vote

Can a tree data structure be empty?

This confusion arises from the use of ambiguous terminology to describe distinct combinatorial structures. A general tree is classified as an ordered tree, whereas a binary tree is considered a ...
Ervin Varga's user avatar
-1 votes

How do I download binary files of a GitHub release?

Here are some tools that can download released executables and even act as a version manager / updater tool sorted by popularity: Repository Stars Last Commit eget 1.9k 2 years ago aqua 1.6k recently ...
mxmlnkn's user avatar
  • 2,199
Advice
0 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

I appreciate that you took the time to answer but this post and your reply do not answer my question. I was looking for something more hardware based at the level of logic gates and not a debate of ...
pcrt's user avatar
  • 1
Advice
0 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

Off-core transfers are whole cache lines (into / from per-core private L1/L2 caches), except for uncacheable memory regions. (Write-back caches are nearly universal, rather than write-through.) Semi-...
Peter Cordes's user avatar
Advice
0 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

thanks, your reply was very helpful unlike some others. The carry-lookahead and carry-select were things i didn't know about and are topics that answer my question. I will look into it. Also by any ...
pcrt's user avatar
  • 1
1 vote

Why is there 16 possible functions when there are 2 inputs?

11 10 01 00 Function 0 0 0 0 NEVER 0 0 0 1 A NOR B 0 0 1 0 B AND (NOT A) 0 0 1 1 NOT A 0 1 0 0 A AND (NOT B) 0 1 0 1 NOT B 0 1 1 0 A XOR B 0 1 1 1 A NAND B 1 0 0 0 A AND B 1 0 0 1 A XNOR B 1 0 1 0 B 1 ...
Michael allen's user avatar
Advice
2 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

Very few CPUs have single-cycle latency for integer multiply, just low-clock-speed things like ARM Cortex-M0+ with the optional fast multiplier. But probably you're talking about throughput? On a ...
Peter Cordes's user avatar
Advice
0 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

(SSE2, available on Intel hardware in 2000, looks like it could use 4 32-bit floats or 2 64-precision floats in parallel in a single operation using 128-bit registers.)
David Maze's user avatar
  • 166k
Advice
0 votes
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

In 2026, it makes almost no practical difference for most applications, and you should use whatever bit width or representation is most natural in your language. In C I'd use an int (32 or 64 bits) ...
David Maze's user avatar
  • 166k
Advice
1 vote
0 replies
0 views

Number of bits used for representing different types vs what the CPU uses

If I could I would flag this as a poor duplicate of Float vs Double Performance (vintage 2009). Some of what it says there is outdated - the original x87 numeric coprocessor could do 80bit float ...
Martin Brown's user avatar
  • 3,895
3 votes

How to return binary from AWS Lambda with Function URL

After more digging around, I figured out that the problem is actually unrelated to AWS Lambda Function URLs. The problem is the console editor (which is now VSCode, not Cloud9). If I upload a binary ...
falsePockets's user avatar
  • 4,483

Top 50 recent answers are included