Skip to main content
deleted 4 characters in body; added 22 characters in body; added 12 characters in body; added 1 character in body
Source Link
Toby Speight
  • 88.7k
  • 14
  • 104
  • 327

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or we get to O(n log n); if we convert to a structure with fast lookup (such as a collections.​Counter, which has amortized O(1) insertion and lookup), then we come down to less than O(n log n) - probably very close to O(n)O(n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a collections.​Counter), then we come down to less than O(n log n) - probably very close to O(n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership we get to O(n log n); if we convert to a structure with fast lookup (such as a collections.​Counter, which has amortized O(1) insertion and lookup), then we come down to O(n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 
complexity is much nearer O(n) than O(n log n)
Source Link
Toby Speight
  • 88.7k
  • 14
  • 104
  • 327

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a collections.Counter​Counter), then we come down to less than O(n log n) - probably very close to O(n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a collections.Counter), then we come down to O(n log n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a collections.​Counter), then we come down to less than O(n log n) - probably very close to O(n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 
I was thinking of collections.Counter - thanks @DSM for the prod
Source Link
Toby Speight
  • 88.7k
  • 14
  • 104
  • 327

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a mapping from number to count of that numbercollections.Counter), then we come down to O(n log n).

If we have a mapping from entry to count of that entryCounter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a mapping from number to count of that number), then we come down to O(n log n).

If we have a mapping from entry to count of that entry, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 

Removing numbers greater than the target affects the correctness of this program - they could be added to negative numbers to reach the target.

You could get a performance improvement by finding the difference between each element and the target, then looking to see if that difference is in the list. This doesn't in itself reduce the computational complexity (it's still O(_n_²)), but we can build on that: if the list is first sorted, and we then use a binary search to test membership, or if we convert to a structure with fast lookup (such as a collections.Counter), then we come down to O(n log n).

If we have a Counter, then we can account for all combinations of that pair by multiplying one count by the other (but we'll need to consider the special case that the number is exactly half the target).

We could do with some auto tests. Consider importing the doctest module and using it. Some good test cases to include:

1, [] → 0 1, [1] → 0 1, [0, 1] → 1 0, [-1, 1] → 1 0, [0, 1] → 0 4, [1, 4, 3, 0] → 2 4, [1, 1, 3, 3] → 4 4, [2, 2, 2, 2] → 6 
added 656 characters in body
Source Link
Toby Speight
  • 88.7k
  • 14
  • 104
  • 327
Loading
Source Link
Toby Speight
  • 88.7k
  • 14
  • 104
  • 327
Loading