I've 2 problems regarding determination of time complexity of 2 algorithms.
Problem 1
Determining the smallest 3 numbers from a set of n distinct numbers using comparisons.
- These 3 elements can be determined using O(log2n) comparisons.
- O(log2n) don't suffice, however they can be determined using n + O(1) comparisons.
- n + O(1) don't suffice, however they can be determined using n + O(logn) comparisons.
- n + O(logn) don't suffice, however they can be determined using O(n) comparisons.
- None of the above.
Here, the way I thought of it is to take 3 variables (e.g: MIN1, MIN2 & MIN3 where MIN1 being the smallest & MIN3 being the largest of these 3), initialize them with the 1st 3 elements of the list and scan the list once. For each number x in the list we have the following 4 cases:
- if x < Min1 then, Min3 = Min2; Min2 = Min1; Min1 = x;
- else if Min1 < x < Min2 then, Min3 = Min2; Min2 = x;
- else if Min2 < x < Min3 then, Min3 = x;
- else if Min3 < x then, do nothing
So, basically it'll require 3n comparisons in the worst case and 0 comparison in the best case.
Correct me if it can be done in an otherwise easier (less time bound) way. Actually I'm confused with options 3 and 4.
Problem 2
Determining both the smallest and the largest number from a set of n distinct numbers using comparisons.
- these 2 elements can be determined using O(log100n) comparisons.
- O(log100n) don't suffice, however they can be determined using n + O(logn) comparisons.
- n + O(logn) don't suffice, however they can be determined using 3.⌈n⁄2⌉ comparisons.
- 3.⌈n⁄2⌉ don't suffice, however they can be determined using 2.(n-1) comparisons.
- None of the above.
Using analogous argument as before I've come up with the answer 2(n-1). Although I'm still confused between options 2 and 4.