Skip to main content
deleted 91 characters in body
Source Link

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$$O(m^2)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces AlA, and ArB (this takes constant time) AlA = Shuffle(AlA) ArB = Shuffle(ArB) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j]B[j] = Al[j]B[j]Ar[i]A[i] + Random(610) Ar[i]A[i] = Ar[j]A[j]Al[i]B[i] + Random(610) AlB = Shuffle(AlB) ArA = Shuffle(ArA) Combine A = AlA + ArB (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(km) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$$O(n^3 \cdot m^2)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$$O(m^2)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces Al, and Ar (this takes constant time) Al = Shuffle(Al) Ar = Shuffle(Ar) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j] = Al[j]Ar[i] + Random(6) Ar[i] = Ar[j]Al[i] + Random(6) Al = Shuffle(Al) Ar = Shuffle(Ar) Combine A = Al + Ar (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(k) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m^2)$ time for an input $m$. Determine the runtime of the following algorithms.

Algorithm 1:

function Shuffle(A) Split A into two equal pieces A, and B (this takes constant time) A = Shuffle(A) B = Shuffle(B) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: B[j] = B[j]A[i] + Random(10) A[i] = A[j]B[i] + Random(10) B = Shuffle(B) A = Shuffle(A) Combine A = A + B (this takes constant time) return A 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(m) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot m^2)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(m^2)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

Rollback to Revision 2
Source Link

Post no longer available Analyzing the Runtime of Shuffling Algorithm

This post has been taken downThe following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces Al, and Ar (this takes constant time) Al = Shuffle(Al) Ar = Shuffle(Ar) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j] = Al[j] − Ar[i] + Random(6) Ar[i] = Ar[j] − Al[i] + Random(6) Al = Shuffle(Al) Ar = Shuffle(Ar) Combine A = Al + Ar (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(k) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

Post no longer available

This post has been taken down.

Analyzing the Runtime of Shuffling Algorithm

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces Al, and Ar (this takes constant time) Al = Shuffle(Al) Ar = Shuffle(Ar) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j] = Al[j] − Ar[i] + Random(6) Ar[i] = Ar[j] − Al[i] + Random(6) Al = Shuffle(Al) Ar = Shuffle(Ar) Combine A = Al + Ar (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(k) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

Post Undeleted by Discrete lizard
Post Deleted by TheSauceMaestro
deleted 1365 characters in body; edited tags; edited title
Source Link

Analyzing the Runtime of Shuffling Algorithm Post no longer available

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces Al, and Ar (this takes constant time) Al = Shuffle(Al) Ar = Shuffle(Ar) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j] = Al[j] − Ar[i] + Random(6) Ar[i] = Ar[j] − Al[i] + Random(6) Al = Shuffle(Al) Ar = Shuffle(Ar) Combine A = Al + Ar (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(k) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A)This post has been taken down. Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

Analyzing the Runtime of Shuffling Algorithm

The following is psuedocode used to shuffle the contents of an array $A$ of length $n$. As a subroutine for shuffle, there is a call to Random$(m)$ which takes $O(m)$ time for an input $m$. Determine the runtime of the following algorithms and justify your answer. Which of these algorithms two is faster?

Algorithm 1:

function Shuffle(A) Split A into two equal pieces Al, and Ar (this takes constant time) Al = Shuffle(Al) Ar = Shuffle(Ar) for i = 0 to len(A)/2 − 1: for j = 0 to i − 1 do: Al[j] = Al[j] − Ar[i] + Random(6) Ar[i] = Ar[j] − Al[i] + Random(6) Al = Shuffle(Al) Ar = Shuffle(Ar) Combine A = Al + Ar (this takes constant time) return A = Al + Ar 

Algorithm 2:

function Shuffle1(A) for i = 0 to len(A) − 1: for j = 0 to i − 1: for k = 0 to j − 1: A[k] = A[k] + A[j] + A[i] + Random(k) return A 

After a good amount of work, I believe that the runtime for Shuffle1$(A)$ to be $O(n^3 \cdot k)$ as there are three nested loops. In the innermost loop, the algorithm does the constant time array access and $O(k)$ work done. But, I am not quite sure. I am having difficulty tackling the runtime of Shuffle(A). Should I be using the Master Theorem in any way? Any advice and help would be greatly appreciated!

Post no longer available

This post has been taken down.

Source Link
Loading