**Cause of speed up**

This is definitely not a memoization. The reason for the speed up is that when you work with very large arrays (like 10^8 elements), the memory clean up operations start take significant time. If you don't clean memory you save time.

Here is an example:

Let's create a large array, perform a calculation, and remove the array:

 AbsoluteTiming[
 Total[ConstantArray[0, 10^8]];
 ]

> {0.422509, Null}

Let's now do the same thing, but keep array in memory:

 AbsoluteTiming[
 Total[garbage = ConstantArray[0, 10^8]];
 ]

> {0.366755, Null}

It is noticably faster.

Let's check how long does it take to remove a large array:

 AbsoluteTiming[
 Remove[garbage]
 ]

> {0.061982, Null}

It is the difference of the calculation times above.

**Your example**

In the example you provide, removing of ``Unitize@data[[All, n]]`` array from memory takes significant time. If one saves this array in a redundant variable, one avoids memory clean-up and wins some time.

**How to make test representative?**

You should put ``Clear[pick, unitize]`` _inside_ your timing function. This will show that the pseudo-memoization technique is actually slower:

 Table[
 Clear[data];
 data=RandomInteger[{0,10},{i*10^7,3}];
 {
 Pick[data,Unitize@data[[All,-1]],1]; // AbsoluteTiming // First
 ,
 Clear[pick,unitize];
 unitize[x_]:=unitize[x]=Unitize[x];
 pick[xs_,sel_,patt_]:=pick[xs,sel,patt]=Pick[xs,sel,patt];
 pick[data,unitize@data[[All,-1]],1]; // AbsoluteTiming // First
 },
 {i,5}]

 (*
 {{0.534744, 0.469538},
 {1.03776, 1.05842},
 {1.58536, 1.65404},
 {2.10422, 2.11284},
 {2.48129, 2.71405}}
 *)