ThinkJet is right that that some of the other answers don't cater for the 'keep together' requirement. However I think this can be done without resorting to a user-defined aggregate.
Sample data
create table test (empno number, ename varchar2(20), trandate date, amt number); insert into test values (100, 'Alison' , to_date('21-MAR-1996') , 45000); insert into test values (100, 'Alison' , to_date('12-DEC-1978') , 23000); insert into test values (100, 'Alison' , to_date('24-OCT-1982') , 11000); insert into test values (101, 'Linda' , to_date('15-JAN-1984') , 16000); insert into test values (101, 'Linda' , to_date('30-JUL-1987') , 17000); insert into test values (102, 'Celia' , to_date('31-DEC-1990') , 78000); insert into test values (102, 'Celia' , to_date('17-SEP-1996') , 21000); insert into test values (103, 'James' , to_date('21-MAR-1996') , 45000); insert into test values (103, 'James' , to_date('12-DEC-1978') , 23000); insert into test values (103, 'James' , to_date('24-OCT-1982') , 11000); insert into test values (104, 'Robert' , to_date('15-JAN-1984') , 16000); insert into test values (104, 'Robert' , to_date('30-JUL-1987') , 17000);
Now, determine the end row of each empno segment (using RANK to find the start and COUNT..PARTITION BY to find the number in the segment).
Then use ceil/4 from APC's solution to group them into their 'pages'. Again, as pointed out by ThinkJet, there is a problem in the specification as it doesn't cater for the situation when there are more records in the empno 'keep together' segment than can fit in a page.
select empno, ename, ceil((rank() over (order by empno) + count(1) over (partition by empno))/6) as chunk from test order by 1;
As pointed out by ThinkJet, this solution isn't bullet proof.
drop table test purge; create table test (empno number, ename varchar2(20), trandate date, amt number); declare cursor csr_name is select rownum emp_id, decode(rownum,1,'Alan',2,'Brian',3,'Clare',4,'David',5,'Edgar', 6,'Fred',7,'Greg',8,'Harry',9,'Imran',10,'John', 11,'Kevin',12,'Lewis',13,'Morris',14,'Nigel',15,'Oliver', 16,'Peter',17,'Quentin',18,'Richard',19,'Simon',20,'Terry', 21,'Uther',22,'Victor',23,'Wally',24,'Xander', 25,'Yasmin',26,'Zac') emp_name from dual connect by level <= 26; begin for c_name in csr_name loop for i in 1..11 loop insert into test values (c_name.emp_id, c_name.emp_name, (date '2010-01-01') + i, to_char(sysdate,'SS') * 1000); end loop; end loop; end; / select chunk, count(*) from (select empno, ename, ceil((rank() over (order by empno) + count(1) over (partition by empno))/25) as chunk from test) group by chunk order by chunk ;
So with chunk size of 25 and group size of 11, we get the jumps where it fits 33 people in the chunk despite the 25 limit. Large chunk sizes and small groups should make this infrequent, but you'd want to allow some leeway. So maybe set the chunks to 65,000 rather than going all the way to 65,536.