2

I'm trying to extract a large number of files stored in a BLOB column in a SQL Server database. The files are being saved correctly to a directory, but after processing approximately 30,000 files, the program throws the error:

Out of Memory

I believe I've cleaned up all resources properly after each file, and I've also added fetch options to handle large datasets. Despite this, the memory usage keeps increasing until the program crashes.

Does anyone know what might be causing this issue or how I can better manage memory in this situation?

procedure TForm1.ExtractInvoicesFromBlobs(AMinInvNo, AMaxInvNo: Integer); var PrintDate, FileName, BaseFolder, TargetFolder: string; FileStream: TFileStream; BlobStream: TStream; RecordCount: Integer; begin // init RecordCount := 0; Memo1.Clear; // retrieve data from db FDQuery1.SQL.Clear; FDQuery1.SQL.Text := 'SELECT ID, PrintedFile, Filename, PrintDateTime ' + 'FROM InvoiceStorage ' + 'WHERE Filename BETWEEN :MinFilename AND :MaxFilename'; FDQuery1.ParamByName('MinFilename').AsString := Format('%d.pdf', [AMinInvNo]); FDQuery1.ParamByName('MaxFilename').AsString := Format('%d.pdf', [AMaxInvNo]); FDQuery1.Open; // these 2 lines are to prevent "out of memory" after approximately 30,000 pdf FDQuery1.FetchOptions.Mode := fmAll; FDQuery1.FetchOptions.Unidirectional := True; while not FDQuery1.Eof do begin try PrintDate := Copy(FDQuery1.FieldByName('PrintDateTime').AsString, 1, 4); BaseFolder := IncludeTrailingPathDelimiter(Edit3.Text); TargetFolder := Format(BaseFolder + '%s', [PrintDate]); ForceDirectories(TargetFolder); BlobStream := FDQuery1.CreateBlobStream(FDQuery1.FieldByName('PrintedFile'), bmRead); try FileName := IncludeTrailingPathDelimiter(TargetFolder) + FDQuery1.FieldByName('Filename').AsString; FileStream := TFileStream.Create(FileName, fmCreate); try FileStream.CopyFrom(BlobStream, BlobStream.Size); finally FileStream.Free; end; finally BlobStream.Free; BlobStream := nil; end; // Each 1,000 files: log and provide some breathing air to the system Inc(RecordCount); if RecordCount mod 1000 = 0 then begin Memo1.Lines.Add(Format('%d files processed...', [RecordCount])); TrimMemo; Application.ProcessMessages; Sleep(100); end; except on E: Exception do Memo1.Lines.Add(Format('Error while processing record %d: %s', [RecordCount, E.Message])); end; FDQuery1.Next; end; FDQuery1.Close; end; 
3
  • Isn't fmAll the actual problem making things worse? Sounds like it fetches everything, what happens when you use fmOnDemand Commented Jun 3 at 20:23
  • Without understanding or solving the root problem, you might still be able work around it by fetching only, say, 10.000 files at a time, in a loop. Your code looks correct and beautiful to me. Commented Jun 4 at 6:11
  • Do you compile it as 32 bit or 64? Compiling as 64 may help a bit if not doing the already. Commented Jun 4 at 11:06

1 Answer 1

2

You need to use:

 FDQuery1.FetchOptions.Unidirectional := True; 

before calling:

 FDQuery1.Open; 

Remove:

 FDQuery1.FetchOptions.Mode := fmAll; 
Sign up to request clarification or add additional context in comments.

2 Comments

Pieter B, it seems to work this way. Thanks!
Another option: Remove the PrintedFile field from the FDQuery1 to reduce memory usage and create a FDQuery2 that only reads the PrintedFile field by ID and fetch that one column inside the while loop. You could use a MasterSource/MasterFields and open FDQuery2 when you open FDQuery1 so you do not have to manual set the ID and open/close FDQuery2. (This could slow the process down, but memory usage should drop)

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.