I'm trying to extract a large number of files stored in a BLOB column in a SQL Server database. The files are being saved correctly to a directory, but after processing approximately 30,000 files, the program throws the error:
Out of Memory
I believe I've cleaned up all resources properly after each file, and I've also added fetch options to handle large datasets. Despite this, the memory usage keeps increasing until the program crashes.
Does anyone know what might be causing this issue or how I can better manage memory in this situation?
procedure TForm1.ExtractInvoicesFromBlobs(AMinInvNo, AMaxInvNo: Integer); var PrintDate, FileName, BaseFolder, TargetFolder: string; FileStream: TFileStream; BlobStream: TStream; RecordCount: Integer; begin // init RecordCount := 0; Memo1.Clear; // retrieve data from db FDQuery1.SQL.Clear; FDQuery1.SQL.Text := 'SELECT ID, PrintedFile, Filename, PrintDateTime ' + 'FROM InvoiceStorage ' + 'WHERE Filename BETWEEN :MinFilename AND :MaxFilename'; FDQuery1.ParamByName('MinFilename').AsString := Format('%d.pdf', [AMinInvNo]); FDQuery1.ParamByName('MaxFilename').AsString := Format('%d.pdf', [AMaxInvNo]); FDQuery1.Open; // these 2 lines are to prevent "out of memory" after approximately 30,000 pdf FDQuery1.FetchOptions.Mode := fmAll; FDQuery1.FetchOptions.Unidirectional := True; while not FDQuery1.Eof do begin try PrintDate := Copy(FDQuery1.FieldByName('PrintDateTime').AsString, 1, 4); BaseFolder := IncludeTrailingPathDelimiter(Edit3.Text); TargetFolder := Format(BaseFolder + '%s', [PrintDate]); ForceDirectories(TargetFolder); BlobStream := FDQuery1.CreateBlobStream(FDQuery1.FieldByName('PrintedFile'), bmRead); try FileName := IncludeTrailingPathDelimiter(TargetFolder) + FDQuery1.FieldByName('Filename').AsString; FileStream := TFileStream.Create(FileName, fmCreate); try FileStream.CopyFrom(BlobStream, BlobStream.Size); finally FileStream.Free; end; finally BlobStream.Free; BlobStream := nil; end; // Each 1,000 files: log and provide some breathing air to the system Inc(RecordCount); if RecordCount mod 1000 = 0 then begin Memo1.Lines.Add(Format('%d files processed...', [RecordCount])); TrimMemo; Application.ProcessMessages; Sleep(100); end; except on E: Exception do Memo1.Lines.Add(Format('Error while processing record %d: %s', [RecordCount, E.Message])); end; FDQuery1.Next; end; FDQuery1.Close; end;
fmAllthe actual problem making things worse? Sounds like it fetches everything, what happens when you usefmOnDemand