On MSSQL (our DB size is 520g), there is one filestore table that holds tons of files. 65-70% of this 520g is sitting in there. So essentially, it's a 200g database with 320g worth of files in this one table.
I have ran a script that removed around 120g worth of old records from this table. The DB is still sitting at 520g, but there's this 120g hole inside that is available for writes.
Should there be a significant improvement in overall DB performance after clearing up this much space in the DB?
How does having a huge amount of data in one table effect the performance of queries that have nothing to do with that table?