I had a meeting with a software vendor today about their recommended infrastructure for deploying a particular application. The application needs two servers: an app server for server web pages (.NET, Windows), and a database (SQL Server). The vendor claimed that these two servers had to have "bit parity". What they meant by this is that if the app server was 32 bit the SQL Server should be 32 bit, or if the app is 64 bit the SQL Server is 64 bit. Otherwise performance will be negatively impacted.
This seems ludicrous to me. The servers are independent and only communicate over a network. Network protocols have nothing to do with the "bit-ness" of the processor on either server.
Am I in the wrong? Is there a reason where a mismatch actually could negatively impact performance?
NOTE: I know that certain apps might run faster or slower in 32 bit vs. 64 bit. But the vendor was saying that the mismatch between web server and DB server causes a problem. This is the statement I'm questioning.