Alright so here's the situation...
All of my coworkers are used to RDP'ing into a Windows server and doing all of their work there. Whether it be coding, SQL work, etc. Multiple developers connecting to the same Windows Server (either 2012 or 2016 currently) and running all the same applications. Their workstations almost act like thin clients, except they run Email on them.
We're working on supporting/customizing/deploying a new software package that's in C#, thus we're using Visual Studio. I've been stressing that instead of purchasing a high performance server for everyone to connect to and work there, that we could just have a cheaper server for hosting the snapshot application (purely development environments) and then just run Visual Studio on our workstations, and access the projects over the network. Everything is in house, the servers are in a rack about 20 feet from my desk.
I'm now in charge of speccing out a server and giving the positives and negatives of all developers connecting to a single server and working on it (this could be 1-8 people simultaneously working in Visual Studio) vs. just running Visual Studio on our individual workstations.
Can anyone give me perhaps some other points of view here? My mind is setting off alarms all over the place with this project...
purchasing a high performance server for everyone to connect to and work thereyou mean multiple high-end servers, and beefy network connections. Because a Terminal server (that's what you describe here) can only handle so many concurrent sessions, and developer workloads are really heavy.except they run Email on them.so it's not security then. Why is this setup used? Instead of one big terminal server you could use more modern technologies like desktop virtualization through eg App-V. Applications are still locked-down, controlled and distributed centrally, but running in their own isolated VM on the client