Ok, first of all, if I'm asking in the wrong place, don't shoot me. I have all the intentions in the world to obey to the rules of stack exchange, so just let me know. Secondly, in case the question sounded weird, here I go with the explanation which actually makes sense:
I am a (junior) programmer and know very little about server configuration. Until today I was provided with the working environment and my job was just to code. So, I (co) developed a project which was supposed to run on a debian or fedora server. The problem is that, it turned out, the server is Windows server 2008. And I'm pretty sure the project will need major changes - best case - to work. Moreover, I'm not even sure if I could make it, cause the part of the project that demands a unix - based machine was not coded by me and the guy has gone MIA. The project managers are not going to give up windows for my sake since they have other projects there. So I was considering to run it on a virtual machine .... Not sure if it is a decent idea. And I am quite welcoming to any other ideas that you might offer. I know how to work on a virtual machine from trying it out in my pc (never in a server) and I assume it must be pretty much the same thing.
Thanks in advance
Another detail: Chances are that this windows environment provided to me already resides in a VM (so they have built a cluster of windows servers inside their main windows server). Would it not affect the performance, if I were to create a debian virtual server inside a windows virtual server? At the very least, there must be some hardware or software specifications.