Is coding important to be good at computer science? Should one implement the algorithm to know it well ?
I remember one cs professor's idiom that "I never code"
Is coding important to be good at computer science? Should one implement the algorithm to know it well ?
I remember one cs professor's idiom that "I never code"
You won't really know the algorithm well until you code it.
Coding is not important to your professor, but you need to keep in mind that he is not paid to DO things. He is paid to SAY things (and WRITE things.)
I'm a former math professor, so I understand this dynamic well.
If you want to follow his path, and to be a theoretical computer scientist, then yes, coding is of lesser importance. But if you do, remember to maintain humility, knowing that your salary is paid by resources earned by those that chose to DO things.
Computer Science is no more about computers than astronomy is about telescopes
— Edsger Dijkstra
I tend to agree.
If you're talking about being a pure Computer Science academic specializing in abstract, foundational Computer Science concepts, then not necessarily.
To bend an analogy: this is a bit like asking if every rocket scientist at NASA should have to fly in space to be a "good rocket scientist". Of course not. Being an astronaut is part of the space flight industry, and a very hands-on part, but it doesn't mean that ground scientists aren't just as important in their own way.
That said, it's probably a good idea to APPLY the algorithm he created, if not actually write it in a real programming language. In this sense, you can think of algorithm design as a branch of mathematics.
Coding is not super important to be a true computer scientist. And thinking in code can constrain thinking as they seek to develop useful abstract concepts. Most excellent coders do not have the intellectual chops to analyze complex algorithms, or develop concepts such as programming languages, advanced searching and sorting algorithms, finite automata theory, distributed computing theory, R-Trees, fault-tolerance protocols, reliable communication protocols, digital signal processing algorithms, cryptographic theory, performance analysis and optimization, efficient caching, map-reduce, reliable security protocols etc. Excellent coders and computer engineers can usually use these theories in the systems that they are trying to build and do it quite effectively, but that is really the realm of the computer systems engineer or computer programmer.
Coding is critically important to being a computer programmer. Understanding how to encode the useful abstract concepts produced by the computer scientists into working code is also useful.
One big problem in computer science is that they often have to find solutions to math problems that have little utility in solving today's programming problems. Even if they coded a solution, nobody would really be able to use it. Think about digital signal processing theory. It was invented by folks like Fourier, Hilbert and Shannon, but the application to computerized DSP problems was not widely possible until about 20 years ago.
The big problem in computer education is that most people taught by computer scientists will not become computer scientists. But too many computer scientists don't get this. Coding may not be important to them, but if you are in their class, it is almost certainly going to be important to you.
Another big problem in computer education is that many true computer scientists lack the industrial experience to be useful in teaching software development. They are essentially trying to teach something that they really don't know. That causes them to lose credibility. Things that are important in an industrial setting just don't often register with some of these computer scientists.
The long and the short, coding is important for most people who become "computer scientists" because most of those people will become computer programmers and computer system engineers.
Depends on the subfield the professor is in.
Anybody competent in numerical analysis is probably a Fortran whiz. Any AI professor will code in Lisp or Prolog or something like that.
In some of the more mathematical areas, there really isn't a need to code. I'd still be a touch suspicious, myself.
Sounds like hes more of a discrete mathematics kind of guy... just into the math and theory behind computer science. Take what these types of professors have to say with a grain of salt.
I would have to say "Logic is important"
You can get away with understanding the theory only but I always found I understood algorithms and such 1000x better after having coded them (Bubble sort vs. Quicksort for example, it is great to know the Big-O but seeing it in practice with large data-sets gives you a certain real world appreciation for measuring computational complexity).
One interesting thing I have found is the more you study the theoretical aspets of computer science the easier coding becomes. At some point you stop thinking of things in a particular language but rather see them just as the broader concepts of computer sience.
This is like asking if all English professors should be capable of writing movies, TV series, novels, plays and poems to my mind. Similarly, imagine a Math professor that never uses numbers for an equally outlandish idea. That is to say that there are some basic elements that do give coding some importance in being able to teach basic Computer Science. Thus the professor should know basic language syntax and how to write programs as sophisticated as the courses that the professor is teaching. If the professor is teaching about compiler design and never wrote a compiler before, this would be a major problem. Imagine a chef cooking a cake that has never cooked or ate a cake previously. Aye carumba.
While I can see some advantages to implementing an algorithm to know it, I doubt it is a requirement. After all, one could wonder how far down the rabbit hole of implementation does one go in understanding how an algorithm is implemented? For example does someone have to take any algorithm and implement it under various paradigms like procedural, Object-oriented, and functional programming to really know it? Do they have to know how compilers translate all the code and move the bits around on an electron-by-electron level to be rather pedantic about it.
"I never code," does have an implication of containing the past as well as present tense in a way though. There can also be an implicit assumption of "coding" as a lowly thing that is below the professor for another way to view the statement that can carry a rather negative tone to it that may not go over well in some circles.
Your professor may be right a bit, in that to be a professor you don't need to code, but knows much about the theory. But that will not work outside the university perimeters.
Despite being a professional software developer, I got a degree in Mechanical Engineering.
You can be a good mechanical designer with very little experience building and machining parts, leaving that job to machinists. But knowing how to build and machine parts will make you a significantly better engineer, because you can predict difficulties involved with fabricating and assembling whatever you are designing.
The same goes for software. A "coder" is a machinist or technician, while a software engineer is, well, the engineer. For many places, one person does both jobs. It's not impossible, and for some very abstract issues, a "engineering only" position might work.
But for the vast majority, there is absolutely no benefit from refusing to code.
Unless you're contemplating and end to the halting problem, there's always a use for coding in every aspect of Computer Science.
The only CS class I took with absolutely no programming was theory. I'd imagine there are plenty of physicists out there who say, "I never experiment" but they're probably also the ones who say, "I never discover anything". And I'd be surprised if they care.
As a Computer Science student i think that at first it's better to understand concepts that involve software development. Once you have learned the idea behind software and how does it interacts with a computer, then is time to start coding and dealing with specific implementation problems.
This is just like "Software Exceptions", at first you only deal with them because you did something that was not allowed to do. Then when you learn them, start doing the same with your code so to make it more verbose.
Well i think that people who dont care about concepts like those programmers who use Exceptions as a normal workflow in their applications. They know HOW but dont really get WHY.
I've got another idiom for your professor:
Those who can, do, those who can't, teach.
imo, talk is cheap. Anyone can endlessly jabber on about 'theory' and call it 'computer science'. But until its put into actual practice, theory isnt very useful because there's no way to validate it. I'd take a prof's opinion about something much more seriously if I knew he's actually solved a particular problem in code than if he's just regurgitating 'theory' which may or may not have any supporting evidence to back up his point of view.