0

I am building an API project, where I have a controller called C1, which calls service S1. Within this service, there are multiple method invocations to services S2and S3 and S4, as well as a call to the R1 repository and the ER1 repository, which contains calls to an external API.

This is my first time writing tests for an API but, as far as I know, each class should have a unit test*. However, when integration-testing, given that I have several services and repositories, that would mean integrating the services with each other (i.e., S1-S2, S1-S2, S1-S4), as well as with the repositories (S1-R1, S1-ER1). Is this the correct approach?

*Also, should the controller have unit tests, or would all be covered with the C1-S1 integration test?

New contributor
Héctor Iglesias is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.
2
  • 3
    A limited amount of end-to-end testing is often valuable, however the key to any testing is being pragmatic, focus on what has value, testing at multiple different levels with the right balance between those levels - this may include unit tests, isolated service tests, integration tests and end-to-end tests. Those aren't choices to be made against each other, rather different parts of an overall picture, and the choice is how much time/emphasis you put on each one (Which is really down to the discretion of you/your team/your organisation). Commented Nov 16 at 10:58
  • 1
    Define "best". Not everyone has the same priorities in terms of effort, complexity, risk, test suite runtime, ... Commented Nov 16 at 23:00

2 Answers 2

5

each class should have a unit test

This is not strictly true. There is little value in assertions such as “each method should be covered” or “we should have 95% branch coverage.”

If you have parts of your code which contain complex logic, that caused you headaches in the past, and that is often modified, make sure you have through testing.

If you have a class that is little more than a DTO, and which is part of the code base that wasn't modified for the past ten years, why would you add a test for that? What's the value?

that would mean integrating the services with each other

Exactly. This is why there are different types of tests. Unit tests are rather limited to, well, the unit they are testing. And as they do it in isolation, through stubs and mocks, they won't catch issues that would arise when those units are interacting with each other. Here's the value of integration tests and system tests.

Make sure, however, to understand the limitation of integration and system tests. As they cover a larger part of the code base than a unit test, they:

  1. Are usually slower than unit tests. This means that you won't be able to have tens of thousands of them for a medium-size application.

  2. Can fail for numerous reasons, and won't point you to the location of the issue. By comparison, a unit test (if correctly designed) will necessarily point you to at least the class, and at best the method or the line of code which causes an issue—or the issue would be in the test itself or its stubs/mocks.

Integration and system tests are also usually prone to the issues with the infrastructure itself. For instance, a misconfigured proxy server could easily make all your system tests go red. This is not necessarily a good thing: you should be able to identify such infrastructure issues through other means, such as smoke tests.

Also, should the controller have unit tests, or would all be covered with the C1-S1 integration test?

If the controller has its own complex logic that is being actively modified, add unit tests. Or think about moving this logic to a dedicated class—a common practice is to keep controllers very basic.

2

This is a "should" question, which necessarily is going to be answered by "It depends" - especially when it comes to testing. Despite what some zeaolots advocate, there really is no one-size-fits-all testing strategy. Generally, what you want to do is match your level of testing to your risk-tolerance level. Are you writing a browser-game that you and two of your friends are going to play? A couple of smoke or sanity tests and a few unit tests will probably be fine. Are you writing a fintech application that could bring down the NYSE if you insert a typo? Then pretty much ALL THE TESTING.

A couple of rules of thumb, though:

  1. minimize duplicate testing - that is, testing the same code or functionality multiple times via unit tests/integration tests/smoke tests. Some overlap is inevitable, but it probably shouldn't be dozens of times testing the same functionality.
  2. Usually you don't test others' code. Either assume they did sufficient testing, or don't use their code. A couple of sanity tests just to make sure it usually a good idea, though.

Personally, I like to use Integration or end-to-end tests for my own code with stubs or mocks at the boundaries. I only use Unit Tests for debugging purposes. i.e. if the integration test fails, I run the unit tests to narrow down where the problem is. I don't try to get exhaustive with Unit Tests, because they are white box and can just as easily mimic or shadow any bug they might be there to detect. Integration or end-to-end tests should be strongly tied to requirements, i.e., you could test them manually if you had to by checking observable outcomes. But that's just my preference. There really is no one "right" way, though there are plenty of ways that are definitely wrong.

New contributor
Jeremy French is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.