I am using the book Ray Tracing from the Ground Up by Kevin Suffern to build my first ray tracers in Java. Based on the theory, I expected that direct illumination and simple path tracers render the same scene equally bright (that is, when the resulting images are more of less converged and the path tracer only allows paths of maximal length 2). However, my implementation for a simple path tracer renders much darker images. Here is an example:
The images on the left and right are respectively the result of direct illumination and path tracing with 225 rays per pixel. My path tracer does not use next event estimation; it returns black whenever the path becomes longer than a certain length without reaching the area light. For every pixel, the received radiances are averaged afterwards.
When I manually change the sensitivity, though (i.e. a simple post rendering scaling of the colours in the image), the path traced image becomes this:
Of course, the brightened image is not near convergence, but it should at least be clear that the path tracer actually renders the same scene similarly.
Is this normal behaviour? If so, could I just pick a larger power for the light source when using a path tracer? If not, please find some relevant code of my path tracer underneath.
This is my code for shading an intersection point of a ray and a matte object. It is supposed to shoot an arbitrary ray from the intersection point along a cosine distribution (I have used a branch factor 1 for my images).
public RGBSpectrum shadePath(Shading shading, BranchPathTracer tracer, int length) { int factor = tracer.getBranchFactor(); cosinusSampler sampler = new cosinusSampler(factor); List<Point> samples = sampler.computeSamples(); RGBSpectrum L = RGBSpectrum.BLACK; for (Point sample : samples) { Vector w = shading.getIntersectionPoint().getNormal(); Random PRNG = new Random(); Vector r = new Vector(PRNG.nextDouble(), PRNG.nextDouble(), PRNG.nextDouble()); Vector v = r.cross(w).normalize(); Vector u = v.cross(w); Vector wi = u.scale(sample.x).add(v.scale(sample.y)).add(w.scale(sample.z)).normalize(); Ray ray = new Ray(shading.getIntersectionPoint().toPoint(shading.getRay()), wi); L = L.add(tracer.trace(ray, length + 1)); } L = L.scale(1.0 / factor); return shading.getColour().scale(getDiffuseReflection()).multiply(L); } Here is how the cosine distributed samples are computed:
public List<Point> computeSamples() { List<Point> samples = new ArrayList<>(); for (int i = 0; i < getSamples(); i++) { double phi = 2 * Math.PI * Math.random(); double cosPhi = Math.cos(phi); double sinPhi = Math.sin(phi); double cosTheta = Math.sqrt(1 - Math.random()); double sinTheta = Math.sqrt(1 - cosTheta * cosTheta); double pu = sinTheta * cosPhi; double pv = sinTheta * sinPhi; double pw = cosTheta; Point sample = new Point(pu, pv, pw); samples.add(sample); } return samples; } This is the shading for emissive materials. showPath returns true by default.
public RGBSpectrum shadePath(Shading shading, BranchPathTracer tracer, int length) { Ray ray = shading.getRay(); Vector n = shading.getIntersectionPoint().getNormal(); boolean showPath = tracer.getShowOnlyLength() == 0 || length == tracer.getShowOnlyLength(); if (n.dot(ray.direction) < 0 && showPath) { return getTexture().getColour().scale(getPower()); } else { return RGBSpectrum.BLACK; } } For reference, here is also my shading function for matte objects under direct illumination.
public RGBSpectrum shade(Shading shading) { RGBSpectrum colour = shading.getColour(); double L = 0; final Point intersection = shading.getIntersectionPoint().toPoint(shading.getRay()); for (Light light : shading.getScene().getLights()) { double l = 0; for (Point sample : light.computeSamples()) { Vector wi = sample.subtract(intersection); double cos = shading.getIntersectionPoint().getNormal().dot(wi); if (cos > 0) { Ray shadowRay = new Ray(intersection, wi); if (!light.inShadow(shadowRay, shading.getScene())) { if (light instanceof PointLight) { double denom = 4 * Math.PI * wi.lengthSquared(); l += light.getPower() * cos / denom; } else if (light instanceof RectangularLight) { Rectangle rectangle = ((RectangularLight) light).getRectangle(); double cosp = -wi.dot(rectangle.getNormal()); l += light.getPower() * cos * cosp / (wi.lengthSquared() * rectangle.getArea()); } } } } L += l / light.getSamplesNumber(); } return colour.scale(getAmbientReflection() + L * getDiffuseReflection()); } Note: I started to randomly change things in my code as a last desparate attempt to find the cause for my dark images. Funnily enough I found out that averaging over the square root of the number of rays per pixel (i.e. change n * n into n in the code underneath) seems to result in the correct brightness. Maybe this is a coincidence, because I don't use the number of rays per pixel anywhere else. Or maybe I'm still overlooking something.
// iterate over the contents of the tile for (int y = tile.yStart; y < tile.yEnd; y++) { for (int x = tile.xStart; x < tile.xEnd; x++) { RGBSpectrum colour = scene.getBackground(); Random PRNG = new Random(x * width + y); // iterate over the contents of the subtile for (int p = 0; p < n; p++) { for (int q = 0; q < n; q++) { // create a ray through the center of the subtile. Double jitter1 = PRNG.nextDouble(); Double jitter2 = PRNG.nextDouble(); double rayX = x + (p + jitter1) / n; double rayY = y + (q + jitter2) / n; Ray ray = scene.getCamera().generateRay(new Sample(rayX, rayY)); // test the scene on intersections colour = colour.add(scene.getTracer().trace(ray)); } } // average the colours colour = colour.scale(1.0 / (n * n)); // add a color contribution to the pixel buffer.getPixel(x, y).add(colour); } } 






