为什么我的Ray Tracer的边缘失真这么大?

问题描述

我正在从头开始编写光线跟踪器。该示例使用射线球相交检测渲染两个球。当球体靠近屏幕中心时,它们看起来很好。但是,当我移动相机或调整球体的位置以使其更靠近边缘时,它们会变形。

enter image description here

enter image description here

这是射线投射代码

void Renderer::RenderThread(int start,int span)
{

    // pCamera holds the position,rotation,and fov of the camera
    // prendertarget is the screen to render to

    // calculate the camera space to world space matrix
    Mat4 camSpaceMatrix = Mat4::Get3DTranslation(pCamera->position.x,pCamera->position.y,pCamera->position.z) *
        Mat4::GetRotation(pCamera->rotation.x,pCamera->rotation.y,pCamera->rotation.z);

    // use the cameras origin as the rays origin
    Vec3 origin(0,0);
    origin = (camSpaceMatrix * origin.Vec4()).Vec3();

    // this for loop loops over all the pixels on the screen
    for ( int p = start; p < start + span; ++p ) {

        // get the pixel coordinates on the screen
        int px = p % prendertarget->GetWidth();
        int py = p / prendertarget->GetWidth();

        // in ray tracing,ndc space is from [0,1]
        Vec2 ndc((px + 0.75f) / prendertarget->GetWidth(),(py + 0.75f) / prendertarget->GetHeight());

        // in ray tracing,screen space is [-1,1]
        Vec2 screen(2 * ndc.x - 1,1 - 2 * ndc.y);

        // scale x by aspect ratio
        screen.x *= (float)prendertarget->GetWidth() / prendertarget->GetHeight();

        // scale screen by the field of view
        // fov is currently set to 90
        screen *= tan((pCamera->fov / 2) * (PI / 180));

        // screen point is the pixels point in camera space,// give a z value of -1
        Vec3 camSpace(screen.x,screen.y,-1);
        camSpace = (camSpaceMatrix * camSpace.Vec4()).Vec3();

        // the rays direction is its point on the cameras viewing plane
        // minus the cameras origin
        Vec3 dir = (camSpace - origin).normalized();

        Ray ray = { origin,dir };

        // find where the ray intersects with the spheres
        // using ray-sphere intersection algorithm
        Vec4 color = TraceRay(ray);
        prendertarget->PutPixel(px,py,color);
    }

}

FOV设置为90。我看到其他人在哪里遇到此问题,但这是因为他们使用的FOV值非常高。我认为90应该不会有问题。即使根本不移动相机,此问题仍然存在。靠近屏幕边缘的任何对象都会变形。

解决方法

如有疑问,您随时可以查看其他渲染器在做什么。我总是将我的结果和设置与Blender进行比较。例如,Blender 2.82的默认视场为39.6度。

我也倾向于指出这是错误的:

 Vec2 ndc((px + 0.75f) / pRenderTarget->GetWidth(),(py + 0.75f) / pRenderTarget->GetHeight());

如果要获取像素的中心,则应该为0.5f

 Vec2 ndc((px + 0.5f) / pRenderTarget->GetWidth(),(py + 0.5f) / pRenderTarget->GetHeight());

而且,这确实是一种挑剔的事情,您的时间间隔是开放时间间隔,而不是封闭时间间隔(如您在源注释中所述。)图像平面坐标永远不会达到0或1,而相机空间坐标永远不会达到0永远不会完全为-1或1。最终,图像平面坐标将转换为像素坐标,即为[0,宽度)和[0,高度)左封闭间隔。

祝您射线追踪器好运!