如何正确使用iOS(Swift)SceneKit SCNSceneRenderer unprojectPoint

我正在iOS上使用SceneKit开发一些代码,而在我的代码中,我想确定全局z平面上的x和y坐标,z是0.0,x和y由点击手势确定.我的设置如下:
override func viewDidLoad() {
    super.viewDidLoad()

    // create a new scene
    let scene = SCNScene()

    // create and add a camera to the scene
    let cameraNode = SCNNode()
    let camera = SCNCamera()
    cameraNode.camera = camera
    scene.rootNode.addChildNode(cameraNode)
    // place the camera
    cameraNode.position = SCNVector3(x: 0,y: 0,z: 15)

    // create and add an ambient light to the scene
    let ambientLightNode = SCNNode()
    ambientLightNode.light = SCNLight()
    ambientLightNode.light.type = SCNLightTypeAmbient
    ambientLightNode.light.color = UIColor.darkGrayColor()
    scene.rootNode.addChildNode(ambientLightNode)

    let triangleNode = SCNNode()
    triangleNode.geometry = defineTriangle();
    scene.rootNode.addChildNode(triangleNode)

    // retrieve the SCNView
    let scnView = self.view as SCNView

    // set the scene to the view
    scnView.scene = scene

    // configure the view
    scnView.backgroundColor = UIColor.blackColor()
    // add a tap gesture recognizer
    let tapGesture = UITapGestureRecognizer(target: self,action: "handleTap:")
    let gestureRecognizers = NSMutableArray()
    gestureRecognizers.addobject(tapGesture)
    scnView.gestureRecognizers = gestureRecognizers
}

func handleTap(gestureRecognize: UIGestureRecognizer) {
    // retrieve the SCNView
    let scnView = self.view as SCNView
    // check what nodes are tapped
    let p = gestureRecognize.locationInView(scnView)
    // get the camera
    var camera = scnView.pointOfView.camera

    // screenZ is percentage between z near and far
    var screenZ = Float((15.0 - camera.zNear) / (camera.zFar - camera.zNear))
    var scenePoint = scnView.unprojectPoint(SCNVector3Make(Float(p.x),Float(p.y),screenZ))
    println("tapPoint: (\(p.x),\(p.y)) scenePoint: (\(scenePoint.x),\(scenePoint.y),\(scenePoint.z))")
}

func defineTriangle() -> SCNGeometry {

    // Vertices
    var vertices:[SCNVector3] = [
        SCNVector3Make(-2.0,-2.0,0.0),SCNVector3Make(2.0,SCNVector3Make(0.0,2.0,0.0)
    ]

    let vertexData = NSData(bytes: vertices,length: vertices.count * sizeof(SCNVector3))
    var vertexSource = SCNGeometrySource(data: vertexData,semantic: SCNGeometrySourceSemanticVertex,vectorCount: vertices.count,floatComponents: true,componentsPerVector: 3,bytesPerComponent: sizeof(Float),dataOffset: 0,dataStride: sizeof(SCNVector3))

    // normals
    var normals:[SCNVector3] = [
        SCNVector3Make(0.0,0.0,1.0),1.0)
    ]

    let normalData = NSData(bytes: normals,length: normals.count * sizeof(SCNVector3))
    var normalSource = SCNGeometrySource(data: normalData,semantic: SCNGeometrySourceSemanticnormal,vectorCount: normals.count,dataStride: sizeof(SCNVector3))

    // Indexes
    var indices:[CInt] = [0,1,2]
    var indexData  = NSData(bytes: indices,length: sizeof(CInt) * indices.count)
    var indexElement = SCNGeometryElement(
        data: indexData,primitiveType: .Triangles,primitiveCount: 1,bytesPerIndex: sizeof(CInt)
    )

    var geo = SCNGeometry(sources: [vertexSource,normalSource],elements: [indexElement])

    // material
    var material = SCNMaterial()
    material.diffuse.contents  = UIColor.redColor()
    material.doubleSided = true
    material.shininess = 1.0;
    geo.materials = [material];

    return geo
}

如你看到的.我有一个三角形,它是4单位高,4单位宽,并设置在以x,y(0.0,0.0)为中心的z平面(z = 0)上.相机是认的SCNCamera,它看起来是负z方向,我把它放在(0,15). zNear和zFar的认值分别为1.0和100.0.在我的handleTap方法中,我取得水龙头的x和y屏幕坐标,并尝试找到z = 0.0的x和y全局场景坐标.我正在调用unprojectPoint.

unprojectPoint的文档表示

Unprojecting a point whose z-coordinate is 0.0 returns a point on the
near clipping plane; unprojecting a point whose z-coordinate is 1.0
returns a point on the far clipping plane.

虽然没有具体说,对于近距离和远平面之间的线性关系之间的点,我已经做出了这个假设,并将screenZ的值计算为近平面与远平面之间的距离百分比z = 0飞机位于.为了检查我的答案,我可以点击三角形附近的角落,因为我知道它们在全局坐标中的位置.

我的问题是,我没有得到正确的值,当我开始更改相机上的zNear和zFar剪辑平面时,我没有获得一致的值.所以我的问题是,我该怎么办?最后,我要创建一个新的几何图形,并将其放在z平面上,对应于用户点击的位置.

在此先感谢您的帮助.

解决方法

3D图形管线中的典型深度缓冲区 are not linear.透视分区 causes depths in normalized device coordinates to be on a different scale.( See also here.)

所以你要进入unprojectPoint的z坐标并不是你想要的.

那么如何才能找到在世界空间中匹配一个平面的归一化深度坐标?那么,如果那架飞机与摄像机是正交的 – 那就是你的帮助.那么你需要做的就是在那架飞机上投射一点:

let projectedOrigin = gameView.projectPoint(SCNVector3Zero)

现在,您可以在三维视图中拥有世界起源的位置归一化深度空间.要将2D视图空间中的其他点映射到此平面上,请使用此矢量中的z坐标:

let vp = gestureRecognizer.locationInView(scnView)
let vpWithZ = SCNVector3(x: vp.x,y: vp.y,z: projectedOrigin.z)
let worldPoint = gameView.unprojectPoint(vpWithZ)

这让您在世界空间中将点击/分接位置映射到z = 0平面,适合用作节点的位置,如果要向用户显示该位置.

(请注意,只要映射到垂直于摄像机视图方向的平面,此方法才有效.如果要将视图坐标映射到不同面向的表面,则vpWithZ中的归一化深度值不会不变.)

相关文章

UITabBarController 是 iOS 中用于管理和显示选项卡界面的一...
UITableView的重用机制避免了频繁创建和销毁单元格的开销,使...
Objective-C中,类的实例变量(instance variables)和属性(...
从内存管理的角度来看,block可以作为方法的传入参数是因为b...
WKWebView 是 iOS 开发中用于显示网页内容的组件,它是在 iO...
OC中常用的多线程编程技术: 1. NSThread NSThread是Objecti...