Camera position viewing 3D stimulus

Hi, I want to present 3D stimuli for experiment. I have two monitors are vertical placed but with 120 degree between the two monitors so that they can cover a large field of view. My experiment requires the subject’s eyes locate as high as the bottom of the monitor, not in the vertical center. The function “gluPerspective( fovy, aspect, zNear, zFar )” has a default eye position and looking direction. The eye position is in both vertical center and horizontal center and looking direction is straight forward. In my case, since the eye is not in “center”, it is a looking up front view. The 3D stimuli become distorted as viewing in this up front view. Is there a way to set the camera at different position, here the position I mean is the actual location of eye or camera related to the monitor, so that the 3D stimuli can keep undistorted?

To solve this, translate your scene so that its center is right where you want it to be. So if your eye is 30 cm below center of screen, move your scene up by 30 cm.

I don’t really understand the description of your setup, but maybe what you are looking for is gluLookAt() for setting up the initial modelview matrix?

I do use the gluLookAt() to move it up or down, perhaps it because my method. I use 3 monitors to present 270 degree FOV’s 3D visual stimuli. Since I don’t want/know to perform the geometric transformation, so I set the front monitor to present 90 degree of front view and the other 2 monitors are placed on two sides to present 90 degree of left view and right view. So when sitting in the middle ( Stereocenter), it can show 270 degree FOV’s 3D stimuli without geometric transformation. it looks good when right in the stereocenter point, however if eye is not in the stereocenter point, then the stimuli around the corner of two screens become distorted. So how to solve this issue my question. I know how to geometric transform a 2D stimuli to 3D, but have no idea about geometric transform an already 3D stimuli ()