The Driving Simulator Lab was founded in early 2005 and is located at the Smart House in the Oak Hammock community at the University of Florida.  The Lab was upgraded in Fall 2016 with a new system from Realtime Technologies Inc. (RTI).


The simulator is a full car cab (4-door sedan) with seven visual channels.  The three forward channels create a 180 degree field-of-view (FOV).  This wide FOV is accomplished by connecting three flat screens with scenes provided by three high resolution LCoS projectors.  The rear scene is also projected on a flat screen and viewed through the in-cab rear-view mirror.  The side-view mirrors and a virtual dash are simulated with LCD panels.  Altogether the visual channels form an immersive and realistic driving experience.


A 5.1 channel audio system external to the car cab provides the environmental sounds such as traffic, passing vehicles, and road noise.  An internal audio system to the car cab provides the engine sounds and vibrations, as well as pre-programmed voice commands and any other scripted sounds.

Car Cab

A 4-door sedan allows the driver to operate normal accelerator, brake, steering, transmission selection, and signaling controls with the simulator responding accordingly. Longitudinal and lateral movement allows the driver to speed up or slow down, come to a halt, steer laterally including lane changes and changes of direction at intersections. All driver inputs are controlled by software that interfaces with the electronics in the car cab.  For example, a high-fidelity control loading system provides feedback on the steering wheel that is directly coupled with the vehicle dynamics sampled at 2000 Hz.

Operator Station

A control area situated to the rear of the vehicle overlooks the driver, vehicle and projection screens. At this workstation, the center visual channel is duplicated and a control monitor allows the experimenter to set parameters for each trial and to monitor the driver’s speed and other variables. Two-way communication is maintained via an intercom system between the vehicle and workstation.

Autonomous Vehicle Mode

Once autonomous mode is engaged, the vehicle is controlled through the SimDriver™ JavaScript and the vehicle stops accepting input from the driver. The scripting allows a range of conditions to be controlled by the system as part of a complete automated scenario. SimDriver™ will keep the vehicle on the road and maintain a specified headway distance or desired velocity.

Measurement and Driver Performance

Empirical data can be captured and plotted within the software of the simulator. Specific data collection depends upon the driving scenario being used and the assessment goals.  Some examples of data that can be collected include:

  • Acceleration (longitudinal and lateral), accelerator pedal position, brake pedal force
  • Steering wheel position
  • Gear, engine RPM
  • Heading, heading error, headway distance and time
  • Tailway distance and time
  • Lane number and offset, road offset
  • Velocity ((longitudinal, lateral, and vertical)
  • Position (X, Y, Z)
  • Roll, pitch, yaw
  • Tire slip (each)
  • User-defined

Additional data outputs can be created through scripting or the integration of external equipment, such as EEG, GSR, or O2 saturation.

Furthermore, the center visual channel can be recorded to a video file for playback.