Specifications include, but are not limited to: Specifications outline the requirements for acquiring a turnkey laboratory system that integrates a physical scale-model autonomous vehicle platform, a matching high-fidelity digital twin environment, control and analysis software, and curriculum materials. The system will support both in-person and virtual instruction in autonomous vehicle theory and practice. Quanser QCar 2 platform (including Self-Driving Car Studio and Virtual QCar 2), or equivalent. 2. Objectives • Provide students hands-on experience with a professional-grade autonomous vehicle research platform. • Enable parallel learning through a detailed digital twin when hardware access is limited. • Support a semester-long undergraduate or graduate course in autonomous vehicles, covering perception, localization, planning, control, and V2X communications. • Ensure compatibility between physical and virtual systems for seamless curriculum integration. 3. Technical Requirements – The following specifications are mandatory: 3.1 Physical Vehicle Platform • Scale: Car length between 30 and 50 cm. • Onboard Compute: NVIDIA Jetson™ AGX Orin (or equivalent in terms of CPU processing power, amount and speed of RAM, and capability of AI/ML acceleration hardware). • Sensors: o 360° RGB cameras o RGB-D camera o LIDAR unit o IMU and wheel encoders o Audio and environmental sensors • Power: Rechargeable LiPo battery pack with charger and battery level indicator. Operating time must be at least 20 min on a single charge. Two batteries must be supplied. • Connectivity: Wi-Fi 6 and Gigabit Ethernet. 3.2 Self-Driving Car Environment (Physical) • High-performance preconfigured PC (testbed & infrastructure server). • Three monitors (resolution at least 1920 x 1080 pixels). • Game controller for manual override. • Two floor maps (not smaller than 4 m × 2 m and not larger than 6 m x 3 m). • Environment border panels for branding/environment demarcation. • At least Four (4) programmable traffic lights with power sources. • Scale signage and traffic pylons. • Wireless router. • Real-time control framework (e.g., QUARC™ or equivalent). • Multi-language API support: Python, ROS 2, C++, MATLAB/Simulink (or equivalent). • Supports ML framework: TensorFlow, PyTorch (or equivalent). • Supports Computer vision libraries: OpenCV, GStreamer, VPI (or equivalent). • Software supports GPU acceleration for image processing tasks. • Software installation via containerization (Docker) or via automated software package. management tools (e.g. rpm, deb, or equivalent). • Supports Virtual environment integration with existing software, Gazebo.