You will need: (1) Arduino Uno - knockoffs will work just fine. (1) Breadboard - for this project, I took the +/- rail from one breadboard and used another, smaller breadboard. Any size will do. (5) HC-SR04 Ultrasonic Sensors. (1) Potentiometer - used to control the speed of the car. Driverless cars have their cameras trained on the road – and on those inside, making some wonder how that data will be used. Plus, Twitter’s viewing limits. Intelligent Automation (IA) in automobiles combines robotic process automation and artificial intelligence, allowing digital transformation in autonomous vehicles. IA can completely replace humans with automation with better safety and intelligent movement of vehicles. This work surveys those recent methodologies and their comparative analysis, which use artificial intelligence, machine Self-driving vehicles employ a wide range of technologies like radar, cameras, ultrasound, and radio antennas to navigate safely on our roads. In modern autonomous vehicles, these technologies are used in conjunction with one another, as each one provides a layer of autonomy that helps make the entire system more reliable and robust. LIDAR combines laser light pulses with other information captured by a vehicle to generates a three-dimensional view of vehicle surroundings, for instance the street on which the vehicle is traveling. The MonoCon technique creates 3-D images from 2-D objects by placing a “bounding box” around objects in autonomous vehicle surroundings. Mainly two types of cameras are used in AV: Monocular and Stereo Camera. Monocular cameras provide 2D array pixels that contain detailed information about the car environment. One disadvantage of this type of camera is the lack of depth detection capability, and this information is used to determine the object's size and location on the surface. Self-driving tech leaders such as General Motors, Waymo, and Mercedes-Benz all rely on sensors, but not Tesla. The Texas-based automaker did use both radar and cameras to make its Autopilot semi-autonomous driving system possible, but beginning May 2021, it announced that it was ditching radar for the Model 3 and Model Y in North America The platform used by the V-Charge project is a VW Golf VI car modi ed for vision-guided autonomous driving. As shown in Figure 2, four sheye cameras are used to build a multi-camera system. Each camera has a nominal FOV of 185 and outputs 1280 800 images at 12.5 frames per second (fps). As one of the keys of autonomous technology, environmental perception technology adds eyes to autonomous cars through a variety of on-board sensors to accurately perceive the surrounding environment to ensure the safety and reliability of driving . At present, the most commonly used on-board sensors are LiDAR, radar and vision camera, etc., but With millions of camera-equipped cars sold across the world, Tesla is in a great position to collect the data required to train the car vision deep learning model. The Tesla self-driving team accumulated 1.5 petabytes of data consisting of one million 10-second videos and 6 billion objects annotated with bounding boxes, depth, and velocity. FlnKgcw.