Welcome to the forum.
I would personally recommend python, you can then lever off many of the various processing libraries such as opencv etc.
I did something similar with pidog ros and slam a couple of years ago,
The final slam models are shown in the Facebook page in the above link.
I’v also shared a lot of my code for other autonomous behaviours in various places on this forum too, you should be able to search and take a look, if interested.
I found that the rpi4 plus pidog was on the limit of being able to process images and slam for real-time navigation. Adding AI then caused overheating. I ended up using ros messaging and offloading some processing to an external computer, which did work. I think using the much more mobile faster picar will really struggle in real time.
So i decided to move away from complex AI style behaviours, and simplify. Subsumption was my first attempt at dumbing down
My final strategy was a combination of insect behavioral models inspired by the work of for example Barbara Webb at Edinburgh University and Barnard machines. This all works much faster. My main navigation was optical flow and ultrasonics, which is fast, then, coupled to slower AI recognition when the robot is stationary. I also had to add a magnetic i2c imu.
Hopefully the above can give you a few ideas and I’m sure others will have other thoughts.
It’s a really interesting project, and i wish you the best of luck.