So I'm building a Quadcopter and started programming the basic controls for it.
The thing is my goal is to make it autonomous with sensors and Computer Vision.
The Sensors and the Optical Flow I can develop inside the flightcontroller but with computer vision and preferably ROS I need a full size computer or an SBC.
With SBC I thought about trying Nano Pi NEO Core 2 because it's 40mm x 40mm but I thought about if it's possible to do the CV and ROS Computing on a Laptop or Smartphone and just send the data to the flightcontroller to perform the actions.
So basically the Quadcopter is just the flight controller with the basic controls, sensors and the Camera sending the image to a laptop or Smartphone and then they do all the computing and send the actions back to the quadcopter.
I hope its kind of understandable.
The drone Frame is around 110mm - 130mm in diagonal.
Just saw this but sounds pretty feasible to me the only down side with having ground station doing processing is extra latency and noise in the video signal. You may be able to train a neural network (or otherwise find/write an algorithm) to remove the static and then pass along the good data to another system for doing the object detection and then control but will add a layer of processing time and latency.