The Successful navigaTIon of mobile robots depends on robust posiTIon and velocity informaTIon. In parTIcular for undamped systems， such as vertical takeoff and landing of micro aerial vehicles （MAVs）， sufficient update rate and low-latency sensors are required to maintain track during operation.
In recent years optical flow sensors based on computer mouse sensors have been successfully used for this purpose ［6］。 Faced to the ground these sensors can be used for accurate velocity， and with integration， position measure-ments. However， mouse sensors require strong lighting to provide accurate measurements. The issue can be alleviated with onboard active lighting in the infrared range， such as high-brightness red LEDs. This however conflicts with the limitations on power consumption and ground distance. Automotive CMOS image sensors are substantially more light sensitive and allow operation in indoor environments and during adverse outdoor conditions without artificial light-ing. However， to our knowledge there is no CMOS based， lightweight sensor available that could be easily integrated into a robotics research system. The Parrot ARDrone has an onboard camera and computes optical flow in an embedded Linux environment ［2］ but the hardware design as well as the software implementation is closed source and can only be modified within certain bounds. In this work， we present PX4FLOW， an ARM Cortex M4 based sensor system that performs optical flow processing at 250 frames per second at a subsampled resolution of 64x64 pixels using a CMOS machine vision sensor. An ultrasonic range sensor is used to measure the distance towards the scene and to scale optical flow values to metric velocity values. Angular velocity is