Image buffer size is too short
-
Hello,
We are currently working on a IA model classification on images of animals. We are supposed to take photo with the camera, then feed the model to have class and save image on SD. We started to do it on Arduino but we are facing a big problem :
Our model needs 224*224 images (mobilenet v1). So we started to take photos on format YUV4222 and so we can convert it to rgb565 for the model and to JPEG to save it on sd card.
But the problem is : the buffer from the camera is uint8_t type. so it's not enough to store the image in yuv format. And the conversions only go from yuv to other (rgb565, jpeg and gray). So it is not possible to take photos of more than 2⁸ pixels in yuv ?
Or is there another conversion way ?Thanks
Simon
-
Hey, @sisimonis-5-1-1!
Have you taken a look at the Arduino camera example?
The camera has two functions:- Video Stream function to get Camera preview image
- Capture picture function to get high-resolution JPEG image.
For AI applications we recommend the camera preview image.
https://developer.sony.com/develop/spresense/docs/arduino_developer_guide_en.html#_camera_library
This part of the documentation goes through the camera sketch explaining step by step.
In the callback function you convert the image from from YUV422 to RGB565.This way you get an image that is by default 320 x 240, but you can also you the clipAndResizeImageByHW function from this library to clip the image to the size you want for your model.
I hope this helps in your application.
-
@CamilaSouza Thank you for the answer.
I have another question on what you said. Is there an easy way to get pixels from rgb565 to put them in input of tensorflow lite interpreter. Is there a function for this purpose. Because we need to take several photos and class them in a short period and manual conversion costs a lot.
Thanks
-
Hey, @sisimonis-5-1-1
I don't believe there is a function exactly for this purpose.
I think you'll have to use memcopy to copy the image from the camera buffer to the input tensor.