July 27, 2010

dmytro.poplavskiy dmytro.popla..
Lab Rat
10 posts

Qt Mobility Tech Preview - Camera API

 

Welcome to the Camera API forum on DevNet.

The Camera API is one of the new APIs being worked on for the Qt Mobility program. It allows you to access the available camera devices, take photos and capture videos.

We would like to share our work so far and see what you think, we would especially like to hear your opinions and comments on what could be improved. This forum is your opportunity to help steer and shape the API so that you as developers will have the best possible API to use.

Today’s release is part our “Technology Preview” package for the new APIs planned for QtMobility 1.1.0.
QtLabs will still be used to communicate the availability of our new packages, but Devnet is where you have the opportunity to review and comment on the API, and suggest changes etc.

Welcome to the forum, we look forward to your feedback on the new Camera API.

Kind regards,
Dmytro
Qt Development Team

14 replies

July 28, 2010

Faid Faid
Lab Rat
18 posts

Is there a way to check if the camera is taken by another process?

QCamera::isAvailable() returns true in my app, even if Cheese has grabbed the camera already. If I call QCamera::state, the state is IdleState. Is there another API I can use?

If the camera is used by another process, the Camera example application gets a callback from QMediaRecorder::error, just after “libv4l2: error requesting 4 buffers: Device or resource busy” has been printed. However, in my case I only have a QCamera and a QCameraImageCapture, and neither of QCamera::error and QCameraImageCapture::error are emitted.

July 29, 2010

dmytro.poplavskiy dmytro.popla..
Lab Rat
10 posts

QCamera::isAvailable() is intended to check in the system has camera support (the backend plugin is installed). If the camera is used by another application, transition to QCamera::ActiveState (in some cases to QCamera::IdleState) should fail with QCamera::error() signal emited. It may be worth to add a QCamera::Error code for this case. Looks like it’s a bug in the current backend implementation.

August 13, 2010

blueiqbal blueiqbal
Lab Rat
2 posts

Hello,

I plan to build some example application (camera) on N900. So I would like to know, How to do packaging of Qt Mobility 1.1.0 Technology preview for N900 to port Qt mobility binaries on N900?

September 16, 2010

Faid Faid
Lab Rat
18 posts

I’m looking for a way to get the current frame rate of the camera, to do platform performance testing of different resolutions, but I have not found any API for this. Is it possible to do or would it be possible to add it to the camera APIs?

September 17, 2010

dmytro.poplavskiy dmytro.popla..
Lab Rat
10 posts
Faid wrote:
I'm looking for a way to get the current frame rate of the camera, to do platform performance testing of different resolutions, but I have not found any API for this. Is it possible to do or would it be possible to add it to the camera APIs?

If you are interested in testing the viewfinder frame rate, I’d suggest to write a simple video surface (QAbstractVideoSurfcae) which only collects stats and discard video frames data, and use it as a video output.

September 20, 2010

Faid Faid
Lab Rat
18 posts
dmytro.poplavskiy wrote:
Faid wrote:
I'm looking for a way to get the current frame rate of the camera, to do platform performance testing of different resolutions, but I have not found any API for this. Is it possible to do or would it be possible to add it to the camera APIs?

If you are interested in testing the viewfinder frame rate, I'd suggest to write a simple video surface (QAbstractVideoSurfcae) which only collects stats and discard video frames data, and use it as a video output.

Similar to the multimedia/videowidget example? How do I use QAbstractVideoSurface with QCamera though? It requires a QVideoWidget or a QGraphicsVideoItem to draw to, so I tried subclassing QVideoWidget (and make it own a QAbstractVideoSurface, which I would then pass “paint” events to) and pass it on to QCamera. When the camera draws, my overridden paintEvent does not get called. I put breakpoints in QVideoWidget::paintEvent as well and it doesn’t seem to be called neither when the camera draws to the screen. The camera output shows up in the widget, though I don’t understand where the drawing occurs. I get the feeling I’m approaching this from the wrong angle.

September 21, 2010

dmytro.poplavskiy dmytro.popla..
Lab Rat
10 posts

It’s possible not to use QGraphicsVideoItem or QVideoWidget, but request QVideoRendererControl yourself and pass your video surface for camera to render frames to.

The code may looks like:

  1. class VideoSurface : public QAbstractVideoSurface {
  2. QList<QVideoFrame::PixelFormat> supportedPixelFormats(
  3.             QAbstractVideoBuffer::HandleType handleType ) const
  4. {
  5. //return the list of formats you can handle.
  6. }
  7.  
  8. bool present(const QVideoFrame &frame)
  9. { // process the frame, or just count them.
  10. }
  11. };
  12.  
  13.  
  14. QVideoRendererControl *control = camera->service()->requestControl<QVideoRendererControl>();
  15.  
  16. if (control) control->setSurface(yourSurface);

September 21, 2010

Faid Faid
Lab Rat
18 posts

dmytro.poplavskiy wrote:
It's possible not to use QGraphicsVideoItem or QVideoWidget, but request QVideoRendererControl yourself and pass your video surface for camera to render frames to.

The code may looks like:

  1. class VideoSurface : public QAbstractVideoSurface {
  2. QList<QVideoFrame::PixelFormat> supportedPixelFormats(
  3.             QAbstractVideoBuffer::HandleType handleType ) const
  4. {
  5. //return the list of formats you can handle.
  6. }
  7.  
  8. bool present(const QVideoFrame &frame)
  9. { // process the frame, or just count them.
  10. }
  11. };
  12.  
  13.  
  14. QVideoRendererControl *control = camera->service()->requestControl<QVideoRendererControl>();
  15.  
  16. if (control) control->setSurface(yourSurface);

Thanks Dmytro, that worked very well!

November 5, 2010

asiraky asiraky
Lab Rat
1 posts

Hey guys,

Is there currently capability within qt and the camera api to take a photo and upload it to a webservice?

February 22, 2011

ABag ABag
Lab Rat
8 posts

Hi Guys,

I am using the QCamera and dispalying the viewport on a QMainWindow. But if the camera is On and my application goes to background & then comes to foreground only the camera Viewport is visible.Rest of the colors of my app do not get repainted although the buttons are there & I can perform operations.
How can I solve this app repaint problem when camera is ON?

Thnx
ABag

May 24, 2011

lapnd lapnd
Lab Rat
2 posts

Hi Dmytro,
I’m new with Qt,and I have a confusing how the real thing works from QtMobilty to Camera hardware.
Let’s say. There are 2 kind of cameras:
1. Camera with encoder. Camera output is compressed data (H264,MPEG4,H263…).
2. Camera without encoder. Camera output is RAW data (YUV,RGB..).
This is my understanding about how Qt work:
In the first case, it needs to decompress video data and then display it.
To decompress video data, it need some software encoders (come from GStreamer plugin).

In the second case,it is needed to encode RAW data. This compressed data is used for 2 purposes

- Send to other ( ex, when video conference, we send compressed data instead of raw data)

- Decompress to display in local computer.

I hope that my thinking above is wrong (in second case)
Because I think, if we just want to preview video from camera ( with output is RAW), it is unnecessary to compress/decompress raw data. We can use RAW data directly to display.

Could you please help me to make clear about how it work from QtMobility—>GStreamer—>Camera plug-in—>V4l2—>Camera-Hardware?
Thank you very much!

May 27, 2011

dmytro.poplavskiy dmytro.popla..
Lab Rat
10 posts

Hi,

QCamera relies on camerabin gstreamer element, I’m not sure it currently supports the first case with compressed viewfinder frames.

In the second case there is no reason to compress/uncompress frames for displaying, video frames from camera (in uncompressed YUV or RGB formats) are displayed directly and only encoded if video recording is active.

May 28, 2011

lapnd lapnd
Lab Rat
2 posts

Hi Dmytro,
Thank you very much!
I caught the ideal

October 16, 2011

mismail mismail
Lab Rat
79 posts

Hi Dmytro,
I am trying to get image frames from the camera on N950.
I know that i should reimplement QAbstractVideoSurface, So i did that like follows:

  1. VideoSurface::VideoSurface(QWidget* widget, VideoIF* target, QObject* parent)
  2.     : QAbstractVideoSurface(parent)
  3. {
  4.     m_targetWidget = widget;
  5.     m_target = target;
  6.     m_imageFormat = QImage::Format_Invalid;
  7.     orientationSensor=new QOrientationSensor();
  8.     m_orientation = ORIENTATION_LANDSCAPE;
  9.     orientationSensor->start();
  10. }
  11.  
  12. VideoSurface::~VideoSurface()
  13. {
  14.     orientationSensor->stop();
  15.  
  16.     delete orientationSensor;
  17. }
  18.  
  19. bool VideoSurface::start(const QVideoSurfaceFormat &format)
  20. {
  21.     m_videoFormat = format;
  22.     const QImage::Format imageFormat = QVideoFrame::imageFormatFromPixelFormat(format.pixelFormat());
  23.     const QSize size = format.frameSize();
  24.  
  25.     if (imageFormat != QImage::Format_Invalid && !size.isEmpty()) {
  26.         m_imageFormat = imageFormat;
  27.         QAbstractVideoSurface::start(format);
  28.         return true;
  29.     } else {
  30.         return false;
  31.     }
  32. }
  33.  
  34. unsigned char* VideoSurface::createGrayscaleBuffer(const QImage &dstImage, const int dWidth, const int dHeight)const
  35. {
  36.     unsigned char* grayscaledBuffer = new unsigned char [dWidth*dHeight];
  37.     int offset = 0;
  38.     // default QT grayscale
  39.     for(int y = 0; y< dHeight; y++)
  40.         for(int x = 0; x< dWidth; x++)
  41.             grayscaledBuffer[offset++]=qGray(dstImage.pixel(x,y));
  42.  
  43.     return grayscaledBuffer;
  44. }
  45.  
  46. bool VideoSurface::present(const QVideoFrame &frame)
  47. {
  48.     m_frame = frame;
  49.  
  50.     // number of frames received for display
  51.     numFrames++;
  52.     if (surfaceFormat().pixelFormat() != m_frame.pixelFormat() ||
  53.             surfaceFormat().frameSize() != m_frame.size()) {
  54.         stop();
  55.         return false;
  56.     } else {
  57.  
  58.         m_frame.map(QAbstractVideoBuffer::ReadOnly);
  59.  
  60.         iWidth = m_frame.width();
  61.         iHeight = m_frame.height();
  62.         int line = m_frame.bytesPerLine();
  63.  
  64.         // build QImage from frame
  65.  
  66.         m_completeImage = QImage(m_frame.bits(), iWidth, iHeight, line, m_frame.imageFormatFromPixelFormat(m_frame.pixelFormat()));
  67.  
  68.         m_frame.unmap();
  69.  
  70.         QImage dstImage = scaleImage(m_completeImage);
  71.  
  72.         int dHeight = dstImage.height();
  73.         int dWidth = dstImage.width();
  74.         unsigned char* grayscaledBuffer = createGrayscaleBuffer(dstImage, dWidth, dHeight);
  75.  
  76.         m_orientation = ORIENTATION_CCW;
  77.  
  78.         QOrientationReading* reading= orientationSensor->reading();
  79.         if ( orientationSensor->isActive() ){
  80.             if (reading->orientation() == QOrientationReading::RightUp){ //rotate with -90 (ccw)
  81.                 m_orientation = ORIENTATION_LANDSCAPE;
  82.             }
  83.         }
  84.  
  85.         // do some image processing work
  86.         //////////////
  87.  
  88.         delete grayscaledBuffer;
  89.  
  90.         // convert points back to original size
  91.         double iWi = (double)iWidth/dWidth;
  92.         double iHi = (double)iHeight/dHeight;
  93.  
  94.         // should keep aspect ratio
  95.         iWi = iHi = qMin(iWi, iHi);
  96.  
  97.         // enlarge faces
  98.         int marginX, marginY;
  99.  
  100.         m_target->updateVideo();
  101.  
  102.         return true;
  103.     }
  104. }
  105.  
  106. QImage VideoSurface::scaleImage(const QImage & srcImage)const
  107. {
  108.     QImage dstImage;
  109.     if(MAX_DIM < iWidth || MAX_DIM < iHeight){
  110.         if(iWidth > iHeight)
  111.             dstImage = srcImage.scaledToWidth(MAX_DIM, Qt::SmoothTransformation);
  112.         else
  113.             dstImage = srcImage.scaledToHeight(MAX_DIM, Qt::SmoothTransformation);
  114.     }
  115.     else
  116.         dstImage = srcImage;
  117.     return dstImage;
  118. }
  119.  
  120. void VideoSurface::paint(QPainter *painter)
  121. {
  122.     if (m_frame.map(QAbstractVideoBuffer::ReadOnly)) {
  123.         QImage image(
  124.                     m_frame.bits(),
  125.                     m_frame.width(),
  126.                     m_frame.height(),
  127.                     m_frame.bytesPerLine(),
  128.                     m_imageFormat);
  129.         QRect r = m_targetWidget->rect();
  130.  
  131.         int shiftX = qAbs(r.size().width() - image.size().width()) / 2;
  132.         int shiftY = qAbs(r.size().height() - image.size().height()) / 2;
  133.  
  134.         QPoint centerPic(shiftX , shiftY);
  135.  
  136.         if (!image.isNull()) {
  137.             painter->drawImage(centerPic,image);
  138.             // draw faces
  139.  
  140.         }
  141.         m_frame.unmap();
  142.     }
  143. }
  144.  
  145. QList<QVideoFrame::PixelFormat> VideoSurface::supportedPixelFormats(
  146.         QAbstractVideoBuffer::HandleType handleType) const
  147. {
  148.     if (handleType == QAbstractVideoBuffer::NoHandle) {
  149.         return QList<QVideoFrame::PixelFormat>()
  150.                 << QVideoFrame::Format_RGB32
  151.                 << QVideoFrame::Format_ARGB32
  152.                 << QVideoFrame::Format_ARGB32_Premultiplied
  153.                 << QVideoFrame::Format_RGB565
  154.                 <<QVideoFrame::Format_UYVY
  155.                << QVideoFrame::Format_RGB555;
  156.     } else {
  157.         return QList<QVideoFrame::PixelFormat>();
  158.     }
  159. }

note that i added the QVideoFrame::Format_UYVY to the supported pixel formate.
but the problem now is that i always get the following error

  1. Failed to start video surface
  2. CameraBin error: "Internal data flow error."

I think the problem is because I should convert the frame from UYVY to RGB, but i don’t know how to do this.
could you please help me?

 
  ‹‹ Qt Mobility Tech Preview - Landmarks API      Qt Mobility Tech Preview - Service Framework API ››

You must log in to post a reply. Not a member yet? Register here!