I am trying to build a simple augmented reality application that will overdraw graphics over a video stream. I achieved that by capturing video frames using gstreamer appsink (how to), converting them to QImage objects that can be drawn with QPainter onto QWidget.

As the appsink new_buffer callback is a static callback function, it can not trigger a repaint. In order to repaint frames, I created a QTimer that ticks 30 times a second and calls a repaint method. This puts a lot of stress on the processor and I also have problems with random crashes that come from the paintEvent function.

1.) Is this the way one should implement drawing over a video stream in QT? Is it possible to trigger a repaint from a static function?
2.) Why is the application's performance so bad? It loads the processor almost to the maximum at the video resolution of 320×240? At this resolution the application achieves 30FPS, however, if I converting QImage to QPixmap (this fixed the crashing bug) the performance drops to 12 FPS. If I disable the repaint timer (drawing of frames on the screen), I get very little processor activity even at 640×480, which suggests the sink and conversion to QImage is not to blame.
3.) Why does my application crash when I use QImage and works OK when I use QPixmap;

//callback function of appsink (Gstreamer thread) 
void CameraN900::new_buffer (GstAppSink *_appsink, gpointer user_data)
		//initialize appsink

	GstBuffer* buffer = gst_app_sink_pull_buffer(appsink);
	image_data = (unsigned char *) GST_BUFFER_DATA (buffer); //image_data is a static unsigned char *
	gst_buffer_unref(buffer); //If I comment out this the application stops crashing, but fills up the memory

//static function  
void CameraField::createImage()
	*image_buffer=QImage(CameraN900::image_data, buffer_width, buffer_height,3*buffer_width, QImage::Format_RGB888); // loads data into an buffer_image

//GUI thread. Repinat triggered by a timer.
void CameraField::paintEvent(QPaintEvent * event)
	QPainter painter(this);