Awesome!
Â
When I get a spare moment, I will try this out. I hope to do more work on the Qt4 GUI system, too. =D
I had about 15 minutes to play with your GUI.
Â
It rocks. Its quite a nice replication of the current MFC one, plus it does the job.
Â
I'm impressed [img]<fileStore.core_Emoticons>/emoticons/smile.png[/img]/emoticons/smile@2x.png 2x" width="20" height="20" />
I think I found a little problem, but I don't have any idea how to fix it.
If you have Composite extension enabled in xorg.conf, some kind of race condition occurs. When you're running a game and try to access the menu, emulator can't decide whether the menu shall be drawn on top of the game screen or vice versa.
I've seen that a while ago in gxine, but it's fixed now (actually, I think it was an upgrade of xine-lib that fixed it).
I had about 15 minutes to play with your GUI.
Â
It rocks. Its quite a nice replication of the current MFC one, plus it does the job.
Â
I'm impressed [img]<fileStore.core_Emoticons>/emoticons/smile.png[/img]/emoticons/smile@2x.png 2x" width="20" height="20" />
Â
Thanks, most of the credits should go to kxu, the original author of this GUI though.
Â
I think I found a little problem, but I don't have any idea how to fix it.
If you have Composite extension enabled in xorg.conf, some kind of race condition occurs. When you're running a game and try to access the menu, emulator can't decide whether the menu shall be drawn on top of the game screen or vice versa.
I've seen that a while ago in gxine, but it's fixed now (actually, I think it was an upgrade of xine-lib that fixed it).
Â
Humm. I have the composite extension enabled, and the menus are fine, even when compiz is enabled. Does it only happen with the OpenGL output ? Maybe it is a problem with the graphic drivers or xorg. I'm using the NVidia blob v169.12 and xorg 1.4.
Hey as long as you dislike and not hate then I am willing to give this Front a go [img]<fileStore.core_Emoticons>/emoticons/tongue.png[/img]/emoticons/tongue@2x.png 2x" width="20" height="20" /> (I love ati but you know even if you hated it I'll still be willing to test it anyways)
Well, yesterday I switched to kernel 2.6.25 and due to my rather strange kernel config had to change from ati-drivers to xf86-video-ati. The problem is still there. But as qt4 gui, while not working, doesn't crash anymore I learned, that this is not gtk issue, but an ati one, cause it's affected too.
Â
It looks like I will need some input from bgK, today, after many tests, I reached following conclusions:
composite issue is a driver bug, but the funny part is that it works while dri is disabled
however when dri is enabled both OpenGL and cairo suck speedwise
when AccelMethod is EXA (haven't tested XAA, but probably too), Xvideo is broken -
Code: Select all
all pixels have green tint
emu screen is split into three parts: a black strip at the bottom and a doubled side-by-side game screen
while this is a driver bug, the strange part is that xvideo of xine lib is OK
Â
SDL doesn't seem to be affected much by the driver change.
OK, this time it's a reply, cause I found a bug and 'd like to know what a certain piece of code was supposed to do.
While cairo/opengl issues seems to be a driver problem,
xvideo is a bug in this app.
What was this block:
Code: Select all
for (int i = 0; i < iNumFormats; i++)
{
if (pFormats[i].id == 0x3 || pFormats[i].type == XvRGB)
{
/* Try to find a 32-bit mode */
if (pFormats[i].bits_per_pixel == 32)
{
m_iFormat = pFormats[i].id;
}
}
}
supposed to do, cause this is the reason for the xvideo problem.
With xf86-video-ati, xvideo info is:
Code: Select all
Number of image formats: 8
id: 0x41424752 (RGBA)
guid: 52474241-0000-0010-8000-00aa00389b71
bits per pixel: 32
number of planes: 1
type: RGB (packed)
depth: 32
red, green, blue masks: 0xff0000, 0xff00, 0xff
id: 0x54424752 (RGBT)
guid: 52474254-0000-0010-8000-00aa00389b71
bits per pixel: 16
number of planes: 1
type: RGB (packed)
depth: 16
red, green, blue masks: 0x7c00, 0x3e0, 0x1f
id: 0x32424752 (RGB2)
guid: 52474200-0000-0010-8000-00aa00389b71
bits per pixel: 16
number of planes: 1
type: RGB (packed)
depth: 16
red, green, blue masks: 0xf800, 0x7e0, 0x1f
id: 0x0
guid: 52474200-0000-0010-8000-00aa00389b71
bits per pixel: 24
number of planes: 1
type: RGB (packed)
depth: 24
red, green, blue masks: 0xff0000, 0xff00, 0xff
id: 0x32595559 (YUY2)
guid: 59555932-0000-0010-8000-00aa00389b71
bits per pixel: 16
number of planes: 1
type: YUV (packed)
id: 0x59565955 (UYVY)
guid: 55595659-0000-0010-8000-00aa00389b71
bits per pixel: 16
number of planes: 1
type: YUV (packed)
id: 0x32315659 (YV12)
guid: 59563132-0000-0010-8000-00aa00389b71
bits per pixel: 12
number of planes: 3
type: YUV (planar)
id: 0x30323449 (I420)
guid: 49343230-0000-0010-8000-00aa00389b71
bits per pixel: 12
number of planes: 3
type: YUV (planar)
I think that this code makes program select RGBA and then the effect is
as described above (it's actually four equal parts, but two at the bottom
are both black). when I cut out this block, xvideo works again.
If I remember correctly, there is a bug in the Xv driver : It always converts the image to YUV, even if there is a RGB mode available. I can't test because my nvidia card doesn't offer RGB modes (I hate NVidia drivers even more than ATI's).
Anyway, I was thinking about removing this output module since it is buggy, and the quality of the output not so good because of the RGB32 -> YUY2 (16 bits) -> RGB conversion + the mandatory bilinear filtering. Moreover it is slower than OpenGL output.
Â
I'm sorry for not being very active lately, but I'm currently having my end of year exams.