Category Archives: Rants

Personal rants

Accepted for Summer of Code

Dear Applicant,
Congratulations! This email is being sent to inform you that your
application was accepted to take part in the Summer of Code.

Yeah, one of my two proposals has been accepted!

My first proposal was about working on GL O.B.S. under the Python Software Foundation, unfortunately it was very likely going to be discarded.
I learned this from a mentor who contacted me, he wrote that my application was based on a personal program and that it would have been hard to find someone to mentor me, moreover I wouldn’t have contributed to the Python community. He also added that I could have been a good candidate for his project, he is, indeed, Arc Riley, Project Manager of PySoy.
And so I did, I wrote another application and, this time, it has been accepted. ๐Ÿ™‚

My work will be to integrate multi-texturing in the PySoy rendering loop and API, document API additions, test the whole under many different free software drivers and then implement some related techniques, like bump or normal mapping.

I’m really glad of this opportunity, I will learn many interesting OpenGL and Python topics and I will improve my design, teamwork and communication skills.
Thank you Google! ๐Ÿ˜‰

Easter gifts

While in Greece to spend holidays with my family I was able to get my hands on some interesting piece of hardware to bring back in Italy for Electron: a 128MB module of SDR RAM to fill up the last motherboard slot and *two* different graphic cards (my old 5900XT and a 6200)! ๐Ÿ™‚
Actually all this was possible thanks to Easter gifts! ๐Ÿ˜‰
My uncle installed a 512MB module of DDR RAM inside his old PC, while both me and a friend of mine changed our old cards with the good and cheap Gigabyte GV-N76G256D-RH, a GeForce 7600GS equipped card with 256MB of DDR2 memory.

It is a completely silent card which delivers sufficient performance at a great price, it is capable of running Shader Model 3.0 vertex and fragment programs, supports OpenGL 2.1 and all the G70 Extensions, moreover it features a lm_sensors compatible thermal sensor on the GPU core. ๐Ÿ™‚

A reduced series of tests follows, all performed at 640×480 excepts Blender 2.43 draw benchmark, which was run at 1024×768, my desktop resolution.

Test NoAA, NoAF 2xAA, 4xAF
glxgears 4966.6 3123.3
Blender 10855 10764
GL_shadow 1428.8 996.2
GL_pointz 631.0 617.6
GL_blit 2052.8 1223.8
GL_smoke 358.2 316.8

The low resolution at which most of the test were run could have made the results a bit too cpu limited, some other games I’ve tested seem to confirm this theory, however that the card shows its strength and I’m satisfied with it.
I hope you also get nice Easter gifts and I wish you a merry Easter! ๐Ÿ˜‰

The quest for the lost fragment

Yesterday I was donated a new graphic card from a generous guy at the university, including cables, manual and bundled software, it is a nice MSI G4Ti4200-DT64 with red PCB.

MSI G4Ti4200-DT64

It’s a good card but unfortunately it has only a primitive version of pixel shaders, they are neither floating point (supporting at most the proprietary “HILO” format) nor GLSL compliant, and they cannot be used via GL_ARB_fragment_program (as a matter of fact it is not present in the extensions array ๐Ÿ™ ), but only through the family of GL_NV_register_combiners and GL_NV_texture_shader functions, which make use of the OpenGL state machine.
Nvidia Cg actually supports the fp20 profile, but its output is just a nvparse program, which has to be passed to a function that will setup GL texture states.

Anyway, let’s analyze what’s new going from a GeForce4 MX440-8X (NV18) to a GeForce4 Ti 4200 (NV25):

  • The OpenGL version string hasn’t changed (mainly because of the lack of Shader Model 2.0), it is still 1.5.8.
  • Thanks to the Accuview AA Engine there are three new anti-aliasing modes: 4x Bilinear Multisampling, 4x Gaussian Multisampling and 2x Bilinear Multisampling by 4x Supersampling.
  • There are twenty new extensions available, most of them are related to multisample, depth textures, occlusion queries, shadows and texture shaders:
    GL_ARB_depth_texture, GL_ARB_multisample, GL_ARB_occlusion_query, GL_ARB_shadow, GL_ARB_texture_border_clamp, GL_EXT_shadow_funcs, GL_EXT_texture3D, GL_EXT_timer_query, GL_HP_occlusion_test, GL_NV_copy_depth_to_color, GL_NV_depth_clamp, GL_NV_multisample_filter_hint, GL_NV_occlusion_query, GL_NV_register_combiners2, GL_NV_texture_compression_vtc, GL_NV_texture_shader, GL_NV_texture_shader2, GL_NV_texture_shader3, GL_SGIX_depth_texture, GL_SGIX_shadow.

Of course I performed some benchmarks too (have a look at Electron specs), all at 1024×768, except from glxgears and globs tests, they were run at the default 640×480 resolution.
Quake 3 was tested on the four.dm_68 demo with sound, Blender 2.42 was tested with the draw benchmark, while the GLSLvp_pointz test uses only vertex shader to move and color points.
Note that this last test is emulated in software on NV18 while is performed in hardware on the NV25, but shaders plus full scene anti-aliasing seem to be impossible to achieve on the latter.

Test NV18 NV25
NoAA, NoAF 2xAA, 4xAF NoAA, NoAF 2xAA, 4xAf
glxgears 1568.5 833.5 2976.7 1571.7
Blender 2580 1580 7000 4484
Quake 3 93.0 51.7 113.6 92.3
gl_shadow 296 148.4 674.6 360
gl_pointz 421.8 272.6 526.2 398.4
gl_blit 600.6 285.8 1197.4 612.4
gl_smoke 299 177.2 404.2 302
GLSLvp_point 118.6 103.8 317.6 X

The card is nice and fast, but the search for the fragment (extension) has not ended. ๐Ÿ˜‰

Why I would have liked to buy a PS3

When I first read about the PS3 I was so astonished by its specs that I immediately fell in love with it.
When the Unreal Engine 3.0 was first presented at E3 2005, Tim Sweeney suggested a GeForce 6800 Ultra for just a decent framerate, and it was the top of the line hardware of that time!
The RSX GPU inside the Playstation 3 was based on the new G70 architecture by nVidia, it had more power than anything on the consumer market in 2005.

Playstation 3

I was also wondering about the Cell, this really innovative CPU which was attractive for its strong multi-threaded design, with a PPE controlling seven SPUs for enhanced parallel performance.
A hard disk, which came pre-installed with Linux, and the fact that it would have used OpenGL ES completed this fantastic image I had about the PS3 as the perfect geek solution for high power computing (and awesome programming possibilities) at a reasonable price.

While not only me, but the entire world, was dreaming about the new Sony console, at Microsoft they were working hard and by the end of the same year the Xbox 360 became reality.
Ok, it had no Blu-ray (which Sony claims to be a must have for next-gen games…), it had a more ordinary (still great!) double-threaded tri-core PowerPC Xenon, it had not a “Reality Synthesizer” but a more modest Xenos GPU.
The Xenos is actually modest only in the name, its unified shader architecture was something ahead of its time, something that nVidia has adopted only with GeForce 8 (and it’s not a DirectX 10 requirement, as someone could believe), the same could be told about the embedded DRAM, a solution never seen in the past.

What is really irritating, at least from my point of view, is the usual Microsoft way to do things, they planned 1000 ways to raise a fence all around their console, they absolutely don’t want you to run anything else than their kernel, and they enforce this concept taking all the possible hardware measures like hardwiring a different encryption key in every Xenon CPU produced!
I should also admit that Microsoft (as Sony) gives you some kind of support for homebrew games, but, for it is mainly a software company, you are again enforced to use their software only: you cannot program on your console directly, running Linux, Vim and GCC, like you could on the Playstation 3, you must use XNA Game Studio Express (which now can be downloaded for free) on the PC with its Visual Studio environment…

We have demonstrated that the Xbox 360 is a great machine but not a viable solution for a geek seeking for a flexible and powerful programming platform, at least until someone finds how to crack it to install Linux, as done on the first Xbox (but this time I think it will be much harder).
And so we come to the Playstation 3, it seems perfect, it has a parallel CPU that makes you learn and stress multi-threaded programming (and not only, as the Cell is something so innovative that needs a new approach), a powerful high-end GPU and a price which is undeniably high but that, as time passes, can only lower.

I thought: “Great, now I know what I will buy next, a Playstation Three!”. I was also wondering about graduating with a thesis on numerical analysis performed on RSX shaders, it would have been so cool and geek. ๐Ÿ˜€
The only problem is that we’re not in 2005! We are in 2007, what was the future is now the present (like the Unreal Engine 3, like Gears of War, which is running only on the Xbox 360).
Now, I know that PS3 is also the present, that it is real, but it is not yet distributed in Europe, not yet produced in high volumes and, what is worst, it has been in the market only for a couple of months and it is sold for a high price that won’t decrease for a very long period!

I was feeling abandoned and betrayed, orphaned of what I thought would have been my future rendering and developing platform, but then, little by little, PCs inspired my confidence again. ๐Ÿ™‚
Now I think that a Core 2 Quad Q6400 and a GeForce 8600 Ultra (which specs has been rumored this very day) cannot be so bad, I’m eager to see them sold in shops, and this time we are sure they won’t be postponed for a year. ๐Ÿ˜‰

Code Freeze

Semester is over, it’s time for university examinations.
This means that all my projects are frozen until the end of February.
Anyway, let’s take a look at what will come next.

Freeze or Stop

Mars. M3xican is studying too (we are from the same university after all), so he has less time to dedicate to Mars, but nevertheless he’s currently involved in writing design documents for the game. In March we will get back to work on the gameplay side!

GL O.B.S.. The plan is to release a 0.2 version as soon as possible, I still need to adapt the submission code to the new site and db interface developed by cowa. It is also possible for a RC version to appear before the final release.

PyLOT. This is a project I start working on in December, a prototype 3d engine written in Python. At the time of writing it has more shortcomings than features, and it’s nothing more than a conversion of an old C++ project plus the addition of code and ideas from some OpenGL demos I wrote, but in the future M3xican will eventually spend some time in designing a game for it (we have already a bunch of ideas :)).