How does HLMV compress with transparent colors


BillyNair
06-28-2004, 10:38 PM
I have played with the way HLMV compresses the colors in your BMP when you make part of it transparent. I have found that from a 256 colored BMP it will drop it down to 64 colors. So i tried saving the skin as a 64 color BMP first, but it didn't look the same, there were still 64 colors, but the pixels were all moved around, as in the original pattern was sqished so that it was just a bunch of large blobs.

I can understand the color drop from 256 to 64, the colors even changing, but I can not figure out why the patterns change.

My theory is that when HLMV drops the ammount of colors from 256 to 64 it choses the 64 to keep. I tried to foloow this process to see if there is a pattern, but did not see one. I was thinking like every 4th color or something. I looked on Planet HL's tutorial page on how to make transparencies, but only found other forum guys with the same question (well, not the same, but I didnt find any answers about this)

If anyone knows where to find an answer let me know.

BillyNair
06-28-2004, 10:42 PM
sorry, forgot the screen shot. It is HLMV with and without the transparent color in and compares the 2 right next to eachother. notice the pattern change. The original is 16 colors, and HLMV drops transparent BMPs to 64 colors, so there should be no change, or so i thought...

Trp. Jed
06-30-2004, 07:55 AM
HLMV doesnt do anything with colours, its your graphics card and video drivers that do it when running in 16-bit.

RGB requires 8 bits per channel. When you add transparency you need another channel for the alpha map so thats 32-bits in total. With 8-bits per channel, each channel can have one of 256 variations.

16-bit colour only allows 5 bits per channel which only gives you 32 possible variations.

HLMV isn't doing it, its your graphic card. HLMV processes all textures as full 32 bit and writes them into your graphic cards memory as 32-bits. Its the card and drivers that are dropping the colour depth to match your desktop resolution.

BillyNair
06-30-2004, 02:03 PM
Trp. Jed, I actualy went to your site first to try and find the answer, I just thought you were AWOL from the forums so I wasn't expecting anythig from you. Since you did Jed's HLMV I was hoping you had more insight in how things work on the inside. I am guessing that my suspissinos were completely wrong and that if I had a better graphics card it would look different.

I am just trying to figure out what you are saying:
If I created the skin on a 32-bit card, saved the BMP, went into HLMV and imported/saved the BMP with transparency, it would look the same to other players with a 32-bit but if I sent the same BMP to a guy with a 64-bit, he would see it how it looked before the transparency?

If I bought a 64-bit wout this fix the problem?
(I am planning on upgrading soon anyway, so this isn't that far away)

Trp. Jed
06-30-2004, 03:02 PM
64-bit has nothing to do with colour representation, I think your getting cofused with the bus width of the data bus of the graphic card which effects how much data the card can transfer at any one time.

Short bit of computer colour theory - computers, regardless of the graphic format have to convert output data to RGB because thats how monitors (and LCD displays) make up their output - tiny dots of red, green and blue.

Basically a byte can hold a value of between 0 and 255 which is why in programs like photoshop your red green and blue values are in that range.

If you include the 0, using one byte per colour channel gives you a maximum of 16777216 possible colours. Truth be told, the human eye can't actually distinguish that many but thats a whole different part of colour theory.

If you take the 3 colour channels (RGB) and add the 8-bits together each takes up you get 24-bits. Now add an 8-bit greyscale alpha channel (giving you 256 degrees of tranparency) you get 32-bits. This is where the term "32-bit" colour comes from - 24-bit colour + 8 bit transparency.

Sometimes 32-bit is abrieviated as R8G8B8A8 or RGBA8, showing that for each channel there are 8-bits. There is another reason for using 32-bits as well - it fits into what is called a DWORD which in computer terms is 4 bytes and a nice number for the card to move around in memory. Just 24-bit is an "odd" number and can be a bit slower so often 24-bit colours have an alpha channel added to make them 32-bit.

The reason for 16-bit is that some older cards dont have enough graphics memory to hold all the textures they need at anyone time. Dropping a texture to 16-bit halves the ammount of graphics memory needed to hold it. Also its less physical data to move around so if your application is dynamically swapping textures in and out of memory it takes half as long (in most cases, things like the PS2 handle 32-bit texture faster).

Obviously 16-bit creates a problem because you cant fit 32 into 16, somethings got to give somewhere so what usually happens is bits are "dropped" from each channel to drop it down so that it fits in 16-bit. This is usually 5-bits per channel for RGB (although sometimes the green channel gets 6-bits) or even 4-bits per channel if an alpha channel is included.

In 4-bits, you can only show 16 variations per channel which is a significant drop from the 256 variations in 32-bit mode.

This dropping of bits is usually done, as I said, in the graphic card itself or by the applications rasterizer and sometimes different cards handle it differently.

At this point you've probably noticed that hang on, all the BMP textures in the models are 8-bit anyway, only 256 colours! This is a legacy thing from the days when 3D accelerator cards weren't that common and games had to handle their own 3D drawing. 8-bit is the most efficient for software mode in terms of transferspeed and colour depth. Rather than re-write the MDL spec to 32-bit, HLMV an HL simply expands the 8-bit textures to 32-bit in hardware modes.

So there you go, thats the "full" explanation - if you don't want HLMV or Half-Life to make your textures all stripey, set your desktop to run in 32-bit colour mode and the same for Half-Life although be aware that the increase in colour depth, especially in games, can cause things to slow down sometimes - it varies from graphic card to graphic card. Some are optimised for 32-bit texture memory transfter and actually run slower in 16-bit mode - the PS2 is actually like this with its GPU.

BillyNair
06-30-2004, 04:10 PM
ty, i will try that. But I am at school now.

Still not sure why the colors change when I drop it to 16 colors and even 8 colors.
is it because it is treated like a 24 bit even when it realy only has 2-bits, something to do with the fact that in order to place a trans color in the pallet it HAS to fill the other (14 * 16 - 1) colors to place something in the 256th spot?

I have found a solution. Add a GRFX editor to HLMV, something even as lame as a color picker and a pencil would be cool.

Trp. Jed
06-30-2004, 05:12 PM
Well as I did explain above the reduction in colours is because full RGB requires 24-bits, or 32-bit with alpha and that just dont fit into 16-bit ;)

Hence the graphic card drops a bit of each colour channel to get it down to 16-bit effectively reducing how many different variations of colour the texture has. Its this automatic reduction which makes the textures look banded or stripey.

Theres pretty much nothing you can do - even if you reduced the total number of colours in your original texture to just 16 colours, HLMV and HL converts it into 32-bit before it loads it into the graphics card memory. If your desktop or game graphics mode is only 16-bit its the graphics card which decides how to reduce the colours down to 16-bit.

Even if you could make your original texture match what your graphics card would reduce the colours too, there is no guarantee that another brand of card (or even the same card with different drivers) would do the 32-bit to 16-bit conversion the same way.

Its a case of just having to live with it really. If you want to make yout textures appear as you originally designed them, run in 32-bit mode which is what the textures are (256 colour images are actually 24-bit, they just have a maximum of 256 unique colours).

BillyNair
07-02-2004, 10:56 PM
Jed, j00 r teh saxay ! !

I had my card set to Performance (force 32-bit to 16-bit) I set it to quality (keep it 32-bit) and the skins are how they should look, just the same as they did before switching to transparency. Oh, well. thanks for letting me know what was going on, or I would still be working on a fix for the next few months.

Trp. Jed
07-02-2004, 11:05 PM
No problem. Sorry the answer was a bit verbose but I felt it best to explain fully as theres a lot of myths surrounding HLMV and what it does or doesnt do to textures.

Day of Defeat Forum Archive created by Neil Jedrzejewski.

This in an partial archive of the old Day of Defeat forums orignally hosted by Valve Software LLC.
Material has been archived for the purpose of creating a knowledge base from messages posted between 2003 and 2008.