Announcement

Collapse
No announcement yet.

WorldSpawn official WIP thread

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by MadGypsy View Post
    I had an idea and it works very well but, it treats the palette differently and isn't useful for quake stuff.
    Heh, once again you posted that while I was typing my comment... Yeah, the result is much better. As for being useful for Quake stuff, you're not making a Quake engine anyway, and if some game dev really wants to use Quake assets with Worldspawn, as you said it's not perfect but close enough.
    ♪ I'm skiiiiiiinnin' in the pain, just skiiiiiiinnin' in the pain ♪
    ♪ What a glorious feelin' I'm haaaaaaappy again ♪

    Comment


    • @bfg

      The way I did it in that post is actually useless period.

      it would take 3 stored numbers to tell it the resulting 3 numbers...lol at that point you might as well just save it as a png.

      Also, it's using it's own color value as an index for another color...per channel.

      I mostly just wanted to check out the paletteMap feature of BitmapData. It's completely useless for wads.
      http://www.nextgenquake.com

      Comment


      • Oh, OK. You said "not useful for Quake stuff", so I thought you found a way to render non-Quake colors more accurately.

        What's with that forest pic, exactly? As I understood, your engine can generate a palette directly from textures, right? Did you generate a palette from this pic and then used it to convert the pic to 8-bit, or was it just processed through the Quake palette?
        ♪ I'm skiiiiiiinnin' in the pain, just skiiiiiiinnin' in the pain ♪
        ♪ What a glorious feelin' I'm haaaaaaappy again ♪

        Comment


        • @your engine can generate a palette directly from textures, right?

          no.

          @or was it just processed through the Quake palette?

          yep. All images were processed through the quake palette except for the couple-three that were processed through paletteMap function.
          http://www.nextgenquake.com

          Comment


          • drawn from the palette and mipmapped


            I tried numerous ways to make mipmaps. Each one I tried has a reason why it is better and worse than some other way. Consider these examples.

            way 1: simply skip 2,4 or 8 pixels in width/height (repsectively). This way isn't terrible but there are 2 things to consider. Doing it this way skips the right and bottom edge of the image. Also, if the image isn't power of 2 the final mip will be jacked up.

            way 2: scale original image by 1 /(1 << miplevel) and draw it to a new bitmap. Then go over all the pixels of the scaled-down version and derive the palette index for the color. This version of going over every pixel is a little faster than a completely new conversion due to all the values already being known. Whereas you don't lose any edges this way you do lose some quality.

            There are 2 more ways but, I don't feel like explaining them.
            http://www.nextgenquake.com

            Comment


            • Using the skip pixel method, these are the results I get. I blew the images up 4x so it's easier to see.



              I didn't like that so, I wrote a new way that utilizes a scaling method and a reverse look-up dictionary. Make both of my images fullsize and compare the smaller mips. I feel like my second method is very good.

              http://www.nextgenquake.com

              Comment


              • However, both methods have their place. Here is another skip pixel render. The mips look sharp.



                Whereas the scale method in this case makes it seem blurry. It's mostly the midground leaves. Above they look almost 3d in comparison to below.



                If this isn't clear...all of this stuff is being stored, ready-to-go, as a wad entry. I've unified all wad related stuff (wad_in/user_in/out/etc) to valid wad entries in the library. I don't just read/write wads. All of this stuff is a system on top of read/writing wads to actually use wads entries as an image. Just like opening a png or a jpg.
                Last edited by MadGypsy; 11-05-2016, 08:04 PM.
                http://www.nextgenquake.com

                Comment


                • even cleaner
                  Code:
                  /** MATCH_IMG32_PIXELS_TO_PALETTE
                   * 		iterate over every pixel of the original image, comparing it to palette entries, until the closest possible entry is found
                   * 		stores the palette indexes in bytes and returns the bytes
                   * @param	bmd		- the img32 data to convert
                   * @param	lof		- level of fogiveness - higher numbers result in less accurate color matches 
                   * @param	unfind		- true|false start a fresh reverse look up dictionary
                   * @return	8 bit bytestreaam of palette indexes
                   */
                  public static function matchToPalette(bmd:BitmapData, lof:int = 0, unfind:Boolean = false):ByteArray
                  {	
                  	//determine if we should (re)create the dictionary of matched colors
                  	if ((unfind) || (matched == null))
                  		matched = reverseLookUp;
                  	
                  	//to store the current pixel value of the original image 
                  	var color:uint;
                  	
                  	//for storing the individual channels of the image pixel value and the palette value
                  	var img_r:uint, pal_r:uint;
                  	var img_g:uint, pal_g:uint;
                  	var img_b:uint, pal_b:uint;
                  	
                  	//positioning vars
                  	var diff:int, last_diff:int;
                  	var pnum:int, index:int = 0;
                  	
                  	//a container for stored indexes 
                  	var mip:ByteArray = new ByteArray();
                  	
                  	//prime loop iteration
                  	var size:int = bmd.width * bmd.height;
                  	var pixel:int = -1;
                  	
                  	//loop compare pixel data
                  	while((size - (++pixel)) > 0)
                  	{
                  		//store original image color
                  		color = bmd.getPixel32(pixel%bmd.width, Math.floor(pixel/bmd.width));
                  		
                  		//break color down into channels
                  		img_r = (color >> 16)& 0xFF;
                  		img_g = (color >> 8) & 0xFF;
                  		img_b = (color)	     & 0xFF;
                  		
                  		//prime offset to absolute max
                  		last_diff = 255*3;
                  		
                  		//if the color has already been found use it
                  		if (color in matched)
                  			mip.writeByte(matched[color]);
                  		else 
                  		{
                  			//loop through the palette comparing and storing the index for the least wrong color and sometimes even the right one
                  			for (pnum = 0; pnum < Palette.hexARGB.length; ++pnum)
                  			{	
                  				//break palette color down to channels
                  				pal_r = (Palette.hexARGB[pnum] >> 16)& 0xFF;
                  				pal_g = (Palette.hexARGB[pnum] >> 8) & 0xFF;
                  				pal_b = (Palette.hexARGB[pnum])	     & 0xFF;
                  				
                  				//store the current difference
                  				diff = Math.abs(pal_r - img_r) + Math.abs(pal_g - img_g) + Math.abs(pal_b - img_b);
                  				
                  				//if this diff is smaller, overwrite last_dif and store palette index 
                  				if (diff < last_diff)
                  				{	last_diff = diff;
                  					index = pnum;
                  				}
                  				
                  				//if the diff is less than the level of forgiveness, quit searching.
                  				if (diff <= lof) break;
                  			}
                  			
                  			mip.writeByte(index);		//write the alleged best palette index
                  			matched[color] = index;		//set reverse look-up for this color
                  		}
                  	}
                  	
                  	//prime position and return
                  	mip.position = 0;
                  	return mip;
                  }
                  This is almost the same as the last code post for converting an image to the palette but, I trimmed it down even more by considering thee math differently.

                  instead of 2 while loops navigating the x and y properties I thought about it and was able to make it one while loop with x and y being derived with clever math as opposed to being treated as iterators. This meant I could dump x and y entirely. Less vars, less loops...more comments.

                  x = currentPositionAsLength%width
                  y = Math.floor(currentPositionAsLength/width);


                  I have it down to under 3 seconds to convert a 256x256 image to a wad image with all 3 miptexes. I don't know if 3 seconds is a long time but, it's a lot shorter than my original 38 seconds (for the same image). Each image will be a little bit different in time because it really depends on how many colors are in the image. More colors = more searching and therefore a longer wait. My reverse look-up system works really well but, I would like to shave 1 more second off the time. I'm thinking it's impossible but, if it isn't... I'll make it happen. I'm thinking the only way to make this faster would be to skip ahead some kind of way to some quadrant of the palette that most closely represents the color I am matching, thereby reducing the color look-up iterations. The only problem is... how much extra processing would I need to do to eliminate the loop processes. I'm seriously splitting hairs here but, I am addicted to making things as fast as they can be. To have flash doing all this matching/drawing in under 3 seconds is really good but, half that would be amazing.
                  Last edited by MadGypsy; 11-06-2016, 11:00 AM.
                  http://www.nextgenquake.com

                  Comment


                  • I got it down to 2.224 seconds (bottom left corner of app window). That's a shave of 0.4ish. I ran numerous tests before and after my little change I made. The shave is consistently in the 0.4 area. In other words, this isn't just the one lucky quick convert due to a lack of system farts, or something.


                    I'm gonna get it even faster.
                    http://www.nextgenquake.com

                    Comment


                    • I've had electricians at my house all day. They finally left. It was practically impossible to get any programming done with all the noise they made. One guy was telling me all about his life while I was very obviously ignoring him and trying to type. I got annoyed and snapped "Do I look like a fucking psychiatrist?...no? But I bet I do look like a guy that doesn't give a fuck about your life."

                      I felt bad after I said it but, I hate completely oblivious people. Anyway, I can finally sit down in peace and move my work forward. :cracks knuckles: :brews coffee:
                      http://www.nextgenquake.com

                      Comment


                      • player

                        I stripped the stream player out of my engine. I'm just sharing it cause I can. The included READ_ME tells you everything you need to know. Really the only thing I added was controls to move/minimize/close the app window. The player is otherwise identical to the one in my engine.

                        -----

                        edit:

                        Hah! I didn't do this on purpose, it just turned out to be true. See if you can find my player in the below image. It is fully visible and "in your face" even.


                        ~ Don't waste your life on this. The absolute best you could possibly do is guess. There are seriously no hints ~
                        click here for answer
                        Last edited by MadGypsy; 11-06-2016, 06:16 PM.
                        http://www.nextgenquake.com

                        Comment


                        • what's done is done, to keep rambling about the possibilities of what might have been is something totally useless (and even harmful)

                          Kind regards
                          the invasion has begun! hide your children, grab the guns, and pack sandwiches.

                          syluxman2803

                          Comment


                          • I advanced my image-to-palette matching. This is attempting to match as close as possible (in offset) even leaning toward the dominant color channel. The "forgiveness" here is 0.


                            by setting the forgiveness to 24 we can sort of fix the "purple problem" in the image. I'm not saying this is awesome or anything but, this seems like it's actually a thing. It "works" on more than just this one image.
                            http://www.nextgenquake.com

                            Comment


                            • Originally posted by MadGypsy View Post
                              @your engine can generate a palette directly from textures, right?

                              no.
                              OK, I misunderstood then. I thought I remembered you talking about something like deducting a palette from a texture.

                              All images were processed through the quake palette
                              Then like I said, using a more balanced palette instead would probably yield better results with the greens.

                              However, both methods have their place.
                              Yeah, funny how they behave differently depending on the texture... If it was possible to determine which method gives the best result per image and select it automatically, it would solve the issue.

                              That said, you've come to quite amazing results in little time.
                              ♪ I'm skiiiiiiinnin' in the pain, just skiiiiiiinnin' in the pain ♪
                              ♪ What a glorious feelin' I'm haaaaaaappy again ♪

                              Comment


                              • I'm about to make 2 simple "real-time" controls.

                                one control will be very simple. It will be a text box that allows you to assign the level of forgiveness. This number creates a range that the matched color can be off by. In other words if the level of forgiveness is 0 the palette will be searched until it finds the exact number or runs out of palette indexes to check. This method obviously still stores the color with the least amount of difference and that becomes the fallback if an absolute match wasn't made. BY setting the LOF higher you're telling the function to stop searching as soon as it's difference falls within the LOF.

                                I added another setting last night which will be the second control... when I was making my film noir setting for images in my engine I had to consider channel weights. The formula was red*.30, green*.59, blue*.11. What this did was balance (weight) the greyscale representation of the color. I was curious how that formula would effect palette matches, by applying those multipliers to each channel offset before adding up the overall difference. I got some very good results so, I decided to externalize the multipliers and see what results I would get by playing with those values and various LOF values. The results I got made sense and between the 2 allowed you to manage the importance of each channel and how accurately to match them.

                                So...the second control will be a bar with 2 sliders. From the left of the bar to the first slider will represent the red multiplier. From the first slider to the second will represent the green mulitplier and from the second slider to the right edge of the bar will represent the blue multiplier.

                                Between just the 2 controls I mentioned you can get a whole lot of very different results and many of the possible results are usable depending on what you are going for. Through numerous tests last night I got results anywhere from very close to original TO corrected from original TO "style" from original ...and all 3 versions were acceptable or better for being a used texture.

                                here are the changes I made to my palette match loop.
                                Code:
                                //if the color has already been found use it
                                if (color in matched)
                                	mip.writeByte(matched[color]);
                                else 
                                {	
                                	//loop through the palette comparing and storing the index for the least wrong color and sometimes even the right one
                                	for (pnum = 0; pnum < Palette.hexARGB.length; ++pnum)
                                	{	
                                		//break palette color down to channels
                                		pal_r = (Palette.hexARGB[pnum] >> 16)& 0xFF;
                                		pal_g = (Palette.hexARGB[pnum] >> 8) & 0xFF;
                                		pal_b = (Palette.hexARGB[pnum])	     & 0xFF;
                                		
                                		//find the dominant channel
                                		fav_r = ((img_r > img_g) && (img_r > img_b));
                                		fav_g = ((img_g > img_b) && (img_g > img_r));
                                		fav_b = ((img_b > img_r) && (img_b > img_b));
                                		
                                		ofs_r = pal_r - img_r;	//negative = needs more of this channel
                                		ofs_g = pal_g - img_g;	//	"			"			"
                                		ofs_b = pal_b - img_b;	//	"			"			"
                                		
                                		//if the dominant channel's offset is less than lof, don't store this color regardless of final offset
                                		skip = false;
                                		if (fav_r && (ofs_r < -lof)) skip = true;
                                		if (fav_g && (ofs_g < -lof)) skip = true;
                                		if (fav_b && (ofs_b < -lof)) skip = true;
                                		
                                		//set weight for each channel
                                		norm_r = Math.abs(ofs_r) * multiplier.red;
                                		norm_g = Math.abs(ofs_g) * multiplier.green;
                                		norm_b = Math.abs(ofs_b) * multiplier.blue;
                                		
                                		//normalize offset
                                		diff = norm_r + norm_g + norm_b;
                                		
                                		//if this diff is smaller, overwrite last_dif and store palette index 
                                		if ((diff < last_diff)  && !skip)
                                		{	last_diff = diff;
                                			index = pnum;
                                		}
                                		
                                		//if the diff is less than the level of forgiveness, quit searching.
                                		if (diff <= lof) break;
                                	}
                                	
                                	mip.writeByte(index);		//write the alleged best palette index
                                	matched[color] = index;		//set reverse look-up for this color
                                }
                                Last edited by MadGypsy; 11-07-2016, 12:26 PM.
                                http://www.nextgenquake.com

                                Comment

                                Working...
                                X