Announcement

Collapse
No announcement yet.

Real Flash Quake

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • (Spike is giving you advice on what "good" or "perfect" would look like. It may be frustrating to you because your knowledge is a bit uneven, so you aren't a newbie or an expert or anything between but a combination, with knowledge gaps in some areas and not in others.)

    Anyway, texture filtering isn't a shader and should be available in your thingy, almost like flipping a switch I bet. It's basically a draw setting for a texture. You should have something like it available (i.e. this should be very easy).

    It just tells the 3D renderer which algorithm to use a draw a texture.

    The code below, for example, in FitzQuake it binds a certain texture and based on flags sets either "nearest" as the draw filter or "linear" as the draw filter.

    Where "linear" is what you want for 3D drawing, but nearest is what you want for 2D drawing (otherwise it can look a bit muddy).

    In FitzQuake looks like this:

    Code:
    static void TexMgr_SetFilterModes (gltexture_t *glt)
    {
    	GL_Bind (glt); // <----- this sets a certain texture like wall_03 or whatever
    // Then this stuff down here .... sets the texture filter.
    	if (glt->flags & TEXPREF_NEAREST)
    	{
    		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
    		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    	}
    	else if (glt->flags & TEXPREF_LINEAR)
    	{
    		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    	}
    ...}
    Quakeone.com - Being exactly one-half good and one-half evil has advantages. When a portal opens to the antimatter universe, my opposite is just me with a goatee.

    So while you guys all have to fight your anti-matter counterparts, me and my evil twin will be drinking a beer laughing at you guys ...

    Comment


    • You are never lying about knowledge gaps. I know spike knows his stuff.. absolute genius. Sometimes I feel like an employee being chastised though. That's just adding more stress to my project. I'm not some 3D genius. This shit is very complicated and I'm doing the best I can. I'm not gonna lie, sometimes, I just want to throw all of my computers in the garbage and go find some other life.

      These are all the filters available to me (doc tree). None of that looks like what I need.


      However, in my fragment shaders I have this. Well, that isn't actually IN my fragment shaders it's just an example of frag shader code.
      ft0 = tex<2d,repeat,linear,nomip>( v0, fs0 );

      Shaders are confusing the hell out of me. Mostly because all the AGAL shader examples I can find are for "native" Adobe Stage3D and my API hides the associated stuff very well. Not to mention that all the examples I can find have nothing to do with what I am trying to accomplish. So, not only do I have to dig a giant hole in my API to figure out how to even connect this stuff, I have to transpose examples that have nothing to do with what I want to do.

      I could keep trying and trying to get shaders/filters whatever the hell I need to work but, the more time I spend on it in failure the more chance that I will eventually become completely fed up and scrap all of this for another method. I've been working on other things (cause let's face it there is tons to do) with the hope that as I move forward these mysteries will become more clear to me. It may be absolutely necessary to have linear filtering but, it isn't absolutely necessary today.

      Regarding the things Spike says - I have implemented damn near all of it. Even to the point of figuring out that even though I thought I was doing what he said at one point, I realized it was impossible that I was and fixed it. re: Consolidating geometry by texture to a single set of buffers. I have even ditched that for interleaved buffers across the board (at his suggestion) all I have left there is plugging in the PVS. I did all of this and not even a week ago I had never even heard of interleaved buffers. Bitching about posting image code and missing linear filters is like ignoring ALL THIS OTHER SHIT I have accomplished. Do you guys realize that there is not one successful flash based BSP parser. Not one. Not one you can find with google anyway. All of them fall apart pretty far back from where I am. And even the ones that render anything at all aren't doing it at 60fps, much less 60fps with the entire world rendered. You think my temporary faux lightmap filter sucks. Go look at my predecessors, they have pretty lightmaps and everything else sucks. My performance is second to none (regarding flash) even using huge expensive replacement textures. I'd be willing to bet that I have already completely eaten the original quake engine for breakfast, lunch, dinner and snacks, and I did it with a hell of a lot less code.

      CLAP DAMMIT! lol

      The one major flaw in my API that I have noticed is it greatly assumes that you will be using it's dynamic lights and in that it attempts to hide the very things I need to access. Shaders are buried in my API with classes built up around them offering me simple field access to accomplish most things. That is royally fucking me. I'm honestly so close to simply contacting Mr Fabrice Closier (Away3D author) personally and have him just straight up give me the answer. I reserve that kind of shit for stuck beyond belief and I'm basically there on this.
      Last edited by MadGypsy; 04-30-2016, 06:30 PM.
      http://www.nextgenquake.com

      Comment


      • @gypsy Just wanna say dude, keep it at. There is a wealth of information here I'm absorbing. I'm sure others are too, but imma say it out loud. I can't join the conversation cause most of this doesn't make a lot of sense to me, but I'm scrapping together bits and pieces. Always something to take away from this.

        By the way, I haven't been around in awhile...how ya been?
        'Replacement Player Models' Project

        Comment


        • Originally posted by MadGypsy View Post
          You are never lying about knowledge gaps. I know spike knows his stuff.. absolute genius. Sometimes I feel like an employee being chastised though. That's just adding more stress to my project. I'm not some 3D genius. This shit is very complicated and I'm doing the best I can. I'm not gonna lie, sometimes, I just want to throw all of my computers in the garbage and go find some other life.
          I'm rather impressed how far you've gotten. I would have thought you would given up a few times over by now.

          No one automatically knows things. Learning new things sucks, and takes some courage.

          Laughing stuff off usually helps.

          And for what it is worth, I've rarely seen someone who is literally figuring out so many different things all at the same time keep going this long.
          Quakeone.com - Being exactly one-half good and one-half evil has advantages. When a portal opens to the antimatter universe, my opposite is just me with a goatee.

          So while you guys all have to fight your anti-matter counterparts, me and my evil twin will be drinking a beer laughing at you guys ...

          Comment


          • @I can't join the conversation cause most of this doesn't make a lot of sense to me,

            Yeah, it don't make a lot of sense to me either. It takes more than that to deter me. Re: my tl;dr post last night..super powerful, and what-not.

            @How have I been

            Not so great, brother. Not so great, at all. I'm trying to just keep my chin up and eyes on the road, ya know. My chest is fuckin killing me. I can't ever sleep and I think I dumped my girlfriend. I'm not sure I care about that last part but, it doesn't indicate awesomeness. People keep trying to give me jobs and for some reason I can't bring myself to go to any of them. I wake up like I'm gonna go... and then I just don't go. I think I'm trying to completely sabotage myself if you want to know the brutal truth. Don't bother with the sympathy though. It takes a hell of a lot more than some depression to take me down. I know what's wrong and I'm gonna predict that within a week I will obtain a vicious fire in me and fix everything in one fuckin day. I'll go from bah-humbug to TRY ME WORLD!
            http://www.nextgenquake.com

            Comment


            • @baker - those are some seriously uplifting words, my friend. My drive comes from the fact that in the last 5 years I have had numerous ideas and there always comes a point where I completely lose interest. You could probably name most of these things. It didn't used to be like that. I used to have this major fire to accomplish my ideas against all odds. I decided that if the fire is no longer in me then I will switch to sheer stubborn determination (which I have plenty of). A 5 year streak with no accomplishments is complete bullshit. I didn't learn 50 gabillion languages just to have some chance of understanding anything spike says (LMAO). I also didn't practice, practice, practice and study myself dizzy just to impress people with my potential. I'm actually a very talented programmer and I am going to do something with that. I'd like to add that, my code images do not, at all, express my level of programming skills. I chose my current structure because I am doing this by the seat of my pants. It's easier to manage it the way I am doing it. Believe me when I say I could rewrite all of this to utilize design patterns. Programming to interfaces instead of methods and utilizing tried and true patterns for every bit of this. That's too spread out for this point. It's good to know everything you intend to do before you start writing flow charts and deriving patterns from them..right?

              @Laughing stuff off usually helps.

              I try. It's not always funny. I'm not at a point right now where laughing is easy. I'm struggling with staying civil. All of you know that I'm not meek or humble. I'm quite the opposite even if it includes shooting myself in the foot. I'll just use the other foot. Like when Spike gave me that BSP2 map, could you imagine how incredibly pissed I would have been if I wasted hours trying to fix code that isn't broken? Luckily, I'm not stupid.
              Last edited by MadGypsy; 04-30-2016, 07:35 PM.
              http://www.nextgenquake.com

              Comment


              • bloom, blur, depth of field, etc are post processing effects. that is NOT texture filtering. think about it, wtf does it have to do with what is described here: https://en.wikipedia.org/wiki/Texture_filtering

                in opengl, texture filtering is a part of the texture itself. in every other api the sampler state (including texture filtering) is a separate 'sampler' state object.
                you have the minification filter(what happens if your texture is far too high res), the magnification filter (what happens when your texture is really low res), the mip filter (what happens when moving from mipmap level to the next), and the (max) anisotropic level (higher values give nicer/pricier sampling on steeply angled surfaces).
                other bits of sampler state includes wrapping/clamping/mirroring, border colours, lod clamps/biases. there's also hardware PCF compares too, of course.
                if you have linear mip+min+mag filters all enabled then you have trilinear filtering. yay.

                considering that you do have repeat/linear/nomip as options inside your fragment shader example, then that's probably where you need to look to enable linear filtering properly. that stuff isn't native to any real 3d api, so your flash stuff must be translating it and probably generating the sampler state accordingly for it (specifying immutable sampler state as part of the shaders themselves means the gpu can pick more appropriate opcodes to sample from it).

                in opengl, selecting linear/nearest/etc is trivially done by the calls baker gave you. in d3d9 they're done by trivially updating the samplerstate for each texture+draw (because more state is fun..). in d3d11+vulkan they're specified via a struct when creating a sampler object, and samplers MUST be enabled.
                In every single 3d API I know of, it is a single line change to change the mag filter to linear (ignoring switching it off again).
                If you think that its annoying for me+baker to keep going on about this stuff, then think how annoying it is for us to have to STILL point out that you've still not fixed such a trivial thing, and how annoying it is to see you go and try to work around it in what appears to be a really painful waste of your time, instead of a 1-line change somewhere.

                I never said to ditch interleaved vertex buffers. by all means merge multiple batches into a single one. just don't split a batch into multiple vertex buffers because of it.

                regarding shaders, you can find fte's water(surface) warp shader for d3d9 here (looks like your api is built around d3d9's flaws, otherwise I would have lnked the glsl stuff):
                https://sourceforge.net/p/fteqw/code...faultwarp.hlsl
                I'm far too lazy to translate it to whatever the hell the api you're using needs, figure it out yourself, its just some fairly simple maths so it shouldn't be hard. and fix your sampler state while you're at it.
                Some Game Thing

                Comment




                • Ok I at least got it to load, that's a start but, it's all liney and shitty. My output is tracing the frag shader for all the textures <2d,linear,clamp> is for lightmaps. I tried <2d,linear,wrap> and that didn't work either, as far as I know all the other options are nearest or mip related.

                  My lightmap "painter" code is set to res 1x (ie no resizing) but' I'm gonna strip it out and go for the direct approach. I seriously doubt this is going to make a difference.
                  http://www.nextgenquake.com

                  Comment




                  • PS> I knew this wasn't gonna work but, I thought it was pretty funny.

                    In this version (my BSPUtility) I'm not using real images for the lightmaps. I'm just building bitmapData on the fly. I don't know if or why that would make a difference but that doesn't mean it doesn't. I had a problem a while back with atf textures, where lightmaps wouldn't display because the material it was attached to had mips but the lightmap didnt, and ATF didn't like that at all. I know why now. LightMapMethod basically just copies the shader from the texture it's attached to and since the lightmap atfs didn't have mips but they were being told to use them, it was breaking. I fixed that, well, in my head I fixed it but, it really will be fixed. I'm curious if I create some dxt5 or dxt1 ATFs and then put the dxt5/1 format in the shader script <2d,linear,dxt5,clamp> if it would make any difference at all. We're about to find out.

                    The reason why I am curious is 2 fold. 1) right now since it's just bitmapData. It's not injecting any format. I hard typed rgba cause it's the closest and it made no difference. Plus rgba is still an atf format. 2) I don't know what I'm talking about but, something about ATF textures are optimized or maybe designed for the GPU. Maybe that will help me in some way.

                    This is gonna take me about a half hour cause I have to rewrite script that I thought would never work due to the mip/atf thing I just mentioned and then I have to spend some time doing conversions. I have officially modified my API to make linear filtering of lightmaps happen and I think I would rather just clone the class and use the clone instead. A bunch of trivial shit but, it's time consuming. Even if this doesn't do shit, at least I can go back to ATF lightmaps instead of slow and expensive pngs.
                    Last edited by MadGypsy; 05-01-2016, 12:08 AM.
                    http://www.nextgenquake.com

                    Comment


                    • I'm waiting for a bunch of conversions so I decided to think out loud about this a little more.

                      My best guess is that because the lightmap is atlased, linear filtering is somehow dragging surrounding lightmap edges into the mix. This is just a guess but let's just assume I am correct. If so, what can be done to mitigate this? I have a couple of ideas.

                      1) I could draw a border of a couple of pixels around each lightmap where the pixels will be the exact color as the edge pixel it touches. Then simply move the Uvs to the new proper spot. Hopefully, any "bleed over" (if that is the case) will disappear.

                      2) This is sort of the exact same idea but a little easier albeit not necessarily effective. Paint the entire lightmap atlas white, gap the lightmaps as they are being drawn, adjust the uvs for the gap and hope that would be good enough.

                      There are ways for me to make a small tester map that could give me indication if I am on the right track. If I am correct then my faux lightmp filter wasn't such a waste of time as currently believed cause, I will have to use a version of it to border these lightmaps. Another test I could run would be to remove the atlas for a moment and spit the lightmaps out ...waaaaaaaait.hmmmm I'll be back.
                      Last edited by MadGypsy; 05-01-2016, 12:31 AM.
                      http://www.nextgenquake.com

                      Comment


                      • see... linear filtering before rewriting atlasing 50 times.
                        grats for finally getting that working though.
                        regarding world textures, high-res ones look stupid with nearest filtering, but nearest is fine on low-res content (low res stuff should still use 'miplinear' though).

                        compressed formats should generally be a property of the texture (and if your api uses it, imageview) and not part of the sampler state. iiuc, your ATF textures contain multiple different formats for compressed texture (the whackamole solution), in order to allow it to work on both desktop and mobile. thus your dxt1/5 option would only be when you create the file itself rather than when you load it.

                        regarding the bleeding, make sure you're rounding the lightmap coords outwards, with both floor AND ceil. presumably your resampling code is using floor for both and seeing that the texture is only a set number of samples across and then realigning everything.
                        even quake's software renderer used linear filtering of a sort. round outwards without realigning or other weird reformatting and you should get it fixed.
                        remember that the lightmap is aligned to the rounded positions rather than the actual extents of the geometry (this was to keep the 1:16 ratio simple in the software render without it being misaligned). violating that will of course result in the lightmap being nudged over by half a sample or whatever.
                        Last edited by Spike; 05-01-2016, 12:31 AM.
                        Some Game Thing

                        Comment


                        • Earlier

                          Right Now


                          Look how much lighter the lines are. I believe I have the answer. I uncommented my old calc_lightmap function and bucked the resolution up to 4x. This means that the lightmap UVs fall further apart and smudging (as I'm going to call it) has more room to happen between the UVs. This means that I am probably right about this smudging thing. It's not the only possibility I suppose but, I have to start somewhere. Since it's the easiest idea out of my 2 above, I'm gonna gap the lightmaps WITH PINK...see what I'm saying?
                          http://www.nextgenquake.com

                          Comment


                          • @no sympathy

                            no worries. I'm not one for sympathy either. You'll get it figured out dude, you got the right mindset for it.

                            anyways, I don't wanna derail this thread, but it's damn good to see ya again bro. your efforts here are really inspiring me to learn some more of this type of shit.
                            'Replacement Player Models' Project

                            Comment


                            • @dutch - you aren't derailing anything. I spam the crap out of this. You'll be 50 pages down by next week.

                              @learn more

                              I'll throw this out there. I have a source on git hub. You are more than welcome to join as a contributor and I'll let you branch away. I can show you how to get set up PDQ with flashDevelop. The good thing about doing it this way is you will always have access to my latest code and it doesn't matter if your branch doesn't work at all. You can play with it all you want and I can be your "spike" while you learn AS3. I know everything, even the things I don't know cause, I have been programming in this language since it was invented and both of the versions before it when they were invented. I have literally been programming flash since the very beginning of it including a language. 16 years. You could stick any API ever invented for flash in front of me and I will figure it out docs or not. The good thing is, AS3's docs are excellent! Away3D on the other hand has bullshit docs.

                              The reason why I would let you branch even if it's broke is cause I can fix whatever you do without us having a constant flow of emails, link sharing and all that crap. I don't give a shit if my github is "pro"

                              look at the difference
                              AS3 API Reference for File Class
                              Away3d API Reference for TextureMaterial

                              Adobe goes out of their way to explain the crap out of everything. Away3D gives you barely a sentence to go on. Scroll the entire page for both of those links to really see how in the dark I am with Away3D docs. There might as well not even be docs cause reading that hasn't helped me one single time.

                              e: Meh, I guess I could have picked an away3d class with a little more to it for better illustration. Just imagine more of nothing and consider that most classes don't primarily consist of obvious stuff like alpha.
                              Last edited by MadGypsy; 05-01-2016, 03:34 PM.
                              http://www.nextgenquake.com

                              Comment


                              • Alright, apparently I'm flying solo on that floor/ceil question so I dreamed about it for a bit. The easiest way that I can figure is to simply scale the UVs but, determining by how much has me stumped.

                                Oh wait, fuck...can't round UV coordinates lol. They're all decimals. Round up and down? You're going to have to be a little clearer than "floor/ceil coords". What coordinates? If I floor or ceil any uvs I'm just going to end up with a bunch of 1 and 0.

                                e: My best guess is that you are saying I need to floor/ceil somewhere in here. Could you tell me what I need to do to this (if you even mean this part).

                                Code:
                                var len:int, name:String;
                                for (name in _tindex)
                                {	atlasLightmaps(name);			//build the lightmap atlas for thia texture batch
                                	len = Math.sqrt(_lmAtlas.length);	//cheater
                                	e = 0; do 
                                	{	n = _tindex[name][e];		//face index for this texture
                                	
                                		j = 0; do 
                                		{	face_t[n].lmuvs[j]    = (face_t[n].lmuvs[j] * face_t[n].s/len)+(face_t[n].light_s/len);
                                			face_t[n].lmuvs[++j]  = (face_t[n].lmuvs[j] * face_t[n].t/len)+(face_t[n].light_t/len);
                                		} while (face_t[n].lmuvs.length - (++j));
                                	} while (_tindex[name].length - (++e));	
                                	
                                	j= face_t[_tindex[name][0]].texid;	//cheater
                                	if( name.indexOf("*") == -1 ) tmmips[j].addMethod(new LightMapMethod(Cast.bitmapTexture($ltmaps[name]), BlendMode.MULTIPLY, true));
                                }
                                Last edited by MadGypsy; 05-01-2016, 11:56 AM.
                                http://www.nextgenquake.com

                                Comment

                                Working...
                                X