382 comments
  • ChuckMcM2y

    My first video accelerator was the Nvidia NV-1 because a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

    Then I bought a 3DFx Voodoo card and started using Glide and it was night and day. I had something up the first day and every day thereafter it seemed to get more and more capable. That was a lot of fun.

    In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL (which they didn't control) to program games if they could. After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

    Sigh.

    • flohofwoe2y

      Microsoft pushing D3D was a good thing, OpenGL drivers were an even bigger mess back then than today, and drivers for popular 3D accelerators only implemented the 'happy path' needed for running GLQuake but were either very slow or sloppily implemented for the rest of the API.

      D3D was a terribly designed API in the beginning, but it caught up fast and starting at around DX7 was the objectively better API, and Microsoft forced GPU vendors to actually provide conforming and performant drivers.

      • ChuckMcM2y

        I see it a bit differently, but there is a lesson in here.

        Microsoft pushed D3D to support their own self interest (which is totally an expected/okay thing for them to do), the way they evolved it made it both Windows only and ultimately incredibly complex (a lot of underlying GPU design leaks through the API into user code (or it did, I haven't written D3D code since DX10).

        The lesson though, is that APIs "succeed", no matter what the quality, based on how many engineers are invested in having them succeed. Microsoft created a system whereby not only could a GPU vendor create a new feature in their GPU, they could get Microsoft to make it part of the "standard" (See the discussion of the GeForce drivers elsewhere) and that incentivizes the manufacturers to both continue to write drivers for Microsoft's standard, and to push developers to use that standard which keeps their product in demand.

        This is an old lesson (think Rail Gauge standards as a means of preferentially making one company's locomotives the "right" one to buy) and we see it repeated often. One of the places "Open Source" could make a huge impact on the world would be in "standards." It isn't quite there yet but I can see inklings of people who are coming around to that point of view.

        • deafpolygon2y

          Should they have evolved it to be cross-platform? What about Apple's Metal API? I don't get why people expect Microsoft to do things that would benefit their competitors.

          > The lesson though, is that APIs "succeed", no matter what the quality, based on how many engineers are invested in having them succeed.

          Exactly. Microsoft was willing to make things work for them. Something other vendors wouldn't do (including those who are ostensibly "open source").

      • nekoashide2y

        It wasn't just graphics, it was audio as well. People have it nice now but back then you were still fighting audio driver issues. AC'97 support made that situation livable but it took forever for everyone to support it.

        • rodgerd2y

          My opinion on this one, for games authors anyway, was changed by reading an early 2000s piece by someone rebutting a lot of the noise Carmack was making on the topic, focusing on exactly this point: by DirectX 6, you got an API that was a suite which gave you, yes, the graphics, but also the sound, the input handling, media streaming for cutscenes and so on. OpenGL vs Direct3D was a sideshow at that point for most developers: it was "solves one part of their problem" vs "solves all of their problems". And no-one involved in OpenGL showed any sign of being interested in those other problems.

          • jamesfinlayson2y

            I haven't played much with either OpenGL or DirectX but I remember wanting to render some text using a 3D API - DirectX supports it out of the box (need to pick the font etc - lots of settings) but at least when I last looked, OpenGL didn't offer that functionality, so I'd need to find a library that handles font loading and everything else associated with printing text.

          • animal5312y

            Input handling was pretty basic back in the day, there wasn't that many different hardware options to support.

            For audio everyone was using 3rd party tools like Miles Sound System etc., but even OpenAL launched around 2000 already as an OpenGL companion. Video had the same thing happen with everyone using Bink which launched around 1999.

            In comparison using OpenGL was a lot nicer than anything before probably DirectX 9. At that time in DX you needed pages and pages of boilerplate code just to set up your window, nevermind to get anything done.

            Advanced GPU features of the time were also an issue, OpenGL would add them as extensions you could load, but in DirectX you were stuck until the next release.

            • tubs2y

              You can't make a window in open gl itself at all...

          • rob742y

            Well yeah, I mean, OpenGL (to quote Wikipedia) is "a cross-language, cross-platform API for rendering 2D and 3D vector graphics" - nothing more, nothing less. Whereas DirectX (which includes and is often conflated with Direct3D) was specifically designed by Microsoft to attract game developers to their platform (and lock them in) by taking care of all their needs. So it's kind of an apples to oranges comparison...

      • taeric2y

        This implies that they could not have pushed OpenGL to be less of a mess. Feels bad faith to argue, when you consider how bad all drivers were back then.

        • Jasper_2y

          Everybody wanted this. There was even an attempt at this, called "Long's Peak", that was ultimately voted down by the OpenGL committee after a long development road. Nobody else needed to sabotage OpenGL, Khronos was more than happy to do it themselves.

          • david-gpu2y

            > Khronos was more than happy to do it themselves.

            I was involved in a few of those committees, and sadly I have to agree.

            The reason Khronos is often so slow to adopt features is because how hard it is for a group of competitors to agree on something. Everybody has an incentive to make the standard follow their hardware.

            A notable exception to this was the OpenCL committee, which was effectively strongarmed by Apple. Everybody wanted Apple's business, so nobody offered much resistance to what Apple wanted.

      • TazeTSchnitzel2y

        Even today, DirectX has stricter quality/precision/behaviour requirements for hardware. On the other hand Vulkan is a lot better specified in other areas and thus better documented. So even if a vendor doesn't officially support one or the other, they will care about both…

        • pjmlp2y

          Kind of, not only Vulkan keeps trailing behind DirectX, where vendors collaborate with Microsoft on DirectX, and eventually they might come to Vulkan.

          There is also the issues of development experience in provided SDKs versus what others provide, and apparently now Khronos rather adopt HLSL than trying to improve GLSL.

          • skocznymroczny2y

            Standarization of features might lag behind D3D, but availability of features usually comes first on Vulkan because of the extension system. For example NVidia brought raytracing to Vulkan long before DXR was a thing.

            • pjmlp2y

              Nö it didn't, check your timelines with more care.

              Here is tip, start with the Star Wars presentation from Unreal engine.

          • TazeTSchnitzel2y

            Vulkan trails on some things, leads on others. DirectX doesn't exist on mobile for instance, so innovations with particular relevance to mobile GPUs tend to come to Vulkan first.

            Also, its extensions make it an interesting laboratory for various vendors' experiments.

        • nspattak2y

          if this is an argument about MS forcing DX and killing OGL, I remind you that Vulkan had/has Apple's support.

          • TazeTSchnitzel2y

            I'm not arguing any such thing, just pointing to the strengths of the two standards.

      • jbverschoor2y

        Well, Microsoft kind of deceived sgi during Fahrenheit

        • pjmlp2y

          Mostly, SGI also did their own part of the overall failure.

      • BearOso2y

        I agree. D3D7 was the first version actually widely used, and it came about as a glut of 3D card manufacturers appeared. Microsoft was willing to change it drastically to appease developers. OpenGL is still held back by the CAD companies. We're lucky Vulkan is here as an alternative now.

    • Aardwolf2y

      Around 1999 we had a PC with both a Riva TNT and a Voodoo 2. The main games I played were Half Life and Unreal 1 (in addition to various games that came bundled with hardware like Monster truck madness and Urban Assault). I found the Riva TNT to work much better than the Voodoo 2 for the main games I played (e.g. when choosing in the game options, the D3D or OpenGL options had less glitches, better looking translucency in Unreal, etc..., than the options that used the voodoo card), and in addition the Riva TNT supported 32-bit color while the Voodoo 2 only had 16-bit color and had this awkward passthrough.

      Maybe being 1999 it was just a little bit too late to still fully appreciate 3dfx and modern day D3D and OpenGL took over around that time, so I just missed the proper Voodoo era by a hair.

      Note that by OpenGL here I meant OpenGL using the Riva TNT (I assume the Voodoo card drivers must have been called Glide or 3DFx in the settings). I've always seen D3D and OpenGL existing side by side, performing very similarly in most games I played, and supporting the same cards, with GeForce cards etc that came later. I mainly game using Wine/Proton on Linux now by the way.

      • flohofwoe2y

        Yep, as soon as the TNT came out it was pretty much over for 3dfx. Quake II on a Riva TNT running in 1024x768 was a sight to behold.

        • smcl2y

          I feel like 3dfx still had a slight edge until the tail end of the Voodoo3 era (when the GF256 came out and blew everything away). But I think it depends what you prioritise.

          Like here, the V3 seems to have a pretty handy lead in most cases https://www.anandtech.com/show/288/14 - but that's a TNT2 (not TNT2 Ultra) and it's all in 16 bit colour depth (not supported by the V3).

          It was certainly an interesting time, and as a V3 owner I did envy that 32 bit colour depth on the TNT2 and the G400 MAX's gorgeous bump mapping :D

          • dwater2y

            I worked at CompUSA the summer of 1999 and there were demo machines for the Voodoo3 and TNT2, and what I remember most was that the Voodoo looked muddy while the TNT2 looked crisp. Frame rates weren’t different enough to have a clear winner, since there were 3 different Voodoo models and Nvidia had the ultra. I ended up getting a TNT2 Ultra and loved it. Never had any compatibility issues that I remember.

            • dbspin2y

              I owned a Voodoo 3, while all my friends had TNT2's. My experience was the opposite. Not sure if it was the handling of anisotropic filtering, or some kind of texture filtering - but my Voodoo 3 was notably sharper on all the games of the time.

              • smcl2y

                It may even vary depending on the game, the settings, whether D3D, OpenGL or GLide were used or which versions of the drivers were installed.

                • dbspin2y

                  For sure, and I'm pretty certain I also tweaked the hell out of 3dfx tools too.

        • ChuckNorris892y

          3D gaming at 1024x768 in 1998 would be like 8K gaming today.

          • jonhohle2y

            I had a Rage 128 in 1999 and easily gamed at 1024x768 - and ran Windows 98 at 1600x1200!

            For years it was so hard to find LCDs that came close to the resolution I had on my CRT for a reasonable price.

            • doubled1122y

              And the LCDs of the day really struggled with contrast ratios and ghosting.

          • Maursault2y

            > 3D gaming at 1024x768 in 1998 would be like 8K gaming today.

            Whatever. In late 1996, I got a PowerMac 8500/180DP (PowerPC 604e) and a 1024x768 monitor. The 8500 didn't even have a graphics card, but had integrated/dedicated graphics on the motherboard with 4MB VRAM (also S-video and composite video in and out). It came bundled with Bungie's Marathon[1] (1994) which filled the screen in 16-bit color.

            [1] https://en.wikipedia.org/wiki/Marathon_(video_game)

            • bzzzt2y

              Probably cost a heck of a lot more than a comparable PC gaming setup in those days. Not to say it's not a great game, but Marathon uses a Doom-like 2.5D raycasting engine which doesn't need a 3D accelerator, just enough memory speed to draw each frame (which the PowerMac obviously had). Life gets a lot more complicated when you have to render perspective correct triangles with filtered textures, lighting and z-buffers in a true 3D space.

              • Maursault2y

                > Probably cost a heck of a lot more than a comparable PC gaming setup in those days.

                Until 2020, this was always a myth. When matching features and performance, the price of a Mac was always within $100 of a PC that is its equal. Not anymore with Apple Silicon. Now when matching performance and features you'll have a PC costing twice as much or more.

                • bzzzt2y

                  Here in the Netherlands Macs were outrageously expensive in the 90's. I only knew a few people who bothered to buy them (mostly because they wanted something simpler than a PC or because of Adobe stuff). Macs also used 'better' components at the time (SCSI drives instead of slow IDE, better sound, etc) so yes, if you wanted a comparable PC you had to pay up. But most people here had a much cheaper PC...

                  • Maursault2y

                    Depending on options, there are at least 2-4 different US-made SUV models that cost half as much as a BMW X1, which is not exactly expensive afa BMWs go.

                    The "Apple is expensive"-myth has been perpetuated since the days of 8-bit computing. Less expensive computers are cheaper because they have fewer features, use inferior parts, and are simply not as performant. But all that is behind us with Apple Silicon. Now you'd be hard-pressed to find a PC that performs half as well as the current line up of low-end Macs for their price.

                    • bzzzt2y

                      There are workloads where a high-end PC outclasses a Mac for the same money (think top-end GPU or lots of memory, Apple doesn't have the first and wants a king sized markup for the second).

                      For most entry level stuff performance is not that important so that's not the metric where customers focus on (price is). A desktop all-in-one from e.g. Lenovo starts at 600 euro's, the cheapest iMac starts at 1500. A reasonable Windows laptop starts at around 400 euro's while MacBook air starts at 1000 euro's. It's not that the Apple machines aren't better, it's just that lots of folks here don't want to pay the entry fee.

                      Same reason most people here don't drive BMWs but cheaper cars.

          • anthk2y

            No, 4K. 1280x720 would be 640x480, the minimum usable for a Windows 95/98 desktop and most multimedia based software. Today that resolution it's almost the minimum for 720p video and modern gaming.

            1920x1080 would be 800x600 back in the day, something everyone used for a viable (not just usable) desktop in order to be confortable with daily tasks such as browing and using a word processor. Not top-end, but most games would look nice enough, such as Unreal, Deus Ex and Max Payne at 800x600, which looked great.

            • int_19h2y

              It was much more common to run games at a lower resolution than regular desktop, though. Even as 1024x768 became the norm for desktop, mostly, the crazy iteration rates on 3D hardware meant that most people who couldn't afford a new card every year would stick to 640x480 for the more recent 3D games.

              • anthk2y

                Yes, I did that even in the Geforce 2MX days. Games maxed @800x600, desktop at 1024x768.

                Ditto with today's 1920x1080 desktop resolution on my Intel NUC and games at 1280x720.

                But I could run 1280x1024@60 if I wanted. And a lot games would run fine at 1024x768.

          • orbital-decay2y

            More like 4K gaming today. PII 450MHz + Riva 128ZX or TNT easily ran Half-Life at 1152x864. (FPS expectations were also lower, however - nobody expected 100fps+ like we do today)

            • Aardwolf2y

              > nobody expected 100fps+

              Not for the games framerates indeed, but I did set my CRT monitor at 120 Hz to avoid eyestrain. You could effortlessly switch between many framerates from 60 Hz to 160 Hz or so on those monitors and it was just a simple setting.

              Today it seems there now exist LCD monitors that can do (much) more than 60 Hz, but somehow it has to have all those vendor lock in sounding brandnames that makes it all sound a bit unreliable [in the sense of overcomplicated and vendor dependent] compared to back then, when it was just a number you could configure that was just a logical part of how the stuff worked.

              • syntheweave2y

                With respect to raw refresh rates, it's mostly the connectivity standards at fault. After VGA things got a bit out of hand with one connector after another in various form factors.

                The part you're probably thinking of is GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days but does add some perceptual smoothness and responsiveness in games. Not using a compatible monitor just means you're doing sync with the traditional fixed rate system.

                What has gotten way more complex is the software side of things because we're in a many-core, many-thread world and a game can't expect to achieve exact timing of their updates to hit a target refresh, so things are getting buffers on top of buffers and in-game configuration reflects that with various internal refresh rate settings.

                • Aardwolf2y

                  I don't buy that it has to be this complex.

                  We could write to buffers at 60 Hz effortlessly with computers from 1999, speeds have increased more than enough to write to buffers at 120 Hz and more, even with 16x more pixels.

                  1/120th of a second is a huge amount of time in CPU/GPU clock ticks, more than enough to compute a frame and write it to a double buffer to swap, and more threads should make that easier to do, not harder: more threads can compute pixels so pixels can be put in the buffer faster.

                  If there's problems with connector standards, software side of things, multithreading making it require third-party complexity, then that's a problem of those connector standards, the software, things like the LCD monitors themselves trying to be too smart and add delay, etc... Take also for example the AI upscaling done in NVidia cards now: adding yet more latency (since it needs multiple frames to compute this) and complexity (and I've seen it create artefacts too, then I'd rather just have a predictable bicubic or lanczos upscaling).

                  Same with audio: why do people tolerate such latency with bluetooth audio? Aptx had much less latency but the latest headphones don't support it anymore, only huge delay.

                • rasz2y

                  >GSync vs Freesync which is a feature for making a tear-free dynamic refresh rate, something that was simply impossible in the CRT days

                  fun fact: the very same technique used by Freesync, delaying the vsync, works with CRTs

                  Genericness of Variable Refresh Rate (VRR works on any video source including DVI and VGA, even on MultiSync CRT tubes) https://forums.blurbusters.com/viewtopic.php?f=7&t=8889

            • ajnin2y

              I remember that the framerate was pretty important in Quake. Because player positions were interpolated linearly, you could miss the top of the parabola when jumping if your framerate was too low. I don't remember any numbers but getting a high framerate was definitely a concern when playing a bit competitively.

              • orbital-decay2y

                Competitive play, sure, especially at low resolutions and everything disabled. Single player games were running at pretty low framerates though, with a few exceptions like Quake/Half-Life. Even 60fps was more of a luxury than a rule.

            • eecc2y

              I had a Riva 128 and it was garbage, I was constantly kicking myself for cheaping out and not getting a Voodoo2

            • rasz2y

              >nobody expected 100fps+

              but thats exactly what you got on Voodoo2 with P2 450 back then in 640x480 https://www.bluesnews.com/benchmarks/081598.html

            • arglebargle1232y

              My 200+ fps quake 2 config sure did. If you weren't running 200+ fps for online quake 2 you were in for a bad time.

              Between a Celeron 333A running at 550MHz and a dual voodoo2 you could drive games at pretty ridiculous frame rates.

          • kodt2y

            I would say 4K, as it was the resolution those with a powerful PC could handle.

            Everyone had monitors that could do 1024x768 (usually up to 1600x1200) where as 8K monitors are much less ubiquitous today in comparison.

          • jbverschoor2y

            Exactly. But now we are drowning in overhead

        • OOPMan2y

          I had one of those weird TNT2 M64 cards that you could fool into thinking it was a proper TNT2.

          Unfortunately for mine the foolery didn't work perfectly and I recall the card was a bit unstable when running that way, or maybe it didn't achieve the optimal performance possible for one of those fiddled M64s.

          Not really a huge surprise, IIRC it was a super cheap card...

        • rasz2y

          >Quake II on a Riva TNT running in 1024x768

          40 fps, 70 fps on V2 sli

          https://www.bluesnews.com/benchmarks/081598.html

      • time0ut2y

        I fondly remember the K6-2 system I had with a Voodoo 2 and 192MB of RAM. It was the first PC that was all mine. I also played HL1 and Unreal Tournament 1. The big games though were the HL mods TFC and Counter-Strike. I dragged that thing to so many LAN parties. A true golden age.

        It was also the first PC I ever installed Linux on. My dad would not let me do such a risky operation as dual booting Linux on the family computer. I don’t even remember what distro at this point.

        • metadat2y

          "Risky", it sounds so strange now to call dual booting "risky". But it was a different time, where remediating a borked software change wasn't google-able.

          Edit: @throwawayx38: I 100% agree with you! Thanks for your reply.

          • throwawayx382y

            To be fair, it may not have been the dual boot itself that was risky.

            Unless they had a dedicated harddrive, they would also need to resize existing fat32/ntfs partitions and add a couple of new partitions for Linux. This process had a certain risk.

            • bigger_cheese2y

              I have vague memories from early days of using Linux that you used to have to manually set the modeline for your videocard in X11 - if you got things wrong you could potentially permanently damage the monitor.

          • time0ut2y

            Heh, I meant it in a tongue in cheek way. My dad thought it was too risky.

            • metadat2y

              I understand, that's what I found amusing. It's easy to imagine a dad in the 90s, when computers were much more costly, being like "hell no you aren't doing that to the family computer!" :)

              Maybe I'm still misinterpreting, but this was where my mind went. Man, I wish I could've appreciated how distinct the 90s were as a kid, but I was too young and dumb to have a shred of hope of being that aware!

              • to11mtm2y

                In the era of that kind of PC, I remember the boot loaders being a bit more touchy.

                I never 'totally borked' a PC with LILO but definitely had to fix some things at least once, and that was with a nice thick Slackware 7.1 book to guide me.

                GRUB, IIRC, vastly improved things but took a little while to get there and truly 'easy-peasy'

        • wing-_-nuts2y

          >My dad would not let me do such a risky operation as dual booting Linux on the family computer.

          You were smarter than me. I wanted all those free compilers so badly I just went and installed redhat on the family pc. Ask me how well that conversation went with the old man...

      • jeffbee2y

        The pass-through was really awkward. Depending on the exact board implementation it either looked perfect or terrible, and you couldn’t trust the reviewers because they were all blind, using awful monitors, and tended to focus on the game performance instead of whether it looked like trash the other 90% of the time you spent looking at the display.

        • JohnBooty2y

          Yeah, this doesn't get mentioned enough. 3Dfx's first few cards were revelatory, but if you used the same computer for "actual work", the passthrough tended to kind of make your text look like ass in Windows/DOS.

      • selimnairb2y

        I had a Voodoo 1 and Voodoo 2. Running Quake with the GLide renderer for the first time was life changing. However, the pass through of the 2D card always felt like a hack. It also led to a noticeable reduction in 2D video quality. It always pained me to have to pass the beautiful output from my Matrox Millennium through the Voodoo.

      • rasz2y

        > in addition the Riva TNT supported 32-bit color while the Voodoo 2 only had 16-bit color and had this awkward passthrough.

        32bit on TNT at half the framerate, performance hit was brutal. 16bit on TNT was ugly AF due to bad internal precision while 3dfx did some dithering ~22bit magic

        "Voodoo2 Graphics uses a programmable color lookup table to allow for programmable gamma correction. The 16-bit dithered color data from the frame buffer is used an an index into the gamma-correction color table -- the 24-bit output of the gamma-correction color table is then fed to the monitor or Television."

        • to11mtm2y

          I was never a fan of the 3dfx dithering/filtering, personally. Things usually looked just a bit too muddled for my taste in most games. It wasn't as bad as others (I remember the Matrox M3D having some very interesting features, but Image quality was definitely worse than Voodoo. Marginally better than a Virge at least, lol.)

          16 bit on TNT was fine for most of what I played at the time, although at the time it was mostly Quake/Quake2 and a few other games. Admittedly I was much more into 2d (especially strategy) games at the time, so 2d perf (and good VESA compat for dos trash and emulators) was more important to me for the most part.

          I think 3dfx had a good product but lost the plot somewhere in between/combination of their cutting 3rd parties out of the market, and not deeply integrating as quickly vs considering binning. VSA-100 was a good idea in theory but the idea they could make a working board with 4 chips in sync at an affordable cost was too bold, and probably a sign they needed to do some soul seeking before going down that path.

          Now, it's possible that comment is only discernable in hindsight only. After all, these folks had seemed like engineering geniuses with what they had already pulled off. OTOH, when we consider the cost jump of a '1 to 2 to 4 CPU' system back then... maybe everyone was a bit too optimistic.

      • bee_rider2y

        Urban Assault was such a cool game, what a shame it never caught on.

        • Aardwolf2y

          It was a gem! It came bundled with the Sidewinder Force Feedback Pro joystick

        • erikw2y

          I reached out to Microsoft to try to buy the rights to it several years ago. They referred me to the original developer in Germany, but they either weren’t interested, or the ownership was too complicated. I’d love a modern version of Urban Assault, although the original is still completely playable. I’m not aware of any other game that managed so well to combine RTS and FPS.

          • rhn_mk12y

            How much could the rights to a game like that cost? Depending on whether it's in range of the average person's disposable income, this could be a cool thing to do as a preservationist.

      • kodt2y

        A Voodoo 2 was kind of old and slow for 1999. A Voodoo 3 was a good card then, but the TNT2 did eclipse it in several games. Only a handful of games where still better in Glide.

      • hef198982y

        Ha, I had a Riva TNT in my forst own PC! Totally forgot about that, and just how great Half Life was back the day, and still is IMHO.

    • usefulcat2y

      > Microsoft was determined to kill anyone using OpenGL ... After about 5 years (Direct X 7 or 8) it had reached feature parity but long before that the "co marketing" dollars Microsoft used to enforce their monopoly had done most of the work.

      I was acutely aware of the various 3D API issues during this time and this rings very true.

      • wazoox2y

        Yup, remember when they "teamed up" with SGI to create "Farenheit"? Embrace, extend, extinguish...

        • jbverschoor2y

          I have the beta cds here.. Fahrenheit / XSG. One disk died though

        • astrange2y

          My "favorite" Microsoft fact is that DirectX and Xbox were codenamed Manhattan and Midway because Microsoft's gaming division just ran on "what codename would be the most racist towards Japanese people?". The dxdiag icon is an X because it was originally the radiation symbol!

          And this tradition is carried on to this day in milder form when Western game devs hear Elden Ring is more popular than their game or try to "fix" visual novels and JRPGs without playing any of them.

        • pjmlp2y

          As if SGI didn't had their share in Farenheit's failure.

          https://en.wikipedia.org/wiki/Fahrenheit_(graphics_API)

          • wazoox2y

            Holy cow I found a nest of Microsoft fans. From your link:

            > By 1999 it was clear that Microsoft had no intention of delivering Low Level; although officially working on it, almost no resources were dedicated to actually producing code.

            No kidding...

            Also the CEO of SGI in the late 90s was an ex-Microsoft and bet heavily on weird technical choices (remember the SGI 320 / 540? I do) that played no small role in sinking the boat. Extremely similar to the infamous Nokia suicide in the 2010s under another Microsoft alumni. I think the similarity isn't due to chance.

            • pinewurst2y

              Don’t attribute to malice that which can be can be attributed to stupidity.

              My current employer has fairly recently hired a ton of ex-Google/Microsoft into upper management. They’re universally clueless about our business, spending most of their time trying to shiv one another for power.

              • sidlls2y

                We've recently hired a bunch of ex-Googlers (as well as Twitter-ers and other laid off people). They seem to spend most of their time making sure everyone here knows they're ex-${big_name} and how awesome things were there and we should change everything to do it the same way. It's a bit of a put-off.

                • pinewurst2y

                  If one thinks about it, they were minions in a monopoly, and a pretty dysfunctional one at that even if lucrative. What the heck do they know about how to run a competitive business?

                • ido2y

                  Is this not visible to the people with authority to fire them/not hire more such people? Or do they think differently than you?

                  • pinewurst2y

                    I think the attitude is that the industry analysts view “big names” as a structural positive not bothering to think about the actual reality. Also when you hire one of them in a senior position, suddenly you now have a gang of them as new subsidiary VPs.

                  • aflag2y

                    People are not usually fired for being obnoxious.

                    • ido2y

                      It sound like the person we're replying too implied they were also not very effective at their jobs.

                      • aflag2y

                        I'd be surprised if he really spent most of his time repeating that he worked at Google. That's certainly hyperbole.

            • Keyframe2y

              TBH SGI 320/540, well namely Cobalt were rather interesting tech-wise. Not sure if they could've gone against Microsoft at the time (NT/Softimage for example) and rise of OpenGL2 (with 3dlabs and all).

              • wazoox2y

                Yeah, they were really interesting machines but there were lots of weird technical choices : the non-PC EPROM that made them incompatible with any other OS than a special release of Windows 2000, the 3.3V PCI slots (at the time incompatible with 95% of available cards), the weird connector for the SGI 1600 screen... making the whole idea of "going Intel to conquer a larger market" moot from the start.

                Of course the main crime of the ex-Microsoft boss as the time wasn't that, but selling out most of SGI's IP to Microsoft and nVidia for some quick money.

                • Keyframe2y

                  Yeah, but that's SGI doing SGI things basically. They were used to daylight robbery systems they had ultimate control over. This is nothing out of ordinary from their thinking. The weird thinking was more if you're doing a PC do a PC then, not this.. but maybe they initially didn't want to - doing an SGI workstation with PC components is more like it. They'd be shown by the market ultimately it's not what it wants. Primary cool thing about it in my opinion was cobalt and cpu aharing RAM, but what was weird about it was distribution of how much CPU gets and how much GPU was not dynamic but rather static which you had to set up manually before boot. Dynamic sharing is what only now Apple is doing. Something AMD also explored if you had their vertical (cpu, mobo, gpu) but only for fast path movement of data. I'd like to see more of that.

                  • wazoox2y

                    The shared memory architecture came directly from the SGI O2 in 1996. The O2 had dynamic sharing, but it was impossible to make it work in Windows.

                    O2 dynamic memory sharing allowed things impossible on all other machines with its infinite texture memory, like mapping several videos seamlessly on moving 3D objects (also thanks to the built-in MJPEG encoding/decoding hardware).

            • justsomehnguy2y

              >infamous Nokia suicide

              Nokia would had killed itself either way, with Elop it still tried to flop.

              Every Nokia fanboy cries about EEE, but blissfully forgets what a turd was 5800 Xpress Music, which came half a year later than iPhone 3G.

              • dagmx2y

                Yeah, definitely agreed.

                RiM basically killed off a big chunk of the Nokia market, as it did for Windows CE as well.

                By the time the original iPhone came out, Nokia hadn’t really put out anything to capture the mindshare in a while. They were severely hampered by the split of their Symbian lineup (S30,40,60,90) and unable to adapt to the newest iteration of smartphones.

                They’d never have been able to adapt to compete without throwing out Symbian, which they held on to and tried to reinvent. Then there was the failure of MeeGo.

                Nokia would have been in the same spot they’re in today regardless of Microsoft. They’d be just another (sadly) washed up Android phone brand. Just like their biggest competitors at the time: Sony Ericsson and Motorola.

                But at least we got a lot of Qt development out of it.

              • Nursie2y

                Nokia was massive outside of the US, it had name recognition bigger than apple in Europe even in 2009 and still pumped out some gems like the N900.

                Yes it had huge systemic issues. Structural problems, too many departments pumping out too many phones with overlapping feature sets, and an incoherent platform strategy.

                But Elop flat-out murdered it with his burning platforms memo and then flogged the scraps to the mothership. It came across as a stitch-up from the word go.

                • justsomehnguy2y

                  By 2009 writing was on the wall and it wasn't 'Nokia'.

                  You know why it was 'Xpress Music'? Because Nokia was years late for a 'music phone'. Even Moto had E398 and SE had both music and photo. By 2009 Nokia had a cheap line-up for the brand zealots (eaten up by Moto C-series and everyone else), a couple of fetishist's phones (remember those with floral pattern and 8800?) and.. overpriced 'commucators' with subpar internals (hell, late PDAs on Xscale had more RAM and CPU power) and incompatible with anything, including themselves, mess of Symbian.

                  Elop not only allowed MS to try the waters with mobiles, but actually saved many, many workplaces for years. Alternative for that would had been a bankrupcy around 2013.

                  • Nursie2y

                    That's some pretty revisionist history IMHO. By 2009 the company was in the shit, but it had money and market share.

                    > Alternative for that would had been a bankrupcy around 2013.

                    The alternative would have been some restructuring by someone with a better idea than crashing the company and selling the name to his real bosses at MS.

                    The company was in trouble but salvageable. Elop flat-out murdered it, and it looked a lot like he did it to try to get a name brand for MS to use for its windows phones, which were failing badly (and continued to do so).

                    • justsomehnguy2y

                      > The company was in trouble but salvageable.

                      No. You should re-read that memo, specifically starting at "In 2008, Apple's market share in the $300+ price range was 25 percent; by 2010 it escalated to 61 percent." paragraph.

                      In 2010 nobody was interested in Symbian, no one else made phones on Symbian, no one would do apps for Symbian[0] - who would bother with all the Symbian shenanigans when even Nokia itself said what it would move to MeeGo 'soon', along with ~10% of smartphone market? Money was in Apple and Android.

                      To be salvageable you need something in demand on the market and Nokia only had a brand. You can't salvage an 18-wheeler running off the cliff.

                      Personally, I had a displeasure of trying to do something on colleague's N8 somewhere in 2011-2012. Not only it was slow as molasses but most of the apps which relied on Ovi were nonfunctional.

                      Insightful tidbit from N8 wiki page:

                      > At the time of its launch in November 2010 the Nokia N8 came with the "Comes With Music" service (also branded "Ovi Music Unlimited") in selected markets. In January 2011, Nokia stopped offering the Ovi Music Unlimited service in 27 of the 33 countries where it was offered.

                      So popular and salvageable what they discontinued the service 3 months after the launch? Should I remind you what Elop's memo was a month later, in February 2011?

                      [0] Yep, this is what really killed WP too - lack of the momentum on the start, inability to persuade Instagram to bake the app for WP, falling integrations (whey worked at the start! Then Facebook decided it doesn't want to be integrated anywhere because everyone should use their app) => declining market share => lack of interest from developers => declining market share => lack of...

                      • Nursie2y

                        Nobody said they had to stick with Symbian. Nobody said they were on any sort of right track.

                        But there were all sorts of options to restructure a company that big that had only just been surpassed by android at the time of the memo. Tanking what was left of the company and selling out to MS was probably the worst of them.

                        It's quite funny, the guardian article on the memo, that reproduces it in full, is here - https://www.theguardian.com/technology/blog/2011/feb/09/noki...

                        First comment below the line "If Nokia go with MS rather than Android they are in even bigger trouble."

                        Everyone could see it, apart from Stephen Elop, who was determined to deliver the whole thing to MS regardless of how stupid a decision it was.

                        • justsomehnguy2y

                          > to restructure a company that big

                          That's the problem. Momentum and more importantly - momentum in the big ass company notorious for it's bureaucracy, red tape and cultural[0] policy of doing nothing above the needed yet jealously protecting own domain from anyone.

                          > Everyone could see it

                          You are forgetting what if Nokia pivoted to Android in Feb 2011, than the first usable, mass produced yet first one for the company unit (ie: with all bugs, errors a first thing in the line you can encounter) would be at Q4 2012 at the very, very best. More sane estimate (considering their total unfamiliarity with the platform and, again, cultural nuances) would say somewhere in Q1-Q2 2013. There it would then compete with the already established market of Android (Moto RAZR would be 1y+ old) and iPhone 5.

                          Even if they somehow managed to do it in less than a year (like a fairy godmother came and magically did everything for them, including production and shipping) then they would needed to compete with iPhone 4S, which as we know was extremely popular and people held them for years.

                          No fucking way they could do anything to stay afloat. That is why I say what would be bankrupt by 2013. You may don't like Elop and his actions as much as you want, but there is zero chances they could do anything themselves.

                          Oh, one more thing. Sure in 2009 Nokia had the money. By 2011 creditors lowered N. rating (reflected in the memo), which means what by 2012 they would have no spare money aka cash. And you probably forgetting what Nokia had a ridiculous amount of employees (which clearly seen by how many were let go by 2013-2014). You need many, many monies to support that amount of people and you can't tell them to fuck off like in the US with the at-will employment, you need to provide the severance and pension funds for everyone. Without money the only thing you can do is to close the doors and file for bankruptcy. If you want to continue your business - you need first to pay out the social responsibilities. And that costs money. And guess who not only had the money but was willing to pour them into Nokia?

                          [0] Literally. I've read the 'memoirs' of the guy who worked there before and while. I had a friend working in Finland some time later. The stories she told about passiveness, lack of enthusiasm, always trying to evade the responsibility - just confirmed me the things what I knew at that time, and no amount of naked sauna helps. Hell, they didn't even had the guts to fire her, instead making macabre dances to force her to quit. Which, after a more than half of year of doing literally nothing (their way to get her out), she gladly did and went to MS.

                          Funny enough, she is at Google now and some shit what is happening there directly resembles what was happening almost decade ago in Finland.

                          • willywanker2y

                            Take a seat. The US was the most primitive mobile market in the world before the iPhone, with people drooling over the frigging Moto RAZR - a feature phone FFS - just because it was thin and flipped open. And imagine paying for incoming calls, or buying your phone from the operator with all useful features like copying your own ringtones disabled so that you were forced to buy whatever lame ones the operator offered. Symbian in the meanwhile was a full featured, multitasking OS with apps, themes, video calling and other features that took their time to reach iOS and Android. Nokia already knew that Symbian was on the way out, and they bought Qt to act as a bridge for developers between Symbian and Meego - it was to be the default app toolkit for Meego. Around 2009 onwards, Qt versions of popular Symbian apps started to appear. The first Meego device, the N9, had rave reviews but was intentionally hobbled by Elop choosing to go with dead in the water Windows Mobile and refusing to allow more production. This piece from back in the day is a detailed analysis of the fiasco - https://communities-dominate.blogs.com/brands/2013/09/the-fu...

                          • Nursie2y

                            > You are forgetting what if Nokia pivoted to Android in Feb 2011, than the first usable, mass produced ...

                            I'm not forgetting anything. All of these issues are present in a switch to the Microsoft platform too, and it was already clear to most observers when they did that, that the MS platform was dead in the water.

                            > hey would needed to compete with iPhone 4S

                            As did every other player, and outside the US the iPhone was not dominant in the same way.

                            > you can't tell them to fuck off like in the US

                            They had engineering units in the US which they could have done that to. And there are things you can do in most European countries too, when situations are dire.

                            > You may don't like Elop and his actions as much as you want, but there is zero chances they could do anything themselves.

                            I very much disagree, as do many observers. It could have been turned around with good management, but that doesn't seem to have been Elop's aim, his aim seemed to be to fulfill goals for MS.

                            > And guess who not only had the money but was willing to pour them into Nokia?

                            Yes, it was a stitch-up job for MS to buy an established name to try to save their dead mobile platform.

              • to11mtm2y

                I'm a Nokia fanboy and will sadly admit that you're right.

                They made a -lot- of stupid decisions, both in sticking to 'what they knew' and bad decisions with cutting-edge tech (N900 comes to mind, you couldn't get one in the states with the right bands for 3G).

                I will always love the Lumia cameras however, even their shit tier models had great image quality.

                • asveikau2y

                  I had an n900 on 3g on T-Mobile USA.

                  They should have continued maemo 5 and bet hard on it. The big rewrites for n9, and continued focus on symbian as a cash cow, hurt them.

            • nix232y

              > Holy cow I found a nest of Microsoft fans. From your link:

              It's not a nest, he's is mostly the only one.

    • rabf2y

      Part of the success of directx over opengl was that very few graphics card companies seemed capable of producing a fully functional opengl driver, for the longest time Nvidia was the only option.

      I recall ATI and Matrox both failing in this regard despite repeated promises.

      • avereveard2y

        Fully functional opengl was not exactly the issue, or not the only one

        Opengl was stagnating at the time vendors started a feature wars. On opengl you can have vendor specific extensions, because it was meant for tightly integrated hardware and software. Vendors started leaning heavily on extensions to one up each other.

        The cronus group took ages to get up and standardize modern features

        By that time gl_ext checks became nightmarishly complicated and cross compatibility was further damaged by vendors lying about their actual gl_ext support, where drivers started claiming support for things the hardware could do, but using the ext causes the scene to not look right or outright crash

        Developers looked at that and no wonder they didn't want to take part in any of it

        This all beautifully exploded a few year later when compiz started taking a foothold which required this or that gl_ext and finally caused enough rage to get cronus working at bringing back under control the mess

        By that time ms were already at directx 9, you could use xlna to target different architectures, and it brought networking and io libraries with it making it a very convenient development environment

        *this is all a recollection from the late nineties early 2k and it's by now a bit blurred, it's hard to fill in the details on the specific exts Nvidia was the one producing the most but it's not like the blame is on them, Maxtor and ATI wonky support to play catch up was overall more damaging. Ms didn't need to really do much to win hearts with dx.

        • qwertox2y

          Plus MS was also trying to offer more than just graphics by adding audio and networking to the stack which kind of started to make the whole ecosystem attractive, even if it was painful to program against.

          I had my share of fun with DirectMusic.

          • mrguyorama2y

            DirectInput (and later XInput) had it's faults but it's probably the only reason you can just plug random first and third party controllers into a USB port and expect everything to just work.

        • moring2y

          How did Microsoft solve the "extensions problem"? Did they publish standardized APIs in time so vendors wouldn't come up with any extensions? Even then, how did MS prevent them from having the driver lie about the card's features to make it look better than it is?

          • flohofwoe2y

            MS had a rigorous certification and internal testing process, new D3D versions came out quickly to support new hardware features, and through the Xbox Microsoft had more real world experience for what games actually need than the GPU vendors themselves, which probably helped to rein in some of the more bizarre ideas of the GPU designers.

            I don't know how the D3D design process worked in detail, but it is obvious that Microsoft had a 'guiding hand' (or maybe rather 'iron fist') to harmonize new hardware features across GPU vendors.

            Over time there have been a handful of 'sanctioned' extensions that had to be activated with magic fourcc codes, but those were soon integrated into the core API (IIRC hardware instancing started like this).

            Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

            • moring2y

              > Also, one at the time controversial decision which worked really well in hindsight was that D3D was an entirely new API in each new major version, which allowed to leave historical baggage behind quickly and keep the API clean (while still supporting those 'frozen' old D3D versions in new Windows versions).

              This is interesting. I have always wondered if that is a viable approach to API evolution, so it is good to know that it worked for MS. We will probably add a (possibly public) REST API to a service at work in the near future, and versioning / evolution is certainly going to be an issue there. Thanks!

              • becurious2y

                It’s a COM based API and so everything is an interface described via an IDL. You add a new member or change the parameters to a method, you must create a new version of the interface with a new GUID descriptor. You can query any interface for other interfaces it supports, so it’s easy for clients to check for newer functionality on incremental versions of DirectX.

                In practice for DirectX you just use the header files that are in the SDK.

              • jamesfinlayson2y

                I've never had to migrate between DirectX versions but I don't imagine it's the easiest thing in the world due to this approach. Somewhat related I saw a library to translate DirectX 9 function calls to DirectX 12 because apparently so much of the world is still using DirectX 9.

          • avereveard2y

            Directx api, and driver certification program

          • moth-fuzz2y

            At the time Microsoft worked directly with vendors to have many of what would be vendor extension on OpenGL become cross-vendor core DirectX features.

        • pjmlp2y

          They haven't learn much from it, see how many Vulkan extensions exist already.

      • tanepiper2y

        Back in 1999 when the Quake source code came out, I started working on "Quake 2000" which was an improvement on the rendering pipeline of the code. I ended up getting free cards ship to me - one was a GeForce256, one was the Matrox G400 DualHead and I think the other was the ATI Rage 128 Pro.

        The GeForce blew the other cards performance out the water. The Matrox was particularly bad and the dual screen didn't add much and I remember maybe 2 games that supported it.

    • Razengan2y

      > he assured me that NURBs were going to be the dominant rendering model

      Wow, this sounds like those little cases where a few different decisions could have easily led us down into an alternate parallel world :)

      Can someone expand on why NURBs didn't/don't win out against polygons?

      Could this be like AI/ML/VR/Functional Programming, where the idea had been around for decades but could only be practically implemented now after we had sufficient hardware and advances in other fields?

      • rektide2y

        Because it's exactly like the parent said: Nvidia has always Nvidia & always has been, a tightfisted tightwad that makes everything they do ultra-proprietary. Nvidia never creates standards or participates.

        Sometimes, like with CUDA, they just have an early enough lead that they entrench.

        Vile player. They're worse than IBM. Soulless & domineering to the max, to every extent possible. What a sad story.

        • mschuetz2y

          > Sometimes, like with CUDA, they just have an early enough lead that they entrench.

          The problem in case of CUDA isn't just that NVIDIA was there early, it's that AMD and Khronos still offer no viable alternative after more than a decade. I've switched to CUDA half a year ago after trying to avoid it for years due to being proprietary. Unfortunately I discovered that CUDA is absolutely amazing - It's easy to get started, developer friendly in that it "just works" (which is never the case for Khronos APIs and environments), and it's incredibly powerful, kind of like programming C++17 for 80 x 128 SIMD processors. I wish there was a platform independent alternative, but OpenCL, Sycl, ROCm aren't it.

          • ribs2y

            I keep hearing that ROCm is DOA, but there’s a lot of supercomputing labs that are heavily investing in it, with engineers who are quite in favor of it.

            • doikor2y

              With supercomputers you write your code for that specific supercomputer. In such an environment ROCm works ok. Trying to make a piece of ROCm code work on different cards/setups is real pain (and not that easy with CUDA either if you want good performance)

            • pixelesque2y

              If you want to run compute on AMD GPU hardware on Linux, it does work - however it's not as portable as CUDA as you practically have to compile your code for every AMD GPU architecture, whereas with CUDA the nvidia drivers give you an abstraction layer (ish, it's really PTX which provides it, but...) which is forwards and backwards compatible, which makes it trivial to support new cards / generations of cards without recompiling anything.

            • mschuetz2y

              I hope it takes off, a platform independent alternative to CUDA would be great. But if they want it to be successfully outside of supercomputing labs, it needs to be as easy to use as CUDA. And I'd say being successfull outside of supercomputer labs is important for overall adoption and success. For me personally, it would also need fast runtime compilation so that you can modify and hot-reload ROCm programs at runtime.

            • pjmlp2y

              Some random HPC lab with weight to have a AMD team drop by isn't the same thing as average joe and jane developer.

        • rabf2y

          Nvidia has had driver parity for linux, freebsd and windows for many many years. No other graphics card manufacturer has come close to the quality of their software stack accross platforms. For that they have my gratitude.

          • foxhill2y

            DLSS was windows only for some time.

            linux’s amdgpu is far better than the nvidia-driver.

            • rabf2y

              ATI drivers were a horror show for the longest time on windows never mind linux. What Nvidia did was have have basically the same driver code for all operating systems with a compatibility shim. If you were using any sort of professioinal 3d software over the previous 2 decades Nvidia were the only viable solution.

              Source: Was burned by ATI, Matrox, 3dlabs before finallly coughing up the cash for Nvidia.

              • nick__m2y

                I was a big Matrox fan, mostly because I knew someone there, and was able to upgrade their products at a significant discount. This was important for me as a teenager whose only source of income was power washing eighteen-wheelers and their associated semi-trailers. It was a dirty and somewhat dangerous job, but I fondly remember my first job. Anyway, I digress, so let's get back to the topic of Matrox cards.

                The MGA Millennium had unprecedented image quality, and its RAMDAC was in a league of its own. The G200 had the best 3D image quality when it was released, but it was really slow and somewhat buggy outside of Direct3D where it shined. However, even with my significant discount and my fanboyism, when the G400 was released, I defected to NVIDIA since its relative performance was abysmal.

                • antod2y

                  One usecase Matrox kept doing well was X11 multimonitor desktops. The G400 era was about the time I was drifting away from games and moving to full time Linux, so they suited me at least.

              • foxhill2y

                yes, i am very familiar with that pain. fglrx was hell compared to nvidia.

                nvidia being the only viable solution for 3d on linux is a bit of an exaggeration imo (source: i did it for 5 years), but that was a long time ago: we have amdgpu, which is far superior to nvidia’s closed source driver.

            • Gordonjcp2y

              Except it doesn't do GPU compute stuff, so it's no use for anything except games.

              • foxhill2y

                it doesn’t do CUDA, but it does do opencl, and vulkan compute

                • Gordonjcp2y

                  Maybe, but nothing really uses that, at least for video.

            • alanfranz2y

              amdgpu is better now. But was terrible for years, probably 2000-2015. That’s what gp is saying.

              • hulitu2y

                Huh ? Compared to open source nvidia driver which could do nothing ?

                I had a Riva TNT 2 card. The only "accelerated" thing it could do in X was DGA (direct graphics access). Switched to Ati and never looked back. Of course you could use the proprietary driver. If you had enough time to solve instalation problems and didn't mind frequent crashes.

                • badsectoracula2y

                  > Compared to open source nvidia driver which could do nothing ?

                  Compared to the official Nvidia driver.

                  > If you had enough time to solve instalation problems and didn't mind frequent crashes

                  I used Nvidia GPUs from ~2001 to ~2018 on various machines with various GPUs and i never had any such issues on Linux. I always used the official driver installer and it worked perfectly fine.

                • onphonenow2y

                  Did people not try the nvidia driver back then? Even as a casual user at the time it was miles ahead - but it wasn’t open source

                • anthk2y

                  DGA and later XV.

              • foxhill2y

                amdgpu is new. you may be thinking about fglrx: a true hell.

                • alanfranz2y

                  No, I was thinking about amdgpu. amdgpu, the open source driver, since 4-5 years is better than nvidia closed source driver (excluding the cuda vs opencl/rocm debacle ofc).

                  fglrx has always been a terrible experience indeed, so AMD was no match for nvidia closed source driver.

                  So, once upon a time (I'd say 2000-2015) the best Linux driver for discrete GPUs was nVidia closed source one. Nowadays it's the amd open source one. Intel has always been good, but doesn't provide the right amount of power.

        • jemmyw2y

          I think any company who feels they are in the lead with something competitive would do the same. The ones who open their standards were behind to begin with and that's their way of combating the proprietary competition.

          • rektide2y

            Belief in your own technology, even if it is good, as it turns out, is often insufficient to really win. At some point, in computing, you need some ecosystem buy in, and you almost certainly will not be able to go it alone.

            Nvidia seems utterly disinterested in learning these lessons, decades in now: they just gets more and more competitive, less and less participatory. It wild. On the one hand they do a great job maintaining products like the Nvidia Shield TV. On the other hand, if you try anything other than Linux4Tegra (l4t) on most of their products (the Android devices wont work at all for anything but Android btw) it probably wont work at all or will be miserable.

            Nvidia has one of the weirdest moats, of being open source like & providing ok-ish open source mini-worlds, but you have to stay within 100m of the keep or it all falls apart. And yea, a lot of people simply dont notice. Nvidia has attracted a large camp-followers group, semi-tech folk, that they enable, but who dont really grasp the weird limited context they are reserved on.

            • bsder2y

              As much as I hate Nvidia, AMD and Intel have done themselves zero favors in the space.

              It's not that hard--you must provide a way to use CUDA on your hardware. Either support it directly, transcompile it, emulate it, provide shims, anything. After that, you can provide your own APIs that take advantage of every extra molecule of performance.

              And neither AMD nor Intel have thrown down the money to do it. That's all it is. Money. You have an army of folks in the space who would love to use anything other than Nvidia who would do all the work if you just threw them money.

            • fud1012y

              What do they get right with shield?

        • agumonkey2y

          Some say the nurbs model was also not fit with culture at the time and not supported either on modeling tools or texturing. Game dev would get faster results with triangles than with nurbs. Not sure who should have footed the bill, game studios or nvidia.

        • DeathArrow2y

          How is NVIDIA different from Apple?

          • verall2y

            Nvidia makes superior graphics cards which are for dirty gamers while Apple makes superior webshit development machines.

      • MontyCarloHall2y

        My guess is that it’s much harder to develop rendering algorithms (e.g. shaders) for NURBSes. It’s easy and efficient to compute and interpolate surface normals for polygons (the Phong shader is dead simple [0], and thus easy to extend). Basic shading algorithms are much more complicated for a NURBS [1], and thus sufficiently computationally inefficient that you might as well discretize the NURBS to a polygonal mesh (indeed, this is what 3D modeling programs do). At that point, you might as well model the polygonal mesh directly; I don’t think NURBS-based modeling is significantly easier than mesh-based modeling for the 3D artist.

        [0] https://cs.nyu.edu/~perlin/courses/fall2005ugrad/phong.html

        [1] https://www.dgp.toronto.edu/public_user/lessig/talks/talk_al...

      • ChuckMcM2y

        Given the available resources today it should be possible to create a NURB based renderer on something like the ECP5 FPGA. Not a project I have time for but something to think about.

      • adastra222y

        How do you direct render a curved surface? The most straightforward, most flexible way is to convert it into a polygon mesh.

        I suppose you could direct rasterize a projected 3D curved surface, but the math for doing so is hideously complicated, and it is not at all obvious it’d be faster.

        • somat2y

          I think the idea is that polygon meshes are the only way things are done on all existing graphics cards and as such that is the only primitive used and that is the only primitive optimized for. Personally I suspect that triangle meshes were the correct way to go. but you can imagine an alternate past where we optimized for csg style solid primitives(pov-ray), or perhaps we optimized for drawing point clouds(voxels), or perhaps spline based patches(nurbs). just figure out how to draw the primitive and build hardware that is good at it. right now the hardware is good at drawing triangle meshes so that is the algorithm used.

          • adastra222y

            So just for the record, I've actually written a software 3D rasterizer for a video game back in the 90's, and did a first pass at porting the engine to Glide using the Voodoo 2 and Voodoo 3 hardware. I'm pulling on decades-old knowledge, but it was a formative time and I am pretty sure my memory here is accurate.

            At the point of rasterization in the pipeline you need some way to turn your 3D surface into actual pixels on the screen. What actual pixels do you fill in, and with what color values? For a triangle this is pretty trivial: project the three points to screen-space, then calculate the slope between the points (as seen on the 2D screen), and then run down the scanlines from top to bottom incrementing or decrementing the horizontal start/top pixels for each scanline by those slope values. Super easy stuff. The only hard part is that to get the colors/texture coords right you need to apply a nonlinear correction factor. This is what "perspective-correct texturing" is, support for which was one of 3dfx's marketing points. Technically this approach scales to any planar polygon as well, but you can also break a polygon into triangles and then the hardware only has to understand triangles, which is simpler.

            But how do you rasterize a Bézier curve or NURBS surface? How do you project the surface parameters to screen-space in a way that doesn't distort the shape of the curve, then interpolate that curve down scanlines? If you pick a specific curve type of small enough order it is doable, but good god is it complicated. Check out the code attached the main answer of this stack overflow question:

            https://stackoverflow.com/questions/31757501/pixel-by-pixel-...

            I'm not sure that monstrosity of an algorithm gets perspective correct texturing right, which is a whole other complication on top.

            On the other hand, breaking these curved surfaces into discrete linear approximations (aka triangles) is exactly what the representation of these curves is designed around. Just keep recursively sampling the curve at its midpoint to create a new vertex, splitting the curve into two parts. Keep doing this until each curve is small enough (in the case of Pixar's Reyes renderer used for Toy Story, they keep splitting until the distance between vertices is less than 1/2 pixel). Then join the vertices, forming a triangle mesh. Simple, simple, simple.

            To use an analogy from a different field, we could design our supercomputer hardware around solving complex non-linear equations directly. But we don't. We instead optimize for solving linear equations (e.g. BLAS, LINPACK) only. We then approximate non-linear equations as a whole lot of many-weighted linear equations, and solve those. Why? Because it is a way easier, way simpler, way more general method that is easier to parallelize in hardware, and gets the same results.

            This isn't an accidental historical design choice that could have easily gone a different way, like the QWERTY keyboard. Rendering complex surfaces as triangles is really the only viable way to achieve performance and parallelism, so long as rasterization is the method for interpolating pixel values. (If we switch to ray tracing instead of rasterization, a different set of tradeoffs come into play and we will want to minimize geometry then, but that's a separate issue.)

        • pixelesque2y

          You'd probably convert it to bicubic patches or something, and then rasterise/ray-intersect those...

          I'm not really convinced curves are that useful as a modelling scheme for non-CAD/design stuff (i.e. games and VFX/CG): while you can essentially evaluate the limit surface, it's not really worth it once you start needing things like displacement that actually moves points around, and short of doing things like SDF modulations (which is probably possible, but not really artist-friendly in terms of driving things with texture maps), keeping things as micropolygons is what we do in the VFX industry and it seems that's what game engines are looking at as well (Nanite).

      • randomifcpfan2y

        Nah, NURBS are a dead end. They are difficult to model with and difficult to animate and render. Polygon-based subdivision surfaces entirely replaced NURBS as soon as the Pixar Renderman patents on subdivision surfaces expired.

      • jmiskovic2y

        NURBs are more high-level compared to triangles. Single triangle primitive cannot be ill-defined and is much easier to rasterize. There are other high level contenders - for example SDFs and voxels. Instead of branching out the HW to offer acceleration for each of these, they can all be reduced to triangles and made to fit in modern graphics pipeline.

        It's like having a basic VM, high-level languages are compiled to the intermediate representation where things are simpler and various optimizations can be applied.

      • flohofwoe2y

        My guess is: 'brute force and fast' always wins against 'elegant but slow'. And both the 3dfx products and triangle rasterization in general were 'brute force and fast'. Early 3D accelerator cards of different vendors were full of such weird ideas to differentiate themselves from the competitors, thankfully all went the way of the Dodo (because for game devs it was a PITA to support such non-standard features).

        Another reason might have been: early 3D games usually implemented a software rasterization fallback. Much easier and faster to do for triangles than nurbs.

      • 2y
        [deleted]
      • rzzzt2y

        Was it NURBs or quads? Maybe both.

    • jasode2y

      >In my opinion, Direct X was what killed it most. OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL

      I don't know the details myself but as a FYI... this famous answer covering the OpenGL vs DirectX history from StackExchange disagrees with your opinion and says OpenGL didn't keep up (ARB committee). It also mentions that the OpenGL implementation in Voodoo cards was incomplete and only enough to run Quake:

      https://softwareengineering.stackexchange.com/questions/6054...

      The author of that answer is active on HN so maybe he'll chime in.

      • spookthesunset2y

        > It also mentions that the OpenGL implementation in Voodoo cards was incomplete and only enough to run Quake

        That brings back some memories... I remember having to pick the rendering pipeline on some games, like Quake.

        I also remember the days of having to route the video cable from my graphics card to my 3dfx card then to my monitor.

        • mepian2y

          >I remember having to pick the rendering pipeline on some games, like Quake.

          You can still do that in some recent games, e.g. Doom 2016 and Half-Life: Alyx.

          • jamesfinlayson2y

            Yeah it's swung around again - it feels like 25 years ago you could choose between DirectX and OpenGL (and software too), then for AAA games at least there was DirectX only for the longest time, but now DirectX or Vulkan (or OpenGL, or DirectX 12 instead of DirectX 9/10/11).

      • 2y
        [deleted]
      • Keyframe2y

        Important puzzle piece in all that is also Microsoft's Fahrenheit diversion.

    • VBprogrammer2y

      I remember selling a a bunch of them at a computer trade show the computer shop I worked at in my teens. In probably best marketing idea I've had to date, I installed tomb raider on two basically identical computers, except one had a 3dfx card. As soon as we started running that the cards basically sold themselves.

    • pjmlp2y

      Like everything else, Khronos did not need help from Microsoft to mess up OpenGL, with its spaghetti soup and lack of tooling, that makes every graphics programming newbie start by hunting and building from scratch, the infrastructure to make a rendering engine.

      Vulkan is just as bad into this regard, with complexity turned to eleven. No wonder people call it a GPU hardware abstraction API, not a graphics API.

      And on the Web they couldn't have better idea than throw away all the existing GLSL, to replace it with a Rust inspired shading language.

      • filmor2y

        Khronos is only responsible for OpenGL since 2006 (essentially 3.0), DirectX 8 came out in 2000. In the relevant timeframe for the OP, the ARB was responsible, which has nothing to do with Vulkan.

        • pjmlp2y

          Around the Long Peaks debacle ARB became Khronos, it was basically a renaming, most of the people stayed the same.

          It has everything to do with Vulkan, givent that the same organisation is handling it, and had it not been for AMD's Mantle, they would probably be discussing what OpenGL vNext should look like.

      • hnlmorg2y

        > No wonder people call it a GPU hardware abstraction API, not a graphics API.

        The entire point of Vulkan is that it’s a hardware abstraction. It was invented to offer an API around low level hardware operations rather than the typical approach of graphics libraries which come from the opposite direction.

        • pjmlp2y

          And with it turned everyone into a device driver developer, no wonder it isn't taking off as much as desired, outside Android and GNU/Linux.

          • bsder2y

            > And with it turned everyone into a device driver developer

            Which is what graphics developers wanted.

            The problems with OpenGL and DirectX 11 were that you had to fight the device drivers to find the "happy path" that would allow you the maximum performance. And you had zero hope of doing solid concurrency.

            Vulkan and DirectX 12 directly expose the happy path and are incresingly exposing the vector units. If you want higher level, you use an engine.

            For game developers, this is a much better world. The big problem is that if you happen to be an application developer, this new world sucks. There is nowhere near the amount of money sloshing around to produce a decent "application engine" like there are "game engines".

            • astrange2y

              Hmm, "vector units" isn't the right term because GPUs usually don't use those. It's better to write your shaders in terms of scalars.

          • hnlmorg2y

            But it was never intended to be a general purpose graphics SDK.

            The way 3D rendering is done these days is drastically different from the days of OpenGL. The hardware is architecturally different, the approach people take to writing engines is different.

            Also most people don’t even target the graphics API directly these days and instead use off the shelf 3D engines.

            Vulkan was always intended to be low level. You have plenty of other APIs around still if you want something a little more abstracted.

          • vlovich1232y

            Metal and DirectX12 (if I’m remembering my version numbers correctly) are very very similar to Vulkan so I’m not really sure what point you’re trying to make.

            • dagmx2y

              They’re similar to Vulkan in being low level.

              But they’re significantly easier to target (less feature splitting) and much more ergonomic to develop with.

            • skocznymroczny2y

              I wouldn't put Metal next to DX12/Vulkan. I'd put Vulkan as the most low-level API, DX12 slightly above Vulkan (because of convenience features like commited resources, slightly simpler memory model), but Metal I'd put somewhere in the middle between OGL/DX11 and VK/DX12. Metal introduces many optimizations such as pipeline objects, but it requires much less micromanagement around memory and synchronization as VK/DX12 do.

    • throwaway092232y

      "OpenGL was well supported on the Voodoo cards "

      Definitely not the case.

      Voodoo cards were notorious for not supporting OpenGL properly. They supported GLide instead.

      3dfx also provided a "minigl" which implemented the bare minimum functions designed around particular games (like Quake) -- because they did not provide a proper OpenGL driver.

      https://en.wikipedia.org/wiki/MiniGL

    • api2y

      > a friend of mine was on the design team and he assured me that NURBs were going to be the dominant rendering model since you could do a sphere with just 6 of them, whereas triangles needed like 50 and it still looked like crap. But Nvidia was so tight fisted with development details and all their "secret sauce" none of my programs ever worked on it.

      (1) Someone designs something clearly superior to other technology on the market.

      (2) They reason that they have a market advantage because it's superior and they're worried that people will just copy it, so they hold it close to the chest.

      (3) Inferior technologies are free or cheap and easy to copy so they win out.

      (4) We get crap.

      ... or the alternate scenario:

      (1) Someone designs something clearly superior to other technology on the market.

      (2) They understand that only things that are more open and unencumbered win out, so they release it liberally.

      (3) Large corporations take their work and outcompete them with superior marketing.

      (4) We get superior technology, the original inventors get screwed.

      So either free wins and we lose or free wins and the developers lose.

      Is there a scenario where the original inventors get a good deal and we get good technology in the end?

      • codeflo2y

        Good point. In this context, it wasn't even superior though, at least not in the long run. Memory got bigger so that storing more triangles wasn't a problem anymore, it's more about computational resources. There, NURBS are only clearly better for very smooth surfaces (like the mentioned sphere), which are rare in natural shapes. For everything else, you get more details per FLOP by just using more triangles, which is where the industry went.

      • svachalek2y

        This is what patents are for, but in the real world that gets complicated too.

    • ChuckNorris892y

      >In my opinion, Direct X was what killed it most.

      False, 3dfx killed themselves. Their graphics chips and their architecture became quickly outdated compared to the competition. Their latest efforts towards the end of their life resorted to simply putting more of the same outdated and inefficient chip designs on the same board leading to monstrosities GPUs with 4 chips that came with their own power supply. Nvidia and ATI were already eating their lunch.

      Also, their decision to build and sell graphics cars themselves directly to consumers, instead of focusing on the chips and letting board partners build and sell the cards was another reason for their fall.

      Their Glide API alone would not be enough to save them from so many terrible business and engineering decisions.

      >OpenGL was well supported on the Voodoo cards and Microsoft was determined to kill anyone using OpenGL

      Again, false. OpenGL kinda kiled itself on the Windows gaming scene. Microsoft didn't do anything to kill OpenGL on Windows. Windows 95 supported OpenGL just fine, as a first class citizen just like Direct3D, but Direct3D was easier to use and had more features for windows game dev, meaning quicker time to market and less dev effort, while OpenGL drivers from the big GPU makers still had big quality issues back then and OpenGL progress was stagnating.

      DirectX won because it was objectively better than OpenGL for Windows game dev, not because Microsoft somehow gimped OpenGL on Windows, which they didn't.

    • oppositelock2y

      I was directly involved in graphics during this time, and was also a tech lead at an Activision studio during these times of Direct3D vs OpenGL battle, and it's not that simple.

      OpenGL was the nicer API to use, on all platforms, because it hid all the nasty business of graphics buffer and context management, but in those days, it was also targeted much more at CAD/CAM and other professional use. The games industry wasn't really a factor in road maps and features. Since OpenGL did so much in the driver for you, you were dependent on driver support for all kinds of use cases. Different hardware had different capabilities and GL's extension system was used to discover what was available, but it wasn't uncommon to have to write radically different code paths for some rendering features based on the cababilities present. These capabilities could change across driver versions, so your game could break when the user updated their drivers. The main issue here was quite sloppy support from driver vendors.

      DirectX was disgusting to work with. All the buffer management that OpenGL hid was now your responsibility, as was resource management for textures and vertex arrays. In DirectX, if some other 3D app was running at the same time, your textures and vertex buffers would be lost every frame, and you're have to reconstruct everything. OpenGL did that automatically behind the scenes. this is just one example. What DirectX did have, though was some form of certification, eg, "Direct X9", which guaranteed some level of features, so if you wrote your code to a DX spec, it was likely to work on lots of computers, because Microsoft did some thorough verification of drivers, and pushed manufacturers to do better. Windows was the most popular home OS, and MacOS was insignificant. OpenGL ruled on IRIX, SunOS/Solaris, HP/UX, etc, basically along the home/industry split, and that's where engineering effort went.

      So, we game developers targeted the best supported API on Windows, and that was DX, despite having to hold your nose to use it. It didn't hurt that Microsoft provided great compilers and debuggers, and when XBox came out, which used the same toolchain, that finally cinched DX's complete victory, because you could debug console apps in the same way you did desktop apps, making the dev cycle so much easier. The PS1/PS2 and GameCube were really annoying to work with from an API standpoint.

      Microsoft did kill OpenGL, but it was mainly because they provided a better alternative. They also did sabotage OpenGL directly, by limiting the DLL's shipped with windows to OpenGL 1.2, so you ended up having to work around this by poking into you driver vendor's OpenGL DLL and looking up symbols by name before you could use them. Anticompetitive as they were technically, though, they did provide better tools.

      • pandaman2y

        I was also involved in game graphics at that time (preceding DirectX) and do not quite remember that as you do. Debugging graphics on PC was a pain, you had to either use WinDbg remote mode (which was a pain to set up to get source symbols) or SoftICE and an MDA monitor. That's just for the regular CPU debugger because of the fullscreen mode. There had not been a graphics debugger until DX9. Meanwhile all consoles could be debugged from the same dev machine, and starting from PS2 we had graphics debuggers and profilers. Even the OG xbox had PIX, which introduced pixel debugging (though was a bit of a pain to set up and needed the game to submit each frame twice).

        HW OpenGL was not available on the consumer machines (Win95, Win2K) at all, GLQuake used so-called "mini-driver" which was just a wrapper around few Glide APIs and was a way to circumvent id's contract with Rendition, which forbade them from using any proprietary APIs other than Verite (the first HW accelerated game they released had been VQuake), by the time the full consumer HW OpenGL drivers became available circa OpenGL 2.0 time, DirectX 9 already reigned supreme. You can tell by the number of OpenGL games released after 2004 (mobile games did not use OpenGL but the OpenGL ES, which is a different API).

        • oppositelock2y

          You must have worked on this earlier than me. I started with DX7 on Windows, before that I worked purely in OpenGL on workstations on high end visual simulation. Yes, in DX7 we used printf debugging and in full screen-only work, you dumped to a text file or as you say, MDA if necessary for interactive debugging, though we avoided that. DX9's visual debugger was great.

          I don't remember console development fondly. This is 25 years ago, so memory is hazy, but the GameCube compiler and toolchain was awful to work with, while the PS2 TOOL compile/test cycle was extremely slow and the API's were hard to work with, but that was more hardware craziness than anything. XBox was the easiest when it came out. Dreamcast was on the way out, but I remember really enjoying being clever with the various SH4 math instructions. Anyhow, I think we're both right, just in different times. In the DX7 days, NVIDIA and ATI were shipping OpenGL libraries which were usable, but yes, by then, DX was the 800lb gorilla on windows. The only reason that OpenGL worked at all was due to professional applications and big companies pushing against Microsoft's restrictions.

          • pandaman2y

            I don't recall any slowness on the PS2 development, I dreaded touching anything graphical on PC though as the graphics bugs tended to BSOD the whole machine and rebooting 20+ times a day was not speeding up anything (all the Windows took their sweet time to boot, not to mention restarting all the tools you needed and recovering your workspace) lol.

    • samstave2y

      Don't forget the co-marketing from Intel's DRG (dev relations group) (the group I worked in) which started in ~1995 or so for game optimization dev on SIMD and, later AGP, openGL and unreal engine (our lab had some of the very first iterations of this - and NURBs were a major topic in the lab especially for OpenGL render tests etc. (if you recall the NURBs Dolphin benchmark/demo)

      Intel would offer upto(?) (cant recall if it base, set, or upto) $1 million in marketing funds if me and my buddy did our objective and subjective gaming tests between the two looking for a subjective feel that the games ran better on Intel.

      The objective tests were to determine if the games were actually using the SIMD instructions...

    • agumonkey2y

      Wasn't MS on the openGL board at one point ? they even bought softimage for a while. They seemed to be interested in the whole 3d space.

      ps: the nv1 "mis-step" was really interesting. They somehow quickly realigned with the nv3, which was quite a success IIRC.

    • datpiff2y

      > he assured me that NURBs were going to be the dominant rendering mode

      How does this relate to the NV-1? I thought it used quads instead of triangles. Did it do accelerated NURBs as well?

      • rasz2y

        reminds me of PowerVR main selling point being tiled rendering, but they tried pushing some proprietary "infinite planes" BS

        https://vintage3d.org/pcx1.php

        "Thanks to volumes defined by infinite planes, shadows and lights can be cast from any object over any surface."

        • datpiff2y

          If it was marketing they didn't seem to do a great job... Very little online except for mentions of "NURBS", and this thread.

          It's not intuitively obvious to me how a rasterizer accelerator would render NURBS surfaces at all (edit: without just approximating the surface with triangles/quads in software, which any competing card could also do)

    • balaji12y

      Most of the upvotes to the original post seem to nostalgic reactions to 3dfx just based on the title.

      Because the article is kinda hard to follow for the uninitiated. Heavy name and jargon dropping haha

      • x86x872y

        100% if you didn't experience the delta between software rendered and 3dfx hardware accelerated live as it unfolded 3dfx does not tell you anything.

    • ngcazz2y

      I liked the first advert. The world hunger one though really rubbed me the wrong way.

      • rl32y

        Well, I think it was taking a swipe at a particular brand of insincere feel-good corporate bullshit marketing that was fairly prevalent at the time. Faceless conglomerates trying to manage their image such that they appeared to be benevolent world citizens—not ruthless plunderers.

        That is opposed to say, today's brand of insincere feel-good corporate bullshit marketing.

        • a3w2y

          They were making fun of greenwashing. Fun still does have its place in personal development, say most guides to the current climate catastrophe for human beings.

          But right now, I too wish we would have a more liveable planet instead of perfect entertainment in polluting, air-conditioned cities. But I studied the wrong topic for me to actually thrive in the former, and not the latter environment.

        • 2y
          [deleted]
    • echeese2y

      100 billion operations per second, what are we at now, 100 trillion?

      • tysam_and2y

        We are in the 4 Petaflops on a single card age currently, my friend: https://resources.nvidia.com/en-us-tensor-core/nvidia-tensor...

        It is quite insane. Now, getting to use all of them is difficult, but certainly possible with some clever planning. Hopefully as the tech matures we'll see higher and higher utilization rates (I think we're moving as fast as we were in the 90's in some ways, but some parts of how big the industry is hides the absolutely insane rate of progress. Also, scale, I suppose).

        I remember George Hotz nearly falling out of his chair for example at a project that was running some deep learning computations at 50% peak GPU efficiency (i.e. used flops vs possible flops) (locally, one GPU, with some other interesting constraints). I hadn't personally realized how hard that is apparently to hit, for some things, though I guess it makes sense as there are few efficient applications that _also_ use every single available computing unit on a GPU.

        And FP8 should be very usable too in the right circumstances. I myself am very much looking forward to using it at some point in the future once proper support gets released for it. :)))) :3 :3 :3 :))))

        • dahart2y

          > We are in the 4 Petaflops on a single card age currently

          FP8 is really only useful for machine learning, which is why it is stuck inside tensor cores. FP8 is not useful for graphics, even FP16 is hard to use for anything general. I’d say 100 Tflops is more accurate as a summary without needing qualification. Calling it “4 petaflops” without saying FP8 in the same sentence could be pretty misleading, I think you should say “4 FP8 Petaflops”.

          • startupsfail2y

            At 1080p yes, tensor cores are not used. But at 4k majority of the pixels are filled by tensor cores (DLSS), so these FP8 ops are used.

            Of course the card linked above is a server card, not a desktop or workstation card optimized for rendering.

            What is that Megatron chat in the advertisement? Does it refer to a loser earth destroying character from Transformers? Rockfart?

            • dahart2y

              Oh yeah excellent point, I should not draw lines between graphics and ML — graphics has will continue to see more and more ML applications. I hope none of my coworkers see this.

              I guess Megatron is a language model framework https://developer.nvidia.com/blog/announcing-megatron-for-tr...

            • tysam_and2y

              Megatron is a Large Language Model -- unfortunately it seems they really undertrained it for the parameter counts it had, so it was more a numbers game of "hey, look how big this model is!" when they first released it.

              Many modern models are far more efficient for inference IIRC, though I guess it remains a good exercise in "how much can we fit through this silicon?" engineering. :D

          • tysam_and2y

            I did mention it, at the end! That's why I made the qualification, it is an important difference.

            Though as the other commenter noted, NVIDIA does like getting their money's worth out of the tensor cores, and FP8 will likely be a large part of what they're doing with it. Crazy stuff. Especially since the temporal domain is so darn exploitable when covering for precision/noise issues -- they seem to be stretching things a lot further than I would have expected.

            In any case -- crazy times.

        • swyx2y

          what is the usual range of flop utilization (10-30%?) and is there a resource for learning more about the contributing factors?

          • tysam_and2y

            I've seen anywhere from 20%-50% on large, fully-GPU-saturating models (that are transformers. And the one that Hotz was reacting to was a tiny CNN (< 10 MB) that still used the GPU pretty efficiently in the end.

            I think that's roughly the upper limit, I think your contributing factors are going to be: 1. How much can you use tensor cores + normal CUDA cores in parallel (likely something influenced by ahead-of-time compilation and methods friendly to parallel execution, I'd guess?), 2. What's the memory format someone is using, 3. What's the dataloader like? Is it all on GPU? Is it bottlenecked? Some sort of complex, involved prefetching madness? 4. How many memory-bound operations are we using? Can we conceivably convert them to large matrix multiplies?, 5. How few total kernels can we run these calls in? 6. Are my tensors in dimensions that are a factor of 64 by 64 (if possible), or if that's not really helpful/necessary/feasible, a factor of 8? 7. Can I directly train in lower precision (to avoid the overhead of casting in any kind of way?)

            That should get you pretty far, off the top of my head. :penguin: :D :))))) <3 <3 :fireworks:

            • swyx2y

              thats a pretty dang good head. thank you!!

  • M4v3R2y

    I have fond memories of playing Quake 2 for some time and then buying a Voodoo card. It suddenly looked like a totally different game. It wasn’t just the resolution and texture filtering - Quake 2 in GL mode used a totally different, dynamic lightning system and back then it was simply stunning.

    Comparison pics: https://www.marky.ca/3d/quake2/compare/content.html

    • deanCommie2y

      I...totally prefer the SVGA "lower resolution" version, and I remember feeling like that at the time too.

      The problem was the 3D version looked/looks just plain BLURRY.

      Something about my brain is able to understand when they see a pixelated video game that it's an artifice, and give it a ton of benefit of the doubt.

      I can't say it is in any way more realistic, but it feels more IMMERSIVE.

      Whereas with the higher resolution one, it's an uncanny valley. Everything is too smooth and Barbara Walters-y.

      Again, this is my memory of how I perceived it at the time, this isn't modern rose coloured glasses because obviously technology has improved dramatically.

      • TOGoS2y

        Same. I have fond memories of first playing Doom on a 386 with the resolution set to extra low (approximately halving 320x200). The space between the pixels gives your imagination something to do. Sort of like how you can make a room look better by turning off the ceiling lights and just having a small lamp on your desk. I kind of think even if we get 120fps perfectly raytraced VR, it'll never be able to quite achieve the big pixel immersion experience.

        • inDigiNeous2y

          I felt similar things with the original Oculus Rift DK1 prototype. There was something magical when the pixel resolution was lower, your mind had to do more work and it somehow felt more immersive due to this..

          At least in my eyes, might probably be also because it was the first VR headset I tried though. But the later DK2 and newer models did not capture that same feeling I had with the original DK..

      • CodeArtisan2y

        People were already having this debate back in the 90's with Playstation vs. Nintendo 64. PSX was too pixelated while the N64 was too blurry.

        • snek_case2y

          Probably because the textures were just too low resolution. You add bilinear filtering to low-res textures, you get something that looks blurry.

          • monocasa2y

            Yeah, the texture memory was anemic. 4KB total. If you had mipmapping on, that ate half leaving you with 2KB which is only enough to fit a single 32x32 pixel texture at 16-bit color.

        • deanCommie2y

          OMG I thought I was the only one. I was on team Playstation and my friends at the time were all raving about N64 and i was like "Am I the only one that genuinely thinks this looks WORSE?"

          • rasz2y

            Both teams were delusional, it all looked bad on consoles :) one had uncorrected perspective, fixed point math resulting in gaps in geometry and no filtering, the other super low res textures resulting in blurry mess.

      • doubled1122y

        This is one of the reasons Quakespasm is my favourite Quake source port.

        I can run on a modern system at modern resolutions but it still looks like it did back in the day.

        Sometimes adding effects and filtering gives a strange broken feeling where the art on the screen no longer matches what is in my head or something.

        I can’t handle upscaling old console games either. Something is always weird.

      • anthk2y

        On 'modern' games, Max Payne 1/2 and Alan Wake didn't look "smooth and clean" like Unreal Engine based games, but crisp and "real world" like.

    • bberrry2y

      I was quite into QWTF (the original Team Fortress) and being able to use OpenGL allowed you to have gorgeous see-through water. This was a massive advantage because players tried to avoid snipers by going through the water. Also super useful on well6 in the flag room. Ah, the nostalgia.

      • doikor2y

        The original pay to win. Another classic one was being able to afford (or live at university dorm) a good internet connection.

    • _the_inflator2y

      Yes, day and night. I specifically bought my 3dfx accelerator card to boost Psygnosis Formula 1. Adding the card was like switching from C64 to an future PC monster.

      I never again experienced such a phenomenal gap in visible performance. Happy times.

    • 4RealFreedom2y

      I remember playing Quake 2 and colored lighting followed your bullets! I was floored.

    • johnwalkr2y

      Have you played Quake 2 RTX (ray tracing version)? 25 years later it's yet again a quantum leap in the same game!

    • actually_a_dog2y

      I loved being able to play Diablo 2 with my Voodoo 5. There could be 100 mobs, plus shitloads of fog effects and fire on screen, and I'd still get like 60 fps.

      • Aeolos2y

        Diablo 2 was actually programmed to run at 25 fps.

        Diablo 2: Resurrected allows you to switch between the new 4k60fps graphics and the original - it's fascinating to see the difference.

        • robbintt2y

          Oh you can go way above 60fps! I'm clocking a stable 144hz on ultra!

    • russdill2y

      I really liked that the Pure3D card [1] had a composite video output which allowed me to play Quake2 on my Apple //c monitor. Very bizarre retro future feel.

      1: https://www.tomshardware.com/reviews/3d-accelerator-card-rev...

      • NovaDudely2y

        I remember seeing this in a store back in 97 and being blown away by the idea of Windows Games running on a TV. I cannot remember the game they were demoing but I remember it wasn't the most amazing demonstration. I think it was just the difference between the bright colours of Mario 64 compared with some very muddy looking racer.

    • sedatk2y

      Yes, I had to play Quake 2 on 320x240 due to low performance of my S3 Virge, but still got floored by the lighting.

      • 4RealFreedom2y

        I was trying to remember the first hardware acceleration I saw - it was the S3 Virge.

        • muro2y

          Die it do anything? I remember S3 cards saying you would get HW acceleration, but it was impossible to tell a difference to software rendering.

          • 4RealFreedom2y

            It worked at very low frames-per-second. Edges and textures were more well-defined. I was very disappointed. I picked up a Voodoo card soon after. S3 made a major mistake releasing unusable 3d acceleration.

          • rasz2y

            Virge provided hardware path at the approximate speed of Pentium 100 rendering 320x200 in software. If your CPU was faster switching to Virge was a speed downgrade.

          • sedatk2y

            It had better lighting at least.

    • ranger_danger2y

      personally I like the SVGA screenshots much better

      • 107292872y

        Probably because it looks way different than what we play today while the open gl is more similar to modern fps, just not as polished. But back in the days open gl Q2 was an out of this world experience.

      • M4v3R2y

        Like the sibling comment said it looked way more impressive back in the day. Plus it looks better when in motion precisely because of the dynamic lightning.

      • ido2y

        I thought the same (and I was around at the time & in the 90s thought the 3dfx version was amazing)! Today the SVGA version looks like a cool retro style where as the blurry/smooth 3dfx version just looks cheap rather than a stylistic choice (in reality neither were stylistic choices at the time & both determined by the hardware available).

  • alexwasserman2y

    I had a PowerMac 6500/300 back in my mid-teens as my first computer.

    To play games I later bought a PC Voodoo 3 which you could flash with a Mac-version ROM. Much cheaper than buying an actual Max version.

    Unreal was incredible. The whole card was incredible.

    Later I put a Linux distro on the computer too, and I needed a custom patch to the kernel (2.2.18, I think) to get the Voodoo drivers working for accelerated 2D and 3D rather than the software frame buffer. It was incredible to see a Gnome 1.4 or KDE 2 desktop in high res perform really well.

    It was also pretty buggy. Due to experimental drivers in both Linux and Mac, I assume. The weirdest was that sometimes a bug in Linux would dump whatever was in memory on the screen, which softly after booting from Mac to Linux could result in Mac OS 9 windows appearing on a Linux desktop. You obviously couldn’t interact, it would happen in a kernel panic type event where the computer would freeze with an odd mix of stuff on the screen.

    Was a fun intro to Linux, not getting a nice windowing environment till after I’d learnt to patch and compile a kernel.

    Back then it was normal to play with more OSs it seems like. That computer had Mac, Linux, BeOS and more on it.

  • cronix2y

    Nice trip down memory lane. I still remember when I popped the 3dfx in and played quake, after playing quake on some ATI or Matrox I had (and a hell of a lot of other games before that lol). It was a transformative experience. I was stunned at how smooth everything was. It was beautiful. It was more incredible than going from 320x200x16(colors) to 640x480x256 and then 1024x768x16.8M, which were all quite marvelous increments. I think Moore's Law was just more visible in the early days. You really felt each iterative change. Going from "PC Speaker" to an Adlib was also a massive transformation.

    • greggsy2y

      Interestingly, the transistor density of GPUs has been following a roughly logarithmic curve since 2000, compared to the linear increase in x86 processors [1].

      I totally agree that the incremental innovations observed in earlier GPU platforms felt much, much more ‘obvious’ though.

      It’s as if the ‘wow factor’ of graphics hardware doesn’t scale at the same rate as density.

      Or perhaps releases were more spread out than they are today (compared to the annual release cycle expected today) making the jumps more obvious.

      [1] https://www.researchgate.net/figure/Comparison-of-NVIDIA-gra...

      • Taniwha2y

        I was designing Mac graphics accelerators in the early 90s, one thing I learned is that perceived performance follows a sort of S curve, there's an area where everything is terribly slow and any change is good but it's slow, then you hit the curve every performance increase is obvious, wonderful, makes new stuff possible - that's where you start selling big if you're leading the pack, then you hit an area where performance is "good enough", double the performance and the perceived increase is far less - now you're fighting for "cheapest", not so good for your company.

        That was for 2D, bigger faster 3D enables new sorts of games so that market has been growing for far longer.

        • christkv2y

          It feels like we are two or three generations away from that in gpus for games.

          • beebeepka2y

            I admire your optimism but the next big with ray/path tracing, voxels and whatnot, I'd say not even ten generations away.

            Good thing nice graphics do not equal good games. My favourite multiplayer FPS games I prefer in glorious picmip 5 detail.

            • christkv2y

              Yeah the amount of sameness open world games is getting boring. Seems big budget games are afraid to take risks. Maybe generative AI will help smaller teams build bigger worlds for us.

              • beebeepka2y

                I've been enjoying the wave of excellent boomer shooters very much. Turbo Overkill, for example, has the best movement mechanics, and fantastic gun play. Art and enemies, while totally acceptable, could be better. Not sure how AI would help but imagination is not among the very short list of my talents

      • Gordonjcp2y

        I wonder if part of that is because GPU tasks are ridiculously parallelisable?

        If you can split the screen into 64 equal chunks, there's nothing except silicon real estate stopping you splitting it into 128, or 256, or 2048. Think about how SLI worked, in the Voodoo II olden days.

    • kyriakos2y

      Adlib forgot it even existed but you are correct. These were generational leaps in PC gaming.

    • sporkland2y

      I had the same experience. Running qw.exe and it just feeling substantial and beautiful compared to the software renderer. I have been chasing a transformative high like that since. I sat out the initial VR iterations hoping for the biggest wow factor when I finally strapped on the HTC vive connected to a PC. It was great but still not at that level.

      One aspect of this story I've never seen covered is how Nvidia managed as a quiet dark horse to come from behind and crush 3dfx in a few years with the TNT and TNT2. The article talks about the GeForce 256, but 3dfx crown was stolen before that.

  • formerly_proven2y

    3dfx is another example of "worse is better" or "slope vs intercept". This article doesn't quite spell it out, but as far as I can tell, 3dfx had one GPU architecture that was tweaked from 1995 till their end. It was much better for a few years than what the competitors put out, but the competitors kept iterating. nVidia created several generations of GPUs in the same time frame. 3dfx started higher but had no slope. Everyone else started much lower, but had a lot of slope (nVidia and ATI in particular). Two of these went ahead and started creating a new "fastest ever GPU" every other year for a quarter century, the other tried putting more of the same GPU on bigger boards and folded.

    • rasz2y

      One of my favorite facts is about Nvidia release cycle speed. At the peak of nvidia 3dfx war new chips were coming out every 6-9 months:

      Riva 128 (April 1997) to TNT (June 15, 1998) took 14 months, TNT2 (March 15, 1999) 8 month, GF256 (October 11, 1999) 7 months, GF2 (April 26, 2000) 6 months, | 3dfx dies here |, GF3 (February 27, 2001) 9 months, GF4 (February 6, 2002) 12 months, FX (March 2003) 13 months, etc ...

      Nvidia had an army of hardware engineers always working on 2 future products in parallel, 3dfx had few people in a room.

    • djmips2y

      Yes, I agree wholeheartedly

      Nvidia had a supercomputer and great hardware design software tools that were a trade secret and basically behind an off limits curtain in the center of their office and it helped them get chips out rapidly and on first turn. First turn means the first silicon coming back is good without requiring fixes and another costly turn.

      I'd say 3dfx weren't poised to industrialize as well as Nvidia and they just couldn't keep up in the evolutionary race.

      I'm not sure I understand where your worse is better idiom fits because 3dfx was better and Nvidia was worse but iterated to get better than 3dfx and won the day. Truly if worse was better in this case 3Dfx would still be around?

      On the other hand triangle based rendering is a case of worse is better and Nvidia learned that and switched course from their early attempts with nurb based primitives.

      • 2y
        [deleted]
      • johnwalkr2y

        As soon as other cards had 32 bit color, voodoo cards with 16bit color looked a lot worse.

        • rasz2y

          Nvidia 16 bit color depth up to GF256 was rendered internally at 16bit precision and looked really bad, while their 32bit was just a marketing bullet point due to 100% performance hit when enabled. 3dfx had 16bit frame buffer, but internally rendered at 32bits and output was dithered resulting in >20bit color https://www.beyond3d.com/content/articles/59/

          "Voodoo2 Graphics uses a programmable color lookup table to allow for programmable gamma correction. The 16-bit dithered color data from the frame buffer is used an an index into the gamma-correction color table -- the 24-bit output of the gamma-correction color table is then fed to the monitor or Television."

      • Infernal2y

        Where can I learn more about this supercomputer and design tooling?

        • djmips2y

          I know about it because I visited Nvidia several times in the nineties as I was a driver engineer implementing their chips on OEM cards.

          I tried to find the information and the best I could find is this better than average discussion/podcast on the history of Nvidia.

          They briefly touch on the chip emulation software that they felt they desperately needed to get back into the game after the NV1 was relegated.

          The NV3 (Riva 128) was designed rapidly (six months) with the use of their what I called their super computer - a cluster of PCs or workstations most likely - running the proprietary chip emulation software. This advantage continued on further evolution of Nvidia hardware generations.

          IIRC the chip emulation startup was started by a university friend of Jensen. The podcast says they failed later which is unfortunate.

          https://www.acquired.fm/episodes/nvidia-the-gpu-company-1993...

  • Teslazar2y

    A few more historical bits that might be interesting.

    Brian Hook was the 3dfx engineer who was the original architect of Glide. He went on to work for id Software in 1997.

    Michael Abrash wrote the first 3d hardware support for Quake, while working for id Software, but it wasn't for 3dfx it was for the Rendition Verite 1000 and released as vQuake.

    John Carmack was, of course, the lead programmer of Quake.

    Hook, Abrash, and Carmack ended up working at the same company again at Oculus VR (and then Meta).

  • mk_stjames2y

    The retail box design [1] of the Voodoo 3 2000 is still burned into my memory even after 23 years. The purple woman's face, 6 million triangles per second... just, sold. 1999 was just an amazing time.

    [1]: https://archive.ph/BNLiX/f3757388e1ee7008f0bad22261c625f1dcf...

    • egeozcan2y

      My brand-new Voodoo graphics card in its shiny package was the only thing I took with me as a teenager when I was leaving the flat in a hurry after the big earthquake in 1999[0]. I remember noticing the missing silhouettes of distant buildings with some fires burning behind where they used to stand and thinking, oh boy we surely won't have electricity soon.

      You tend to have different priorities in those ages, I guess.

      It's a dark memory, sure, but probably the packaging somehow made me get attached to a "stupid computer part" (not my words) and that's interesting.

      [0]: https://en.wikipedia.org/wiki/1999_%C4%B0zmit_earthquake

    • rayiner2y

      1999 was such a good tech year.

    • dehrmann2y

      The design of their second logo (the one in that image) held up pretty well.

      • moosedev2y

        In retrospect, it reminds me of the Amazon logo [0]. The orange swoopy thing evokes the Amazon A-to-Z "smile" (2000 onwards, although they had a more generic orange swoopy thing before that). Never noticed that similarity before.

        I also had a Voodoo 3 with that box design back then, but different colors (Voodoo 3 3000 model). Actually still have the card...

        [0] https://blog.logomyway.com/history-amazon-logo-design/

    • user_named2y

      Same

  • ddeck2y

    >On the 15th of December in 2000, NVIDIA announced that they would acquire 3dfx Interactive’s assets for $70 million ($121,613,821.14 in 2023) and 1 million shares of common stock.

    Those million shares are worth about $5.7 billion today ($238 stock, 1:24 cumulative split since then).

    • nerpderp822y

      Heh, I owned 3dfx stock when they folded. I saw 0$ of that transaction.

      • dexterdog2y

        Same here. It was the first and unfortunately last tech stock that I took a gamble on because it failed so gloriously and I was so sure of it.

      • cbm-vic-202y

        Same here! Retail investors got screwed on this deal.

      • djmips2y

        Pretty sneaky huh?

        • cowsandmilk2y

          What’s sneaky? The company was going bankrupt and the deal was approved by shareholders. When you go bankrupt or dissolve, creditors get paid before shareholders. That someone owned shares and saw nothing is how bankruptcy works (and should work).

          • wildermuthn2y

            It feels sneaky because in these deals someone always seems to get rich except those who who were teased into investing their time (employees with “equity”) or invested with small-money (stockholders) rather than big-money (VCs, banks). I have no idea about the particulars of this deal, so maybe no individual got rich off the deal, and VCs/banks only recouped a portion of their investment. But in my experience, usually a failure still sends up as a success for those who are in a position to negotiate a deal. In other words, Nvidia didn’t do this deal out of the kindness of their hearts — there was still something of value to them that they paid for, and I’d be quite surprised if the board of 3dfx didn’t find a way to capture some of that value personally. Saying that “creditors get paid before shareholders” doesn’t capture the full story of most of these sales, where some equity owners are special and some aren’t.

          • bcrosby952y

            Common shareholders get peanuts, C-suite gets tens of millions in retention incentives. Not saying that's what happened here, but that has happened to me in the past. As a small time shareholder what do you do? Assuming its even mathematically possible to do anything.

            • dehrmann2y

              It was bankruptcy, so I suspect all shareholders got nothing. You're probably thinking of a startup getting bought for less than investors put in and finding about about liquidation preferences.

  • MontyCarloHall2y

    >Voodoo Graphics and GLide were the standard in the PC graphics space for a time. 3dfx created an industry that is going strong today, and that industry has affected far more than just gaming. GPUs now power multiple functions in our computers, and they enable AI work as well.

    >This tale brings up many “what ifs.”

    What if 3dfx had realized early on that GPUs were excellent general purpose linear algebra computers, and had incorporated GPGPU functionality into GLide in the late 90s?

    Given its SGI roots, this is not implausible. And given how NVidia still has a near stranglehold on the GPGPU market today, it’s also plausible that this would have kept 3dfx alive.

    • monocasa2y

      I don't think so. Part of what made consumer GPUs viable at that gate count was their laser like precision on their specific rasterization workloads. More general linear algebra solutions wouldn't have been viable on the marketplace. You wouldn't see the viability of more general hardware until the bro layer 90s advent of register combiner hardware.

    • aldrich2y

      History is funny because at that time in the 90s there was a company called Bitboys Oy. That company was founded by some Finnish demoscene members and was developing a series of graphics cards, Pyramid3D and Glaze3D, with a programmable pipeline around 1997-1999 [1]. This was at around 5 years before the first commercial shader capable card was released.

      Even though Wikipedia classifies it as vaporware, there are prototype cards and manuals floating around showing that these cards were in fact designed and contained programmable pixel shaders, notably:

      - The Pyramid3D GPU datasheet: http://vgamuseum.info/images/doc/unreleased/pyramid3d/tr2520...

      - The pitch deck: http://vgamuseum.info/images/doc/unreleased/pyramid3d/tritec...

      - The hardware reference manual: http://vgamuseum.info/images/doc/unreleased/pyramid3d/vs203_... (shows even more internals!)

      (As far the companies go: VLSI Solution Oy / TriTech / Bitboys Oy were all related here.)

      They unfortunately busted before they could release anything, due to a wrong bet in memory type choice (RDRAM, I think) and letting their architecture rely on that, then running out of money, perhaps some other problems. In the end their assets were bought by ATI.

      As for 3dfx, I would highly recommend watching the 3dfx Oral History Panel video from the Computer History Museum with 4 key people involved in 3dfx at the time [2]. Its quite fun as it shows how 3dfx got ahead of the curve by using very clever engineering hacks and tricks to get more out of the silicon and data buses.

      It also suggests that their strategy was explicitly about squeezing as much performance out of the hardware, and making sacrifices (quality, programmability) there, which made sense at the time. I do think they would've been pretty late to switch to the whole programmable pipeline show, for to that reason alone. But who knows!

      [1] https://en.wikipedia.org/wiki/BitBoys

      [2] https://www.youtube.com/watch?v=3MghYhf-GhU

    • 2y
      [deleted]
    • zozbot2342y

      Early 3dfx cards did not do any linear algebra or even 3D rendering. They accelerated triangle rasterization in screen coordinates, everything else was done by the CPU. So, 2.5D at most really.

      • NovaDudely2y

        It is funny playing titles from that era on modern hardware as it is apparent just how much the CPU was doing. Mostly due to the lack of optimization of the geometry routines past basic use of MMX - frame rate are still high just nowhere near as high as you would expect.

        I remember when a friend 1st got an i7 machine and we decided to see just how fast Turok 2 would go. I mean seeing Quake 3 go from barely 30fps to up near 1,000 FPS over the same time period, we figured it would be neat to see. Turns out it could barely break the 200 FPS mark even though it was a good 8 times the clock rate compared with the PC we originally played it on at near 60fps.

        No use of SSE, no use of T&L units or Vertex/Pixel shaders. It is all very much just plane rasterisation at work.

        • rasz2y

          > past basic use of MMX

          MMX is fixed point and shares register space with FPU. Afaik not a single real shipped game ever used MMX for geometry. Intel did pay some game studios to fake MMX support. One was 1998 Ubisoft POD with a huge "Designed for Intel MMX" banner on all boxes https://www.mobygames.com/game/644/pod/cover/group-3790/cove... while MMX was used by one optional audio filter :). Amazingly someone working in Intel "developer relations group" at the time is on HN and chimed in https://news.ycombinator.com/item?id=28237085

          "I can tell you that Intel gave companies $1 million for "Optimized" games for marketing such."

          $1 million for one optional MMX optimized sound effect. And this scammy marketing worked! Multiple youtube reviewers remember vividly how POD "runs best/fastest on MMX" to this day (LGR is one example).

          • NovaDudely2y

            I was using MMX as just an off the top of my head example but you are completely right. SSE would have been a better example. ;)

            Also had no idea about them paying for those optimizations but I am not surprised one bit. It is very in character for Intel. ;)

  • djmips2y

    This is a really great retrospective on 3dfx from four of the founders.

    3dfx Oral History Panel with Ross Smith, Scott Sellers, Gary Tarolli, and Gordon Campbell (Computer History Museum)

    https://www.youtube.com/watch?v=3MghYhf-GhU

    • minimaul2y

      I really recommend sitting down and listening to this - it's a very good cover of 3DFX from start to finish (and I feel like it was probably one of the primary sources for this article).

      • hnarayanan2y

        Yes, it looks like the article was very heavily sourced from this.

    • hnarayanan2y

      This is so very good and even more engaging than the article.

      • muro2y

        The article has a weird writing style, something is off. Maybe it feels like a high school essay?

        • hnarayanan2y

          It does, doesn't it? It just feels very abrupt.

  • 4RealFreedom2y

    I remember getting my first Voodoo card. After playing games, I couldn't stand software rendering anymore. Purchased a Voodoo 2 when they came out. Reminds me of Unreal, Shogo: Mobile Armor Division, Thief, Mechwarrior 2 - so many fun older games! My next card was a nvidia tnt2. I've owned nvidia cards ever since.

    • creinhardt2y

      Dang, hadn't thought about Shogo since playing a demo a loooong time ago. Thanks for triggering a good memory!

  • casenmgreen2y

    > The company endured litigation for securities fraud for quite sometime, declared bankruptcy in December of 1994, and became Aureal Semiconductor Inc on the 9th of September in 1995. This company would also go bankrupt and later be acquired by Creative.

    Aureal produced revolutionary audio cards, with real 3D audio - and I mean real. Close your eyes and you would track the sounds in the space around you with your eyes, from your hearing. It changed everything. I played HL with this, and CS too.

    I may be wrong, but my understanding is Creative sued them into bankruptcy, bought what remained, and never used that technology. I have never forgiven Creative for this.

    • 4RealFreedom2y

      I remember it like that, too. It wasn't a product war, it was a war of court battles. Creative had more money and was able to chip away at Aureal through litigation and caused their collapse.

  • bottlepalm2y

    Man the first 3dfx was such a mind blowing experience. There was nothing like it at them time. Games went from a rough pixelated mess to super smooth and sharp. PS1 and N64 had an edge over PC graphics in the 90s, 3dfx turned the tables.

    • bennysonething2y

      This wasn't my experience. I bought a voodoo 3 , playing quake 1 and quake 2 was just disappointing, at the time it just felt like Goldeneye was next gen compared to those flagship pc games. I desperately wanted it to be a step up. It didn't feel like it. Also it's sort of odd to me that quake two is seen as revolutionary. In my opinion Goldeneye blows it out of the water and it was released in the same year.

      Edit, obviously this is all subjective, and it's about fond memories! :)

      • bottlepalm2y

        Goldeneye levels are very simple, boxy, and low poly count. Quake 1 levels were far more complex and impressive than Goldeneye. Though pixelated and low res which is what 3dfx revolutionized with their Voodoo 1.

  • uberduper2y

    I worked at 3dfx in their final months. I watched as workers from NVIDIA broke down and hauled off a large number of netapp shelves (seemingly without labeling them).

    3dfx was run like a frat house. It was fun while I was there, but it wasn't a surprise when they announced they were shutting down.

  • harel2y

    I was managing a CD-Rom shop when the Voodoo was a thing. We bought one for the store's 486DX computer and I remember the before and after shock of playing Tomb Raider and demoing it to shoppers. It felt like "The Future"...

  • BruceEel2y

    Also worth mentioning, the 3dfx Voodoo SDK, docs and examples. For those like me (zero previous experience with 3D API's) it was a lot of fun to play with.

    And! Still available today: DOSBox-X has Voodoo1 support built-in (or host pass-through glide support), Open Watcom C++ works well in DOSBox and will compile most examples.

    • christkv2y

      Otherwise pcem also has full voodoo support.

  • userbinator2y

    Not too long ago someone recreated a Voodoo5 6000: https://news.ycombinator.com/item?id=32960140

    The (leaked?) datasheets and miscellaneous information for the GPUs seem to be widely available, but that's still quite impressive for a single person; on a similar level to making your own motherboard: https://news.ycombinator.com/item?id=29273829

    • dehrmann2y

      > on a similar level to making your own motherboard

      How hard would it be to make a motherboard from that era?

  • jvolkman2y

    I still remember installing a Voodoo and playing glquake. It was so amazing after relative disappointments like the S3 Virge. Lots of nostalgia for that period.

    The article touches on this a bit, but one of the quirky things about the original Voodoo and Voodoo 2 were that they lacked 2D entirely. You had to use a short VGA passthrough cable to some other 2D card. This also meant that they only supported fullscreen 3D, since 2D output was completely bypassed while the 3dfx card was in use.

    The Voodoo 3 finally came with 2D, but I think I jumped ship to Nvidia by then.

    • beebeepka2y

      I think Voodoo Banshee was a full blown 2d+3d solution that was released before Voodoo 3. Don't take my word for it, though

      • vikingerik2y

        It was, but the Banshee wasn't popular because the 3d side was only half a Voodoo 2, with one texture unit instead of two. It had a higher clock speed to try to compensate but that wasn't enough.

        The Voodoo Rush a year before that also had problems - it was a Voodoo 1 chip, but coupled to a slow 2d processor (back when that still mattered, like for just drawing windows or wallpaper in Windows) and had some compatibility problems.

        It took until 3dfx's third iteration, with the Voodoo 3, to finally get 2d/3d integration right.

      • smcl2y

        And the Voodoo Rush was a 2D/3D card that came out a little before that (August 1997, compared to the Banshee in late-1998). I didn't own either, mind, my only 3dfx card was a Voodoo 3 2000 PCI :D

        • beebeepka2y

          Ah, you're absolutely right. Forgot about the Voodoo Rush. I was a poor kid and couldn't afford either.

          • smcl2y

            Oh me too - my motherboard had no AGP slot (hence the V3 2000 PCI). If I was smart I would've saved up for a few months, bought a more future-proof motherboard (I was on the dying Socket 7 platform, with an AMD K6-2 466MHz) and I would've had a nice little setup that would've kept me going for a while. But I was an impatient kid, and I wanted to play Quake 3 and ended up doing piecemeal upgrades over the next few years that ultimately cost me more overall :D

    • astrostl2y

      > You had to use a short VGA passthrough cable to some other 2D card

      And the lossy, double-clutched analog nature of this crushed 2D clarity IME. I could immediately tell the difference just looking at a desktop screen. It was by far my biggest grievance and resulted in me staying away until fully-integrated solutions were available.

  • tablespoon2y

    > Gordon Campbell grew up in Minnesota, and as there wasn’t a market for technology there at the time, he felt he needed to go to Silicon Valley in 1976.

    That seems like a bit of anachronistic Silicon Valley mythology. Wasn't Minnesota something like the "Silicon Valley" of mainframes at the time? There were several then-major computer companies there: CDC, Cray, Honeywell, IBM, Univac, etc.

  • antiterra2y

    The 2D/3D Voodoo Rush performed so poorly that there was an offer to return the 3D daughterboard in order to get a proper standalone Voodoo card. You could still use what remained of the card for 2D. I did this and they sent me a retail Diamond Monster 3D.

    I thought I was buying something better than a Voodoo and got it at release before there were any reviews. Lesson learned.

    • bitwize2y

      Mesa had a trick that let it capture the framebuffer off your regular Voodoo card and display it in a window.

      I did this with my Voodoo 2. It was like having a Voodoo Banshee. Well, maybe not quite like, but my in-window Quake 2 framerates were quite decent.

    • christkv2y

      I remembered it being jokingly called the Voodoo Rushed.

  • Neil442y

    I remember the audible ‘clunk’ of the relays snapping in on those old pass through cards. Always the sound of a good time. Remember seeing a guy at a lan party carrying his whole box by the pass through cable one too, that made a few people wince.

    • jeffybefffy5192y

      That is wild, but can totally imagine it as the pass thru cable was chunky.

  • exabrial2y

    800mb/s 23+ years ago is absolutely bezerk.

    I remember being in high school and seeing the butter smooth high frame rates (for the time) being like wow.

  • unxdfa2y

    I remember these days. I spent a lot of time buying GPUs and eating ramen. Unreal Tournament 99 on a Voodoo 3 3000 was the peak for me.

    Now I have an Intel iGPU and don’t care. And I can afford to eat better :)

  • severino2y

    Funny thing, when I heard about those "3d graphics accelerator" cards in the late 90s, I just thought they were meant to make your 3d games run faster. But Quake II and similar games already ran very fast in my machine.

    It wasn't until I went to some cybercafe where they had Quake II, among others, installed for network play in computers with one of those 3dfx cards, that I actually saw what an accelerator card could turn a game into, regardless of speed, and I decided that I needed one of those.

    • AlecSchueler2y

      That sounds like a really cool experience. Could you expand further on the differences you saw?

      • severino2y

        Something similar to the first screenshot in the article. Until then, Quake II on software graphics was a cool game for me but a game where you saw pixels everywhere, like every other game at the time. Discrete graphics cards on the other side made the game's graphics take a huge leap forward. It was a quality (and not just speed) I hadn't seen before, so much so that it was almost a different game for me. I guess it was 1999 or so.

      • corysama2y

        Going from low rez, point-sampled to high rez, bilinear filtered (+antialiasing if you are lucky) is a huge quality leap. Basically PlayStation 1 vs PS2.

  • ben77992y

    Count me in. I was in college when this happened. I remember being a Freshman and Sgi had come to campus for recruiting with all these amazing demos on 6-figure hardware. Then my junior year I had scrounged enough money to get a 3Dfx card and my friends and I were all blown away by it. I didn’t end up going into graphics but remember we did a project for computer graphics and I was able to run my code on the 3Dfx.

    It was an incredibly exciting time.

    • corysama2y

      Same. I was fortunate enough to be able to use one of those hundred thousand dollar SGI’s as a sophomore because of my lab job. So, in my dorm I had a Pentium60 capable of running Quake at 320x240 software rendered. And, in the lab, I could play at 1280x1024(!!!) on the Reality Engine with the same point filtered look. But, then I managed to get a Voodoo1 and my little PC could run Quake at 640x480 bilinear filtered. It was nuts to get such a huge speed up for a fraction of the cost of the base machine. Peak “Pop Moore’s Law” going on at that time.

  • pcdoodle2y

    I think what really did it for me and my friends at that time was we could have a better graphics at home that what was available at most arcades at the time. Need for Speed 3 was the killer app for us, Playing over a DB9 serial link or directly calling each others house (and arguing about who was going to call who because the ringers were still on in the parents bedroom and the pickup was after the 1st ring).

  • williamDafoe2y

    Had a PC custom rebuilt in 1998 for $1200 just to run a voodoo2 and play Half-Life 1 and Tomb Raider 1. Also Unreal tournament GOTY Edition. Happy days. Only needed a Celeron 400 mhz chip. But 3dfx did not get the best engineers from SGI and they were doomed to fail eventually. The best engineers founded NVidia ... Got a Riva TNT2 in 2000 and forgot about 3dfx ...

  • darkwater2y

    Oh, this brings back so many memory from when I actually bought and updated PC hardware every time I had some money saved! I remember having a Matrox Mystique (the one with a... joker or something like that on the box), the 3Dfx Voodoo and also a TNT2. I remember moving from the Mystique to the Voodoo playing Quake and it was one of the biggest "wow" moment of my childhood.

  • Seanambers2y

    3DFX was basically the shit in mid late 90s. First time seeing Quake in OpenGL was insane - before that all games were rendered in software.

    • pjmlp2y

      Nope, the blitter engine on Amiga, tile engine on consoles like Nintendo and Sega, the early GPUs on arcades like TMS34010, glide on 3Dfx.

      Quake used miniGL, not full OpenGL.

  • jahnu2y

    While I was at university in Derby, England, I got a sneak preview of a prototype of the first consumer 3dfx card. This was at a gfx card company called DataPath I was doing some contract work via another company. The day I saw it working I started saving so I could buy one the instant it was available. I had quite the advantage at Quake LAN parties for a short while :)

  • wolpoli2y

    >Interestingly, 20 years later, the 3dfx Rampage was tested and found lacking against the NVIDIA GeForce 256 which would have been its competition had the Rampage been released.

    Does anyone know where I could find out more about this? The prevailing narrative for many years has been that the Rampage GPU would have provided a technological edge for a few years.

    • randomifcpfan2y

      A quick web search finds: https://hothardware.com/news/3dfx-rampage-gpu-performance-re...

      But you could argue that the Rampage drivers weren’t optimized.

      • wolpoli2y

        Thanks! I would argue that the drivers weren't optimized because the specification on the Spectre 1000 (Rampage) is roughly at the GeForce 2 GTS level.

    • iforgotpassword2y

      I also found that part interesting. Last time I read about the rampage was about a decade ago, I even saw a video showing the prototype, but just the card itself because "nobody has any drivers for it". Apparently they must have been found somewhere by now. Some fun stuff to research for a Sunday I guess. :-)

  • tiffanyh2y

    In 1998, Tom’s Hardware had a long article on “NVIDIA vs 3DFX - The Wind Of Change“

    https://www.tomshardware.com/reviews/nvidia,87.html

    It’s a great article that predicted a lot of things.

    Side note: love that an article 25-years old is still accessible.

  • smcl2y

    > ATI’s Radeon and NVIDIA’s RIVA TNT2 were now offering higher performance at roughly the same price

    Minor nitpick - Voodoo 3 and TNT2 weren't competing with Radeon. Among that generation were the 3dfx Voodoo 3, NVidia TNT2, S3 Savage4 and Matrox G400 - ATI's offering around this time was the Rage Fury.

    The ATI Radeon was part of the next generation, along with the Geforce 256 and S3 Savage 2000.

    edit: or possibly even the one after that? Wikipedia tells me the Rage Fury MAXX was out around the same time as Geforce 256, and Radeon only showed up around the time of the GeForce 2 family. The MAXX slipped from my memory, it was pretty iirc finicky card (performing badly and having buggy drivers, possibly hardware?) and a bit of a disappointment.

  • fho2y

    ... that article triggered me hard ... I just hate when the first paragraph is interesting and then they start with something completely different (ie "the guy that did the interesting thing"s life) in the second paragraph ... here ... they just did that four times ...

  • jaimex22y

    I just remember glide being that annoying thing you had to wrap to get the original n64 emulator working.

    • iforgotpassword2y

      Because the original emulator required a 3dfx chip. The N64 was based on an SGI chip, which coincidentally was quite similar, so it was easy to translate to glide calls.

  • bee_rider2y

    It is always funny to read the comments on this site for a computer history post, you’ll get

    1) I remember buying that card with my first paycheck

    2) Oh yeah, me and/or my buddy worked on that

    As someone who was a little kid for most of the 90’s, all I can say is thanks to everyone who worked on the stuff — it was a truly magical time when every time your dad brought you to the computer shops they’d have some new demo set up they could do things that simply were not even imaginable a last time. Plus since I was not even a teenager yet, I didn’t have to learn about the battles between which model of rendering was better, it was just pure magic.

    Hopefully we’ll speed things up again soon. I’m worried that there may be a generation at some point that doesn’t realize we’re in the future.

  • christkv2y

    I could not afford the Voodoo card so I had to settle for a verite based card. In retrospect that chip architecture was way before its time being a risc architecture. To bad it could not compete on raw performance but at least Quake ran great on it.

    • djmips2y

      Yes it was intriguing. It was a programmable GPU in a sense. The drivers downloaded micro code for the triangle shading.

  • russ2y

    I remember my dad’s friend joined 3dfx in ‘94 and we became testers for all their prototype cards. Being able to play Virtua Fighter on our PC still ranks as one of the most magical moments I’ve had with technology.

  • Genbox2y

    Back in the day I bought a Savage 3D graphics card (1998) over the Voodoo 2 Banshee because the price vs. value was much better. At that time, 3Dfx was known for it's 3D performance, but was still clunky on the 2D part, especially considering the Matrox G400 GPU which had dual VGA, amazing color correction and software support everywhere.

    It went so fast going from having no 3D acceleration to having more 3D accelleration than you could imagine. It died fast too when ATI and NVIDIA became the only ones left, which is still true to this day.

  • quelsolaar2y

    This story misses some of the details of how Quake cane to support 3Dfx. Carmack had an intrgraph workstation that had openGL at the time and had written a version of Quake for it: GLQuake. This version was shown to 3Dfx, and 3Dfx realized they could imlement a OpenGL drive that only supported the features Carmack had used for his Intergraph port. So a "MiniGL" driver was boarn. This was a huge deal for 3Dfx, and loats of people got their start programing OpenGL using the "MiniGl" subset

  • johnny_canuck2y

    I love reading about 3dfx as their hardware had a great deal of impact on my preteen years (gaming night and day). I'm always amazed to read how small some of these 90s tech companies were.

  • pentagrama2y

    Oh, I remember installing my Diamond Monster 2 video card as a kid, with high expectations of this thing called 3dfx and the Voodoo chip built in.

    The first game that I launched was Need for Speed 2 SE, who supported 3dfx, and oh boy o boy, the difference was night and day, it blew my mind at the time. The textures, shadows, look and feel overall was the best thing that I ever seen. It also has exclusive features for 3dfx like bugs splatters on the screen in some parts.

    A leap forward on gaming to my teenager eyes at the end of the 90s.

  • a3w2y

    3dfx Voodoo 3 - ah yes passive cooling, I lost some skin touching an aluminium fin. An active cooler was needed, but easily screwed on.

  • OnlyMortal2y

    My first card was a Voodoo 1 - for the Mac. The difference it made to Quake was dramatic.

    When the Voodoo 2 came out, I couldn’t find a vendor selling them for the Mac. I’m not even sure anyone did.

    I bought a Voodoo 2 card that worked with a patched version of Mesa but it was slow.

    I managed to find a driver on some peer sharing thing (Hotline??). I’ve no idea of its origin but it worked fine.

  • papito2y

    Man, I remember I had a crappy Celeron because we were dirt-poor. At some point I finally bought a Voodoo 3, popped it in, and fired up Tomb Raider. I was blown away. The game went from pixelated chop-o-rama to smooth and gorgeous (in those days). I could even play combat simulators like European Air War. Good Times.

  • hatsuseno2y

    > All of this meant that the Voodoo Rush was dead on arrival.

    Interesting this should be my first video card that I bought myself. I do remember there being issues using specific renderer libraries, swapping out glide dll's and that sort of stuff to make certain games work. For a DOA card I sure was happy with it!

  • bentt2y

    I loved my 3DFX cards but there was a moment where I realized that a Nvidia Riva TNT would properly accelerate OpenGL 3d applications (3dsmax in my case) and it was over. 3DFX didn't implement OpenGL properly at the time. It always felt like something was broken if you were outside of Glide.

  • rjmunro2y

    I find the way this article converts money for inflation really annoying.

    $70 million in 2000 is not "$121,613,821.14 in 2023", it's "about $120 million in 2023".

    The number is changing at about 7c/second. Those 14 cents are no value whatsoever.

  • fabiensanglard2y

    The one game that always puzzled me what Diablo 2. I never saw it run better than on a Voodoo 2. It was butter smooth. I compared it to a TNT2Ultra and maybe an ATI Rage 128 and none of these came even close.

    To this day, this remains a mystery.

  • Maxburn2y

    My love for the Voodoo 2 knows no bounds, it was the start of a beautiful thing for me.

  • ErneX2y

    I had one the 1st Voodoo cards, it was a Diamond. When you ran a game that supported it you could hear the mechanical relay switching from the regular video card to the Voodoo.

  • 2y
    [deleted]
  • drKarl2y

    I got the 3dfx Voodoo which required a 2D card! Later on I got a Nvidia Riva TNT! It's been a long way, now I have an RTX 4090...

  • EricE2y

    I still have my Voodoo 2 - I just moved and stumbled on it recently while unpacking. What an era.

  • secalex2y

    I can still hear the clunk of the relay switching over to 3D (versus 2D pass-thru) when booting Wing Commander.

  • app4soft2y

    > Voodoo2, Diamond Monster 3D, image from [Russian site]

    Why the heck this image claimed being taken from [Russian site] if that image is from Buyee?[0]

    [0] https://buyee.jp/mercari/item/m78940760663

  • ReptileMan2y

    And glide is the only 3d accelerator that made diablo 2 look good. To this day.

  • stan_kirdey2y

    amazing read

  • 29athrowaway2y

    And then NVIDIA ate everything.

  • nix232y

    Playing Wizardry8 right now (on wine with the glide patches).

  • kleer0012y

    [flagged]

    • saghm2y

      They work for me (Firefox on Arch Linux). Maybe they're using JavaScript for it somehow, and you have it disabled?

    • jdboyd2y

      On Chrome 110 on Ubuntu 22.04, both arrow keys and page up/down work for me.

  • mikezirak2y

    [flagged]

  • carabiner2y

    typo: loose/lose