In 1988 when I got a job working in MSDOS, WordStar was in use by the programmers to edit source code. Though one guy used QuickEdit, and I used MicroEmacs. We all standardized on Brief a couple of years later.
My Grandfather had Alzheimer's. He didn't recognize anyone, but if you put a harmonica in his hands he could still play the polkas he learned when he was young. I can't say whether it had any other effects, but it certainly wouldn't surprise me.
There is some capacity in the primitive brain that allows some birds to mimic what amounts to a simplified audio file, storing and replaying it like data without any real understanding of the data or how they do it.
Others seem to genetically have a song of the species whether or not they learn from the birds of a feather.
It can remain in storage for who knows how long, it's in their brain cells of some kind, and memory is what they do.
Some of them sing pretty good and as nature intended can be heard from quite far away. So the other ones can recognize them by their song starting in the most prehistoric of times. Familiarity acts like it's built-in from the get-go too. How long have living things been recognizing each other by their song?
Like a digital file, it's the same data regardless of the playback tempo.
What other kind of place would you want a time-series abstract data file to be stored in higher life forms? It would be good to have a part of the brain inherently prepared for flawless memorization since before anybody could possibly figure it out.
Plus there may be other parts of the advanced brain that are bigger and more capable if suitably engaged, even if not comparable in terms of effortless, flawless memorization and recognition of things as abstract as frequencies, timbres, and their patterns across time.
Put it all together and that could be a virtuoso or something.
One of my late partner's grandmothers ended up with Alzheimer's and when it progressed to a point, it became difficult to communicate because there was so much hesitation and loss for words. She had been born a Louisiana Cajun and graduated from the first school in her town in the early 20th century. Same textbooks, radio programs, TV shows (when they came along), as any other grandmother from that generation in other states, so she was a regular native US English speaker and nobody ever thought she had a real Cajun accent at all. Nothing like her parents or grandparents but she was born into it and still was exclusively spoken locally for her early years.
Well she left the backwoods behind when she married a Texan and moved there, and hardly ever spoke French again. Sounded like she was from Texas when she had all her faculties, that's like 60 years later. Then years after that there was serious decline but you could play her a Cajun song and she could sing or speak perfect French and tell you what it was like getting ready to go to a high school dance back on the bayou. In detail.
I wonder if the idea was bad, or just that the PCI+VLB boards tended to be last dregs of the 486 market, where the selling point was "The system integrator can clear out his cabinets of rapidly obsolescing VLB cards" or "the person in a weirdly budget-constrained upgrade can avoid replacing the VLB video card". They weren't competing for the premium market and probably cut other corners; this was after all the era of fake cache.
The PCI chipsets of this time were really really buggy, that's why.
They got better as time went on but it really took a number of years before people could get reasonably high-performance, reliable PCI implementations. For x86, aside from AMD's Irongate (750/760) chipsets (K7-era) and nVidia (nForce), pretty much only Intel had PCI working reasonably. ALI, VIA and SiS PCI implementations always had weird issues and quirks.
PCI-IDE adapters are another good example -- VIA's PCI IDE had all sorts of issues, if you wanted high performance PCI IDE, it worked best with Intel.
It's not that different with modern PCIe-SATA either (history repeats), Marvell PCIe/SATA adapters have lingering oddities.
No, Irongate was very much internally developed by AMD. AMD was well aware of the "sketchy" nature of ALI/VLI/ULI/SiS and knew it was giving them a really bad rep so they undertook their own designs.
AMD chipsets were not as successful in the marketplace because unsurprisingly the AMD-chipsets cost more than the Taiwanese ones, and motherboard vendors (who are almost all based in Taiwan) stuck with their existing vendors. The dual-socket AMD machines (K7-based) pretty much all had the 760MPX on it because I think only AMD had a multi-socket chipset that was reliable.
AMD also inhereted a lot of DEC engineers, so it was no surprise that K8-era their 'HyperTransport' was really 'Lighting Transport', developed at DEC.
Interestingly, Micron had also developed a chipset for AMD in this era but never released it.
nForce was also not re-licensed from ALI/ULI that I recall but I have far less insight here. nVidia at the time had a 'total system play' in mind so they were attempting to do GPU/audio/network/chipsets for AMD & Intel and got designed into original Xbox (x86-based). Jensen I think used NRE money from MSFT to fund a lot of the nVidia chipset work.
> AMD was well aware of the "sketchy" nature of ALI/VLI/ULI/SiS
From my limited experience in practice AMD-751 was more problematic than KT133/133A. Those were still the times when hardware reviews had sections dedicated to "Stability". For example https://www.anandtech.com/show/718/5
"Even when running the DDR SDRAM at CAS 2 settings, the system did not crash once within 24 hours of our stress tests. We continued to run the stability tests and finally the first crash occurred after 34 hours of operation. Considering that this is FIC's first try at a DDR board we were very impressed with the stability of the AD11."
Personally I dont remember many vendors sticking with AMD chipsets after switch to Socket A, at least in Europe it was all VIA with some ALI/SIS. As for price its all on AMD, nobody forced them to manufacture at Dresden. If chipset was so strategic AMD should have sold it with minimal/no margin. The way I see it AMD was just seeding the market making sure to avoid chicken and egg problem.
I wonder if there was also some business sabotage in effect.
It would have been extremely tempting for Intel to pressure motherboard manufacturers with some subtle messages like "Nice AMD750 board, too bad we don't have any more 440BX chips for you."
The Athlon debuted with only like three compatible mainboards, and two of them were minimally rebadged versions of the AMD reference design.
Of course, it was still too compelling of a platform to ignore, and everyone got on board soon enough. But if R&D had stayed away from the Athlon market for an extra 6 months or a year, that could manifest in worse board design and optimization for quite a while.
ULi got subsumed by nVidia shortly after PCI-Express became the norm.
They made a chipset which offered a fairly compatible AGP-like slot alongside PCI-e, and one with two full x16 slots when this otherwise required a very expensive nForce board. So of course, nVidia immediately blocked SLI support on it.
I installed ESIX (Everex's SysV Unix) off of 40-something 3.5" floppies in the mid 90s. Naturally there were a couple of bad disks and the installation sat partially done for a few days while I awaited replacements to arrive in the mail.