Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The strange trend of "authoritarian minimalism" design that seems to be working its way through the majority of newer software is very strange. I wonder who actually wants this stuff, and is not content to merely make it a default, but instead forces it to be the one and only way.

Monitors are bigger than ever with huge resolutions, and yet UIs are being dumbed down to uselessness and alienating an increasing number of users.

A recent related article https://news.ycombinator.com/item?id=29954266 seems to indicate that not even people inside Microsoft --- who are being forced to use Win11, because MS --- have any say in the matter. It's almost like some tiny extremist faction has gained control of Windows and is determined to show everyone else how much power they have by making these changes and gloating sadistically at seeing everyone object, but still end up using Windows.

I wonder how much damage they will inflict before people start turning to WINE and a saner Linux distro, just to run their Windows applications.



> It's almost like some tiny extremist faction has gained control of Windows

This has been the case for a while. I worked on the Windows Desktop Experience Team from Win7-Win10. Starting around Win8, the designers had full control, and most crucially essentially none of the designers use Windows.

I spent far too many years of my career sitting in conference rooms explaining to the newest designer (because they seem to rotate every 6-18 months) with a shiny Macbook why various ideas had been tried and failed in usability studies because our users want X, Y, and Z.

Sometimes, the "well, if you really want this it will take N dev-years" approach got avoided things for a while, but just as often we were explicitly overruled. I fought passionately against things like the all-white title bars that made it impossible to tell active and inactive windows apart (was that Win10 or Win8? Either way user feedback was so strong that that got reverted in the very next update), the Edge title bar having no empty space on top so if your window hung off the right side and you opened too many tabs you could not move it, and so on. Others on my team fought battles against removing the Start button in Win8, trying to get section labels added to the Win8 Start Screen so it was obvious that you could scroll between them, and so on. In the end, the designers get what they want, the engineers who say "yes we can do that" get promoted, and those of us who argued most strongly for the users burnt out, retired, or left the team.

I probably still know a number of people on that team, I consider them friends and smart people, but after trying out Win11 in a VM I really have an urge to sit down with some of them and ask what the heck happened. For now, this is the first consumer Windows release since ME that I haven't switched to right at release, and until they give me back my side taskbar I'm not switching.


UX at MS is this weird consulting style org. Maximum visibility with minimal accountability. I think the Office 2007 Ribbon was the last new thing they did that I liked, it just needed a search function to quickly refine the buttons. That wasn’t allowed externally because needing a search function was an indication that the UX designer failed. I know devs tend to hate the Ribbon but I thought it was well thought out.

UX was constantly trying to remove the start menu, I remember hearing that it was Bill Gates who was adamant that it stay in. I had a good laugh at Win8 when they finally got their chance to remove it and totally messed it up.

Now the Win11 UI is so slow that it feels like every click must be hitting telemetry first. When I right click I see the normal context menu briefly before the simplified context menu. Sometimes it’s so slow enough that buttons will move right before I click on them.

I now have a stutter in games making them unplayable. Having to boot into Linux for games is not something expected to happen so soon. I still use Windows for legacy reasons but few things would make me happier than deleting my windows partition for good.


> Now the Win11 UI is so slow that it feels like every click must be hitting telemetry first.

This reminds me of an experience with Windows 10 a few years ago.

I tried to start cmd.exe and nothing happened. Tried a bunch of times more. Still nothing...

I was connected to a WiFi network that allowed ping to the entire Internet but not TCP connections. Then I switched to a WiFi network that allowed TCP connections to the Internet and 20+ cmd.exe instances started all at once.

I thought: "Wait, that can't be true", but it was reproducible.

This makes me wonder if launching programs takes longer on a slow internet connection.

I do not know if this still happens as I have only used Windows 10 on a handful of occasions since then.


That's bonkers. Why wouldn't you write telememary to an on machine log/queue that can trickle out as needed and not impact the performance of the system.


Maybe because they put their needs before yours?


I was recently on a boat with a bad connection. Searching in windows (11) doesn't seem to work well if that is the case. Even searching for local files, the response is just loading since it appears to wait for some request.


> That wasn’t allowed externally because needing a search function was an indication that the UX designer failed.

That's a fascinating dynamic. We can't make our product better, because that would suggest that it isn't already good enough.


I hated the Ribbons. Nothing was categorized where I expected and none of the icons or sizes match their use or frequency of use.

Drop down menus are so much quicker to scan


> Drop down menus are so much quicker to scan

Another commenter mentioned OpenOffice (http://www.openoffice.org/) but i'd also like to suggest LibreOffice (https://www.libreoffice.org/) as a capable and reasonably compatible alternative to MS Office, which retain the old look of dropdown menus.

If you don't have the need to support very particular features of Office, then it might just be suitable to retain your preferred way of navigating the UI of an office suite app - on my personal computers i have just LibreOffice (with which i finished my bachelor's and master's thesis), whereas on my work computer i have both LibreOffice and Word, for those few exceptions when the latter is necessary.

That said, i can understand why many would also find the ribbon UI to be easy to navigate, should they get used to it. That's why some Office clones, like WPS Office (https://www.wps.com/office/linux/) and FreeOffice (https://www.freeoffice.com/en/) seem to copy it with varying degrees of success. That said, personally i like the open source nature of LibreOffice too and the file format itself not being proprietary.


If I understand correctly, LibreOffice is a fork/successor of OpenOffice. Here's a comparison on the LibreOffice website: https://www.libreoffice.org/discover/libreoffice-vs-openoffi.... It's a bit disingenuous though since it makes it seem OpenOffice hasn't had a release since 2014, even though last release was October 2021.


I found it quite accurate, not disingenuous at all. OpenOffice has been a maintenance-only project since 2014, they do release minor versions once in a while (like 4.1.11 in October 2021) with security fixes, some small bug fixes, plus whatever free-update they get from upstream libraries (e.g. updates to Unicode), but they don't have any plans to keep developing the software further, no new features. OpenOffice is a extraordinarily well maintained "legacy" software, and on that note, it is really cool that Apache stepped up to maintain it.


Libre Office also offers an optional ribbon style interface


Oh yeah, totally forgot about it!

In case anyone would like to try it out, it's under:

  View > User Interface > (pick one of the Tabbed options, there are quite a few)
Now there's one more argument for open software - the developers caring enough to giving us that many options to support the preferences of many people without needlessly deprecating anything!


Ranting in Gibbons:)

Still i wonder how much it effects my productivity decline in Office since i still search sometimes for functionality. In anyway it made me hate office and made Switch to open Office wherever i can. However I attribute it to having kicked of my quest for a plaintext workflow because beeing f*ked over by feature bloat and Designer circle jerking once to many times.

Simplicity is king. Maybe its just the age that makes me realize that there are just to many layers of complexity.

I feel for the young learners that get thrown into this world and dont even understand the concept of a file system tree anymore. It has been abstracted away. it's not in the cloud! It's in the app with 3 klicks reachable or doesn't exist.


I can't figure out how to reach my files in Windows 11.

There are around 6 Documents folders/libraries/whatever in File Explorer, going to a combination of my work OneDrive, my personal OneDrive and my local user folder. There's one weird Documents that goes to the Documents in both OneDrives. There's a My Documents that give an error. I've a work Sharepoint folder that was also named Documents. I tried renaming them, which didn't work, and has just left me even more confused how to find things. If I go to the command line and do "dir C:\Users\<me>\Documents" which of this do I find? And how do I get to the others? And which ones get backed up where?

It's a complete mess. I sympathise now with the people who keep every file on the desktop.

Every time I go back to Linux, I feel a wave of relief: I see my files in a single dir tree under /home, searching works predictably.

How did Microsoft get to this?


I installed Everything, disabled Windows search/indexing and I always start working with files from there. Windows Explorer is simply not worth the hassle.


I use the portable (no install) version of Everything. http://www.voidtools.com/


Don't let MS tell you where to save your files. C:\docs (or similar) will always work (until nothing is stored locally).

Avoid the "My Computer" paradigm.


All of our development machines have multiple drives mapped to specific functionality:

    C: System
    D: Data (business data and main work area for product design data)
    F: Library (stuff you rarely touch, PDF's, books, references, "knowledgebase", etc.)
    G: Backup (external)
    S: Swap and Scratch Files (this is actually a RAM-based 128 GB drive for speed)
    Z: Development (mostly for web development, VM storage, etc.)
The idea is that the C drive can be taken out and shredded and the most valuable part of the computer, the data, is unharmed. Decades ago I learned --the hard way-- that storing your data on the same drive as the OS/system files is a dangerous thing.

The other thing this facilitates is backup. You can backup and restore each of the logical/functional units separately, as full drives.

This also makes upgrading the system or the entire computer far simpler. The separation between OS and data makes it so.

Way back when, before the registry was a thing, you could upgrade your OS without having to reinstall the applications. While I understand the advantages of common DLL's and the centralized management of common settings and code, I do miss the ability to not only separate data from the OS, but also applications. I don't think that is ever coming back.


> S: Swap and Scratch Files (this is actually a RAM-based 128 GB drive for speed

Isn't it better to just have the RAM as RAM than to have the same RAM used to provide swap space on a RAM disk?


Thats what made me smile to :) Swap trying to swap RAM to RAM then suddemly: swap recursion :)


Sure, yet, it depends on what you are doing.

One of the primary motivations for using a RAM disk was to not beat-up SSD's with the kind of access swap space gets.

The advantage of a hardware RAM-based drive is that it is extremely fast. When these machines were built, this was the fastest way to get data on and off a swap drive.

Today SSD's are super fast. On some machines we now have a separate dedicated 250 GB SSD for swap. If it craps out, you throw it away and pop in a new one. No worries about comingling valuable data with swap space.


This^.

I still have my muscle memory tuned to creating and using only:

  C:\Games
  C:\Music
  C:\Movies
  C:\Pictures
  C:\Torrents
  C:\Workspace
I hate the default

  C:\Users\ChuckNorris89\Pictures\ or whatever.
Much prefer the linux ~/Pictures instead.

Or maybe this just shows how old I am.


Well ~ is basically just a shortcut to /users/ChuckNorris89. It's kind of surprising MS hasn't mapped ~ to go to basically the Windows home folder. In R on Windows you can use ~ to go there at least which is nice.


FWIW ~ works in PowerShell too.


  %CSIDL_MYPICTURES%
  %USERPROFILE%\Pictures
https://docs.microsoft.com/en-us/windows/deployment/usmt/usm...

You can also add your own environment variables.

  setx pix "%SYSTEMROOT%\Pictures"


It works great but you won't escape the merriad of system folders in Documents or Pictures that way. Apps, games and the OS itself flooding these folders is the main problem why file management on Windows sucks.


That even happens on Linux. Apps dump a thousand little files in ~.

Distros and package managers putting binaries in completely different spots. Is it /bin, /usr/bin, /usr/sbin? Who knows!


> /bin, /usr/bin, /usr/sbin? Who knows!

If anyone is curious:

Historically /bin and /sbin contained the binaries that were necessary to bring up the system (especially to mount the /usr partition, which was "best practice" to have separately from the root and /boot partitions). Nowadays most distros just symlink them to /usr/(s)bin

/usr/sbin is for utilites that only root should use, whereas /usr/bin is for regular applications managed by your system (i.e.: your distro's default package manager).


I prefer the short paths myself. I create something similar and add it to the quick access in explorer.


>Avoid the "My Computer" paradigm.

My Computer paradigm is long gone. They call it "This Computer" or just "Computer" now. There is no "My".


I believe that started around Vista/7 and is a very telling sign of whose computer they really think it is now.


I understand this is (probably) a tongue-in-cheek comment, but the "My" prefix was infantile and terrible for scanning the alphabetically ordered list of folders ("My Documents", "My Music", "My Whatever"), so dropping this nonsense feels like a step in the right direction.

It still baffles me though how the Windows designers assumed it would be a great idea to bless me with a predefined "3D Objects" folder (right under "This PC"), which comes at the top of the list due to alphabetical ordering, and can't be removed. I wonder what percentage of users actually need this.


IIRC they introduced that one around 2012, when 3D printing was taking off and was thought to be the next big thing


They start off by being very bad at search, they then compound the issue by creating the Library concept that transparency blends multiple nodes in the tree structure together. Then they make it needlessly slow.


I finally caved into the pressure and started using onedrive instead of local storage, but STILL it seems to take umpteen clicks to load/save from office. Is there any way to navigate the onedrive folder hierarchy without clicking through to the old file dialog? If there is I haven't found it.


> It's a complete mess. I sympathise now with the people who keep every file on the desktop.

The UI in regards to navigating files in Windows is indeed a mess. A personal gripe that i have with it, that's even worse than some of what you named is the file open/save dialog - there's not just one (that would let you write the file path in the bar and thus allow you to copy paths from Explorer to it), but many different ones for different programs! This essentially makes it so that using some software like GIMP and navigating around the filesystem with it is needlessly annoying!

Add on small annoyances like Windows search by default trying to look into the contents of many files, thus making search needlessly slow and also things like it not supporting mounting remote directories over SFTP (which should get at least a bit of attention) instead of always having to use SMB or whatever, and you have a somewhat problematic daily experience.

Of course, there are also things that it does better than some other systems/configurations, like having the whole recycle bin concept, with ctrl+Z allowing you to undo file deletions, file moves or even renaming files, which makes dealing with user error on your part more easy! Just checked on Linux Mint - the recycle bin works as you'd expect, but there is no ctrl+z to restore the last deleted file automatically.

> Every time I go back to Linux, I feel a wave of relief: I see my files in a single dir tree under /home, searching works predictably.

About this, i'm also somewhat torn. The filesystem structure on most Linux distros also just feels weird, although it's for historical compatibility reasons (not quite as bad as Windows having two oddly named Program Files directories, but still). I doubt that we couldn't structure things in a more reasonable manner, than having all of the following for just executables, for example:

  /bin/
  /sbin/
  /usr/bin/
  /usr/local/bin/
  /usr/local/sbin/
In case anyone wants to learn more about these, here's the Filesystem Hierarchy Standard document, which is actually pretty well written: https://www.pathname.com/fhs/

Throw in /opt and /usr/share and whatnot and it's a recipe for different bits of software out there using different configurations because of differing opinions about what would be suitable for their use cases. The same mess actually extends to what's in the /home directory - sure, in most cases you can back it up and restore it, but once again every piece of software will have its own opinions about how to structure its data, be it in a visible/hidden config file, a visible/hidden folder for the application itself, or something else.

Of course, Windows also has a similar mess going on with Program Data and AppData folders, as well as Documents and whatnot, so Linux isn't the only problematic system here.

In short, i think that all have their advantages and shortcomings, here's hoping that things improve in the following decade!


> not quite as bad as Windows having two oddly named Program Files directories, but still

At least three, if you inclde "Roaming\Appdata" (or is it "Appdata\Roaming"?), which is also a program files directory nowadays.

As for Linux, yeah... Either stick with the Filesystem Hierarchy Standard, or -- if it's time to revolutionise the directory structure -- maybe something like what GoboLinux (et al?) are doing.


On my Guix systems, I've gone full obstinate idiot:

- / is on tmpfs

- the OS is mounted on /gnu

- system data is mounted on /var (and /etc is populated from /var/etc)

- user data is mounted on /usr

This way, I can avoid the plethora of separate tmpfs filesystems on most distro's (at least /dev, /run, /tmp, /dev/shm).

But I'm not running any desktop systems on Guix right now, just service containers. I'm would expect there to be plenty of Linux desktop software that can't handle /usr being for user data.


I went the Everything way and gave up trying to remember what is where. Not a perfect solution but a huge productivity bump for me: https://en.wikipedia.org/wiki/Everything_(software)


> The UI in regards to navigating files in Windows is indeed a mess. A personal gripe that i have with it, that's even worse than some of what you named is the file open/save dialog - there's not just one (that would let you write the file path in the bar and thus allow you to copy paths from Explorer to it),

Note that even in the older-style file dialogs, that don't have that new (relatively speaking) breadcrumbs-style file path display at the top (like in Windows Explorer), you can still paste a path into the file name input box and it'll navigate to that folder. Relative paths also work. The only drawback is that the file name will then be reset to whatever it was when the dialog was initially opened, but other than that it works nicely.

> but many different ones for different programs! This essentially makes it so that using some software like GIMP and navigating around the filesystem with it is needlessly annoying!

The problem is a) backwards compatibility – programs can customise the file open/save dialogs to quite some extent, so they need to explicitly opt into using the new dialogs. Plus software that wanted to support older Windows versions (pre-Vista, so these days it's probably not that relevant any more, but during the Vista/7-era it definitively mattered) then needs to have code to handle both kinds of file dialogs. b) as far as I can tell, some cross platform frameworks use completely self-written file dialogs, which 1) usually mimic the older (Windows XP and older) style of Windows' dialogs and b) usually don't manage to copy all the features of Windows' native file dialogs, and instead often get some things subtly (or less subtly) wrong.

My personal gripe with the breadcrumbs file path navigation is that the breadcrumbs dropdowns don't support keyboard navigation like the main explorer window (or the file list in the file open/save dialogs) does, so when you have a folder with lots of subfolders and want to use the breadcrumbs to change the path, you then need to do lots of scrolling instead of simply being able to press a key.


Agreed that Linux directory structure is kind of a mess. The extent to which it's a mess varies from distribution to distribution, though. For example, on Arch, /bin, /sbin, and /usr/sbin are all just symlinks to /usr/bin.


I'd certainly try LibreOffice which is much better supported and modern than the sort of abandoned OpenOffice.


>I feel for the young learners that get thrown into this world and dont even understand the concept of a file system tree anymore.

Linking to my previous rant on this: https://news.ycombinator.com/item?id=28994133


I miss the ribbon every time I use LO.

I do think the categorization takes a little getting used to but once you do it’s fairly trivial.

It’s also important to recognize the horizontal categorizations within each ribbon, something that can easily be missed until you know about it.


I agree that the grouping doesn't make much sense at all in most programs. It would have been nice to at least be able to reorder things like in classical UIs, which I also find superior from a usability perspective.


But drop down menus were still there.

I thought the ribbon was great, but obviously you had to get used to it.


Windows 11 also got rid of the ribbon in windows file explorer


Jensen Harris' blog was deleted when Microsoft replaced their dev blogs platform, but it went into great detail on the creation of the Ribbon. As far as I can remember, search couldn't be ready in time for the initial release.


That rings true. From memory there was a plug-in that added it later, and I asked why we didn’t release it and was told that if users were given the option it would subsume more and more functionality until it became unwieldy and difficult to use. Also since it would be cheap and flexible there would be a push to use search even when not ideal.


You liked ribbons?! As a Mac user ribbons made me really happy I was not on Windows although they did bring that poison over to the Mac with their office apps.

Occasionally I have to use windows and the abandonment of regular drop down menus and introduction of ribbons is really disorienting in my view. MS changes too much core UI. I like that Apple is more conservative and keep stuff that works. Keeping the menu bar but adding a search bar to it was a much better solution in my opinion. It allows us to use a well established UI paradigm while also making it easier to find well hidden entries.

Hunting through stuff in the ribbon is not easy.


Ribbons were an excellent design pattern for new users. The problem is it disrupted many power users who reacted negatively.

I think anything that changes nothing is more willing to have faithful users but changes can be good if done in a utilitarian way considering different types of users.


Power users are using the keyboard and probably hide the ribbons.


That’s why ribbon search would have helped a lot. Plus if fast enough people could use a shortcut to get to search and then quickly type a few characters to a function e.g. Meta S -> “pa” for paste, much faster than finding a tiny button and easier to remember than a combination of keys. Like the Jump feature in Winamp. If you forget the small subset of characters that give you what you want you can use the longer names. The results list would contain the key combos if you wanted to use that next time instead. While searching the Ribbon UI could show you which tab and button refers the currently selected search result item. It was really needed to bring it all together. And for power uses why not have customizable tool strips....


No UX designer worth their salt would ever say search wasn’t an essential part of most interfaces. It’s 101. Even I know that. Sounds like you just had bad UX designers, which fits the observed data.


I've certainly met more bad ones than good ones. A lot of them consider themselves artists and with a focus on appearance. Search is a box you put text into, not much room for expression. Search can be hard to get right and they tend to lose control of it to the devs who can easily screw it up and sink their UX. Even Apple does a terrible job with search.


> Now the Win11 UI is so slow that it feels like every click must be hitting telemetry first. When I right click I see the normal context menu briefly before the simplified context menu. Sometimes it’s so slow enough that buttons will move right before I click on them.

How is this an indication of UI/UX (where UX is how things should work, but not implementing it) failures? It sounds like this one is purely on the development org for writing someone that has poor performance and/or not prioritising improving it.


A slow UI is bad UX. UX should have pushed back in releasing a slow UI until the developer fixed it.


Maybe they did?


UX tends to have more power at MS than devs for exactly this reason so it is unlikely. Additionally it can still be bad UX even if it's not the fault of the UX designer.


Microsoft shouldn't even have MacBooks available unless specifically for the minority developing for Apple platforms. It's just a concession of defeat otherwise.


Chasing after the Apple aesthetic doesn't work because they're not Apple. Design works holistically and you can't extricate it and expect success.

To make this obvious, let me use an analogy. If a vegan restaurant found a very successful steakhouse and decided to copy their decor of say game animal heads and rifles on the wall, it would probably decrease their sales regardless of how faithfully they match the aesthetics.

The context matters. Also the fact that both imaginary businesses are in the same industry of serving food isn't enough of a similarity. It's quite specific.

And even if the contexts match, the pursuit just makes them seen as a knockoff; a laggard, an imitation.

The goal isn't to meet the competition, it's either to beat it or be somewhere else.

Intentionally making yourself a 2nd rate Apple isn't a positioning strategy; they could simply introduce a discount line and knock you out in a single blow. It's phenomenally stupid


+1, context matters a lot. In my experience, the Dock alone is pretty terrible for window management. I use it mostly as a launcher and for easy access to files/directories. Its design works in macOS because it isn't the primary utility to manage windows there.

The center of window management in macOS I'd say is easy access to Mission Control (prev. Exposé). All Macs come with either a huge, multi-touch trackpad or a Magic Mouse with gestures to trigger it. I sometimes use my Macbook with a "normal" mouse— I've mapped all extra buttons to trigger Mission Control commands otherwise I wouldn't be able to effectively navigate my way around.

Moreover, full-screen windows are automatically made their own desktop, so one switches between these windows using desktop switching. I actually am just now remembering that the Dock supports window switching with a ctrl-click: I completely forgot since I only use it for rudimentary app switching and for windows it's all Mission Control and desktops.

This, is context at the OS-level and hardware levels, but of course there's also the user base and what they're used to after years of using their computers.


Exactly, I'm also on mac and I don't really like some individual design choices, but when taken together, the whole is greater than the sum of the parts.

Copying some individual elements is just really dumb. For example, Gnome 3 copied the mac menu bar. Except menus don't actually go there (windows still have their own menu bar), negating the very meaning of having an unified menu bar.

The result is that my desktop now has a ugly-ass, useless, wasteful black bar on top. Why.

I have to agree with the OP, it's the triumph of form over function.


I and many others find the Apple UX experience to be awful. If I wanted to suffer through it I would just buy a MacBook. I purposely didn't get another MacBook when my last one died.


+1 We want an improved W2K... Linux is probably the best way to get it (with many tweaks... start by not using Gnome).


Win2K was great. When I used Win7, I used the Windows Classic theme which made it look like Win2K.

I use Windows Blinds on my Win10 system to make it look like Win98, which is close enough to 2K.

And for extra lulz, my screensaver is the Windows 95 loading screen.


At least GNOME designers apparently are using GNOME and they still let me configure my desktop as I want it to be. I undid some man years of their work and used some others.


Uh, I think GNOME3 suffers from the same authoritarian minimalism design problem.


Indeed but at least they dogfood their work and they like the result (can't understand why.)


Some people like Gnome. I like using gnome on my Lenovo Yoga as it is as efficient being keyboard driven as it is using the desktop as a tablet.


I usually prefer tiling WMs but among the more modern Linux desktops Gnome is my favorite. It does all I need in a polished way. At least since the performance improvements, initially Gnome 3 was just too sluggish.


It wouldn’t conflate the state of Windows 11 with macOS at all really. Just because you dislike macOS doesn’t mean Windows 11 is like macOS because it’s kind of bad.

macOS isn’t perfect but Apple hasn’t mangled core features like the task bar and start menu for something that is superficially prettier but lacks a bunch of actually useful features that used to exist.


Awful compared to what, if I may ask?


All versions of Windows it has competed against except Windows 8 and Windows 8.1.


to any sane desktop environment so basically windows (except the 11 abomination) and KDE


it's simple, apple users want ugly and confusing UI while windows users want pretty and practical UI, bringing the apple one to windows will never work


For devs I think it matters less. But, yeah, for designers on the _Windows Desktop Experience Team_ to not be using their own product for day to day work is absurd to me. It sounds like a joke: "The people who designed this shitty feature don't even use it themselves", but alas, it appears to be the truth.


Why would you even hire designers that don't use Windows as their daily OS? It's not even a question of making designers use your software, but you could just hire people who do. Globally you probably have to go out of your way to hire a designer that wasn't Windows based.


I'm not sure this is true. Anecdotally, across three Fortune 500 tech companies I've worked at, designers / web devs / program managers had Macbooks (or had the option of choosing Macbooks) even when the team was developing primarily in Windows.


I think having some people daily driving other systems may be beneficial, providing a fresh view

but if majority of your lead designers never in their lives used the taskbar they are surprised when replacing it with a dock causes outcry


> Globally you probably have to go out of your way to hire a designer that wasn't Windows based.

That, likely, highly depends on your industry. Designers doing mobile application design, as well as web design from my limited experience tend to steer towards Macs. Even within my college the entire graphics design department was Macs. Of course if you are working on a Windows based product you'd in theory expect the designers to use or be familiar enough with Windows but you may be surprised.

I'd be curious to know if there are regional (country) differences in this.


The key word is ‘globally’. MacBooks are expensive, and because they are hardware they cannot be pirated. I’ve worked with Eastern European designers who were excellent but could not afford Apple hardware. You only have to look outside North America and Western Europe.

It’s tragic that because of elitism Microsoft does not hire those designers and instead hires designers who don’t care about their product.


Which country is that you are referring as Eastern Europe? I live in Eastern Europe and every designer + most of the devs I know are using Macs. I am not originally from here and my home country currently is economically worse than most Eastern Europe countries. Even there majority of designers use Macs.


You could say, you dont want the designer who is supposed to design win8 to be comfortably working with win7. Instead you want them to be more inspired by different OSes.


If people wanted a fundamentally different UI than Windows has then they wouldn't use Windows in the first place. When we went from XP to Windows 7 that was a huge improvement. It was fundamentally still the same UI with the exact same workflows, just nicer in every way.

When you have over a billion users worldwide there's an expectation that you don't treat your product like a startup's art project. I haven't met a single person who wasn't completely confused by Windows 8's UI or who liked the changes. I've seen lifelong Windows users spend 10+ minutes desperately looking for the shutdown button.

There's "inspiration" by different products but in the end you need to know your own product in order to improve it.


Really, why though? Is there something inherently wrong with satisfying user expectations?

If you want innovation, how about performance, accessibility, compatibility,...


It's one thing to be inspired, it's another to be completely ignorant of how you are breaking workflows for a billion people


Designers who could tolerate a Windows laptop in the Win7 era are probably not good designers. They were mostly inexcusably ugly, and Adobe software support was known to be not ideal until late Win8.


anyone who tolerates laptops at all is not good at judging ergonomics and usability, they are only made so you can get the necessary minimum done when you can't access the main machine and it shows


I do all my work from the sofa or floor. Bad ergonomics but good usability.


Well, I know we all have our kinks and perversions, but it's a bit unfair to push them onto unsuspecting joe


the physical and mental pain related to the terrible "keyboards" and cooling systems and small, poorly positioned screens laptops offer makes the usability nearly non-existent


You haven't really coded until you've coded under a blanket.

And I'm not an indestructible 20-year-old either :)


it's literally the worst experience, though sure, it is an experience


It is universally more ergonomic to have a device that you can’t take to where you need to be working? Unbelievable.


it's more ergonomic not having to use device that actively tries to hurt you


Now hear me for a second, I think I will blow you mind.

All laptops have a very special feature called external monitor support. And for some laptops, you can even buy docks.

Boom!

I know, it's amazing.


and then you're still stuck with the inferior performance and acoustics, why would you want to do that?


That is an expert's view which is not valued today which makes it untrue. I personally agree, I totally do, I mean my setup has triple display and plenty RAM, but sorry!


I know way too many people who build Android apps but use an iPhone as their personal phone. And this is how you get things like the back button going back through tabs that you selected on the tab bar. Even some of Google's own apps are guilty of this.


"Dog food? No, thank you."


"Do you have any apples?"


"You're fired. How do you like them apples?"


I strongly disagree. Two of the major flaws of pre-Nadella Microsoft were arrogance and complacency.


Dogfooding your own product isn't arrogance or complacency, it's common sense. If it's not good enough for you not to prefer a competitor, then it's not good enough to ship.

Arrogance is not doing any or ongoing competitive analysis.


I mean, dogfooding is good but not everyone has to make the pinnacle product?

If someone is selling something that's out of reach, financially, for most people, then making a cheaper and worse product available can still be good. Doesn't mean you prefer it.

I'm guessing Microsoft don't want to think of themselves as "the poor man's OS" however. And, they're not, it costs something like £120.


> I mean, dogfooding is good but not everyone has to make the pinnacle product?

This discussing thread is getting lost in the weeds.

We were talking about people <<designing>> Windows.

For sure they should be using it, <<day-to-day>>. They should feel the same pain as their users are feeling and they should want to improve it.

Anything else is a travesty and that's how you end up with enterprise software (designed for the CEO, used by the peons), or Android apps (designed for iOS by people with iPhones, used by peons on Android).


I think there is a case about using day to day the product they want to improve, but there is also a case about knowing how competitors are doing it.

The hardest part I think is working and sharing work 2 different systems at the same time. Not that the technical solutions do not exists, but muscle memory will always make that one system end up feeling unbearable and it might not because it is worse but by resistance to change.


> but there is also a case about knowing how competitors are doing it.

In the case of UI/UX, I don't want this.

I use Windows because it's not MacOS. I absolutely hate MacOS. Microsoft UI/UX designers using MacOS as inspiration is a critical bug, not a feature, as far as I'm concerned.

I want my taskbar to show labels. I want multiple windows of the same app to be a separate item on the taskbar so that switching between multiple windows of the same app is a single click. I want each window to have its own menu bar, rather than a single menu bar at the top. I want a taskbar on each monitor, each showing only the items on that monitor.

Windows 10 has all these as an option. If Win11 is imitating MacOS, all those go away.


there's a bit of issue there: most of users don't feel the same pain, I think huge part of edge dev team uses edge daily despite it being basically unusable and they are happy and proud of that mess, the same goes for chrome, chropera, quantum... so dogfooding isn't the silver bullet either


> I'm guessing Microsoft don't want to think of themselves as "the poor man's OS" however. And, they're not, it costs something like £120.

"Whereas MacOS is free, it's included with the machine and Apple doesn't charge for it separately"?

But how much of the price you pay for that machine is because of the OS, that's an unknown. Maybe the actual development costs, distributed per device, are available somewhere deep in the bowels of Apple accounting, but that doesn't tell us how much of the (inflated) price you pay for the box is due to the OS. Microsoft's dev costs for Windows probably aren't £120 a copy either (especially considering what a tiny minority of users actually pays exactly that, as an explicit item separate from the hardware).

So I reckon Windows still actually is "the poor man's OS": You probably pay more than that for MacOS, only it's baked into the cost of the hardware so you don't know how much, exactly, it is you're paying.


> reckon Windows still actually is "the poor man's OS"

Ask gamers that paid $3000 for the graphics card alone, I recon their opinion will differ


But the OS has a separate price tag there. Just because gamers rich enough to blow ridiculous money on their rigs also use it doesn't make Windows "the rich man's OS"; that expression means something only the rich man can afford. Rearrange the sentence a bit, and "the poor man's [whatever]" means the only such thing the poor man can afford.

MacOS sure ain't the poor man's OS.


Ever since Lion or something, MacOS has been free. Not free as in free upgrade, but free as in free to download and install and use on any compatible machine (only Macs, of course).

Windows always had a price, even if that price was included by the OEM in the price of the device.


> Not free as in free upgrade, but free as in free to download and install and use on any compatible machine (only Macs, of course).

Let me repeat that:

   only Macs, of course
So then it...

* IS just an upgrade. Those Macs all came with an OS originally, didn' they?

* ISN'T actually "free": You have to buy a Mac first. They cost money -- quite a lot of it, compared to most other PCs, I've heard.


How did the designers become absolute dictators?

I've dealt with this at a few companies and it's really crazy. I've seen them tank products more than once because they ignore everyone and are given unquestionable power.

It's absurd. They have as much capacity to be wrong as anybody else


Designers focus hard on interpersonal skills. To non-technical leadership they look like tech guys who aren't weird and have better haircuts.


> How did the designers become absolute dictators?

When the iPhone became successful, before that it was all utility, then fashion took priority when the iPhone out sold literally everything.


I've always been suspicious of that attribution.

Pretend everything else about the iPhone happened; The charisma and leadership of Steve Jobs, the brilliant marketing, the countless number of talented software engineers, but instead the device basically looks like a blackberry.

Physical keyboard, plastic case, More or less a knock off aesthetically.

We're deep in counterfactual territory but I think it can be argued that this imagined iphone with a very conventional aesthetic would have done effectively just as well. It still connects to iTunes and hooks up to your Mac, having all the same technical features and functionality.

The point is people see something successful, then they assign one of the attributes as the driver of the success and there seems to be very little reflection of the two core questions: (1) is it true? (2) is it generalizable to other companies?

And let's say it is. Are you really going to make something that's more iPhoney than an iPhone? More macosy then macos?

So even conceding those two points, which I think are utterly contestable, It still doesn't make any sense. Maybe that's why it's never worked. Maybe. Who knows?


>> When the iPhone became successful, before that it was all utility, then fashion took priority when the iPhone out sold literally everything.

> Pretend everything else about the iPhone happened; The charisma and leadership of Steve Jobs, the brilliant marketing, the countless number of talented software engineers, but instead the device basically looks like a blackberry.

> Physical keyboard, plastic case, More or less a knock off aesthetically.

> We're deep in counterfactual territory but I think it can be argued that this imagined iphone with a very conventional aesthetic would have done effectively just as well.

Yeah, but AFAICS that doesn't really contradict the hypothesis. Sure, I might agree with your counterfactual and also conclude that it wasn't really the design -- and, side note, I think what was originally meant here was the narrower sense of the ever-dumbed-down GUI, not physical characteristics like keyboard or not -- that led to the iPhone's breakout success... But just because you and I think so, CEOs didn't (necessarily) agree.

If they thought it was the cheery skeumorphic (and later stylised, then rounded-corner, then sharp-edge-rectangular, then rounded-corner again... yadda yadda) graphical design that led to the market success, then they started listening only to graphical designers, and that is what has elevated graphical designers to "gods" over actual developers.

And that's not just a counter-hypothesis to yours, but my theory; it's how I think it actually happened.


the thing about iphone is: it didn't start the smartphone era like many say, it ended it suddenly people started desiring device as expensive as business models were that coul'd only do half of things their mainstream ones could so by market rules it wasn't profitable to struggle and do things right anymore, do you remember cheap Chinese portable music players that required dedicated software but we beared with it because they were cheap? so... iphone isn't cheap


Apple, and a number of other players, have long had the ability to craft a favourable narrative in most people's minds.

Heck even supposedly technically minded people will say things like "Apple invented the mouse" or they had "the first mp3 player" or "the first smartphone".

There's a circus about them that permits them to stomp out history before their first performance and mark it as irrelevant.

Musk is also a master of this craft. Founder of Tesla? You'd be surprised. Founder of PayPal? Go look that up.

These people could have been excellent actors if their businesses had failed


> There's a circus about them that permits them to stomp out history

And how do you think they would do that?

Apple sometimes uses a bit of hyperbole to describe itself (https://512pixels.net/2014/01/apple-boilerplate/), but I don’t think Apple ever claimed "Apple invented the mouse" or they had "the first mp3 player" or "the first smartphone".

I also have never met anybody who made such claims, only people saying there are people who make such claims.

It also would be weird for them to do so. Their marketing differentiates their products from the competition, and has been doing that for years.

The competition sells MP3 players, the iPod was a _music_player_ or just the iPod (look at https://youtube.com/watch?v=kN0SVBCJqLs, and count how often Steve says music before he says mp3. [1]); the iPhone an iPod with touch controls, a revolutionary mobile phone, and a breakthrough internet communications device (https://youtube.com/watch?v=x7qPAY9JqE4), the iPad just the iPad.

[1] There’s a claim in hat video that Apple invented FireWire. I think that is mostly correct. They were the driving force there.


I'm similarly confused that anybody would even think Apple claim to have invented these things.

Particularly so since in the Keynotes where SJ introduced e.g. the iPod he shows the state of the competition and has a good critique before introducing his 'lame' replacement.


Zuckerberg, founder of Facebook...


Upvote for content -- but, if I could have simultaneously downvoted for near-unintelligibility of run-on sentece, I would have.


in the text area it was separated into paragraphs but oh well, I guess it pretends to be markdown formatted?


You need to put an extra newline to get separate paragraphs.

Like this.


Periods would have helped. They're standard even at the ends of paragraphs.


The iPhone being a flat glass screen and nothing else really was a big change.

It lets you have video and web browsing, and what actually lets apps be useful.

Before the iPhone, the UX was basically fixed and couldn't be iterated on, and the screen space was ~240p or 144 tops


Now that's some nonsense.

iPhone was ways ahead in usability compared to everything else back then.

For all the faults of Apple, they needed to define how smartphones look like and act. Mobile OSes were horrible pain to use. People tolerated that because there was nothing else out there.


iphone always was behind, basic things always were either convoluted or just impossible, ios is the most painful mobile os I ever encountered


It wasn't behind the first year or so.

It's not about buttons or even copy-paste as weird as that sounds for a techie.

But zooming on a map with your fingers was a game changer. Zooming an imagine with your fingers. Using your fingers to scroll a list, naturally. And lists are probably the most common UI element except for buttons, labels and images. They're for sure the most common complex element.


if it adds some ingenious new features but at the cost of removing some basics it is still behind, that's the whole point of w11 issue


Yeah, but there's a term: "paradigm shift".

It wasn't a "few features" (techie speak), it was a "new way of working" (usually marketing speak, but here it was actually true).

W11 is a few missing features for an existing way of working.

The iPhone had a few missing features for a fundamentally new way of working that was much superior to existing smartphones.

Your complaint was like the handlebars on the new bike being hard to push (which I can workaround by pushing harder, up to a point) while the old bike had square wheels and a chassis meant only for square wheels (which I could not work around).


"new way of working" on iphone is using fork to move soup from your pot to bowl instead of ladle, that simply doesn't work

pinch to zoom doesn't interfere in any way with easy app installation or easy files transfer, they can coexist (and they do, on android), sync simply doesn't work when you want to quickly drop that one specific file and keep moving, sync doesn't work when one device has much bigger storage than the other, sync doesn't work when you want to easily remove files from one device

"paradigm shift" to golden cage is not a good thing, apple intended to take all responsibility from users but at a cost of being unable to do anything efficiently


> "paradigm shift" to golden cage is not a good thing, apple intended to take all responsibility from users but at a cost of being unable to do anything efficiently

The original iPhone didn't have an app store, apps were supposed to be web apps.

If you're going to rewrite history, at least do it well.

I get it, you're a techie, just like me. I use Android, I don't like iOS.

But to deny that for the average person the iPhone was the first usable smartphone is just silly at this point.


how making everything harder made iphone "the first usable smartphone"? that makes no sense


As an example navigating a list directly with my fingers is much, much faster and more convenient than navigating it with arrow keys (physical or on-display ones).

Navigating a 2D space with my fingers, just dragging around or pinching to zoom, is also much, much faster and more convenient faster than than doing the same with arrow keys (again, either physical or on-display ones).

And navigating through the OS and apps, either through lists and 2D spaces (websites, images, maps, videos, etc.) is a lot more common than copying files around on phones. 100:1, probably. Again, some techie/advanced functionality was lost at the start, but the time and frustration savings from those basic yet intuitive features heavily outweighed their loss.

If you don't agree with this, I guess you either haven't tried pre-iPhone smartphones or you just have a very unorthodox opinion and I'm not very keen to continue this conversation.


I don't think that's a great description.

It's more like switching from using a spook to move soup to your bowl to eating a sandwich instead


What basic things are convoluted or impossible on iPhone/ iphone OS?


Things on the home screen auto arrange. I can't leave things at the bottom near my thumbs where it's most ergonomic.

It took until last year to get an apps list and not have to have everything pinned to my start screen.


files transfer, installing apps unwelcome in the main store, using proper contextual content blocker


[flagged]


No. IMO they were awful. Sony had Symbian (?) licensed as I recall on some models, and that was the best of it. Moto, HTC, and Blackberry - ugh. My last two pre-iPhone phones (and I've used many in between Androids - every other phone until about 4 years ago) were Treo with a version of PalmOS and it basically froze anytime you looked at it funny, and a Samsung Windows Mobile device. The hardware was decent enough, but Windows Mobile deserved the death it had.

Don't even get me started on the Blackberry devices of the era. Missed the keyboard still though.


Only the G1 (first android) was in iPhone`s class (and may have outclassed iPhone a bit with GPS built in and support for 3g). No one else even got it. It was a complete game changer.


Most of phones you might be thinking is probably post-iPhone. Just couples but I used couple pre-iPhone smartphone OS and oh boy they sucked.


I still remember my HP Ipaq 514. i wouldn't wish it on my worst enemy.


Sorry but none of the offerings were any good I had WindowMobile 4-6 machines, Palms, the Nokia N61 and blackberries they all sucked. The iPhone was 10 years ahead.


The iPhone / 3G / 3GS / 4 were so far ahead of what any of their competitors were doing that it can be difficult to conceptualise it in hindsight.

HTC, Sony, Moto, RIM - there's a reason these companies barely exist in the same form anymore. Samsung were the first to consistently catch-up to Apple - largely because they're official strategy at that point was 'copy Apple'


That's a very wrong way to describe what happened.

Let me describe you what the phones scene was looking like just before iphone was released. They were fine for making phone calls, sending texts/e-mails (in blackberry case), and taking an occasional photo if you had a high end Nokia. That's it. The phones had very long lists of features nobody actually used. Yes they had rudimenatry browers, but why browse if the pages look nothing like the real ones and it was extremely expensive outside of being connected to wifi.

When iphone was released it was very obvious that it was on a completely different level to what we had until then. Yes, one could argue that it was missing some minor functionalities like SMS character counter which was very standard. But the first time you see it, when you saw that the screen can be smooth scrolled, that you use fingers completely naturally to do any kind of actions (vs styluses which were crap), when you see that everything is presented clearly and nicely, that the screen is nice and bright, and GUI easy to use, it seemed like the device came from the future. In many ways it was like a pocket computer.

I got the second batch of first iphone (the one which could be unlocked by some means so I could use it in my country) and showed it to my friends at college who all had high-end phones. You should have seen their faces. They had no idea what they were looking at. And no, they never watched Jobs's presentation or anything. Most of them didn't even know who he was, or were exposed to any kind of Apple marketing. Actually it was the reverse, Nokia and similar were established brands. The phone was judged on its own and it was almost magical to us.

And I still didn't come to the best thing about it.

It brought internet to your pocket. Not only the browser would render the pages like they've looked on your computer, but it was very usable which was unseen before. Have you actually tried to browse internet on a Nokia phone at that time? It was complete shit. That pinch to zoom/in out was actually revolutionary as well.

The second part of the revolution was people actually wanted to use the phone to browse the internet, so Apple and users both put pressure on the ISPs to lower the prices of cellular data. And believe it or not, they did. They lowered the prices so much to make it from one of the most expensive resources in the world to the cheapest. That alone is groundbreaking.

The third part, even though it happened some very short years after that, was that it enabled programs which were actually usable paving way to the whole mobile app ecosystem which we have now. How many people were writing software for a living on a mobile device before iphone? How many do you think are doing it now? The iphone kickstarted the whole industry.

And all of this was actually made possible by excellent software and hardware design (not fashion, they are usually diametrically opposite!). Nothing of this would happen if it wasn't for that design. If iphone was just another N95, nothing would have happened whatever people thought of Apple marketing. Nothing would have happened if the browser was difficult to use, the UI lagging, or needed a stylus to use. It all had to come together for it work. And they've managed to pull it off spectacularly. It makes it very easy to argue that the most important Jobs' legacy is the iPhone.


I did have device with Symbian S60v3. I browsed internet and it worked perfectly fine, on Opera Mini/Mobile, pages rendered just as on desktop and you just zoomed on area you wanted to read. It did have proper filesystem and explorer available. One time I registered domain, write some placeholder content HTML (on QWERTY keyboard, none of that crap touchscreen keyboards!) and uploaded it through FTP, all on the phone. It ran some games too, including SNES emulation and others. That felt like a pocket computer. I don't think you could do most of these things on first iPhone, it was feature phone by comparison.


Maybe same with usual companies, higher ups don't really get ux / functionality, and are moved purely by looks.

"Because many users are appraising apple's looks, by cloning apple we'll also get praised by our users right?" "Then we can get Apple users to use windows too!"


But they don't. They've cloned some superficial UX bits.

I'm still dismayed that in Windows 10 you still basically have 2 separate UI/UX experiences to configure the damn OS.


Isn't that incredible? Trillion dollar company cannot have a top-down decree that the mixed UIs need to be fixed for the next OS.


Yes, they cannot. Because the new UI is more or less useless and touches core functionality (i.e. interaction with HW). Maybe they were told by kernel engineers to not touch this part.


A couple of days ago I run into one of those windows in Task Scheduler. I had to type a long string of program arguments into a right aligned narrow text field (plenty of space at the left of it) inside a window that I could not resize to make that text input wider.

Luckily that was a VM in my Linux box (the luck is that I use Linux, not Windows.) The Linux equivalent would be either cron or systemd, which are configured with text files that I can type full screen if I care. Microsoft made a ton of work building those graphical UIs and in some case the result is worse than not doing it and using text files. Note that GUIs are not magical. To do what I was up to I still had to look for documentation and eventually had to type some XML in a text area. That is to start a task after another one completed.


I understand that OSs are really complicated. I've been exposed to every one now for almost four decades.

What's dismaying is that in that four decades it's now just regression. I admittedly have used MacOS for a couple of decades -- simple and it works well enough, but always run Windows as a secondary OS and Linux "as needed".

Windows has gotten so ersatz and bizarro that it's just strange to me when I do use it every few days.

Windows 2000 / Windows 98 had it down. It wasn't fancy, logical, and it worked. It's been a confusing mess since Windows 7.


Windows 7 was probably the best it's been for a long time.

But you're right, they completely screwed up the control panel. If not for that then it would've been the new pinnacle (let us never discuss Vista).


Everyone can give their opinion on UX choices, this is something higher up can discuss at lenght. It is much more involving to discuss engineering topics and the time scale is different. Overhall it is easier for designers to develop accointance with top management (as compared with engineers). Engineers would love to do what designers do: propose massive changes every years. They don't for engineering reasons.

Is there such a thing as design continuity? or granular design change?


Who should override the experts in design.


Users.

They are the ones providing the money that pays for their employment.

"Expert" User Design is design that is recognised as such and accepted by Users.

If Users reject these designs then they can't claim to be "experts".


Any high quality design team should be regularly interacting with users. They should be during users studies, with both the shipping products and mocks of new features.

And design team that thinks they can make changes and new features just by sketching in their notebook is mistaken and ineffective.

You can't have every user inputting on each change so the design team needs to be the voice of the user. They are talking to users anyways.


Management


I knew the Windows UX was dead the moment I tried to type a search into the Start menu and drag a program off the results onto the desktop to create a shortcut and it didn't work, and then when I went to right click on it "Send to Desktop (Create Shortcut)" was gone too.


In XP you had to revert to classic desktop to turn off the Fisher-Price.


Not that bad aesthetics aren't a problem, but I think loss of basic functionality is a bigger issue.


Meh, Fisher-Price was cute. I've never understood the fascination for those drab Windows 3.11 grays.

And Fischer-Price was also functional, you just got extra colors.


I never understood the fascination with everything being forced white. The gray widgets and white content were perfect for drawing the eye to the important parts, ... the content.

In addition, the "classic" desktop of XP time wasn't flat grey. It was a warmer color, with a bit of beige. Wikipedia says image says #d4d0c8. Which was a lot nicer on the eyes all day long. How far backward we've come. Forced into narrow choices by folks who never heard of "Chesterton's fence."


Windows 3.x by default had colourful (yellow) borders and white backgrounds (except maybe the "main container" window in MDI apps; was that perhaps gray? Or teal?). The grey came with W95. Both were fully user-configurable via a quite user-friendly dialogue box, as were NT 3.x, NT4, and W2K.

Can't recall whether XP, Vista, and 7 also allowed you to change colours directly -- if it was only the "Fisher-Price", "Glass", and, uh, "Glass 2?" that were locked-in in the standard UI setting -- or if you needed to switch to "Windows Classic", the 95--W2K-style UI, to change even the colours. Can't recall, because I of course switched to Classic first thing I did on any of those.

That all went away with Windows 8, of course, and AFAIK didn't come back on 10 either. Actually, they'd started to sabotage it a little earlier: About halfway into the lifecycle of W7, you couldn't change the window border width in the dialogue any more; you needed to hack that value in the Registry (and re-hack it after any change via the dialogue box). And towards the very end, even that hack didn't do anything any more.


XP for sure allowed you to change colors. The dialog wasn't even hidden, I think.

7 probably did, too, but the dialog was a bit hidden.


Yeah, it was probably either the dialogue before that, or (even more likely) a drop-down at the top of the dialogue that let you select "Classic".

It's utterly incomprehensible to me why the went and just ripped all of that functionality out in later versions.

(Also, why Linux desktop environments keep crowing about how many "themes" they have. However many thousands of them people cobble together... Just utterly pales in comparison to letting users set each detail of their systems up exactly as they want; that's in effect infinite "themes"!)


Not hidden at all. Right click anywhere on the desktop and hit "Personalize".


What we're talking about is probably not a secondary but at least tertiary or... Eh, fourth-level dialogue, buried at least three clicks deeper after that right-click. (Except in versions after Windows 7 it's nowhere to be found at all any more.)


Windows 3.x was white, it is Windows 95 that was gray.

Amusingly Win10 was the first time since 3.1 where using the color returned for GetSysColor(COLOR_WINDOW) would give you a "correct" color for the window background :-P.


For me it was when I couldn't right click and "Open file location" from the Windows search

What folder is Gears of War 5 installed to on my PC? Who knows, certainly not me. Also a reason why I hate the Windows Store (not that the mac App Store is any better)


My feeling too, in that sense the UI now feels more like a thin veneer, you could easily mistake it for a kde theme or a mac os clone.


> most crucially essentially none of the designers use Windows

Why are they allowed to work on Windows? Even car salesmen are often required to drive the same car as the one they're selling.


I honestly don't get how this is even a problem. My best friend is a UX/UI designer who works on Mac but the guy also games and used to do crypto mining back in 2013 so he is very familiar with Windows as well. Why is Microsoft having problems finding designers that use and know both?


It's a problem for so many reasons it's not even funny, from the mundane to the esoteric-but-important!

A random example: Macs these days use screens with a brightness response Gamma curve of 1.8 and PCs use 2.2. Macs use a color gamut of DisplayP3 by default now, which has different color primaries than the PC sRGB standard.

Why does this matter? Because two shades that are easily distinguishable when physically viewed on a Mac may not be easily distinguishable on PC! Colors wills shift, and even 50% grey is represented differently. The same JPG or PNG file from a design will look different on a PC even if it has an identical monitor as a Mac!

Similarly, font weight, anti-aliasing, and kerning are very different on the two platforms. Any text design done on one has to be adapted for the other. It's not just the default fonts! The same font at the same size will look very visibly different.

Did I mention default fonts? Only Times New Roman, Arial, and Courier are common between the two platforms. All of those are physically different fonts that look different even if transported to the other platform.

Etc, etc, etc...


This is an interesting reply, but very much off topic. The GP asked, and I quote:

> Why is Microsoft having problems finding designers that use and know both?

and you totally failed to answer that.


Well, not entirely their fault. Your GP (now my GGP) began by stating:

>> I honestly don't get how this is even a problem.

Which can be interpreted not only as "How can this even happen?", but also quite reasonably as "So what's wrong with that?". The parent comment to yours (GP to this) answered the latter.


> essentially none of the designers use Windows

This is ridiculous.


I mean why would someone who cares anything at all about how software looks even glance at Windows if they had any choice?

Imagine taking someone who is so passionate about UX that they made a career out of it, making them use their worst nightmare as their daily driver, and then giving them zero institutional power to fix any of it because of the n-dev years problem.

Windows, while extremely powerful and stable because of the commitment to not breaking things, is basically unfixable UI wise unless “forward” means going backward. There’s too much to rewrite and not enough value gained by it. So every iteration will be piling on even more inconsistencies and shifting around the “easy stuff” because that’s all you can do.

You just have to embrace that Windows is 50 years of UX trends all coexisting.


Good design is the intersection of form and function.

Imagine taking a role on the Windows Desktop Experience Team while caring so little about how the operating system functions that you don't even bother to use it as your main workstation.

This is the company where the phrase "eating your own dog food" originated, by the way.


>Imagine taking someone who is so passionate about UX that they made a career out of it, making them use their worst nightmare as their daily driver, and then giving them zero institutional power to fix any of it because of the n-dev years problem.

1 Those people are paid, as a dev I am paid to use JS and PHP , I am not crying here because poor me I can't use my favorite language Prolog or Lisp so why a designer should cry if he does not work on his preferred shit. They can use Linux or OSX at home.

2 this designers call themselves UX people. This means they don't work with images of GUIs , they should in fact use the product as an user would use. Ex: you are a UX dev for the Word product but you always use Word with a document of 1 page length, then you would never feel the pains your users that have documents of 50+ , 100+ pages feel. As a dev and UX designers you should use the product you create.

3 If you are a designer in the sense you create logos and icons then probably is fine you don't use the product, though if you are a web designer and really love fucking with fonts and fonts color PLEASE borrow a regular laptop from a friend and check your website in it, your cool fonts might look different on a different machine and maybe you also will be reminded that not everyone has high definition laptop screens.


I mean I don't know about you, but I do like to whinge a bit if I have to use PHP or JavaScript or some framework of misery for a job. Pretty sure I'm not the only one.

I still do the job though.


2. This reminds me, while we're documenting ms bugs: does anyone else's Word get insanely slow once there are around 40 equations in a doc? Like taking over a minute to copy from an equation?


> I mean why would someone who cares anything at all about how software looks even glance at Windows if they had any choice?

If they're building software for Windows??? Duh, WTF else are they supposed to look at? "Hey, my mockups of our Windows-only software look great on my Mac, so what's the problem?" is about the stupidest thing that designer could possibly say.

And if what they're designing is not just software for Windows, but Windows itself, it's ten times worse. Which is what's under discussion here.

> Imagine taking someone who is so passionate about UX that they made a career out of it, making them use their worst nightmare as their daily driver, and then giving them zero institutional power to fix any of it because of the n-dev years problem.

Yeah, that might be a fascinating hypothetical to discuss some other time, but this discussion is about the problem of designers "so passionate about UX" that they insist on making stuff look "good" at the cost of usability and consistency being given all the institutional power to break it despite the n-user-years problems this causes. The exact opposite of what you are imagining.


Oh, and returning to this bit:

>> Imagine taking someone who is so passionate about UX that they made a career out of it, making them use their worst nightmare as their daily driver, and then giving them zero institutional power to fix any of it because of the n-dev years problem.

And if the UI they're designing sucks so humongously that these poor "passionate about U"I designer snowflakes can't even deign to use it... Then whose fault is that? Whom can they blame for this, if not -- the designers of the UI they're designing, i.e. themselves?

Assholes, that's what they are. (And what kind of person defends assholes?) Of fucking course it sucks, when the assholes who designed it don't even have to use it, so they'd get to suffer if it sucks. They're getting away with inflicting the suffering only on others, while they themselves avoid it. No wonder they don't give a shit!

This is precisely what "eating your own dogfood" (aka "dogfooding") is all about. Look it up.


> I mean why would someone who cares anything at all about how software looks even glance at Windows if they had any choice?

Ever saw airplane controls? They are ugly, and UX people have to design them too, to enable pilots to act in emergencies, to highlighr issues before emergency even happens, etc.

You are conflating visual design and UX, which is common.

The former draw pretty pictures and they can fuck off to MacOS, nobody cares.

UX is about structure: where should we put the delete button? How many clicks does it take to access your files?


As an office power user i like the UI of windows 10. 11 sucks tho


Going from 7 to 10 was like being a juggler who was used to juggling 3 apples suddenly having to juggle oranges. A little different but basically the same.

Going from 10 to 11 is like being a juggler going from juggling oranges to being punched in the face for the entertainment of 10 year old boys.

Have you tried using an app other than photos to view images? How about being forced to manually set every single extensions default opener? Oh, did you expect to be able to right click on the taskbar? Screw you! Have a kick to the groin instead! That's what you get for thinking that this version of windows would operate anything like any other version of windows, you poor, pathetic worthless mineable data source! Now shut up and answer this survey about how much you like Windows 11!


yes, anyone who cares about looks would pick win10 pre-neon over mac, they would pick win8 over mac, they would pick win2000 over mac


"title bars that made it impossible to tell active and inactive windows apart": Win10 (and iirc Win 7, I never had Win8) are a mix-and-match. FreeCommander XE (the file manager I use) and Thunderbird both have title bars that change color, and Vivaldi allows it (although the contrast is not as great as I prefer). But none of the MsOffice apps do. I think there's some kind of subtle change they do, maybe a 1 pixel wide border or something that shows up when they have keyboard focus, but I honestly can't tell. As a result, I not infrequently type into (or worse, delete something) from the wrong window.

Also, the title bars on MsOffice apps are so cluttered with controls that you can hardly find a safe place to click on them if you want to select that app with your mouse. Why, for example, is the search bar in Outlook up there, instead of some pop-up dialog box? (And the Outlook and Word search tools themselves are a mess, but I digress...)


To your last paragraph, my favourite thing about Mac is how if you click a window that doesn't have focus, the click ONLY makes the window focused. If you clicked on a button, it doesn't actually click the button.

A similar thing is what happens when you touch a phone when it dims right before sleeping. Some OSes only let the touch restore brightness, while others will do that AND register a tap wherever you touched. Super annoying IMO.


I think I it would bother me to have to click twice when working back and forth between two windows. Maybe another approach could be to only register the click action if there is no overlap? Then again, the inconsistency would probably be frustrating with that approach.


Fun to see this come up because it is one of my biggest annoyances when switching between MacOS and Linux.

I have focus follows mouse on Linux so it might be an exaggerated problem, but it’s annoying to click a text box and it not actually select the text box.

There is an option in yabai for focus follows mouse, but I had to disable yabai because it’s really awkward and perceptibly slow on macos.


Funny, I use windows and Mac interchangeably, and I completely prefer the windows model.

I use 3 monitors for both, and frequently multi task and the extra click on Mac infuriates me.


Yes, it's a good feature for ease of use and makes things less confusing for my mum but I wish I could change that on Mac.


A really useful feature on MacOS is that you can scroll a window that doesn't have focus by hovering over it and using a mouse with a scroll wheel/surface.

MS - and especially Windows - has always seemed to me a company that takes smart people and makes them do really stupid things. The product culture seems incredibly broken.

It's never been great. But in the past usable versions like XP and 7 would fall out. 8 set a new baseline for idiocracy and user hostility. I see no evidence things have gotten better since.

I suppose - as per earlier comments here - if the culture is top-down design by people who don't even use the product and are trying to Make a Statement for career reasons, the future isn't encouraging.

The real question is what kind of management allows something so obviously nonsensical to happen.


> A really useful feature on MacOS is that you can scroll a window that doesn't have focus by hovering over it and using a mouse with a scroll wheel/surface.

Yeah, Windows has had that too for a while now. Dunno for sure how long; some five years (or eight? Ten?), I'd guess.


Citrix (a remote desktop app) breaks that; apparently no mouse scrolls are sent to an app on the remote desktop until you click in the Citrix window.. Thunderbird partially breaks it: I just tried with TBird in the background: scrolling the list of emails (that works) and the text of a selected email (does not work).


Did that behavior change at some point? I'm on Big Sur and clicking on an unfocused window will focus it and trigger whatever button your mouse landed on.

Edit: it seems to depend on the kind of button(?)


The developer can make a choice there. https://developer.apple.com/documentation/appkit/nswindow:

> var worksWhenModal: Bool

> A Boolean value that indicates whether the window is able to receive keyboard and mouse events even when some other window is being run modally.

I think there are flags for doing that for non-modal ‘other windows’, too.


Native buttons and sliders have click-through enabled by default. Users can also ⌘-click to press buttons in background windows without changing the focus.


This is because of the toxic effect of economics and marketing. Everything has to be centered on making the platform more money. Even in cases where it makes no sense to do so!

I've seen open source projects embrace market driven design methodologies, because they mistakenly think it matters.

There are so many designers and developers who have only worked at startups, and dark patterns have become baked into our development culture.

Mostly this attitude that you need to manipulate the experience of the user, and disregard any feedback from them, because you are the expert, and they don't know what they really want. That's what Steve Jobs said, and he was a success, right? So you should do that also.


I love good design, but designers are also able to completely ruin things in short order. I've been using Windows for 30 years as my primary desktop but in the last 6 months I'm increasingly ready to abandon it and just switch to Linux, even though I'd miss a lot of small things.


> most crucially essentially none of the designers use Windows.

That explains a lot. Not having the designers dogfood their work is serious oversight

As an MSFT employee, I have no choice but to use win 11. And suffer from taskbar annoyance and bugs


In a same boat. I must have filed hundreds of internal feedback & bug reports to see none of them addressed. Nowadays I remember to double click the right-click context menu, because right-click + V doesn't open a file on a gvim anymore...


I had used some registry hacks to disable the new context menu. And now have installed explorer patch (mentioned in article and this thread) to fix other issues.

You should give it a shot!


As an MSFT employee, you also have a better chance of getting things fixed.

If everyone there (except the designers) complains, maybe the management might listen?


You'd think that, but as an internal employee "you're not our target demographic".

Not that this is exclusive to Microsoft. Every large company I've worked for has had some variation of that phrase.


Seems weird you'd be forced to use an OS not designed to be used for your usecase


Yeah I've met two Metro/Modern designers during my internship at MS and both used MacBooks. One of them do most of his work in parallels.


That explaines some long lasting bugs on real hardware like single clicks registered as double clicks.


> Windows Desktop Experience Team from Win7-Win10. Starting around Win8, the designers had full control, and most crucially essentially none of the designers use Windows.

How are they allowed to do this? This is awful and goes completely against the "eat your own dog food" philosophy.

Separately, I've noticed that developers in general are too deferent to designers, who then get to run amok with shitty ideas. Your pushback is exactly what is needed.


Pushback will get you a bad review, smaller bonus, and a performance improvement plan. The company has been too dysfunctional for too long. Fixing it would be like boiling the ocean.


Counterpoint: Different competencies. A UX designer telling a developer how to code would be laughed out of the room, the other way around is the same.


Difference being, the software (at least more or less) works; what's annoying people is the usability and consistency of the UI[1].

Designers deserve to be laughed out of the room; developers don't.

___

[1]: Until they get their shit in order, designers can kiss my ass for their pretentious "UX" claptrap. The actual experience is only max-half interface, min-half functionality. Where the latter is provided by lowly coders.


Hi, the equivalent here is not a UX designer telling a developer how to code, but rather telling the developer their code does not work. For example, "Hi, I clicked delete in Windows Explorer, but the file was not deleted". This is just user feedback.


To design the core UX for a system, you should understand its userbase. If you're stuck in a paradigm where there are documents instead of applications, where closing things doesn't terminate them, and where the default switching hotkey doesn't allow you to switch between windows of the same application, it affects how you perceive the whole system and puts you at odds with your user base.


Omg, the white title bars! I literally can't tell where the title bar is sometimes if there's several windows overlapping.


I want taskbar, not a dock, I also want it improved to disable grouping (who even thought that it was a good idea to be unable to put buttons in the order you want?) I want start screen (yes, I actively used full screen and yes, I made use of tiles grouping)

and I want to get rid of all these attention stealing animations, all these readability degrading blurs, pre-neon 10 was great, why break it?


I agree.

I also have the taskbar on the side of the screen, so there is absolutely no chance of me going to W11. (That and their TPM2.0 insistance for DRM purposes)

Hopefully by the time W10 is EOL there will be a Proton/Linux alternative for gaming, combined with all the malware some games require installed to work.


To replace W11 taskbar joke I installed StartAllBack - after 30 days you must pay about a coffee and it's all back and even better. Yes I'm fanboying. https://www.startallback.com/


For the longest time I would fight with Windows to get a plain black task bar and active window title bar. And I mean #000000 as in fully black with white text. In Windows 7 you could kind of trick it into doing it with registry alone but after that I always had to do uxtheme.dll hacks to get it to work. Then one day I just stopped and started using Linux on my desktop and MacOS on my laptop. Whatever game MS thinks it's playing ... they are losing market share now in both business and education. All they really have left is DRM heavy gaming.


> All they really have left is DRM heavy gaming.

AFAIK, it's not the DRM that's a problem, it's anti-cheat. Cheat software needs to evade detection of both its install and execution, and anti-cheat has to be able to detect those. I imagine writing API emulation that works with anti-cheat is pretty difficult.

Now I'm curious about how anti-cheat could work in a Linux environment. Cheat software likely doesn't need root to benefit the player, but it will certainly need to run as root to avoid detection from the anti-cheat software which will definitely run as root.

Of course, Linux gamers are probably going to raise an eyebrow about a game component wanting root permissions, but how else are you going to detect cheats?


> DRM heavy gaming.

As someone who's now playing League of Legends from his Linux desktop, even that is going away.


Microsoft has a lot of gaming IP that they'll only release games for on Xbox and Windows. They'll likely try using DRM and anti-cheat to avoid the likes of WINE or maybe they won't even feel the need to. It's just more effort for users anyway. And Riot Games has VALORANT. DRM heavy gaming is not going away but only just beginning.

Off topic, but Microsoft can even bypass Steam and use their own (terrible) Microsoft Store and Xbox apps on Windows as they have hard to resist exclusive content. Just like Epic Games did with their store. Platforms, not services!


> most crucially essentially none of the designers use Windows. [...] the newest designer (because they seem to rotate every 6-18 months) with a shiny Macbook [...]

This is crazy. Essentially you get no product memory and no incentive at designing a good product because the designers won't experience the negative effects of their actions.

If this is by design, tell me what Microsoft management wants Windows to go to.

A saner dynamic would be

Build this please

I built it, use it for a couple of weeks

OK it's shit let's try this other one.


I think the problem is that there is no one product evangelist at the helm saying, this is where we are, this is where we are going to go, we are going to solve these problems along the way, and we will have these metrics to tell us where we are.

Instead its every crab for themselves and the bucket is overflowing.


You just pinpointed my biggest issue right now with windows. Found 8 or 9 different titlebars on my normally opened windows that have very little visual differences between active an inactive state and furthermore often no visual difference between window and titlebar. It feels like I more often than not click outside of intended window and also start to type with the wrong window active.


Right-click on Desktop -> Personalize

Colors -> scroll down to 'Choose your accent color'

   click a nice colour from the 'windows 'colors' tile.

   Scroll down to: 'show accent color on the following surfaces'

     click tickbox 'Title bars and window borders'

The homogeneous madness then starts to dissipate a bit...


> the Edge title bar having no empty space on top so if your window hung off the right side and you opened too many tabs you could not move it

This was extremely annoying, thank you for standing up for this.


Thank you for your effort & for fighting for the users.



> things like the all-white title bars that made it impossible to tell active and inactive windows apart (was that Win10 or Win8? Either way user feedback was so strong that that got reverted in the very next update)

Images online show that Windows 8 still had colored title bars, so it must have gone white in Windows 10. I do wish we have a way to get back colored title bars, since I find myself confused about what Window is active in a multi-monitor world.


> Images online show that Windows 8 still had colored title bars

Depends on exactly which window, doesn't it? For example, the main Outlook window on my work Windows 10 PC has a blue title bar, but the pop-up-and-hover appointment reminder's is white. Don't think I ever used Outlook on Win 8, but I could well imagine it was the same.

And so on for a lot of other apps.


Whoever decided to implement the mouse-over thumbnail preview for open apps on Taskbar needs to a whack to the head with a rolled up newspaper. It's incredibly aggravating having all these small windows popup when you're busy. Most times I end up clicking on something I didn't want just because the mouse cursor got near the Taskbar.


It gets even worse in Windows 11.

I love the example of the Task View to show that they really don't use their own darn things. Since 2018 it has been incredibly buggy. Reproducibly even.


Funny that Windows gets forced simplicity, Apple goes the opposite, more gestures and hot corners which means that there is no place to move the mouse that will not produce an effect.

On iOS, the fact that the Home Button brought you back home every time was a selling point, so when you are in the sun, you click once and you know where you are. Then they introduced the 2-click on Home for the Control Center, 3-click, 4-click and now, it’s a cycle, so you never are home. Same for the swipe up to bring the control center, which you always have to do twice because the first scroll up generally brings the local app’s own swipe gesture. Apple became a model of stuffing the maximum of gestures, while Windows went the way of the simplicity. I can’t wait the day when Satya Nadella says “The problem with Apple guys is, they have no taste”…


It's not really evident you've actually held an iPhone or operated a Mac based on how strangely off your complaints are. Gesture and hot corner features and complexity hasn't really changed in over a decade. Getting to the home screen on any iOS device is still just one click or swipe and has never necessitated multiple clicks or allowed apps to override the OS level gestures.


> Starting around Win8, the designers had full control, and most crucially essentially none of the designers use Windows.

What happened to the culture of "dogfooding" that supposedly was a big thing at Microsoft, and often pointed as a contributing factor to its success?


I just want to say thanks, this is why I love HN. I'm a fairly new user, and the amount of "inside people" that show up in threads really blows my mind.


Can somebody forward this feedback to Mr. Bill Gates?

Not sure if he cares that his legacy goes down the corporate drain, becoming a failure.


Does that mean they're not doing usability studies anymore?


Like with every test the procedure is key to good results. They might be doing them only to confirm their actions.


>The Windows 11 taskbar is an annoying step backward

Hmm, somebody noticed.


> This has been the case for a while. I worked on the Windows Desktop Experience Team from Win7-Win10. Starting around Win8, the designers had full control, and most crucially essentially none of the designers use Windows.

The idea that designers circa Win7 "ruined" windows UX/UI is just revisionist history. Windows has had pretty terrible UI/UX basically forever, and most Linux folks have known this since the early 90s, where KDE, GNOME, etc. were vastly superior to anything Microsoft was putting out. In fact, most ideas you see in MacOS these days can find their roots in early (and often experimental) Linux interfaces (from multiple desktops, to launchers, to Exposé).

The Windows design team was always terrible, and not to mention lazy[1]. Don't mean to throw shade on your ex-team, and I'm sure there's dozens of brilliant people working there, but the design has been an absolute dumpster fire for decades. And this comes from an almost exclusive Windows power user. (I just recently bought my first ever Mac laptop).

[1] https://www.reddit.com/r/Windows10/comments/grhjuu/til_that_...


Ermm...no. I would say Microsoft's UI team was absolutely stellar and basically the gold-standard in the world during Windows 95 days.

I think you need to open your mind a little bit and read: https://socket3.wordpress.com/2018/02/03/designing-windows-9...

This is debatable but personally, 2009 Windows 7 UX/UI is still better than 2022 versions of GNOME, KDE, XFCE what-have-you in the Linux world. If anything, complete opposite of what you're saying is true. Linux UX/UI is let's just politely put it - OK.


win7 was aesthetically awful, but usability was almost on point

but KDE is slightly better, at least in these aspects I care about, of course it looks disgusting, but you don't need third party tools to disable taskbar grouping and that's a big thing


> win7 was aesthetically awful, but usability was almost on point

A bunch of clicks to get to the Control Panel, but once there just a few to get to the GUI configuration dialogue box, and somewhere in that one or two more to enable "Windows Classic" style, and you had the good old W95/NT4/98/2K aesthetics back -- and all the functionality of W7. (And then you could spend a day or two getting colours and fonts and GUI-element sizes juuuust... so, if you wanted to.)


I agree, Windows 7 wasn't as good as Win95/98. Win95 I think is just second to none in UI design. Perhaps, Cinema4D UI is a contender but that's an application. If you haven't checked, totally checkout C4D :-).


To each their own. Linux DMs (including KDE, Gnome, and many others) are still way behind in basic UX flows like multimon and snapping windows, and Mac has barely even tried. Keyboard shortcuts for moving windows between monitors, snapping, launching pinned apps, etc. are also sorely lacking on alternatives. I use Linux (Gnome 3 these days), OS X, Windows, and Chrome OS on a daily basis and still prefer Windows for my personal devices.


To each their own indeed. For me, as someone who uses KDE/Plasma by choice and Windows because sometimes I have to, it has been a source of humor for years and years how an environment literally called Windows can have such hilariously bad window management. No point to focus, no configurable window shortcuts, no send to front/back, no minimizing a window when its application is "busy", bad and inconsistent visual indication of which window has focus...


KDE has 95% of the things you've mentioned for at least 5 years, if not longer (I'm using KDE for more than a decade, so I don't remember exact dates).

I clearly remember that when Windows introduced auto-snapping/auto-scaling windows, I looked and giggled as yet another KDE feature made its way to one of the "bigger" environments.


What ? I can't speak about Gnome and other DMs, but KDE experience with multimonitor, it's better. It simply works out of the box. I don't get monitors swapped aleatory. I can designated my primary screen without issues. Also, snapping windows was a feature on Linux DMs like decades ago!


Multi monitor comfort in Linux is a relatively new thing. In the olden days, that thing was flat out painful.

It's really well working in these days, but it was not like that until recently. I remember setting up Xorg.conf to get my single monitor working as it should.


Ha. I still had to do it yesterday because the PPI of my screen is 141 and so Nvidia set it to that and KDE half followed along making everything have larger title bars and fonts even at 100% desktop scaling. I had to set it to ignore the monitor's true ppi and just use 96.


What's "recently" for you ? The last time I had to write Xorg modelines must have been before Ubuntu 6.04


5-6 years.

Yes, Auto configuring X.org is older than that, but drivers, Multi monitor setup and other related things came later than that.

Even if X had proper support for multiple monitors, drivers were very finicky at the beginning.


weird, I wonder what distro you were using. Even 5-6 years ago, multi monitors just worked for me - hell, I had a retina screen at that time and it pretty much just worked too.


Editing xorg.conf became unnecessary around 2003 iirc. You either have a strange set up or a very good memory.


I'm using Linux for ~20 years at this point. I remember fine tuning X.org after than that to fine tune behavior of ATI and nVidia drivers back in the day, around 2007-2008 IIRC.

While most of the work has vanished for single monitor setups around that time frame, multi-monitor setting wasn't very reliable and deterministic under xrandr came around and drivers added native support for that.

*: randr came around 2003: https://cgit.freedesktop.org/xorg/xserver/log/randr/randr.c?...


I still had Ubuntu upgrades requiring me to edit the xorg.conf for a bog standard single CRT in about 2007 or so.

And I definitely touched xorg.conf a bunch of times even a few years later.


Jesus, those days of manual modeline hacking, because I could get an enormous, 42kg (had to carry it 6 stories up and later down), 21" fixed frequency Monitor from an old Workstation for small money...


You've clearly not used KDE in about 10 years if you think it doesn't have excellent snapping and multi monitor and multi desktop. VS KDE. Without powertoys is at parity with maybe kde4 in this area.


> Keyboard shortcuts for moving windows between monitors, snapping, launching pinned apps, etc. are also sorely lacking.

Allow me to show you this tiling window manager…


>most Linux folks have known this since the early 90s, where KDE, GNOME, etc. were vastly superior to anything Microsoft was putting out.

Saying this and also claiming revisionist history, this has got to be satire ?

Early 90s Linux desktop ? From what I can tell KDE/gnome started development in 97.


From what I can tell KDE/gnome started development in 97.

KDE started in 1996, but I remember running one of the first alphas in 1997 (Alpha 2?) and being relieved that Linux had finally something that somewhat resembled the Windows 95 UI. Before that people were primarily running tvm or fvwm (CDE was still closed-source), which was fine for Unix-heads, but nowhere close to the gold standard that the Windows 95, NT 4, or macOS classic were.

(Ps. fvwm95 was released pretty soon after Windows 95, but only had the look of Windows 95, not really the usability.)


The idea that designers circa Win7 "ruined" windows UX/UI is just revisionist history. Windows has had pretty terrible UI/UX basically forever, and most Linux folks have known this since the early 90s, where KDE, GNOME, etc. were vastly superior to anything Microsoft was putting out.

Sorry, but talking about revisionist history. KDE had their first stable release in 1998, years after Windows 95. KDE was inspired by CDE, but almost equally by Windows 95 (see e.g. the launcher in early KDE versions). Before KDE, we were mostly using fvwm, twm, etc., which had extremely bad usability for non-expert users. The GNOME project was started after KDE, because some folks were unhappy that KDE used Qt, which didn't have a free software license at the time.

In fact, most ideas you see in MacOS these days can find their roots in early (and often experimental) Linux interfaces (from multiple desktops, to launchers, to Exposé).

Well, that's saying like "most ideas for the iPhone have their roots in...". Sure, there may have been environments that give some overview of applications. But Exposé set the gold standard for this feature and relied very heavily on GPU-accelerated compositing (Quartz compose). Something Linux desktop environments and Windows only got years later. Quickly after Exposé was added to Panther, you saw folks in the Linxu world trying to copy the feature to KDE and GNOME. See e.g.:

https://store.kde.org/p/1220207

(Also see how clunky and ugly it looked compared to OS X Exposé. Since the icons all have different resolutions, some are sharp, others are very pixely. The design is quite ugly, the Kopete window is cut in half. Etc.)

The Linux development has largely been reactive, even to this day. E.g. once hamburger menus started popping up in Google's design language, GNOME started to copy this idea. Thanks to which GNOME doesn't have application menus anymore (which is absolutely disastrous for desktops.)

most ideas you see in

I think this also expresses an issue with the Linux-approach to the desktop (by and large). In the end it is not about features, ideas, or being there first. It is about forming a consistent and coherent interface. Linux desktops fail here in even basic things that have been solved in other desktops for decades, such as consistent keyboard shortcuts between applications.


I absolutely LOVE the Windows'98 UI : https://guidebookgallery.org/screenshots/win98


I always found the gradients of the title bar colours a little cheesy.


Windows 7 with the dock, searchable launcher and hotkeys for placing windows was when Windows became quite usable. I’ve never heard people complaining that it was ruined, just moaning about having to relearn stuff. But that is inevitable. We would still be using stone tools, if we had listened to these people in the Neolithic.


> In fact, most ideas you see in MacOS these days can find their roots in early (and often experimental) Linux interfaces (from multiple desktops, to launchers, to Exposé).

I am pretty sure it was the other way around.


Not at all. Multiple workspaces was common in Linux way before it was a thing in windows or Mac. And Compiz really helped in bringing new fancy effects to the desktop.


Multiple desktops were on Unix first. But Quartz Extreme preceded Compiz by three years:

https://en.wikipedia.org/wiki/Quartz_Compositor

Also, Exposé was introduced in 2003 before Linux counterparts (and requires a good compositor to do it well). In fact, there was a rush to write KDE/GNOME plugins after Panther was introduced. Sadly, most were pretty janky/bad until Compiz was introduced.

Spotlight (launcher/search) was introduced with Tiger in 2005, also much before Linux desktops started introducing it. Only after that, desktop search (e.g. Beagle) and search-based launchers like GNOME Do started showing up.


And it's not just Windows proper, but also things like PowerToys.

In every version of Windows since at least 2.0, Alt+Space has worked to bring up the system menu for the current window. This is the same menu you get if you click the app icon in the top left corner.

Once in a while, this has been a lifesaver for me when a window gets stuck offscreen with my multiple monitor configuration. It is a bit tricky, but even if you can't see the window, you can press Alt+Space to open the system menu, then M to select the Move menu item, then press any arrow key, and finally you can use the mouse to move the window around.

Yes, quite kludgy, but this has worked on every version of Windows for decades.

The PowerToys team decided to take over Alt+Space for their own Mac-style launcher, without even warning you at install time that they are disabling a fundamental Windows keyboard shortcut.

This caused me loss of data when I tried to use that trick for a window that got stuck offscreen. I forgot that I'd installed PowerToys a while back, and Alt+Space didn't work to rescue my window!

There have been multiple threads and GitHub issues raised on this, but the PowerToys team has seemed deaf to the problem. They closed one of the issues with a comment like "our metrics show that very few people use Alt+Space for its old purpose, so we felt OK about reassigning it."

I was like, "Your m-----f---ing metrics told you that? Your metrics just caused me loss of data! Do I need to go get Raymond Chen to explain to you the value of backward compatibility?" But I didn't post that, as there is no hope of getting through to people who think metrics should tell them what to do.

Let's say there are a billion Windows users. And only a tenth of one percent know about Alt+Space to open the system menu. No big deal, that's only a million users you have just inconvenienced or caused to lose data.

Please forgive the rant, but I just don't understand that line of thinking.


Reminds me how Mozilla removes various things from Firefox such as [Compact density](https://www.pcmag.com/news/firefox-redesign-will-see-compact...) because their telemetry tells them nobody uses them. And oh boy the backlash from the community...


> Reminds me how Mozilla removes various things from Firefox...because their telemetry tells them nobody uses them

Got I hate telemetry. I usually (and I imagine many people) turn it off because I don't like my stuff phoning back to the mothership, but then some brainless moron mindlessly uses their "data" for decision-making, and rips out good features.

This is especially dumb for Mozilla. Who do they think still uses Firefox? The people who want some dumbed down minimalist lookalike? Their overall browser market-share isn't too much more than Samsung Internet, and probably the only reason anyone still uses it anymore is that it's not as dumbed down as Chrome. Maybe they should just run with being the browser for power users, with all kinds of interesting niche features and customization.

Honestly, UX telemetery probably should be banned for everything except error reporting. Nothing good seems to come of it. If you want to know what people do with your stuff, just try to empathize and maybe have some focus groups.


You could argue that the majority of telemetry is produced by the most compliant subset of the users. Critical or privacy-savvy users will probably reject telemetry when asked. These users will accept all the bad UX ideas.


In other words, the data is just acting as confirmation bias.


> Maybe they should just run with being the browser for power users, with all kinds of interesting niche features and customization.

The ship has long sailed after they killed their addon ecosystem by killing XUL addons. The quality and feature-sets of addons took as much a nosedive as their user-share.

Since then, they were keen on continuing this trajectory, and introduced multiple UX/UI regressions that alienated all (most?) of their power users, sometimes even disallowing fixes through about:config.

That .gov analytics site that was posted here yesterday showed Firefox setting at 2.4%, which is just about right for all the string of horrible decisions they've made.

The horse is dead, Jim. And, unlike an old movie or a videogame, almost no one is going to have nostalgia feelings for a web browser.


> The horse is dead, Jim. And, unlike an old movie or a videogame, almost no one is going to have nostalgia feelings for a web browser.

I have more nostalgia for old Firefox [EDIT:] and for the old W95/NT4/98/2K GUI [/EDIT] than for any game or movie.


This is why power users and developers like us should be running with telemetry enabled, so that our actions, the features we use, and the way we use them get accounted for and stop being removed or nerfed into uselessness.

Yes I know there is talk about privacy and security issues with telemetry, and I don't necessarily like it myself, but if we don't vote with our data then the developers of these systems won't recognise what we use and we will lose it.

Probably not the most popular view around here but if companies use these metrics to decide what is important then maybe we should contribute some data.


Compact density, bookmark descriptions, rss support... They've removed so many "niche" features that I decided to try Chrome and found the only thing I miss are multi-account containers. I used a container for personal stuff and another one for work, and I've switched to using Chrome for work and Firefox for personal stuff so I don't even use that anymore, but I hope they don't remove it.


Mozilla removed add-on support from Firefox Android for non-approved add-ons. So now 90% of the add-ons I used to use are gone.

There is a workaround using custom AOM collections but it's annoying to have a feature removed.


The initial versions of PowerToys Run used the old WindowWalker default shortcut of Win+Ctrl, which I found to be fantastic. Then they rewrote the whole hotkey detection code which made that hotkey combination nonfunctional.

Now I use Win+Ctrl+Space but it doesn't work at all if certain apps are focused, ironically VSCode and Windows Terminal, so you have to enable some "compatibility mode" that also doesn't work properly sometimes with those apps in focus. The PowerToys Run window pops up but it isn't always focused.

Might have to fork the damn thing and reinstate the old hotkey code :/


> I tried to use that trick for a window that got stuck offscreen.

Aside to your larger point, but WIN+SHIFT+DOWN ARROW (restore), WIN+SHIFT+LEFT or RIGHT ARROW (move monitor) are an alternative go-to for saving off-screen windows.

You can also reconfigure Powertoys Run to use a different key-combination than ALT-Space.


I didn’t expect to see a Powertoys rant in this thread. Seemed like one of the few teams at Microsoft that’s clearly power-user-focused and eats its own dogfood. Has made my life easier in several ways. Very difficult to live without once you become accustomed to it, has a number of features that should be shipped with Windows but aren’t for some reason.


It’s an optional power user add in you’re complaining about not a system wide change. Just disable that feature if you don’t like it.


I was describing how this metrics-driven style of development infected not just the core Windows team, but peripheral teams like the PowerToys group.

PowerToys used to be a rather benign system enhancement. It didn't take away any features I already relied on, it just added new features.

So when I saw that there was a new PowerToys for Windows 10, I installed it. I figured it wouldn't get in the way and I would discover its enhancements over time.

I didn't know at the time that it would disable a basic system keyboard shortcut that I had relied on decades.


Couldn’t you just open PowerToys and disable the feature though, or does that not restore the other shortcut?


Of course I could, and I did, eventually.

The problem was that in the moment, I did not understand why my Alt+Space shortcut that had always worked before instead brought up this weird launcher window instead of the standard system menu I was expecting. After fiddling around for some time, I gave up and rebooted and lost data.

Once I learned what was going on, I switched the shortcut in PowerToys.

That does not excuse the PowerToys team for disabling a fundamental Windows shortcut key combo.


They very existence of PowerToys is a testament to Windows's mismanagement. They wrapped search around Wox, an app that uses Everything, to appease users who were unhappy with Windows Search. Microsoft declined to use Everything to power their search, by the way. Most PowerToy suggestions on their GitHub are just Windows suggestions and bug fixes.


Mine's set to Win+Space.

Finally a use for the windows key! (on top of win+shift+arrows)


They did?!? Please excuse me a sec, while I go uninstall that shit...


Old UIs were like woodshops, with brightly-colored tools stored in a series of labelled tool chests. Everything got reorganized once every other year, which was annoying but understandable.

New UIs are like shoe stores, with off-white shoes stored in unlabelled off-white shoeboxes. Everything is shuffled around randomly once per quarter, which is exhausting and not understandable.

Why did this happen?


> Why did this happen?

UI/UX people need to make their mark to justify their salary, that's why.


Uhh I don't know if UX takes part here. The new design really ignore ux. I guess it must be UI & marketing people.


My guess is marketing roles just pays a whole lot better than anything in the society today thanks to extreme focus onto the Web. It is plain stupid to seek career in actual hard engineering when just consuming alcohol and socializing in suits lands you mid six figures. People often criticized that we are spending finest of a generation into extracting clicks just few years ago but it's not even that.


"UX" is mostly just the new "sexy" term for UI. Look how many UI designers there used to be; now they're (almost) all "UX designers".


To be fair, classic Windows had its share of poor UI choices. Remember those days of a system tray with 25 icons in it?


One that seems a lot more core to “old style” Windows to me was the labyrinthine dialog tunnels one had to traverse to change some settings, which persisted through XP and didn’t really start getting better until Vista/7. Half the secret to being “good with computers” back then was just remembering which branch of the cave system you needed to take to toggle a particular checkbox.


For good or for bad, We got used to the labyrinthine dialog tunnels.

It's no different than building up the option flags in a bash or powershell one liner. It's the direct graphical translation even.


Probably written by somebody who used their computer to play D&D all day. You are in a maze of twisty passages, all the same...


That's Zork, not D&D.


Oops. I guess that shows to go, I don't play computer games...


11 is better but there's still a few of these tunnel vestiges that show up, like removing a driver from a hardware device or adding NTP servers.


Applications have to explicitly add themselves to the tray, while a button in the taskbar is provided by default (there's some conditions on window styles and such which I won't bother to go into here).

The Win11 taskbar is like having only a systray with 25 icons in it.


I do, but I'd say that was an okay design and that bloatware bundlers abused it. It was simple to use and made a lot of sense for your "always on" applications like AIM, but definitely didn't need to be occupied by an icon for your printer drivers.


that sounds more of a problem with all the random apps you install than being a "windows" issue. microsoft can take steps to mitigate it (eg. by making hiding system tray icons by default), but calling it a "poor UI choice" on their part doesn't make much sense. it's like blaming firefox for allowing sites with annoying banners and not having adblocker by default.


Am I supposed to practice driver minimalism to prevent a slew of tray icons from occupying my screen?

It is poor design by having that be the form of which quick access to settings is provided. The tray icon paradigm simply invites clutter.


> Am I supposed to practice driver minimalism to prevent a slew of tray icons from occupying my screen?

The taskbar icons are added by programs that are running. Disabling autorun for them prevents them from starting, and therefore the icons from showing up. I'd say 90% of the time disabling autorun for the companion program doesn't break any functionality.


I've got the opposite problem here. I like showing all the active icons (although I also try to keep them to a minimum) and Windows 11 removed the option to always show all. Now any time you install something with a new icon, you have to go to the taskbar settings again and manually allow it to show.


or having to press the start button in the taskbar to stop/turn off your computer!


How do you turn off the engine on modern cars with a "START" button in stead of an old-fashioned metal-key ignition lock?

I used to repeat that line too, 25 years ago when I was still on Windows 3.x, but it's gotten a little old since then.


those buttons usually have both "start" and "stop" text on them


I find it's about fifty-fifty, not sure which side it tends towards. But I'm talking from what I've seen in Internet pictures and YouTube videos; I've only ever owned cars with metal-only ignition keys, and last I bought one, the buttons hadn't yet filtered down to my price class, so I can't recall if I've even test-driven one.


> It's almost like some tiny extremist faction has gained control of Windows

Specifically, this faction's name is WebXT. They're the one responsible for adding all the ads, the forced Edge adoption, etc. And they're doing it with no oversight because there no longer is any one person in charge of Windows any more, it's a pissing contest and whoever's running that team wins because they can show short-term profits (no one seems to worry about the long term effects of burning all the good-will away), such short-sightedness...


> Monitors are bigger than ever with huge resolutions, and yet UIs are being dumbed down to uselessness and alienating an increasing number of users.

> I wonder how much damage they will inflict before people start turning to WINE and a saner Linux distro, just to run their Windows applications.

The issue is, operating systems aren't something you have a whole lot of choice in; you use whatever OS goes with the software/hardware that you need to run. (It's like that classic "I need you to understand that friends don't go around recommending operating systems to one another" meme). Sure, you could try setting up Linux+WINE, but that is a lot of effort for a few taskbar improvements. Even most power users wouldn't go that far.


that,s what i did about a year ago, got sick of windows forcing updates that take forever, dealing with garbage being installed on my computer, advertising, spying etc. Now Linux UI/UX is way better then Windows depending on the desktop enviroment you choose, but it has other issues that i hope they will solve soon.

Installing a printer on Windows... you have to go through so much crap to get it running but on Linux you just connect your printer to WIFI or to your pc and its ready to print. No idea why Windows makes it so hard. I was also surprised when i seen my wacom tablet just works without doing anything and its settings or in the settings panel, but GIMP UI/UX is horrid that's what you get when programmers try to do UX/UI


>on Linux you just connect your printer to WIFI or to your pc and its ready to print

To be fair that is exactly what happens with my printer with Windows.


If you have a wacom tablet I heartily recommend Krita.


yea i have it, but with how powerful Darktable is compared to Lightroom, i rarely need to use GIMP or anything else.


Ah yes for RAW work Darktable is my go-to software, it works great.


Or some "UX consultants" need to justify their high salaries and some managers want a promotion by giving the illusion of novelty to their superior.

There is absolutely nothing wrong with Win10 taskbar and MSFT didn't learn a damn thing from Windows METRO fiasco.

Explorer and file browsers on the other hand are in a dire need for change. For instance, I wish I could easily tag my files, no matter where they are. I shouldn't have to remember where I put this or that file in my hard drive. File search on Windows is still terrible.


It's the same thing with hardware, just look at laptops. Apple from 2005 would think we have gone too far. No dedicated volume controls, all keys must look exactly the same and keyboards of exactly rectangular shape, no touchpad buttons, no leds to indicate battery status, the fucking power button that looks like a normal key and sits between other keys. At least the webcam shutters are coming back


I actually have an issue with my Windows 11 right now where I have 7 (seven!) keyboard layouts to choose from, but according to windows settings only one language installed, with one keyboard layout.

It's gotten so bad that what they're enumerating in these mobile menus aren't the same source the system is using. It's extremely annoying.

Edit: If anyone is curious about this specific issue, I solved it by adding the languages and layout Windows thought I had, just so I then could explicitly remove them again, which seems to have solved this issue... for now at least.

Edit 2: Don't get me started on the new context menu where you have to click "more" to get the rest of the options that applications can add.

Edit 3: Oh, and there's no setting for not grouping taskbar items, or always showing all systray items, in Windows 11.


Android 12 had the most bizarre UI changes. They made everything so big and spaced-out, it induced scrolling everywhere. Most infuriatingly, the default pull-down menu (with wifi, flashlight, airplane mode, etc.) became tall enough to crowd out the music player controls and induce scrolling. Who wants to scroll on such a constantly accessed part of the UI? And all in service of spacing out the button text by an absurd 10mm.

I can't imagine any justification for this other than some blind adherence to a design ideology over user testing.


> Most infuriatingly, the default pull-down menu (with wifi, flashlight, airplane mode, etc.) became tall enough to crowd out the music player controls and induce scrolling.

On many (most? all?) Android phones, you can move the icons in that menu around. (Press-and-hold, then drag, IIRC.) Takes a while to find out which ones you use and which you don't, but once you do you can then move the former before the latter.


"authoritarian minimalism"

Can we blame this on Apple?


I think we can blame it on a poor imitation of what Apple does.

However even Apple has copied the atrocious flat UI fashion that has destroyed the ability of users to decipher what can be interacted with, and how. If I have to deal with another goddamned toggle that has reinvented the wheel and in so doing destroyed the ability to decipher the current state and what toggling will cause to happen... well I'll be very unhappy and then continue to eat the dog food because I have no other choice.


Ok, so this light grey text and toggle means it's not selected, I assume? Click.

Oh, the text has now changed to the other option and the toggle is green, so that means the text is telling me what option is actually currently selected, whilst the toggle colour indicates which option the system thinks is equivalent to 'on'.

Yep, many interfaces I've come across require me to interact with the toggle to work out what the current setting is.


Many web interfaces use iOS style toggle and somehow use "double negatives"(?). It's a binary decision but still, they somehow make it seemingly confusing like you mentioned. It's like - is the text label, the result of what happens after you click this, or is it showing the current state?


That's my pet peeve as well. It should read "activate feature x" not just "feature x" and then change to "deactivate feature x".


Worst one is when the button text changes. Does the new text represent the current state of the system, or does it represent the state which the system will be in when I press it?


Apple is terrible at this. How do you close all tabs in Safari on iOS? Press and hold the tabs button. God forbid there was a button when viewing tabs.


Yes. Years and years ago there was a funny chain email called 'if you car was an operating system' about how different operating systems would make a simple trip to the store unnecessarily difficult. The Mac entry was 'You get in your car to drive to the store, but it takes you to church instead.'

I've had an antipathy to Apple gear over this since the very first tech job where I worked with Mac SEs, because when they'd crash they'd display this amusing bomb graphic and an utterly unhelpful numeric code. There was no way to explain what had gone wrong to the person using the machine (who had just lost a pile of work) and more often than not the more experienced people in the support department would just shrug and reinstall the operating system.

This Fisher-Price toy mentality now pervades personal computing and is making problem solving itself obsolete. Many people just don't have the patience or inclination to figure something out when they could just reset it.


I'm not sure. Yes, latest macOS versions do have their share of misguided decisions, but at least most of the time it still feels like it's built by people knowing what they're doing and what their users need. It feels like there's some sort of contention going on between those who want shitloads of whitespace and other attributes of "UI like an art piece", and those who want their users to have great, empowering, and easy to use tools.


I want to throw in Tesla too. The minimal, buttonless dash is as dumb as the Windows 11 UI.


But it sells. And I agree by the way.


You would, with that moniker. Vroom, vroom!


Yes, most profitable company in the world. People copy all including the really bad ideas.

Just see how car makers have copied stuff out of the Tesla that is really bad like touch screen only controls. Some are thankfully going back to tactile switches as they realized the safety concerns.


> Can we blame this on Apple?

We can always blame everything on Apple.

Quite a lot of the time, we'll even be right.


Your words remind me this is happening everywhere.

Relentless focus on minimalism but they don't know where to stop.

Apple is doing it.

So is Tesla. Their cars have fewer and fewer dedicated controls, removing control stalks and putting critical functions on a touchscreen (for petes sake).


Sure, for power users and people who have used the same UI for over a decade, the changes do not seem to make any sense. But that's why there's options to re-enable legacy UI, as seen in TFA.

I suggest you give a laptop to a normal person who doesn't spend time on the computer and see how they use it. Give them a task that doesn't involve using the browser. I'm pretty sure you'll be surprised how easily they get lost in the UIs. These "minimalist" UIs actually do have a use for people who struggle with basic usage of their OS outside the browser.


> The strange trend of "authoritarian minimalism" design that seems to be working its way through the majority of newer software is very strange.

A lesson that Microsoft could stand to relearn was from Windows XP, where the new grating Fischer Price UI was easily turned off to return to the sane defaults of Windows 2000.


The Windows 10/11 debacle... I only use Windows 10 to play Fornite. And only because Epic don't release Fornite & Epic Launcher for Linux. Nearly every game that I play, ran native on Linux or just ran fine over Wine/Proton. If I can, I would NEVER touch the dumpster fire that is Windows 11.

A sane GNU/Linux distro with Plasma/KDE, have a user experience that it's far superior to Windows. It's simply works out of the box, don't spam you, and don't spy you. And offer a familiar UI where the user can customize it totally.


I don't think this is minimalism though, this seems somewhat informed by "a picture is worth a thousand words" (hence an icon suffices). Let's not kid ourselves, designer just need to justify getting paid.

A programmer suggesting a "refactoring" of the whole codebase is often frowned upon (yet happens). On the design side it feels like the opposite, if a designer doesn't come up with some sort of overhaul every year, they, or their boss will feel insecure or something.


> Monitors are bigger than ever with huge resolutions, and yet UIs are being dumbed down to uselessness and alienating an increasing number of users.

Looking at the screen resolution data of visitors to the web apps I’m working on, I’m still surprised by how many people are on 1024x768 screens and even smaller (often more than 50%). Don’t forget there’s a big crowd of people on small and cheap laptops in the developing world. New OSes should also serve this group.


>I’m still surprised by how many people are on 1024x768 screens and even smaller (often more than 50%)

I was also surprised when you said that, but after a few seconds of thought outside of my tech bubble where everyone has 4K/retina screens, I remembered that my parents, and my GF and her parents, all still use their 10+ year old PCs/laptops with low-res screens, so now I am not surprised anymore.


True. But the reason W11 UI is a horrible mess is not precisely to make the experience better for people with low res laptops.


> Don’t forget there’s a big crowd of people on small and cheap laptops in the developing world.

In the developed world too, I'd think. It's not like Grandma -- or the kids / grandkids buying her a Christmas present -- are going to splurge more than a few hundred bucks on a laptop for her to Facebook with.


>Monitors are bigger than ever with huge resolutions, and yet UIs are being dumbed down to >uselessness and alienating an increasing number of users.

This! so much this!

Never have we had so much screen real estate, yet UI designers seem to want to compress the actioanable items into as small a space as possible, with the stated goal of providing as much 'content' space as possible. The result is swathes of pointless unused whitespace, or overly large fonts to fill up the space if anybody uses the app maximized.

Currently on Chrome, the tabs occupy what was once the Title bar (why ??!), same on Office apps where the save and search functions take space on the Title bar. Please, I'm an MDI guy, I hardly ever maximize application, and since the monitor(s) is so large, I like to have an easy to click Title bar to select my focus app, or to move the window somewhere. I've been doing this since the GEM days, and that's how I like it!


I would guess that the designers have nice clean desks as well, so all that whitespace is giving the illusion of cleanliness.

I also imagine that those designers would maximise a window where there would only be a single page in the middle of the screen and whitespace all around.

Looks "nice" from a minimalist standpoint but a complete waste of space otherwise.


Gotta make use of those full-time UI people somehow... sigh.


It's always every other release. Then the next one is a massive 180, because all the power users (including server admins, software developers, etc people who actually might affect MS market share) absolutely loathed that release.

We have seen this happen with Vista and Windows 8, and it will happen again with whatever comes after Windows 11. Microsoft does eventually listen to the users, but unfortunately only after it's too late for release N. I still can't explain how things are able to get that bad, before the correction happens


I have loved Vista. The difference with all the people that hated it is that I waited about 1.5 years after release before installing it. It ran flawlessly on my PC and to this day I have no bad word to say about it.

I'm pretty sure Windows 11 will be mostly OK next year. What's the rush? It's not like Windows 10 is broken.

If it wasn't for telemetry I'd still be on Windows now.


If I remember correctly, Chrome started this minimalism sh*t. It was kinda make sense as most people like me only had one single 17" CRT with 1027x768 resolution by that time and thus the screen estate was quite limited. But not things have changed a lot. Most people now have one or more 24" LCD with 1920x1080 resolution at least. For myself, I have two 27" 4K ones. So screen estate is no longer a problem any more but such ridiculous trends are not going away but getting worse and worse. Hamburg menu becomes everywhere and the tabs are most likely with huge icons/ribbons but still no place for menus, even there is more than enough space. So the result is that even though there is sufficient space, uses likely still need to click at least one time more to do what we could easily do in the past. Making it worse, such BS is not only limited to software but also expanded to hardware designs. Bought a TV recently, in general I'm happy with the image quality etc., but the remote drove me nuts: it removed video source button along with whole bunch of other buttons. Now I have to press at least 5 times to get to the video source... and such disaster is called optimization...

And you were right, I'm on the verge to ditch Windows but unfortunately, there are something cannot run on Linux yet...


Or an eccentric billionaire to fund SerenityOS.


There is interest in getting WINE to run on it: https://github.com/SerenityOS/serenity/issues/6410

Compatibility is definitely the one big thing SerenityOS is missing. A nontrivial effort for sure, but if it can run Linux and Windows binaries, you are going to see much more adoption.


But then you've basically just made ReactOS again?


ReactOS is a kernel-level implementation of Windows NT. Serenity is a Unix-like.


For the kind of people who read Hacker News, this does sound like a significant distinction. For everyone else it's a mere implementation detail.


It looks like Mac OS's dock, which is IMHO the best way to present a taskbar. Good to see this in Linux distros as well, such as Elementary OS (which looks a bit ugly, but is getting there).

Minimalism is about getting out of people's way. Many extras are never used by most people and shouldn't be visible, or they are annoying. If you want those extras there should be options to enable them or make them visible. Maybe even specialized software. But don't force every obscure feature onto my desktop!


>Minimalism is about getting out of people's way.

Nothing says minimalism as shoving a 80 pixels dock in people's face. :)) Here is current taskbar https://i.imgur.com/GkAfOMx.png , this is minimalist.


Yes, but on macOS the Dock can still be put to the side of the screen, it supports drag and drop and right-clicking, etc.

Windows 11 broke most of that, which is the most common complaints I have (and hear). Windows’ SnapAssist is _amazing_, and I miss it on the Mac, but I am not happy with what was done to the taskbar.

(I use both OSes and Linux, so I am OK with change, just not losing flexibility)


>I wonder who actually wants this stuff

The same people who want the authoritarian minimalism buildings: their designers, who get to satisfy the design fashion, and show off to each other.


>before people start turning to WINE and a saner Linux distro, just to run their Windows applications

I jumped to Linux when Windows 11 was announced. I tried it (W11) in a VM and it convinced me that things are only going to get worse.

Must say that I couldn't be happier for the move. I still get to run the vast majority of software via WINE (Affinity products need to up their game though) to boot. Games are much better supported nowadays as well, so it's all good, and no Windows 11.


Just give me the Windows XP UI


Not sure I follow. What's the relation between monitor size and UI?


It is the Zeitgeist. "We know what it is better for you, peasant, we will create this cozy, little garden and you will never leave it, and if you complain you will be expelled, like Adam and Eve", or like the Xbox guy said recently "If you are banned from one online service you should be automatically banned from all of them"


it's not "authoritarian minimalism"

it is laziness from microsoft




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: