Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't see why Bluetooth took off and wusb didn't. It must have something to do with marketing.



Bluetooth had some early success in cellphones, mostly to support Bluetooth headsets and car radio integration, starting from about 1999. It could do other things, but the wireless headset was the killer app in its early days.

Bluetooth didn’t really hit mainstream until the arrival of chipsets that multiplexed Bluetooth and WiFi on the same radio+antenna. My memory is that happened sometime around 2007-2010.

At that point, the BOM cost to add Bluetooth to a laptop or smart device became essentially zero, why not include it? Modern smartphones with both Bluetooth and Wifi arrived at around the same time (I suspect these combo chipsets were originally developed for handheld devices, and laptops benefited)

And once Bluetooth was mainstream, we saw a steady rise in devices using Bluetooth.

WUSB operates on a completely different set of frequencies and technology and couldn’t share hardware with WiFi. Maybe it could have taken off if there was a killer app, but there never was.


> the wireless headset was the killer app in its early days

Don't forget music piracy.

At least over here, a lot of kids had phones that did Bluetooth, and the primary use case for it was sharing songs they liked with each other. You could use infrared (IRDA) for that, and some people did before Bluetooth was common, but it was much slower.

This was mostly on low-end Nokias, maybe with a bit of Sony Ericsson thrown into the mix. They definitely did not have WiFi, in fact, Nokia even tried to limit internet over Bluetooth for usual carrier monopoly reasons as far as I'm aware, but Bluetooth was definitely there.

For many here, the iPhone not doing file and ringtone sharing over Bluetooth was one of its main limitations, at least early on. It was a social network in its own way, and having a device that couldn't participate in it was no fun.


By "early days", I was more thinking about the 1999-2005 era, before low-end Nokias even got Bluetooth and the ability to play MP3s.

The wireless headset was the killer app that drove bluetooth adoption within cellphones, driving down costs until eventually the lower-end models receiving it too. While sharing files was possible in the 1999-2005 era (especially with PDAs), most phones were lacking enough flash storage to store anything worthwhile.

While I don't want to say file sharing wasn't a killer app, it does seem to have been limited to just schools during a certain time period.

A time period that I missed out on by a few years. At high school, we did all our file sharing by swapping burned CDs. Then we switched to dragging around laptops and USB hard drives at university (and using the private emule network on the university wired ethernet).


It may be worth articulating the Bluetooth headset specifically as the one-ear little clip headset executives and IT staff seemed to use to answer calls.

Remember companies like jawbone?

I vaguely remember a cultural stereotype of bmw drivers driving aggressively and wearing Bluetooth headsets. [edit: this is the clip https://youtu.be/UqfAMvXpSw4?t=25 from top gear of jeremy clarkson wearing a bluetooth headset in sunglasses in a bmw, supposedly from topgear season 10, episode 10]


Yes... there is a very interesting generational thing going on here.

Bluetooth headsets were very popular among a certain market segment (business people who made a lot of phone calls), but saw very little adoption outside of that. At that time, you often had to buy an expensive business grade phone to get bluetooth functionality.

Then once Bluetooth was common in cheaper phones, we see a completely different market segment (students at schools) rapidly adopting bluetooth for a completely different usecase (file sharing). It's hard to find two market segments that are more isolated from each other.

I don't think file sharing could have ever driven bluetooth to mass adoption on its own, partly because companies always overlook what school students are doing with technology. But mostly because the file sharing usecase required mass deployment of the technology before it could take off.

When I was in high school, I had a palm PDA with IrDA that could do file sharing. But did I ever use it for file sharing? No, because nobody else had devices with IrDA. IrDA never hit the market saturation it needed to be actually useful, so there never was much demand for it (despite the hardware being really cheap, especially compared to a dedicated bluetooth radio)

Bluetooth headsets worked as a killer app in those early days, because a single BMW driver could buy both a high-end phone with bluetooth and a headset from the cellphone store and get the complete experience. It worked without market saturation.


Jabra! In all seriousness the blackberry Bluetooth earpiece was the killer app (hardware / device).

BlackBerry HS-655 Bluetooth Headset

Worked perfectly paired with a bb.


Wow this unlocked a bunch of memories from middle school where we would send each other the latest songs and games via bluetooth. I remember pirating games for my sony ericsson and sharing them with my friends and we would play these games in class. You could just share and install the .jar files. Good times


> Don't forget music piracy.

What you describe is file sharing, not necessarily piracy :-). Just nitpicking, I understand what you mean of course!


At this point, the decision to add Bluetooth or not is literally just a product decision. If you don't want Bluetooth in your product, you actively have to disable the Bluetooth part of your WiFi chip, because you can't really get a WiFi chip without Bluetooth.


All of the BT+wifi chips I’ve worked with require active initialization of each feature; you have to signal the chip (over i2c or SPI or whatever) with what kind of BT operations you want.

No BT stack in your product, no BT radio initialization, no BT/wifi multiplexing. At least in the (admittedly limited) chips I’ve worked with.


> the wireless headset was the killer app [for Bluetooth] in its early days

But the wireless headset is now a horrifying millstone making Bluetooth look like the world's stupidest trash fire. If you enable your microphone, you lose all audio from anything that doesn't want to use the microphone as the headset switches into "headset" mode and drops anything that wants to use "headphones" mode. There is no reason for there to even be two different modes.

Why is this still happening?


It is happening because it works the way that is most useful to most people. The number of people who want to use bluetooth earbuds with a different microphone is line noise in the consumer market.

Implementing special requirements is always inconvenient for users because no B2C wants to risk bad the-microphone-didn’t-work reviews, customer returns, and support tickets.

Nevermind coordinating with arbitrary USB microphone latency…I’ve got one with 250ms of it.


> It is happening because it works the way that is most useful to most people. The number of people who want to use bluetooth earbuds with a different microphone is line noise in the consumer market.

I don't think you have any idea what you're saying. The scenario I'm describing is when you want to use a bluetooth headset that includes a microphone. Using a different microphone is how you solve the problem.


I have never had a microphone problem with a bluetooth headset. They all always just work until something mechanical breaks through use.

If I had an issue, wired headphones seem a simpler solution than changing the bluetooth standard and more likely to work than wishing manufacturers changed their devices.


So, the standard defines behavior that is obviously pathological, actively working against the needs of all users. But because it's already codified, it's a bad idea to change it?


Works for my needs.

If it didn't, I'd use wires.

Or something else.


it happens because bluetooth profile for audio+microphone uses different codecs and has less bandwidth, due to being used for realtime communication.

the bluetooth audio streaming profile enables more codecs, but only playback, and allows significantly higher latency that you wouldnt accept on a call


> and allows significantly higher latency that you wouldnt accept on a call

That latency is also a millstone. We can do much better, but the standard lags and implementations lag even more.


My personal bluetooth ear clips do something much worse than adding latency - if they're not currently playing something, and a sound is supposed to come through, they omit the beginning of the sound while they get ready to become active or whatever it is they're doing.

Just delaying the sound and playing all of it would be a big improvement.

(Though that would fail badly for watching videos. That's something that uses 'headphones' mode anyway - why is latency OK there? It isn't.

My ear clips do add some latency, a noticeable amount, when I'm watching a video with mpv, and I adjust that by altering the A/V sync setting. They don't do the same thing when I'm watching something on youtube. I'd like to know what's going on there.)


well you wont get any argument from me that things could be significantly better.

> why is latency OK there? It isn't.

well.. if you listen to music, you probably dont care too much, but if you're talking to someone, you do.

whats much worse is also that latency is not constant.


The absolute madness that is Bluetooth pairing between cars and cellphones is wild. If I get into my car and it decides to randomly pair with my wife's phone (who is inside the house) and I drive off, the whole infotainment system is locks up and dies until I get to my destination and turn off the car.


What car do you drive? Then I make sure to not buy it


Mine connects and starts playing silently, and you need to press play and pause on the headunit to make it make sound. Every two months or so it fails to connect to the phone and needs a complete forget and repair to happen. Toyota unit and iphone.


My partner's hearing aid connects (via some radio protocol) to a device that then connects via bluetooth. Unfortunately, it presents itself as a headset, which causes... problems. For Android, they have to use an app from the play store that presents itself as an audio device and then sends that to the 'headset'.


Except for the "headphone" versus "headset" mode dichotomy that is inherent to Bluetooth, all those other issues are due to stupid product decisions that most OSes do to themselves independently on the same way.

If you use Linux + KDE, you can still use any microphone or headphone, many at the same time, or in whatever mode you want.


Linux + KDE user here.

It used to work on kde/plasma 5 at some point. And after a minor version update it stopped working.

Now the mic of my headset doesn't work because KDE insists that only the high quality sound output without mic is available. The mic + low quality output is gone from the settings.

Lucky for me this update also brought proper handling of the stereo positioned noise cancelling microphones on my thinkpad. So now I can actually enjoy the luxury of built-in microphones that work. Until the day it wont I guess.


if you use pipewire it works fine both in plasma 5 and 6.

but the audio quality in the sound + mic profile is very poor, the codec is only meant for speech


I am using pipe wire. The option to select the handset mode is gone! I can only select the various output codecs with my increasing quality. But not the mic & output mode. It's gone from the list...


Wow brings back memories when this used to be a problem on gnome like five years ago.


> If you use Linux + KDE, you can still use any microphone or headphone, many at the same time, or in whatever mode you want.

This doesn't really seem to respond to the problem. The problem is that I'd like to use a single bluetooth device that includes earpieces and a microphone. That doesn't work, because of the headphone-headset mode dichotomy. As I replied to another comment, using multiple devices would be a solution to the problem. It wouldn't be an example of the problem that I want solved.

Bluetooth is apparently incapable of simply delivering an audio stream to the earpieces while accepting one from the microphone. This is a baffling design. The assumption appears to be that there will never be more than one source of audio for output. But that's crazy.


obligatory xkcd: https://xkcd.com/2055/


Except this isn't a problem of trying to manage multiple connections. Bluetooth can't even handle maintaining a single connection between two devices.


Bluetooth was the main way to transfer music from dumbphone to a dumbphone


The ironic thing is the Bluetooth ignored it's audio use as much as possible for as long as possible. They wanted it to be used for tracking shoppers in stores...


how to use a phone in car without headset / bluetooth to talk legally?


Back when Bluetooth was new, the alternative for wirelessly sharing data between mobile devices like phones was infrared.

IR was exceptionally slow, required line-of-sight and even at the time, felt like a shitty solution. So even though the early implementations of Bluetooth also left a lot to be desired (battery hungry, insecure, and also slow), it was still a massive improvement on what came before.

Wireless USB wasn’t a significant enough improvement to Bluetooth given that BT was already ubiquitous by that point, but also cheap and (by that point) battery efficient now too.


IR with palm devices was super nice - just point to the other device and send, then confirm on the other. No persistent pairing bullshit & you could also use it to control TVs.


That’s how BT originally worked too but it got abused (I touched on this in my original comment when I said BT was insecure). The paring is a security measure to protect people from abuse.

Back when BT was new, I used to get all sorts of random shit pushed onto my phone every Friday night on the drunk train home from London.


I guess that was a lot less of a problem with IRDA as it required line-of-sight, which limited the abuse potential significantly.

Some devices would even establish an IRDA connection automatically as soon as they found anything. I have friends whose laptop names have suddenly appeared on lecture room projectors, as their laptop's IRDA receiver was in direct line of sight of that of the teacher's.

Not that you couldn't do that with Bluetooth, some early BT chipsets gave you a "<device name> wants to connect to you" dialog box any time somebody tried sending something to your device. This could be abused, to great student amusement, to display funny messages on that same projector if the lecturer's laptop had such a chipset.


I wonder why IR is slow. Shouldn’t there be plenty of bandwidth available at those frequencies?


These days, professional optical equipment, aka expensive lasers+supporting hardware, can do 10-Gbit over multiple kilometers through the air, so you're right that optical transmission through the air should be able to support higher data rates.

The problem with Irda is that it's old. Technology has significantly advanced since the 90's, when Irda was popular on cellphones, so a modern implementation could do better data rates even accounting for the significant interference from the environment. We barely had wifi back then, and now it'll do a few hundred megabytes per second without breaking a sweat (your ISP might though). All the technology required to do that didn't exist in the 90's. We have Bluetooth now though, so there's that same bootstrapping problem, where you'd just use Bluetooth, and not spend a bunch of money building a system very few people are asking for, so then there's little demand for a modern high performance Irda system in any devices.


It was harder to extract a clean signal due to ambient environmental conditions.

You could probably solve those issues with modern tech though. Things have advanced significantly since IR was popular. For example, back then Bluetooth was slow too.


Not through a tiny photodiode + amp on a spare UART RX, if not repurposed TV remote phototransistor. They can be slow.


I'm frankly baffled at all these reports of IR being unreliable and slow. It... wasn't. Not for the file sizes of the day. I exchanged plenty of files back in the day, even at 115200bps a picture would be 2-3 seconds tops (pictures were small!). And when devices started supporting 4Mbps, even a large-ish MP3 would go in 5-6 seconds. All without setup or pairing, beautiful. Huge files (like full resolution pictures from an SLR camera) would take a while - but frankly they took almost the same time with a cable! You'd just have to plug their memory card directly into your computer if you were in a hurry.

The only really clunky use case for me was internet access - keeping phone and laptop positioned and aligned for 30 minutes was limiting.

And yes there IS plenty of bandwidth at those frequencies. In fact latest IR standards reach 1Gbps, but it's pretty much extinct. There was an attempt called Li-Fi to use it for as a wireless networking but I don't think it went far.

What I really miss is OBEX (Object Exchange), which worked also over Bluetooth, and which Apple sadly chose not to implement: simplest protocol to just ship a file or a contact vCard over, no setup, just worked - and it's been a standard for 20+ years. Early Android had it too, it was since dropped I think. Sigh.


You’re either misremembering things or talking about an era after Bluetooth had already taken off.

In the days before Bluetooth, transferring MP3s over IR took multiple minutes, even on high end (for the time) handsets.

And the fact that you needed to keep line of sight during the whole process meant your phone couldn’t be used that whole time. Which was a real pain in the arse if you got a text message or phone call while trying to transfer a file.

IR was really more designed for swapping contacts. In fact that’s exactly how BlackBerry (or was it Palm?) marketed IR on their device: a convenient way to swap contact details. But you’re talking about a few KB vs several MBs for an MP3.

The tech has definitely moved on since. But then so has Bluetooth, WiFi and GSM et al too.


It would take me like 30 minutes to transfer 1MB.


At 9600bps. Almost every device supported 115200 - that would cut it down to to 72 seconds. And as I mentioned - pictures were often small (20-30kbytes) back in the day - that's barely 1-2 seconds at 115200. And the later 4Mbps speeds would move that megabyte in 2 seconds flat.


You were pretty unlucky. The basic bitrate was 9.6kbps but much higher speeds were common.


Mostly just SNR issues.


My guess is just that Bluetooth and Wi-Fi came first, and when wireless USB entered the party, there wasn't a real need for it, as most of its use cases were already covered by Bluetooth and Wi-Fi.

It maybe could have worked with better marketing, but convincing potential customers to change something that works (somewhat, BT wasn't without issues) is hard. That's why we are keeping abominations like cigarette lighter sockets in cars even though they often can't even light cigarettes anymore. It is already well established and it works well enough as a power outlet.


Bluetooth took off before wireless usb did and was allready useful to people when wusb came a long. It was also lower power so you could do peripherals that was smaller and longer lived.


In my opinion, for computers, wireless mouse (and eventually keyboards) was one of the killer app that showed pushed for Bluetooth to be common in computers and laptops. "Not needing a dongle" was a huge added value compared to the private radio protocol of manufacturers


Largest phone manufacturers of the time (Ericsson and Nokia) supporting and developing it surely helped.


Maybe the low-power feature of Bluetooth helped it, such as with the Bluetooth LE (Low Energy) spec that with the way "services" and "characteristics" are defined that it allows for extremely low power consumption when no read-operation is done and thus the Bluetooth radio power can be minimized. Also, the 2.4 GHz band of Bluetooth is inherently less power consuming than the higher frequency band of Wireless USB.

But I guess the other problems may have been due to as the OP article brings up about it being in a different frequency communication band. Bluetooth on the other hand shares the same frequency band as Wi-Fi (2.4 GHz (which is also the OG Wi-Fi frequency band)). Thus for PC and mobile device OEMs, it is one less antenna to integrate. Furthermore, the higher frequency of the band would have made through-walls and further-distance communication less reliable. In this context, one of the few cool applications of wireless USB would've been a wireless keyboard or mouse (but this is coming from someone who is working on an OEM computer keyboard and mouse).


Introducing a new wireless protocol is incredibly difficult. You basically have to have all the countries in the world to give you a chunk of their spectrum.

You have 2 mainstream protocols now, one for low energy, slow data transfers (Bluetooth) and one for fast, but more power hungry devices.

I don't see the usecase for UWB.


And as far as I know, Wi-Fi and Bluetooth didn't have frequency given to them. They instead used frequencies that were already available for unlicensed use.


yes-and-no.

In my opinion, this was the timing and usefulness of Bluetooth in an era when only Nokia ruled the world. Moreover, there are many other reasons too.


Bluetooth took off because it was easiest path to do wireless headsets when handsfree driving laws passed.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: